Beschreibung
Impulsive Control in Continuous and Discrete-Continuous Systems is an up-to-date introduction to the theory of impulsive control in nonlinear systems. This is a new branch of the Optimal Control Theory, which is tightly connected to the Theory of Hybrid Systems. The text introduces the reader to the interesting area of optimal control problems with discontinuous solutions, discussing the application of a new and effective method of discontinuous time-transformation. With a large number of examples, illustrations, and applied problems arising in the area of observation control, this book is excellent as a textbook or reference for a senior or graduate-level course on the subject, as well as a reference for researchers in related fields.
Autorenportrait
Inhaltsangabe1. Introduction. 2. Discretecontinuous systems with impulse control. 3. Optimal impulse control problem with restricted number of impulses. 4. Representation of generalized solutions via differential equations with measures. 5. Optimal control problems within the class of generalized solutions. 6. Optimality conditions in control problems within the class of generalized solutions. 7. Observation control problems in discrete-continuous stochastic systems. 8. Appendix: Differential equations with measures. Bibliography. Index.