Optimal Impulsive Control explores the class of impulsive dynamic optimization problems-problems that stem from the fact that many conventional optimal control problems do not have a solution in the classical setting-which is highly relevant with regard to engineering applications. The absence of a classical solution naturally invokes the so-called extension, or relaxation, of a problem, and leads to the notion of generalized solution which encompasses the notions of generalized control and trajectory; in this book several extensions of optimal control problems are considered within the framework of optimal impulsive control theory. In this framework, the feasible arcs are permitted to have jumps, while the conventional absolutely continuous trajectories may fail to exist.
The authors draw together various types of their own results, centered on the necessary conditions of optimality in the form of Pontryagin's maximum principle and the existence theorems, which shape a substantial body of optimal impulsive control theory. At the same time, they present optimal impulsive control theory in a unified framework, introducing the different paradigmatic problems in increasing order of complexity. The rationale underlying the book involves addressing extensions increasing in complexity from the simplest case provided by linear control systems and ending with the most general case of a totally nonlinear differential control system with state constraints.
The mathematical models presented in Optimal Impulsive Control being encountered in various engineering applications, this book will be of interest to both academic researchers and practising engineers.