Aims & Learning Objectives:
Aims: Four concepts underpin control theory: controllability, observability, stabilizability and optimality. Of these, the first two essentially form the focus of the Year 3/4 course on linear control theory. In this course, the latter notions of stabilizability and optimality are developed. Together, the courses on linear control theory and nonlinear & optimal control provide a firm foundation for participating in theoretical and practical developments in an active and expanding discipline.
To present concepts and results pertaining to robustness, stabilization and optimization of (nonlinear) finite-dimensional control systems in a rigorous manner. Emphasis is placed on optimization, leading to conversance with both the Bellman-Hamilton-Jacobi approach and the maximum principle of Pontryagin, together with their application.
Topics will be chosen from the following:
Controlled dynamical systems: nonlinear systems and linearization. Stability and robustness. Stabilization by feedback. Lyapunov-based design methods. Stability radii. Small-gain theorem. Optimal control. Value function. The Bellman-Hamilton-Jacobi equation. Verification theorem. Quadratic-cost control problem for linear systems. Riccati equations. The Pontryagin maximum principle and transversality conditions (a dynamic programming derivation of a restricted version and statement of the general result with applications). Proof of the maximum principle for the linear time-optimal control problem.