Optimal Control Systems

From ANCS Wiki

Jump to: navigation, search

This course serves as an introduction to optimal control theory for linear and nonlinear systems.

TEXT: “Optimal Control Theory, An Introduction,” by D.E. Kirk, Prentice-Hall, New York, NY, 1970.

  • Optimal Control Examples
  • Review of Function Optimization
    • Global Extremum
    • Little o and Big O Functions
    • Vector Calculus
    • Constraints and Lagrange Multipliers
  • Calculus of Variations
    • Perturbations in Functions
    • Necessary Conditions
    • First Problem of Calculus of Variations
    • Boundary Condition Problems
    • Piece-Wise Smooth Extremals
    • Sufficient Conditions for Local Minima
  • Hamiltonian Formulation
    • Euler-Lagrange Equations
    • Transversality Conditions
    • Maximum Principle
  • Linear Quadratic Regulator
    • Matrix Riccati Equation
  • Computational Techniques
    • Gradient Method
    • Shooting Methods

Years Taught: Spring '01