How do I approach Optimal Control?

You might want to check Lawrence Evans' notes on optimal control: An Introduction to Mathematical Optimal Control Theory [pdf].


My field is mathematical programming, and I tend to look at optimal control as just optimization with ODEs in the constraint set; that is, it is the optimization of dynamic systems. I would start by studying some optimization theory (not LPs but NLPs) and getting an intuitive feel for the motivations behind stationarity and optimality conditions -- that will lead naturally into optimal control theory.

I should mention there is another facet of optimal control, related to control systems. The systems considered are discrete time (as opposed to continuous in PMP) therefore it's difference equations instead of differential equations. Examples of optimal control laws in this latter sense are Linear Quadratic Regulators (LQRs), Linear Quadratic Gaussian (LQGs), Model Predictive Control (MPC). It is this latter type of optimal control that is actually applied in industry. The Pontryagin principle, while useful for analysis, is generally intractable for real-time application to nontrivial plants.


I recommend Eduardo Sontag's Mathematical Control Theory without hesitation. It explains the basics of control theory, optimal control inclusive, as mathematicians see it - geared towards advanced undergrads but useful for all. The easier books to read are for and by engineers - nothing against them, I'm one - but if you want a mathematical text that gives the whole story I suggest you look at Sontag's.