Control / Feedback Theory

I am more interested in the engineering perspective of this topic, but I realize that fundamentally this is a very interesting mathematical topic as well. Also, at an introductory level they would be very similar from both perspectives. So, what are some good introductory texts on Control/Feedback theory for an advanced undergraduate/early graduate student?

Thanks!


Solution 1:

Edit: In European Journal of Control (2007) appeared this 11 page article "In Control, Almost from the Beginning Until the Day After Tomorrow by Jan C. Willems" that gives you a very good perspective of the history of the field and its present situation:

"We have recently seen a strong growth in the number of applications. Especially model predictive control appears to be a leading circle of ideas here. For my own taste, it has perhaps too little system theory and too much brute force computation in it, but MPC is an area where essentially all aspects of the field, from modeling to optimal control, and from observers to identification and adaptation, are in synergy with computer control and numerical mathematics."

My opinion is somewhat outdated. Having worked in Control in electrical power plants and process industries, but not in any Academy, I could indicate you the best German Book (by Otto Follinger, Regellungstechnik) on the subject published 30 years ago, but that is perhaps not what you need.


I suggest these:

  • Feedback Systems: An Introduction for Scientists and Engineers, Åström, Karl Johan and Murray, Richard M., Princeton University Press, Princeton, 2008

  • Mathematical Control Theory: Deterministic Finite Dimensional Systems, Eduardo D. Sontag, Second Edition, Springer, New York, 1998

  • Feedback Control Theory, John Doyle, Bruce Francis, Allen Tannenbaum, Macmillan Publishing Co., 1990

Solution 2:

  • "Feedback for physicists: A tutorial essay on control" (Rev Mod Phys 77 pp783-836, or free pdf downloadable here)

Solution 3:

Classic and sufficient for beginners.

  • Feedback control of dynamic systems, GF Franklin

Some classic advanced books:

  • Robust and optimal control, K Zhou

  • Applied optimal control, AE Bryson

  • Nonlinear systems, Hassan K. Khalil

Solution 4:

One possible way of analyzing optimal control problems is via Markov Decision Processes. For an introductory view I recommend Sutton & Barto's "Reinforcement Learning: An Introduction" (this is free online).

For more details and theory, two books by Dimitri Bertsekas: Dynamic Programming and Stochastic Control, and Approximate Dynamic Programming.

Bertsekas webpage also has some interesting stuff: http://web.mit.edu/dimitrib/www/home.html

Bruno