Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.

  

Optimal Control: A Review

Author(s):

V. Nevistic
Conference/Journal:

vol. AUT97-05
Abstract:

Basic principles behind optimization and optimal control theory are introduced. A general statement of the optimal control problem is given, and various approaches to its solution are described. Since the concept of optimal control design is intimately related to the solution of the Hamilton-Jacobi-Bellman optimization equation and dynamic programming, attention is centered upon this approach. The interconnection with other optimal methods based on the classical variational approach such as Pontryagin Minimum Principle and Euler-Lagrange equations, as well as the fundamental distinctions between them, are then discussed in detail.

Year:

1997
Type of Publication:

(04)Technical Report
Supervisor:



No Files for download available.
% Autogenerated BibTeX entry
@TechReport { Xxx:1997:IFA_1450,
    author={V. Nevistic},
    title={{Optimal Control: A Review}},
    institution={},
    year={1997},
    number={},
    address={},
    url={http://control.ee.ethz.ch/index.cgi?page=publications;action=details;id=1450}
}
Permanent link