Abstract (EN):
: In this article, we address the infinite horizon
problem of optimizing a given performance criterion by
choosing control strategies whose trajectories are asymptotically
stable. In a first stage, we state and discuss suf-
ficient conditions of optimality conditions in the form of
an Hamilton-Jacobi-Bellman equation, and, based on them.
Then, we present necessary conditions of optimality in the
form of a maximum principle and show how it can be derived
from an auxiliary optimal control problem with mixed
constraints.
Language:
English
Type (Professor's evaluation):
Scientific
No. of pages:
4
License type: