Abstract (EN):
An optimization problem for systems that are subject to perturbations and have both controlled discrete state observations and impulsive control actions is here addressed. The dynamics equation is given by a differential inclusion x(t) e F(t,x(t)) and the optimal strategy is the minimizer of the worst viable perturbation process (minimax). Such formulation is appropriate when noise is barely known but limited and when observing the system, similarly to changing its state, has a positive cost that is not negligible. Since observation and control actions occur at stopping times relative to the o-algebra generated by the state values at previous observation times (which are the only known state values), this formulation has a discrete time analogue, which is exploited. An algorithm based on sufficient conditions for optimality is presented.
Language:
English
Type (Professor's evaluation):
Scientific