Resumo (PT):
Abstract (EN):
In the present paper, we analyze the stability of equilibria of impulsive control systems whose
dynamics is determined by a differential inclusion driven by a vector-valued measure. The notion
of solution given in [1] provides the meaningfulness of the stabilization problem under very general
assumptions on the conditions of the problem. This notion of solution has the important property
that it covers systems whose singular dynamics does not satisfy the so-called Frobenius condition.
It turns out that for each admissible solution the trajectory joining boundary points of discontinuity
is determined by the singular dynamics. Note that the above-mentioned notion of solution is caused
by practical engineering considerations. For important classes of applied problems, it is of interest
to control a dynamical system that can operate in several viable configurations. Although the
transitions between configurations, modelled by jumps (discontinuities) of the trajectories, are
unproductive and their duration is negligible, their nature can affect the general properties of the
system. Therefore, it is advisable to consider the jump dynamics as integral part of the dynamical
optimization problem.
This class of problems arises in various applications such as finance, mechanics of vibroshock
systems, renewable resource management, or aerospace navigation, where the solution is contained
in the set of control processes with trajectories of bounded variation. This has naturally given
an impetus to the recent rapid development of the theory of such systems and numerical schemes
implementing the control strategies.
There is a wide literature on the stability of ordinary control systems ˙x = f(x, u), x(0) = x0, or,
in terms of differential inclusions, ˙x ∈ F(x) (for a detailed list, see, e.g., [2–4], and a brief survey
can be found in [5]).
The stability conditions were stated in [5] in terms of a controllable Lyapunov pair of functions
satisfying the uniform decay condition. This is due to the fact that these conditions were found
by applying ordinary stability theory to a standard problem obtained by a reparametrization of
the original control system. However, these conditions are useless in numerous cases. Therefore,
we weaken this result and extend the notion of a controllable Lyapunov pair of functions in such a
way that V increases at each jump. The price we pay for this approach is that we have to consider
only control problems with a control measure such that either the total variation of its singular
component is finite or its total variation on any finite interval tends to zero as its lower bound
tends to infinity. This is a rather general scheme from the viewpoint of applications, although it
might seem restrictive.
Language:
English
Type (Professor's evaluation):
Scientific
No. of pages:
9