Transactions of the Society of Instrument and Control Engineers
Online ISSN : 1883-8189
Print ISSN : 0453-4654
ISSN-L : 0453-4654
An Approximate Method of Optimal Control for Nonlinear Dynamical Systems with State-Dependent Noise
Yoshifumi SUNAHARAHideaki TORII
Author information
JOURNAL FREE ACCESS

1970 Volume 6 Issue 3 Pages 238-246

Details
Abstract
The purpose of this paper is to establish a method of finding the sub-optimal control for nonlinear dynamical systems subjected to a state-dependent noise.
Guided by the basic notion of state variable representation in control theory, a mathematical model of the dynamical system considered here is given by the nonlinear vector stochastic differential equation of Stratonovich-type.
First, for the purpose of establishing an approximated model of the system, a method of stochastic linearization is demonstrated in Markovian framework. The method is to linearize the nonlinear drift term by expanding it into a linear function whose coefficients are determined by the least-squares error criterion. The linearized drift term is thus specified by the coefficients dependent on both the conditional mean and the associated error covariance.
Secondly, by using the quasi-linear model and following the method of Dynamic Programming, the approximate strategy of optimal control is presented for the quadratic cost functional.
Finally, the quantitative aspects of sample paths behavior of optimal control signal are shown by digital simulation studies in comparison with the similar quantitative aspects obtained when the mathematical model is given by the stochastic differential equation of Ito-type.
Content from these authors
© The Society of Instrument and Control Engineers (SICE)
Previous article Next article
feedback
Top