Abstract
In this paper, we consider an optimal control problem for a linear discrete time system with stochastic parameters. Whereas traditional stochastic optimal control theory only treats systems with deterministic parameters with stochastic noises, this paper focuses on systems with stochastic uncertain parameters. We derive an optimal control law for a novel cost function by which the designer can adjust the trade-off between the average and the variance of the states. Finally, a numerical simulation shows the effectiveness of the proposed method.