システム制御情報学会論文誌
Online ISSN : 2185-811X
Print ISSN : 1342-5668
ISSN-L : 1342-5668
論文
非負値入力に対するL2誘導ノルムの解析と再帰型ニューラルネットワークの安定性解析への応用
本岡 駿人蛯原 義雄
著者情報
ジャーナル オープンアクセス

2022 年 35 巻 2 号 p. 29-37

詳細
抄録

A recurrent neural network (RNN) is a class of deep neural networks and able to imitate the behavior of dynamical systems due to its feedback mechanism. However, the feedback mechanism may cause network instability and hence the stability analysis of RNNs has been an important issue. From control theoretic viewpoint, we can readily apply the small gain theorem for the stability analysis of an RNN by representing it as a feedback connection with a linear time-invariant (LTI) system and a static nonlinear activation function typically being a rectified linear unit (ReLU). It is nonetheless true that the standard small gain theorem leads to conservative results since it does not care the important property that the ReLU returns nonnegative signals only. This motivates us to analyze the L2 induced norm of LTI systems for nonnegative input signals, which is referred to the L2+ induced norm in this paper. We characterize an upper bound of the L2+ induced norm by copositive programming, and then derive a numerically tractable semidefinite programming problem for (loosened) upper bound computation. We finally derive an L2+-induced-norm-based small gain theorem for the stability analysis of RNNs and illustrate its effectiveness by numerical examples.

著者関連情報
© 2022 一般社団法人 システム制御情報学会
前の記事
feedback
Top