2017 年 2017 巻 FIN-018 号 p. 20-
Network embedding is one of the approaches to effectively analyzinge the network data. Almost all the existing network embedding methods adopt shallow models without having deep architecture that is commonly used in deep learning studies. However, shallow models cannot capture highly non-linear network structures that are often observed in real-world, complex networks. To solve this problem, Structural Deep Network Embedding (SDNE) was proposed as a deep model for network embedding. In this paper, we focus on Generative Stochastic Network (GSN) for network embedding, in an extension of Autoencoder. GSN robustly captures latent features of data by adding random noises in the process of learning. The framework to capture the latent structure of network is similar to that of SDNE. As a target network in this study, we focus on the time-dependent networks. In order to address the dependency between time intervals and to capture the tendency of previous time interval, we propose time-dependent pretraining that uses the parameters learned from the previous time interval as initial states of the current time interval while in the learning process. In the experiments, we use time-dependent financial network data, where each node (or vertex) represents a bank and each link (or directed edge) represents a per-month transaction between a pair of banks, resulting in a series of per-month networks.