IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Regular Section
A Transformer-Based Fully Trainable Point Process
Hirotaka HACHIYAFumiya NISHIZAWA
Author information
JOURNAL FREE ACCESS

2025 Volume E108.D Issue 6 Pages 583-592

Details
Abstract

Using prior physical and mathematical knowledge, an appropriate intensity function should be designed when applying a point process to a real-world problem. A novel Transformer-based partially trainable model has been proposed. This model adaptively extracts a sequence feature from the past event sequence using a self-attention mechanism. However, because the feature vector is the transformed vector of the latest event and the intensity function is modeled in a handmade manner given the feature vector, the approximated intensity function and the predicted next event depend strongly on the latest event. To overcome these problems, a novel Transformer-based fully trainable point process (Transformer-FTPP) is proposed. With this model, multiple trainable vectors are transformed through an encoder-decoder Transformer architecture to extract past sequence-representative and future event candidate vectors. This facilitates the realization of an adaptive and general approximation of the intensity function and a prediction of the next event. The effectiveness of the proposed method was proved experimentally using synthetic and real-world event data.

Content from these authors
© 2025 The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top