Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
General Paper (Peer-Reviewed)
A Comprehensive Analysis of PMI-based Models for Measuring Semantic Differences
Taichi AidaMamoru KomachiToshinobu OgisoHiroya TakamuraDaichi Mochihashi
Author information
JOURNAL FREE ACCESS

2023 Volume 30 Issue 2 Pages 275-303

Details
Abstract

The task of detecting words with semantic differences across corpora is primarily addressed by word representations such as Word2Vec or BERT. However, there are no abundant computing resources available in the real world where linguists and sociologists apply these techniques. In this paper, we extend an existing CPU-trainable model which trains vectors of all time periods simultaneously. Experimental results demonstrate that the extended models achieved comparable or superior results to strong baselines in English corpora, SemEval-2020 Task 1, and Japanese. Furthermore, we compared the training time of each model and conducted a comprehensive analysis of Japanese corpora.

Content from these authors
© 2023 The Association for Natural Language Processing
Previous article Next article
feedback
Top