Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
Paper
Unsupervised All-words WSD Using Synonyms and Embeddings
Rui SuzukiKanako KomiyaMasayuki AsaharaMinoru SasakiHiroyuki Shinnou
Author information
JOURNAL FREE ACCESS

2019 Volume 26 Issue 2 Pages 361-379

Details
Abstract

All-words word-sense disambiguation (all-words WSD) involves identifying the senses of all words in a document. Since a word’s sense depends on the context, such as surrounding words, similar words are believed to have similar sets of surrounding words. Therefore, we predict target word senses by calculating Euclidean distances between the target words’ surrounding word vectors and their synonyms using word embeddings. In addition, we replace word tokens in the corpus with their concept tags, that is, article numbers of the Word List by Semantic Principles using prediction results. After that, we create concept embeddings with the concept tag sequence and predict the senses of the target words using the distances between surrounding word vectors, which consist the word and concept embeddings. This paper shows that concept embedding improved the performance of Japanese All-words WSD.

Content from these authors
© 2019 The Association for Natural Language Processing
Previous article Next article
feedback
Top