Information and Media Technologies
Online ISSN : 1881-0896
ISSN-L : 1881-0896
Media (processing) and Interaction
A Generative Dependency N-gram Language Model: Unsupervised Parameter Estimation and Application
Chenchen DingMikio Yamamoto
Author information
JOURNAL FREE ACCESS

2014 Volume 9 Issue 4 Pages 857-885

Details
Abstract

We design a language model based on a generative dependency structure for sentences. The parameter of the model is the probability of a dependency N-gram, which is composed of lexical words with four types of extra tag used to model the dependency relation and valence. We further propose an unsupervised expectation-maximization algorithm for parameter estimation, in which all possible dependency structures of a sentence are considered. As the algorithm is language-independent, it can be used on a raw corpus from any language, without any part-of-speech annotation, tree-bank or trained parser. We conducted experiments using four languages, i.e., English, German, Spanish and Japanese, to illustrate the applicability and the properties of the proposed approach. We further apply the proposed approach to a Chinese microblog data set to extract and investigate Internet-based, non-standard lexical dependency features of user-generated content.

Content from these authors
© 2014 The Association for Natural Language Processing
Previous article Next article
feedback
Top