Higher Brain Function Research
Online ISSN : 1880-6716
Print ISSN : 0285-9513
ISSN-L : 0285-9513
Seminar
Introduction to neural networks—How do neural networks read words?—
Itaru F. Tatsumi
Author information
JOURNAL FREE ACCESS

2000 Volume 20 Issue 3 Pages 222-233

Details
Abstract
    Recent development of neural networks for reading words, such as the framework called the “triangle model” proposed by Seidenberg and McClelland (1989) , makes it possible to simulate reading processes of not only normal subjects but also patients with various dyslexia (or alexia) like deep, surface and phonological dyslexia. In this paper I first describe the outline of dual route models, as opposed to the triangle model, which possess lexical and non-lexical (rule) systems. Then, I will give a general description of a single route model (the triangle model) , e. G., the network's constituent “unit,” the basic structure of a three-layered feedforward network, a learning procedure called back propagation, etc. This type of a model is called the connectionist model. Interestingly, the neural network has a high ability to read words and nonwords despite of its simple structure, and it sometimes shows unpredictable behaviors due to intuition. While most research is being done in English speaking countries, it has been shown that, even in Japanese which has a more complex character system than English, the triangle framework can simulate performances on reading of Kana and Kanji words and nonwords in normal and dyslexic people. Moreover, it is mentioned that the point at issue is not in the number of routes of the models but in the “rule/lexicon” vs. “consistency” for reading. The challenge of the connectionists against the rule/lexicon approach is now made for spelling, verb inflection (generation of past tense of verbs) , and generative grammar (the stronghold of the rule/lexicon approach).
Content from these authors
© 2000 by Japan Society for Higher Brain Dysfunction ( founded as Japanese Society of Aphasiology in 1977 )
Previous article Next article
feedback
Top