This paper reviews the history of cognitive studies on thinking from the dynamical point of view. In the early 1970s, researchers employed the formal approach to thinking where its processes were modeled as applications of domain-independent formal rules. However, in the 1980s, various studies revealed that human thinking is best characterized as a knowledge dependent process. Although knowledge plays critical roles, this approach had difficulties in dealing with flexible use of knowledge, its origin, and interaction with the external environment. In the 1990s, dynamics of thinking is more and more a topic within the scope of cognitive science, by virtue of biological approaches such as cognitive neuroscience, evolutionary psychology, and extended connectionism, as well as the research on analogy, creative thinking and scientific reasoning. Finally, methodological issues to further develop the dynamical approach are discussed.
Connectionism is an approach to understanding the mechanisms of human cognition using simulated networks of neuron-like processing units. In this article, I attempt to report on recent progress in connectionist models that simulate empirical data of natural cognitive tasks, these being visual word perception, memory, word naming, understanding word meanings, speech perception and production, sentence understanding, and reasoning. I also summarize the advantages and disadvantages of these connectionist models. In particular, the problems of dealing with structured information in distributed form, and doing tasks that require variable binding in connectionist networks are discussed from several different perspectives. I argue that connectionist computer simulation offers significant benefits for today's researches of cognitive science, and that connectionist modeling is likely to have an important influence on future studies. The question of how the human brain efficiently realizes and learns symbols and rules by the parallel distributed processing is still one of the great intellectual problems of our time.
In this paper, developments in the theory of generative grammar spanning the last 40 years are traced. Particular attention will be paid to the theoretically important changes that have been made in each of the four decades. The significance that linguistic research based on the Minimalist Program has in relation to cognitive science will be discussed.
Natural language understanding systems had been developed during 20th century after computers were invented. This paper surveyed the computer systems in the view point of language cognition. Though many of the computer systems for natural language processing were not so successful nor cognitive, some of them such as Winograd's and Schank's were able to understand texts but the text topics were bounded to small worlds. The paper explains that the difficluty was caused by a crucial feature of natural language, dependency to the context. It means that meanings of words in a text are determined when its context is given. This difficulty cannot be conquered by the development of computers such as their speed or memory amount. In the last half fo this paper, a cognitive system research to natural language understanding is introduced. First part of the research is an electronic concept dictionary constructed with a large scale associative experiment which provides more than one hundred thousand words from human memory. The other part is a pulse neural network system on which the concept dictionary is implemented. The neural network system is applied to analyze metaphorical expressions. The results show a promising perspective of the cognitive appproach to natural language processing systems.
This paper provides an overview of cognitive neuroscience in the 20th century, and tries to suggest some directions for new research strategies for the 21st century. In the 20th century, cognitive neuroscience in a broad term was composed of neurophysiology, psychophysiology and brain modeling, each of which represents a research strategyy of microscopic analysis, behavioral studies and computational theories. Each discipline has made a great progress on its own, but it was only the last few years of the 20th century that the related disciplines began to interact with one another. For cognitive neuroscience in the 21st century, we expect neuroscience and behavioral science to fuse through modeling studies. A model is the abstraction of physical phenomenon. A brain architecture model that integrates abstracted neuroscientific evidence with abstracted behavioral scientific evidence will be the key for the fusion of both field evidence and interpretation. It is suggested that the the understanding of the process of language acquisition, i.e. that of both lexicon and grammar, as the process of neural circuit development will be one of the candidates that verifies the biological feasibility of such a brain model. Therefore, training not only in one's own field but also in a wide range of related disciplines will be indispensable for the researchers of cognitive neuroscience in the 21st century.
Donald A. Norman has been a prominent figure in the history of Cognitive Science in the latter half of the twentieth century. In this paper, we discussed how he influenced and formed major research trends in Cognitive Science and the related area by reviewing his works so far, and tried to considered the possible future development in his works and Cognitive Science research.
This study compared Cosmides's (1989) social contract theory with Cheng & Holyoak's (1985) pragmatic reasoning schema theory as accounts of the thematic content effect in the Wason selection task. The former explains the effect in terms of an innate algorithm, whereas the latter explains it in terms of learned schemata. Cosmides prepared a “switched rule,” in which an antecedent and a consequent in the original conditional rule were interchanged so that the innate algorithm could not be applied while the pragmatic reasoning schema could. She found that the effect disappeared, and concluded that the effect is not produced by the pragmatic reasoning schema. However, the context attached to the switched rule revealed that this rule could hardly be interpreted as a permission rule, which could invoke a pragmatic reasoning schema. We made small modifications in Cosmides's switched rule and its context so that this rule could be interpreted as an obligation rule, which could also invoke a pragmatic reasoning schema. The cost-benefit relation in the context was held essentially unchanged so that the innate algorithm could not be applied. As a result, the thematic content effect appeared for the switched rule as well. This finding favors the pragmatic reasoning schema theory over the social contract theory, and thus largely reduces the plausibility of the hereditary account of reasoning ability. It was stressed that especially strong evidence is needed for hereditary accounts because they might well be utilized to justify social prejudice and discrimination.