Abstract
A new method of feature generation and association based on attention is proposed in this paper. Attention to features of patterns plays an important role in recognition by human beings. A lot of models have been proposed to realize the attention in computational models. But the attention functions are determined by designers a priori and they are fixed. In the proposed model, the attention functions are organized automatically. They are realized by using the features obtained through learning. The proposed model is a 2-layered hierarchical network which consists of an associative memory layer and a feature layer. The connections between the layers represent the features of the input patterns. The connections are generated automatically and their weights are updated through the modified Hebbian learning rule. The features are extracted through a bottom-up procedure when the pattern is presented to the input layer. The connections corresponding to the most excited neuron in the feature layer are used as attention points in a top-down procedure. Through these procedures, the system can associate appropriate patterns.