Abstract
We show the Inductive capability or generality of a connectionist learning depends on the network architecture. It is shown that the conventional network architecture is inadequate for representing and learning complex nonlinear structure of the continuous mapping. We propose a new network architecture, high-order functional networks, with some nonmonotonic functional units as input units. It is shown that a high-order functional network trained with backpropagation can generalize and infer the nonlinear structure between the continuous variables. Nonlinear mappings are characterized by the positions and the number of their extreme points and the curvatures at the extreme points. It is shown that the combination of the, high-order functional input units and the sigmoid-type hidden units make it possible to realize and acquire a proper internal representation of the network and that extracts those features of the task domain.