For discriminating functional data, several linear methods have been proposed, such as, filtering method, regularization method and functional linear discriminant analysis. However, if the size of training data is small or if it can not be assumed that the data in each class are sampled from multivariate normal populations with equal covariance matrices, there is no guarantee that linear methods are always available. In this paper, nonlinear discriminant analysis methods are introduced, based on principal component expansion of functional data. They are called functional subspace methods, which include functional subspace method and functional CLAFIC method. These procedures do not require the knowledge on the underlying distributions of populations, but linear discriminant analysis methods do. Furthermore, our procedures can be carried out more rapidly than the traditional subspace methods, because they require only small size of training data and they can be computed with lower dimensions. In order to show that the functional subspace methods are effective, they are compared with filtering method. As a result, the functional subspace methods produce good discrimination. Particularly, when the size of training data is small they can give more stable results than the filtering method.
View full abstract