This paper evaluates three incremental matrix approximation algorithms, Brute Force, Incremental Singular Value Decomposition (iSVD) and Frequent Directions (FD), in terms of running time and accuracy. The authors assume that each feature vector comes from data stream one-by-one, and the algorithms try to incrementally update already-known SVD. As a result of experiments for 256-by-7291 dense matrix from a handwritten digits dataset, iSVD ran faster than brute-force algorithm. Importantly, since iSVD is a straightforward incremental technique for SVD updating, the result always gave nearly optimal low-rank approximation. By contrast, FD shrinks from the original 256-by-7291 matrix to a tiny 256-by-l matrix, and provides approximated singular vectors. The singular vectors were nearly-optimal to estimate the best low-rank approximation of the original matrix, and running time was much faster than iSVD. As conclusions, this paper discusses possible applications and future directions of these incremental algorithms.
抄録全体を表示