Abstract

Proceedings Abstracts of the Twenty-Fourth International Joint Conference on Artificial Intelligence

Compressed Spectral Regression for Efficient Nonlinear Dimensionality Reduction / 3359
Deng Cai
PDF

Spectral dimensionality reduction methods have recently emerged as powerful tools for various applications in pattern recognition, data mining and computer vision. These methods use information contained in the eigenvectors of a data affinity (i.e, item-item similarity) matrix to reveal the low dimensional structure of the high dimensional data. One of the limitations of various spectral dimensionality reduction methods is their high computational complexity. They all need to construct a data affinity matrix and compute the top eigenvectors. This leads to O(n2) computational complexity, where n is the number of samples. Moreover, when the data are highly non-linear distributed, some linear methods have to be performed in a reproducing kernel Hilbert space (leads to the corresponding kernel methods) to learn an effective non-linear mapping. The computational complexity of these kernel methods is O(n3). In this paper, we propose a novel nonlinear dimensionality reduction algorithm, called Compressed Spectral Regression, with O(n) computational complexity. Extensive experiments on data clustering demonstrate the effectiveness and efficiency of the proposed approach.