Learning the Optimal Neighborhood Kernel for Classification

Kernel methods have been applied successfully in many applications. The kernel matrix plays an important role in kernel-based learning methods, but the ideal kernel matrix is usually unknown in practice and needs to be estimated. In this paper, we propose to directly learn the ideal kernel matrix (called the optimal neighborhood kernel matrix) from a pre-specified kernel matrix for improved classification performance. We assume that the pre-specified kernel matrix generated from the specific application is a noisy observation of the ideal one. The resulting optimal neighborhood kernel matrix is shown to be the summation of the pre-specified kernel matrix and a rank-one matrix. We formulate the problem of learning the optimal neighborhood kernel as a constrained quartic problem, and propose to solve it using two methods: level method and constrained gradient descent. Empirical results on several benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithms.

Jun Liu, Jianhui Chen, Songcan Chen, Jieping Ye