Two dimensional Large Margin Nearest Neighbor for Matrix Classification
Two dimensional Large Margin Nearest Neighbor for Matrix Classification
Kun Song, Feiping Nie, Junwei Han
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 2751-2757.
https://doi.org/10.24963/ijcai.2017/383
Matrices are common forms of data that are encountered in a wide range of real applications. How to classify this kind of data is an important research topic. In this paper, we propose a novel distance metric learning method named two dimensional large margin nearest neighbor (2DLMNNN), for improving the performance of k nearest neighbor (KNN) classifier in matrix classification. In the proposed method, left and right projection matrices are employed to define the matrix-based Mahalanobis distance, which is used to construct the objective aimed at separating points in different classes by a large margin. The parameters in those two projection matrices are much less than that in its vector-based counterpart, thus our method reduces the risks of overfitting. We also introduce a framework for solving the proposed 2DLMNN. The convergence behavior, initialization, and parameter determination are also analyzed. Compared with vector-based methods, 2DLMNN performs better for matrix data classification. Promising experimental results on several data sets are provided to demonstrate the effectiveness of our method.
Keywords:
Machine Learning: Data Mining
Machine Learning: Machine Learning