Multi-Level Metric Learning via Smoothed Wasserstein Distance
Multi-Level Metric Learning via Smoothed Wasserstein Distance
Jie Xu, Lei Luo, Cheng Deng, Heng Huang
Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence
Main track. Pages 2919-2925.
https://doi.org/10.24963/ijcai.2018/405
Traditional metric learning methods aim to learn a single Mahalanobis distance metric M, which, however, is not discriminative enough to characterize the complex and heterogeneous data. Besides, if the descriptors of the data are not strictly aligned, Mahalanobis distance would fail to exploit the relations among them. To tackle these problems, in this paper, we propose a multi-level metric learning method using a smoothed Wasserstein distance to characterize the errors between any two samples, where the ground distance is considered as a Mahalanobis distance. Since smoothed Wasserstein distance provides not only a distance value but also a flow-network indicating how the probability mass is optimally transported between the bins, it is very effective in comparing two samples whether they are aligned or not. In addition, to make full use of the global and local structures that exist in data features, we further model the commonalities between various classification through a shared distance matrix and the classification-specific idiosyncrasies with additional auxiliary distance matrices. An efficient algorithm is developed to solve the proposed new model. Experimental evaluations on four standard databases show that our method obviously outperforms other state-of-the-art methods.
Keywords:
Machine Learning: Machine Learning
Computer Vision: Computer Vision