Measuring the Discrepancy between Conditional Distributions: Methods, Properties and Applications
Measuring the Discrepancy between Conditional Distributions: Methods, Properties and Applications
Shujian Yu, Ammar Shaker, Francesco Alesiani, Jose Principe
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 2777-2784.
https://doi.org/10.24963/ijcai.2020/385
We propose a simple yet powerful test statistic to quantify the discrepancy between two conditional distributions. The new statistic avoids the explicit estimation of the underlying distributions in high-dimensional space and it operates on the cone of symmetric positive semidefinite (SPS) matrix using the Bregman matrix divergence. Moreover, it inherits the merits of the correntropy function to explicitly incorporate high-order statistics in the data. We present the properties of our new statistic and illustrate its connections to prior art. We finally show the applications of our new statistic on three different machine learning problems, namely the multi-task learning over graphs, the concept drift detection, and the information-theoretic feature selection, to demonstrate its utility and advantage. Code of our statistic is available at https://bit.ly/BregmanCorrentropy.
Keywords:
Machine Learning: Time-series;Data Streams
Machine Learning: Transfer, Adaptation, Multi-task Learning
Data Mining: Theoretical Foundations