Proceedings Abstracts of the Twenty-Fifth International Joint Conference on Artificial Intelligence

Distance-Preserving Probabilistic Embeddings with Side Information: Variational Bayesian Multidimensional Scaling Gaussian Process / 2011
Harold Soh

Embeddings or vector representations of objects have been used with remarkable success in various machine learning and AI tasks — from dimensionality reduction and data visualization, to vision and natural language processing. In this work, we seek probabilistic embeddings that faithfully represent observed relationships between objects (e.g., physical distances, preferences). We derive a novel variational Bayesian variant of multidimensional scaling that (i) provides a posterior distribution over latent points without computationally-heavy Markov chain Monte Carlo (MCMC) sampling, and (ii) can leverage existing side information using sparse Gaussian processes (GPs) to learn a nonlinear mapping to the embedding. By partitioning entities, our method naturally handles incomplete side information from multiple domains, e.g., in product recommendation where ratings are available, but not all users and items have associated profiles. Furthermore, the derived approximate bounds can be used to discover the intrinsic dimensionality of the data and limit embedding complexity. We demonstrate the effectiveness of our methods empirically on three synthetic problems and on the real-world tasks of political unfolding analysis and multi-sensor localization.