Efficient Inference for Untied MLNs

Efficient Inference for Untied MLNs

Somdeb Sarkhel, Deepak Venugopal, Nicholas Ruozzi, Vibhav Gogate

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 4617-4624. https://doi.org/10.24963/ijcai.2017/644

We address the problem of scaling up local-search or sampling-based inference in Markov logic networks (MLNs) that have large shared sub-structures but no (or few) tied weights. Such untied MLNs are ubiquitous in practical applications. However, they have very few symmetries, and as a result lifted inference algorithms--the dominant approach for scaling up inference--perform poorly on them. The key idea in our approach is to reduce the hard, time-consuming sub-task in sampling algorithms, computing the sum of weights of features that satisfy a full assignment, to the problem of computing a set of partition functions of graphical models, each defined over the logical variables in a first-order formula. The importance of this reduction is that when the treewidth of all the graphical models is small, it yields an order of magnitude speedup. When the treewidth is large, we propose an over-symmetric approximation and experimentally demonstrate that it is both fast and accurate.
Keywords:
Uncertainty in AI: Approximate Probabilistic Inference
Uncertainty in AI: Graphical Models
Uncertainty in AI: Relational Inference