Computing Explanations in Problem Solving: A Review of Formal Approaches
Barry O'Sullivan and Ulrich Junker
Explanation generation is an essential feature of intelligent systems. Explanations are important in interactive systems such as product configurators, personal assistants, expert systems, and model development tools. Explanation techniques are also used to improve the quality of results in test generation, software verification, and decomposition methods in combinatorial optimization.
This tutorial develops a general theory of explanation generation in problem solving, which unifies existing methods and is valid for a large range of AI problems (product configuration, multi-criteria optimization, constraint satisfaction, satisfiabililty, recommender systems, case-based reasoning, diagnosis, debugging, ontological reasoning, etc.). Only a basic background knowledge of AI is required.
The tutorial will characterize problems needing explanations while analyzing the explanation requirements of different types of user. We will introduce different notions of explanation such as minimal and preferred explanations, abstraction and refinement, representative explanations, explanations of solutions, and explanations of optimality. The computation of one, several, or all explanations can be achieved by parameterized algorithms, which work for arbitrary problem solvers such as constraint solvers. The tutorial studies the relationship between new algorithms and classic methods. The tutorial concludes with a list of successful applications of explanation generation ranging from product configurators, recommender systems, debuggers, planning and the web.
Barry O'Sullivan is Associate Director of the Cork Constraint Computation Centre at University College Cork. He is the President of the Association for Constraint Programming, Chairman of the Artificial Intelligence Association of Ireland, and Coordinator of the ERCIM Working Group on Constraints. His research interests are constraint programming and optimisation.
Ulrich Junker received his Ph.D. from the University of Kaiserslautern (1992). He has significant experience in explanation generation, starting with research on truth maintenance systems and model-based diagnosis. His pioneering work on QuickXplain paved new directions for explanation generation in constraint programming, product configuration, and other fields. He is a distinguished scientist at ILOG.