Guest Talk: Gabriele Kern-Isberner: Towards Lifted Inference: Counting Strategies for Relational Maximum Entropy Reasoning
Wednesday, November 04, 2020, 10:30am
Location: Online session
Speaker: Gabriele Kern-Isberner
The principle of maximum entropy (MaxEnt) constitutes a meaningful methodology for drawing nonmonotonic inferences from probabilistic conditional knowledge as it satisfies some fundamental principles from commonsense reasoning. Similar to alternative approaches to probabilistic reasoning in relational settings, straightforward maximum entropy computations suffer from an exponential dependence from the size of the underlying domain which can lead to intractability in many cases. To overcome this problem, we adopt techniques from weighted first-order model counting (WFOMC) which exploit symmetries and interchangeabilities among the domain elements in order to count models of sentences more efficiently. To meet the requirements of our formalization of knowledge by conditionals, we assign a type to the counted models which captures the three-valued evaluation of the conditionals, i.e. the conditional structure of the models. We present the resulting variant of model counting which we call 'typed model counting' and discuss its benefits by means of some illustrating examples.