Hybrid Probabilistic Inference with Algebraic and Logical Constraints

In this tutorial we study probabilistic inference in the presence of algebraic and logical constraints. We will cover the theoretical foundation and computational challenges related to probabilistic inference in constrained settings, while exploring at the same time highly relevant applications, such as probabilistic formal verification of hybrid systems and learning models that satisfy constraints by construction.

With the increasing pervasiveness of AI and machine learning in our day-to-day lives, there is an urgent need for systems that are not only intelligent but also safe, fair and verifiable. Deploying such systems in the real world often equates to performing probabilistic reasoning in domains that exhibit continuous (i.e. quantitative) and discrete (i.e. qualitative or logical) traits. Such domains are often referred to as being hybrid. Safety and fairness are then enforced by constraining the hybrid domain, either by algebraic or logical constraints. This further complicates the matter of performing probabilistic inference. Additionally, a number of logical and algebraic constraints can arise from the environment itself, such as laws of physics, or be instrumental in characterizing the "correct" behaviour of the system, such as safety properties that must be satisfied. In this tutorial we present weighted model integration (WMI), a framework that reduces probabilistic inference in hybrid domains to computing the weight of discrete-continuous logical formulas. Thereby, it constitutes a generalization of the already widely deployed weighted model counting (WMC) framework. In the first part of the tutorial, we incrementally build up the theoretical background on WMI -- a journey we start out from purely logical reasoning all the way up to probabilistic reasoning in hybrid domains, while providing pointers to challenges involved and connections to WMC. Second, we provide a practical perspective on the problem of probabilistic inference in the hybrid domain, by discussing applications of WMI, as well as describing the most promising openly-available tools and approaches.

Program

DateSunday, 24th of July 2022
RoomSchubert 2
14:00-15:00Background, Weighted Model Integration, Exact solvers
15:00-15:30Coffee break
15:30-16:30Approximate solvers, Applications, Future Work

Thanks for attending!
slides | pywmi package

References

  • V. Belle, A. Passerini, and G. Van den Broeck. Probabilistic inference in hybrid domains by weighted model integration. In IJCAI, 2015.
  • P. Morettin, A. Passerini, and R. Sebastiani. Efficient weighted model integration via SMT-based predicate abstraction. In IJCAI, 2017.
  • Samuel Kolb, Martin Mladenov, Scott Sanner, Vaishak Belle, and Kristian Kersting. Efficient symbolic integration for probabilistic inference . In IJCAI, 2018.
  • Pedro Zuidberg Dos Martires, Anton Dries, and Luc De Raedt. Exact and approximate weighted model integration with probability density functions using knowledge compilation. In AAAI, 2019.
  • Samuel Kolb, Pedro Zuidberg Dos Martires, and Luc De Raedt. How to exploit structure while solving weighted model integration problems. In UAI, 2019.
  • Vincent Derkinderen, Evert Heylen, Pedro Zuidberg Dos Martires, Samuel Kolb, and Luc de Raedt. Ordering variables for weighted model integration. In UAI, 2020.
  • Vaishak Belle, Guy Van den Broeck, and Andrea Passerini. Hashing-based approximate probabilistic inference in hybrid domains. In UAI, 2015.
  • Ralph Abboud, Ismail Ilkan Ceylan, and Radoslav Dimitrov. Approximate weighted model integration on dnf structures. Artificial Intelligence, page 103753, 2022.
  • Samuel Kolb, Stefano Teso, Andrea Passerini, and Luc De Raedt. Learning SMT(LRA) constraints using SMT solvers. In IJCAI, 2018.
  • Paolo Morettin, Samuel Kolb, Stefano Teso, and Andrea Passerini. Learning weighted model integration distributions. In AAAI, 2020.
  • Paolo Morettin, Pedro Zuidberg Dos Martires, Samuel Kolb and Andrea Passerini. Hybrid Probabilistic Inference with Logical and Algebraic Constraints: a Survey. In IJCAI, 2021.

Paolo Morettin

KU Leuven | Leuven AI Institute

Paolo Morettin's interests lie in the intersection of machine learning and logics. In particular, he has been working on probabilistic reasoning and learning over hybrid continuous/logical domains. He investigated the use of SMT technology for marginal inference, how to learn constrained hybrid models from unlabelled data and what classes of problems admit tractable inference. Recently, he contributed to characterizing the classes of parametric models that can be learned optimally via maximum-likelihood estimation.

Pedro Zuidberg Dos Martires

Örebro University

Pedro Zuidberg Dos Martires has been studying weighted model integration from a knowledge representation perspective, which allowed him to formulate WMI as a generalization of algebraic model counting and standard knowledge compilation techniques. Together with Samuel Kolb (and collaborators) he has pushed the frontier of inference algorithms based on probabilistic circuits, introducing novel concepts such as lambda-SMT and variable orderings for SMT formulas. Furthermore, he has pioneered the use of Monte Carlo techniques to compute integrals in the weighted model integration setting. This enables the computation of high-dimensional integrals. Together with the other tutorial collaborators he has authored a software toolbox including various WMI solvers and the field's first survey. His ultimate goal is to leverage weighted model integration to power inference engines of probabilistic programming languages in the discrete-continuous domain -- an endeavor in full-steam development.

Samuel Kolb

KU Leuven | Leuven AI Institute

Samuel Kolb was awarded a PhD scholarship from the Research Foundation -- Flanders (FWO), a selective scholarship for fundamental research, to work on constraint learning, probabilistic inference methods and constrained predictions. He completed his PhD in December 2019 for which he received the rare summa cum laude distinction. Samuel Kolb has published more than 11 papers at top-tier conferences and journals (Machine Learning Journal, AAAI, IJCAI, CIKM, UAI). He has been actively involved with the ERC advanced grant SYNTH project on automated data science of his promoter Prof. Luc De Raedt.
In 2018 Samuel, along with Martin Mladenov, Scott Sanner, Vaishak Belle and Kristian Kersting, introduced the use of knowledge compilation techniques for WMI, i.e., hybrid constrained probabilistic reasoning. Together with Pedro Zuidberg Dos Martires and others, he has since co-authored multiple important contributions that established knowledge compilation as one of the main techniques for solving exact WMI problems. He led the development of a framework for hybrid constrained probabilistic inference (http://pywmi.org) that aims to make solvers and tools more accessible to researchers. Together with Paolo Morettin and others, he combined his expertise on constraint learning with probabilistic inference to learn probabilistic machine learning models with explicit constraints from unlabeled data.
Learning hybrid constrained distributions from data allows WMI to not only be used as an "assembly language" for higher order probabilistic inference models but also as a target for probabilistic machine learning approaches. In 2021 he was awarded a grant (spin-off mandate) from VLAIO to apply the fundamental techniques developed throughout his career towards commercial applications. Since then, he has focused on translating the research on probabilistic hybrid inference to solve real world decision problems that involve probabilistic machine learning models.

Andrea Passerini

University of Trento

Andrea Passerini's research focuses on the combination of learning, reasoning and optimization. He pioneered the combination of machine learning and SMT in a number of early works on interactive optimization and structured-output prediction. He later introduced Learning Modulo Theory as a learning framework formalizing this combination. Together with Vaishak Belle and Guy Van den Broeck, Andrea Passerini coined the term Weighted Model Integration (WMI) to indicate the extension of Weighted Model Counting to hybrid domains, in a seminal paper that started the line of work on WMI. Later contributions to WMI include approximate or special-purpose methods, exact methods based on predicate-abstraction, WMI learning as well as a WMI toolbox and a survey.