## Foundations of Probabilistic Logic Programming

### Languages, Semantics, Inference and Learning

Book title: Foundations of Probabilistic Logic Programming

Languages, Semantics, Inference and Learning

**Second edition**

Author: Fabrizio Riguzzi, University of Ferrara, Italy

With a Foreword by Agostino Dovier, University of Udine, Italy

Publisher: River Publishers

Series: River Publishers Series in Software Engineering

ISBN 9788770227193

2022

Get it from

Sample content: Foreword, Preface to the 2nd Edition, Preface, Chapter 2

## Abstract

Probabilistic logic programming extends logic programming by enabling the representation of uncertain information by means of probability theory. Probabilistic logic programming is at the intersection of two wider research fields: the integration of logic and probability and probabilistic programming.Logic enables the representation of complex relations among entities while probability theory is useful for modeling uncertainty over attributes and relations. Combining the two is a very active field of study.Probabilistic programming extends programming languages with probabilistic primitives that can be used to write complex probabilistic models. Algorithms for inference and learning tasks are then provided automatically by the system. Probabilistic logic programming is at the same time a logic language, with its knowledge representation capabilities, and a Turing complete language, with its computation capabilities, thus providing the best of both worlds.

Since its birth, the field of probabilistic logic programming has seen a steady increase of activity, with many proposals for languages and algorithms for inference and learning. This book aims at providing an overview of the field with a special emphasis on languages under the distribution semantics, one of the most influential approaches. The book presents the main ideas for semantics, inference, and learning and highlights connections between the methods.Many examples of the book include a link to a page of the web application http://cplint.eu where the code can be run online.

This 2^{nd} edition aims at reporting the most exciting novelties in the field since the publication of the 1^{st} edition. The semantics for hybrid programs with function symbols was placed on a sound footing. Probabilistic Answer Set Programming gained a lot of interest together with studies on the complexity of inference. Algorithms for solving the MPE and MAP tasks are now available. Inference for hybrid programs has changed dramatically with the introduction of Weighted Model Integration.

With respect to learning, the first approaches for neuro-symbolic integration have appeared together with algorithms for learning the structure of hybrid programs

Moreover, given the cost of learning PLPs, various works proposed language restrictions to speed up learning and improve its scaling.

**Keywords:**

Probabilistic logic programming, statistical relational learning, statistical relational artificial intelligence, distribution semantics, graphical models, artificial intelligence, machine learning

### Table of Contents

- Foreword by Agostino Dovier
- Preface to the 2nd Edition
- Preface
- Acknowledgements

1 Preliminaries

1.1 Orders, Lattices, Ordinals

1.2 Mappings and Fixpoints

1.3 Logic Programming

1.4 Semantics for Normal Logic Programs

1.4.1 Program Completion

1.4.2 Well-Founded Semantics

1.4.3 Stable Model Semantics

1.5 Probability Theory

1.6 Probabilistic Graphical Models

2 Probabilistic Logic Programming Languages

2.1 Languages with the Distribution Semantics

2.1.1 Logic Programs with Annotated Disjunctions

2.1.2 ProbLog

2.1.3 Probabilistic Horn Abduction

2.1.4 PRISM

2.2 The Distribution Semantics for Programs Without Function Symbols

2.3 Examples of Programs

2.4 Equivalence of Expressive Power

2.5 Translation into Bayesian Networks

2.6 Generality of the Distribution Semantics

2.7 Extensions of the Distribution Semantics

2.8 CP-Logic

2.9 KBMC Probabilistic Logic Programming Languages

2.9.1 Bayesian Logic Programs

2.9.2 CLP(BN)

2.9.3 The Prolog Factor Language

2.10 Other Semantics for Probabilistic Logic Programming

2.10.1 Stochastic Logic Programs

2.10.2 ProPPR

2.11 Other Semantics for Probabilistic Logics

2.11.1 Nilsson’s Probabilistic Logic

2.11.2 Markov Logic Networks

2.11.2.1 Encoding Markov Logic Networks with Probabilistic Logic Programming

2.11.3 Annotated Probabilistic Logic Programs

3 Semantics with Function Symbols

3.1 The Distribution Semantics for Programs with Function Symbols

3.2 Infinite Covering Set of Explanations

3.3 Comparison with Sato and Kameya’s Definition

4 Hybrid Programs

4.1 Hybrid ProbLog

4.2 Distributional Clauses

4.3 Extended PRISM

4.4 cplint Hybrid Programs

4.5 Probabilistic Constraint Logic Programming

4.5.1 Dealing with Imprecise Probability Distributions

5 Semantics for Hybrid Programs with Function Symbols

5.1 Examples of PCLP with Function Symbols

5.2 Preliminaries

5.3 The Semantics of PCLP is Well-Defined

6 Probabilistic Answer Set Programming

6.1 A Semantics for Unsound Programs

6.2 Features of Answer Set Programming

6.3 Probabilistic Answer Set Programming

7 Complexity of Inference

7.1 Inference Tasks

7.2 Background on Complexity Theory

7.3 Complexity for Nonprobabilistic Inference

7.4 Complexity for Probabilistic Programs

7.4.1 Complexity for Acyclic and Locally Stratified Programs

7.4.2 Complexity Results from [Mauá and Cozman, 2020]

8 Exact Inference

8.1 PRISM

8.2 Knowledge Compilation

8.3 ProbLog1

8.4 cplint

8.5 SLGAD

8.6 PITA

8.7 ProbLog2

8.8 TP Compilation

8.9 MPE and MAP

8.9.1 MAP and MPE in ProbLog

8.9.2 MAP and MPE in PITA

8.10 Modeling Assumptions in PITA

8.10.1 PITA(OPT)

8.10.2 VIT with PITA

8.11 Inference for Queries with an Infinite Number of Explanations

9 Lifted Inference

9.1 Preliminaries on Lifted Inference

9.1.1 Variable Elimination

9.1.2 GC-FOVE

9.2 LP2

9.2.1 Translating ProbLog into PFL

9.3 Lifted Inference with Aggregation Parfactors

9.4 Weighted First-Order Model Counting

9.5 Cyclic Logic Programs

9.6 Comparison of the Approaches

10 Approximate Inference

10.1 ProbLog1

10.1.1 Iterative Deepening

10.1.2 k-best

10.1.3 Monte Carlo

10.2 MCINTYRE

10.3 Approximate Inference for Queries with an Infinite Number of Explanations

10.4 Conditional Approximate Inference

10.5 k-Optimal

10.6 Explanation-Based Approximate Weighted Model Counting

10.7 Approximate Inference with TP-compilation

11 Non-Standard Inference

11.1 Possibilistic Logic Programming

11.2 Decision-Theoretic ProbLog

11.3 Algebraic ProbLog

12 Inference for Hybrid Programs

12.1 Inference for Extended PRISM

12.2 Inference with Weighted Model Integration

12.2.1 Weighted Model Integration

12.2.2 Algebraic Model Counting

12.2.2.1 The Probability Density Semiring and WMI

12.2.2.2 Symbo

12.2.2.3 Sampo

12.3 Approximate Inference by Sampling for Hybrid Programs

12.4 Approximate Inference with Bounded Error for Hybrid Programs

12.5 Approximate Inference for the DISTR and EXP Tasks

13 Parameter Learning

13.1 PRISM Parameter Learning

13.2 LLPAD and ALLPAD Parameter Learning

13.3 LeProbLog

13.4 EMBLEM

13.5 ProbLog2 Parameter Learning

13.6 Parameter Learning for Hybrid Programs

13.7 DeepProbLog

13.7.1 DeepProbLog Inference

13.7.2 Learning in DeepProbLog

14 Structure Learning

14.1 Inductive Logic Programming

14.2 LLPAD and ALLPAD Structure Learning

14.3 ProbLog Theory Compression

14.4 ProbFOIL and ProbFOIL+

14.5 SLIPCOVER

14.5.1 The Language Bias

14.5.2 Description of the Algorithm

14.5.2.1 Function INITIALBEAMS

14.5.2.2 Beam Search with Clause Refinements

14.5.3 Execution Example

14.6 Learning the Structure of Hybrid Programs

14.7 Scaling PILP

14.7.1 LIFTCOVER

14.7.1.1 Liftable PLP

14.7.1.2 Parameter Learning

14.7.1.3 Structure Learning

14.7.2 SLEAHP

14.7.2.1 Hierarchical Probabilistic Logic Programs

14.7.2.2 Parameter Learning

14.7.2.3 Structure learning

14.8 Examples of Datasets

15 cplint Examples

15.1 cplint Commands

15.2 Natural Language Processing

15.2.1 Probabilistic Context-Free Grammars

15.2.2 Probabilistic Left Corner Grammars

15.2.3 Hidden Markov Models

15.3 Drawing Binary Decision Diagrams

15.4 Gaussian Processes

15.5 Dirichlet Processes

15.5.1 The Stick-Breaking Process

15.5.2 The Chinese Restaurant Process

15.5.3 Mixture Model

15.6 Bayesian Estimation

15.7 Kalman Filter

15.8 Stochastic Logic Programs

15.9 Tile Map Generation

15.10 Markov Logic Networks

15.11 Truel

15.12 Coupon Collector Problem

15.13 One-Dimensional Random Walk

15.14 Latent Dirichlet Allocation

15.15 The Indian GPA Problem

15.16 Bongard Problems

16 Conclusions

Bibliography

## First Edition

ISBN: 9788770220187

e-ISBN: 9788770220170

Get it from:

- the publisher
- Amazon.com
- Amazon.eu
- electronic version from Google Play
- other sellers on Google Books

Sample content: Table of contents, Foreword, Preface, Chapter 2

Latex sources of slides for some of the chapters are available from GitHub

## Abstract

Probabilistic Logic Programming extends Logic Programming by enabling the representation of uncertain information. Probabilistic Logic Programming is at the intersection of two wider research fields: the integration of logic and probability and Probabilistic Programming.

Logic enables the representation of complex relations among entities while probability theory is useful for model uncertainty over attributes and relations. Combining the two is a very active field of study. Probabilistic Programming extends programming languages with probabilistic primitives that can be used to write complex probabilistic models. Algorithms for the inference and learning tasks are then provided automatically by the system.

Probabilistic Logic programming is at the same time a logic language, with its knowledge representation capabilities, and a Turing complete language, with its computation capabilities, thus providing the best of both worlds.

Since its birth, the field of Probabilistic Logic Programming has seen a steady increase of activity, with many proposals for languages and algorithms for inference and learning. Foundations of Probabilistic Logic Programming aims at providing an overview of the field with a special emphasis on languages under the Distribution Semantics, one of the most influential approaches. The book presents the main ideas for semantics, inference, and learning and highlights connections between the methods.

Many examples of the book include a link to a page of the web application http://cplint.eu where the code can be run online.

**Keywords:**

Probabilistic logic programming, statistical relational learning, statistical relational artificial intelligence, distribution semantics, graphical models, artificial intelligence, machine learning

### Table of Contents

- Foreword by Agostino Dovier
- Preface
- Acknowledgements

- 1 Preliminaries
- 1.1 Orders, Lattices, Ordinals
- 1.2 Mappings and Fixpoints
- 1.3 Logic Programming
- 1.4 Semantics for Normal Logic Programs
- 1.4.1 Program Completion
- 1.4.2 Well-Founded Semantics
- 1.4.3 Stable Model Semantics

- 1.5 Probability Theory
- 1.6 Probabilistic Graphical Models

- 2 Probabilistic Logic Programming Languages
- 2.1 Languages with the Distribution Semantics
- 2.1.1 Logic Programs with Annotated Disjunctions
- 2.1.2 ProbLog
- 2.1.3 Probabilistic Horn Abduction
- 2.1.4 PRISM

- 2.2 The Distribution Semantics for Programs Without Function Symbols
- 2.3 Examples of Programs
- 2.4 Equivalence of Expressive Power
- 2.5 Translation to Bayesian Networks
- 2.6 Generality of the Distribution Semantics
- 2.7 Extensions of the Distribution Semantics
- 2.8 CP-Logic
- 2.9 Semantics for Non-Sound Programs
- 2.10 KBMC Probabilistic Logic Programming Languages
- 2.10.1 Bayesian Logic Programs
- 2.10.2 CLP(BN)
- 2.10.3 The Prolog Factor Language

- 2.11 Other Semantics for Probabilistic Logic Programming
- 2.11.1 Stochastic Logic Programs
- 2.11.2 ProPPR

- 2.12 Other Semantics for Probabilistic Logics
- 2.12.1 Nilsson’s Probabilistic Logic
- 2.12.2 Markov Logic Networks
- 2.12.2.1 Encoding Markov Logic Networks with Probabilistic Logic Programming

- 2.12.3 Annotated Probabilistic Logic Programs

- 2.1 Languages with the Distribution Semantics
- 3 Semantics with Function Symbols
- 3.1 The Distribution Semantics for Programs with FunctionSymbols
- 3.2 Infinite Covering Set of Explanations
- 3.3 Comparison with Sato and Kameya’s Definition

- 4 Semantics for Hybrid Programs
- 4.1 Hybrid ProbLog
- 4.2 Distributional Clauses
- 4.3 Extended PRISM
- 4.4 cplint Hybrid Programs
- 4.5Probabilistic Constraint Logic Programming
- 4.5.1 Dealing with Imprecise Probability Distributions

- 5 Exact Inference
- 5.1 PRISM
- 5.2 Knowledge Compilation
- 5.3 ProbLog1
- 5.4 cplint
- 5.5 SLGAD
- 5.6 PITA
- 5.7 ProbLog2
- 5.8 TP Compilation
- 5.9 Modeling Assumptions in PITA
- 5.9.1 PITA(OPT)
- 5.9.2 MPE with PITA

- 5.10 Inference for Queries with an Infinite Number of Explanations
- 5.11 Inference for Hybrid Programs

- 6 Lifted Inference
- 6.1 Preliminaries on Lifted Inference
- 6.1.1 Variable Elimination
- 6.1.2 GC-FOVE

- 6.2 LP2
- 6.2.1 Translating ProbLog into PFL

- 6.3 Lifted Inference with Aggregation Parfactors
- 6.4 Weighted First-Order Model Counting
- 6.5 Cyclic Logic Programs
- 6.6 Comparison of the Approaches

- 6.1 Preliminaries on Lifted Inference
- 7 Approximate Inference
- 7.1 ProbLog1
- 7.1.1 Iterative Deepening
- 7.1.2 k-best
- 7.1.3 Monte Carlo

- 7.2 MCINTYRE
- 7.3 Approximate Inference for Queries with an Infinite Number of Explanations
- 7.4 Conditional Approximate Inference
- 7.5 Approximate Inference by Sampling for Hybrid Programs
- 7.6 Approximate Inference with Bounded Error for Hybrid Programs
- 7.7 k-Optimal
- 7.8 Explanation-Based Approximate Weighted Model Counting
- 7.9 Approximate Inference with TP -compilation
- 7.10 DISTR and EXP Tasks

- 7.1 ProbLog1
- 8 Non-Standard Inference
- 8.1 Possibilistic Logic Programming
- 8.2 Decision-Theoretic ProbLog
- 8.3 Algebraic ProbLog

- 9 Parameter Learning
- 9.1 PRISM Parameter Learning
- 9.2 LLPAD and ALLPAD Parameter Learning
- 9.3 LeProbLog
- 9.4 EMBLEM
- 9.5 ProbLog2 Parameter Learning
- 9.6 Parameter Learning for Hybrid Programs

- 10 Structure Learning
- 10.1 Inductive Logic Programming
- 10.2 LLPAD and ALLPAD Structure Learning
- 10.3 ProbLog Theory Compression
- 10.4 ProbFOIL and ProbFOIL+
- 10.5 SLIPCOVER
- 10.5.1 The Language Bias
- 10.5.2 Description of the Algorithm
- 10.5.2.1 Function INITIALBEAMS
- 10.5.2.2 Beam Search with Clause Refinements

- 10.5.3 Execution Example

- 10.6 Examples of Datasets

- 11 cplint Examples
- 11.1 cplint Commands
- 11.2 Natural Language Processing
- 11.2.1 Probabilistic Context-Free Grammars
- 11.2.2 Probabilistic Left Corner Grammars
- 11.2.3 Hidden Markov Models

- 11.3 Drawing Binary Decision Diagrams
- 11.4 Gaussian Processes
- 11.5 Dirichlet Processes
- 11.5.1 The Stick-Breaking Process
- 11.5.2 The Chinese Restaurant Process
- 11.5.3 Mixture Model

- 11.6 Bayesian Estimation
- 11.7 Kalman Filter
- 11.8 Stochastic Logic Programs
- 11.9 Tile Map Generation
- 11.10 Markov Logic Networks
- 11.11 Truel
- 11.12 Coupon Collector Problem
- 11.13 One-Dimensional Random Walk
- 11.14 Latent Dirichlet Allocation
- 11.15 The Indian GPA Problem
- 11.16 Bongard Problems

- 12 Conclusions

- References