- Markov logic networks (MLNs) combine first-order logic and probabilistic graphical models by assigning weights to first-order logic formulas. This allows logical knowledge bases to represent soft constraints rather than hard rules.
- MLNs can be represented as templates for Markov networks, where each grounding of a first-order formula corresponds to a feature. Inference involves extracting a minimal grounded subnetwork and using MCMC methods. Learning maximizes the pseudo-likelihood of the data.
- The speaker experiments with MLNs on a university database, finding they outperform hand-built knowledge bases, ILP, and other statistical relational learning systems on tasks like predicting student-advisor relationships.
1. Learning, Logic, and Probability: A Unified View Pedro Domingos Dept. Computer Science & Eng. University of Washington (Joint work with Stanley Kok, Matt Richardson and Parag Singla)
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14. Example of First-Order KB Friends either both smoke or both don’t smoke Smoking causes cancer
19. Example of an MLN Cancer(A) Smokes(A) Smokes(B) Cancer(B) Suppose we have two constants: Anna (A) and Bob (B)
20. Example of an MLN Cancer(A) Smokes(A) Friends(A,A) Friends(B,A) Smokes(B) Friends(A,B) Cancer(B) Friends(B,B) Suppose we have two constants: Anna (A) and Bob (B)
21. Example of an MLN Cancer(A) Smokes(A) Friends(A,A) Friends(B,A) Smokes(B) Friends(A,B) Cancer(B) Friends(B,B) Suppose we have two constants: Anna (A) and Bob (B)
22. Example of an MLN Cancer(A) Smokes(A) Friends(A,A) Friends(B,A) Smokes(B) Friends(A,B) Cancer(B) Friends(B,B) Suppose we have two constants: Anna (A) and Bob (B)