Parameter Learning From Partial Interpretations
Go to the directory ~/yap-6/packages/ProbLog/problog_examples/ and start YAP. Then type :- [alarm]. This query loads the example file below where the fact probabilities are unknown.
t(0.1) :: burglary. t(0.3) :: earthquake. t(_) :: hears_alarm(_Person). myclause(person(mary), true). myclause(person(john), true). myclause(alarm, burglary). myclause(alarm, earthquake). myclause(calls(Person), (person(Person),alarm,hears_alarm(Person))).
Instead of probabilities every fact has a t( ) prefix. The t stands for tunable and indicate that ProbLog should learn the probability. The number between the parentheses indicates the ground truth probability. It is ignored by the learning algorithm and if you do not know the ground truth, you can write t(_). The ground truth is used after learning to estimate the distance of the learned model parameters to the ground truth model parameters.
Each clause in the background knowledge has to be written using the myclause/2 predicate. For instance myclause(calls(Person), (person(Person),alarm,hears_alarm(Person))) stands for calls(Person) :- person(Person), alarm, hears_alarm(Person).
Furthermore, the example files contains two training examples
example(1). example(2). known(1,alarm,true). known(2,earthquake,false). known(2,calls(mary),true).
The first training example specifies that alarm is true and everyting else is unknown. The second example specifies earthquake being false, calls(mary) being true and everything else as unknown.
After the example file is loaded, you can start learning algorithm by typing :-do_learning(10). where 10 is the number of iterations you want to perform. Alternatively, you can also use :-do_learning(N,Epsilon). and the learning will stop after N iterations or if the difference of the log likelihood between two iterations gets smaller than Epsilon - depending on what happens first.
Afterwards you can quit YAP and go to the folder ~/yap-6/packages/ProbLog/problog_examples/. There you will find the file log.dat which contains LLH on training and test set for every iteration, the timings, and some metrics in CSV format. The files factprobs_N.pl contain the fact probabilities after the Nth iteration and the files predictions_N.pl contain the estimated probabilities for each training and test example - per default these file are generated every 5th iteration only.
Also, the folder output is created as sub folder in the folder where you YAP started. To change this location, use the corresponding flag (see below).
Settings for Learning (problog-lfi.yap)
For the learning module, use :-problog_flags. to get an overview of all options, :-set_problog_flag(Name,Value). to change an option and :-problog_flag(Name,Value). to obtain the current value for a flag.
Continue with the tutorial on decision-theoretic ProbLog (DTProbLog).