Monday 4 october 2010, at 16h30 in Celestijnenlaan 200ARoom 05.001
Generative Learning in the Context of Distribution Semantics by the
Means of BDDs
By Ingo Thon (PhDstudent DTAI)
The Distribution Semantics provides a general framework for specifying probabilistic models. Whenever a model can be cast into this framework it is ensured that the models defines a probability distribution. Prism, Problog and CP(T)-Logic are well known examples for languages based on the Distribution Semantics. But also Bayesian Networks and Stochastic Context Free Grammars can be seen as instances of models defined in the distribution semantics.
Generally speaking a model in the distribution semantic consists of two parts (a) a set of probabilistic switches and (b) a deterministic model defining a mapping from these switches into an output space.
In the first part of this talk I will present an algorithm to generatively learn in the context of the distribution semantics. The algorithm uses an Expectation Maximization scheme to optimize the Log-Likelihood of the training data given the learned model.
The first version of this algorithm was introduced for learning a model of Stochastic Relational Processes (Thon et. al., ECML 08). The second part will constitute of several applications within this context. Namely building a model of an online game and the behavior of users in chat rooms
Independently a slightly extended version has been presented (Sato et al., ILP late breaking papers 08). We extended this algorithm further (Gutmann et al. unpublished) to a more general version which is able to learn the parameters of ProbLog programs. I will show several examples of learning ProbLog programs to define a distribution overHerbrand Interpretations