We have decided to discontinue the publication of preprints on our preprint server as of 1 March 2024. The publication culture within mathematics has changed so much due to the rise of repositories such as ArXiV (www.arxiv.org) that we are encouraging all institute members to make their preprints available there. An institute's repository in its previous form is, therefore, unnecessary. The preprints published to date will remain available here, but we will not add any new preprints here.
MiS Preprint
81/2013
Inhomogeneous Parsimonious Markov Models
Ralf Eggeling, Andre Gohr, Pierre-Yves Bourguignon, Edgar Wingender and Ivo Grosse
Abstract
We introduce inhomogeneous parsimonious Markov models for modeling statistical patterns in discrete sequences. These models are based on parsimonious context trees, which are a generalization of context trees, and thus generalize variable order Markov models. We follow a Bayesian approach, consisting of structure and parameter learning. Structure learning is a challenging problem due to an overexponential number of possible tree structures, so we describe an exact and efficient dynamic programming algorithm for finding the optimal tree structures. We apply model and learning algorithm to the problem of modeling binding sites of the human transcription factor C/EBP, and find an increased prediction performance compared to fixed order and variable order Markov models. We investigate the reason for this improvement and find several instances of context-specific dependences that can be captured by parsimonious context trees but not by traditional context trees.
Ralf Eggeling, André Gohr, Pierre-Yves Bourguignon, Edgar Wingender and Ivo Grosse
Inhomogeneous parsimonious Markov models
In: Machine learning and knowledge discovery in databases : European Conference, ECML PKDD 2013, Prague, Czech Republic, September 23-27, 2013, Proceedings, Part 1 / Hendrik Blockeel... (eds.) Berlin [u. a.] : Springer, 2013. - pp. 321-336 (Lecture notes in artificial intelligence ; 8188)