Kurucan, Mehmet, Özbaltan, Mete, Schewe, Sven and Wojtczak, Dominik
(2022)
Hidden 1-Counter Markov Models and How to Learn Them.
In: Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}, 2022-7-23 - 2022-7-29, Vienna, Austria.
Text
IJCAI22.pdf - Author Accepted Manuscript Download (718kB) | Preview |
Abstract
<jats:p>We introduce hidden 1-counter Markov models (H1MMs) as an attractive sweet spot between standard hidden Markov models (HMMs) and probabilistic context-free grammars (PCFGs). Both HMMs and PCFGs have a variety of applications, e.g., speech recognition, anomaly detection, and bioinformatics. PCFGs are more expressive than HMMs, e.g., they are more suited for studying protein folding or natural language processing. However, they suffer from slow parameter fitting, which is cubic in the observation sequence length. The same process for HMMs is just linear using the well-known forward-backward algorithm. We argue that by adding to each state of an HMM an integer counter, e.g., representing the number of clients waiting in a queue, brings its expressivity closer to PCFGs. At the same time, we show that parameter fitting for such a model is computationally inexpensive: it is bi-linear in the length of the observation sequence and the maximal counter value, which grows slower than the observation length. The resulting model of H1MMs allows us to combine the best of both worlds: more expressivity with faster parameter fitting.</jats:p>
Item Type: | Conference or Workshop Item (Unspecified) |
---|---|
Uncontrolled Keywords: | 31 Biological Sciences, 3102 Bioinformatics and Computational Biology, Bioengineering, Generic health relevance |
Divisions: | Faculty of Science and Engineering > School of Electrical Engineering, Electronics and Computer Science |
Depositing User: | Symplectic Admin |
Date Deposited: | 04 May 2022 09:11 |
Last Modified: | 18 Jul 2024 20:26 |
DOI: | 10.24963/ijcai.2022/673 |
Related URLs: | |
URI: | https://livrepository.liverpool.ac.uk/id/eprint/3154266 |