Tree-Structured Neural Topic Model



Isonuma, Masaru, Mori, Junichiro, Bollegala, Danushka ORCID: 0000-0003-4476-7003 and Sakata, Ichiro
(2020) Tree-Structured Neural Topic Model. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020-7 - 2020-7, Seattle, USA.

[img] Text
ACL2020.pdf - Author Accepted Manuscript

Download (509kB) | Preview

Abstract

This paper presents a tree-structured neural topic model, which has a topic distribution over a tree with an infinite number of branches. Our model parameterizes an unbounded ancestral and fraternal topic distribution by applying doubly-recurrent neural networks. With the help of autoencoding variational Bayes, our model improves data scalability and achieves competitive performance when inducing latent topics and tree structures, as compared to a prior tree-structured topic model (Blei et al., 2010). This work extends the tree-structured topic model such that it can be incorporated with neural models for downstream tasks.

Item Type: Conference or Workshop Item (Unspecified)
Depositing User: Symplectic Admin
Date Deposited: 13 May 2020 10:23
Last Modified: 15 Mar 2024 02:26
DOI: 10.18653/v1/2020.acl-main.73
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3085260