Title
A Lifelong Learning Topic Model Structured Using Latent Embeddings
Abstract
We propose a latent-embedding-structured lifelong learning topic model, called the LLT model, to discover coherent topics from a corpus. Specifically, we exploit latent word embeddings to structure our model and mine word correlation knowledge to assist in topic modeling. During each learning iteration, our model learns new word embeddings based on the topics generated in the previous learning iteration. Experimental results demonstrate that our LLT model is able to generate more coherent topics than state-of-the-art methods.
Year
DOI
Venue
2017
10.1109/ICSC.2017.15
2017 IEEE 11th International Conference on Semantic Computing (ICSC)
Keywords
Field
DocType
Lifelong learning,Topic modeling,Latent embeddings
Computer science,Exploit,Artificial intelligence,Natural language processing,Topic model,Lifelong learning,Machine learning
Conference
ISSN
ISBN
Citations 
2325-6516
978-1-5090-4285-2
0
PageRank 
References 
Authors
0.34
4
4
Name
Order
Citations
PageRank
Mingyang Xu121.37
Ruixin Yang220131.97
Steve Harenberg3175.11
Nagiza F. Samatova486174.04