Semantic N-Gram Topic Modeling

Authors

  • Pooja Kherwa Maharaja Surajmal Institute of Technology
  • Poonam Bansal Maharaja Surajmal Institute of Technology

DOI:

https://doi.org/10.4108/eai.13-7-2018.163131

Keywords:

Topic Modeling, Latent Dirichlet Allocation, Point wise Mutual Information, Bag of words, Coherence, Perplexity

Abstract

In this paper a novel approach for effective topic modeling is presented. The approach is different from traditional vector space model-based topic modeling, where the Bag of Words (BOW) approach is followed. The novelty of our approach is that in phrase-based vector space, where critical measure like point wise mutual information (PMI) and log frequency based mutual dependency (LGMD)is applied and phrase’s suitability for particular topic are calculated and best considerable semantic N-Gram phrases and terms are considered for further topic modeling. In this experiment the proposed semantic N-Gram topic modeling is compared with collocation Latent Dirichlet allocation(coll-LDA) and most appropriate state of the art topic modeling technique latent Dirichlet allocation (LDA). Results are evaluated and it was found that perplexity is drastically improved and found significant improvement in coherence score specifically for short text data set like movie reviews and political blogs.

Downloads

Published

11-02-2020

How to Cite

1.
Kherwa P, Bansal P. Semantic N-Gram Topic Modeling. EAI Endorsed Scal Inf Syst [Internet]. 2020 Feb. 11 [cited 2024 Nov. 13];7(26):e7. Available from: https://publications.eai.eu/index.php/sis/article/view/2122