Survey On Term Weighting Using Coherent Clustering In Topic Modelling
Manisha N. Amnerkar, Ashwini Tikle
Topic modeling, Term weighting, Informative word, Conditional entropy.
Topic models often produce uncountable topics that are filled with noisy words. The reason is that words in topic modelling have same weights. More frequency words dominate the top topic word lists, but most of them are meaningless words, e.g., domain-specific stopwords. To address this issue, in this paper we aim to investigate how to weight words, and then develop a straightforward but effective term weighting scheme, namely entropy weighting (EW). The proposed EW scheme is based on conditional entropy measured by word co-occurrences. Compared with existing term weighting schemes, the highlight of EW is that it can automatically reward informative words. For more robust word weight, we further suggest a integrated form of EW (CEW) with two existing weighting schemes. Basically, our CEW assigns unmeaning words lower weights and informative words higher weights, leading to more coherent topics during topic modelling inference. We apply CEW to DMM and LDA, and evaluate it by topic quality, document clustering and classification tasks on 8 real world data sets. Exploratory results show that weighting words can effectively improve the topic modelling performance over both short texts and normal long texts. More importantly, the proposed CEW significantly outperforms the existing term weighting schemes, since it further considers which words are informative.
Article Details
Unique Paper ID: 148352

Publication Volume & Issue: Volume 6, Issue 1

Page(s): 402 - 405
Article Preview & Download

Share This Article

Go To Issue

Call For Paper

Volume 7 Issue 9

Last Date 25 February 2020

About Us enables door in research by providing high quality research articles in open access market.

Send us any query related to your research on

Social Media

Google Verified Reviews

Contact Details