Academic literature on the topic 'Dirichlet allocation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Dirichlet allocation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Dirichlet allocation"

1

Du, Lan, Wray Buntine, Huidong Jin, and Changyou Chen. "Sequential latent Dirichlet allocation." Knowledge and Information Systems 31, no. 3 (2011): 475–503. http://dx.doi.org/10.1007/s10115-011-0425-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Schwarz, Carlo. "Ldagibbs: A Command for Topic Modeling in Stata Using Latent Dirichlet Allocation." Stata Journal: Promoting communications on statistics and Stata 18, no. 1 (2018): 101–17. http://dx.doi.org/10.1177/1536867x1801800107.

Full text
Abstract:
In this article, I introduce the ldagibbs command, which implements latent Dirichlet allocation in Stata. Latent Dirichlet allocation is the most popular machine-learning topic model. Topic models automatically cluster text documents into a user-chosen number of topics. Latent Dirichlet allocation represents each document as a probability distribution over topics and represents each topic as a probability distribution over words. Therefore, latent Dirichlet allocation provides a way to analyze the content of large unclassified text data and an alternative to predefined document classifications
APA, Harvard, Vancouver, ISO, and other styles
3

Yoshida, Takahiro, Ryohei Hisano, and Takaaki Ohnishi. "Gaussian hierarchical latent Dirichlet allocation: Bringing polysemy back." PLOS ONE 18, no. 7 (2023): e0288274. http://dx.doi.org/10.1371/journal.pone.0288274.

Full text
Abstract:
Topic models are widely used to discover the latent representation of a set of documents. The two canonical models are latent Dirichlet allocation, and Gaussian latent Dirichlet allocation, where the former uses multinomial distributions over words, and the latter uses multivariate Gaussian distributions over pre-trained word embedding vectors as the latent topic representations, respectively. Compared with latent Dirichlet allocation, Gaussian latent Dirichlet allocation is limited in the sense that it does not capture the polysemy of a word such as “bank.” In this paper, we show that Gaussia
APA, Harvard, Vancouver, ISO, and other styles
4

Archambeau, Cedric, Balaji Lakshminarayanan, and Guillaume Bouchard. "Latent IBP Compound Dirichlet Allocation." IEEE Transactions on Pattern Analysis and Machine Intelligence 37, no. 2 (2015): 321–33. http://dx.doi.org/10.1109/tpami.2014.2313122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Pion-Tonachini, Luca, Scott Makeig, and Ken Kreutz-Delgado. "Crowd labeling latent Dirichlet allocation." Knowledge and Information Systems 53, no. 3 (2017): 749–65. http://dx.doi.org/10.1007/s10115-017-1053-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

S.S., Ramyadharshni, and Pabitha Dr.P. "Topic Categorization on Social Network Using Latent Dirichlet Allocation." Bonfring International Journal of Software Engineering and Soft Computing 8, no. 2 (2018): 16–20. http://dx.doi.org/10.9756/bijsesc.8390.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Syed, Shaheen, and Marco Spruit. "Exploring Symmetrical and Asymmetrical Dirichlet Priors for Latent Dirichlet Allocation." International Journal of Semantic Computing 12, no. 03 (2018): 399–423. http://dx.doi.org/10.1142/s1793351x18400184.

Full text
Abstract:
Latent Dirichlet Allocation (LDA) has gained much attention from researchers and is increasingly being applied to uncover underlying semantic structures from a variety of corpora. However, nearly all researchers use symmetrical Dirichlet priors, often unaware of the underlying practical implications that they bear. This research is the first to explore symmetrical and asymmetrical Dirichlet priors on topic coherence and human topic ranking when uncovering latent semantic structures from scientific research articles. More specifically, we examine the practical effects of several classes of Diri
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Gen, and Hazri Jamil. "Teacher professional learning community and interdisciplinary collaborative teaching path under the informationization basic education model." Yugoslav Journal of Operations Research, no. 00 (2024): 29. http://dx.doi.org/10.2298/yjor2403029l.

Full text
Abstract:
The construction of a learning community cannot be separated from the participation of information technology. The current teacher learning community has problems of low interaction efficiency and insufficient enthusiasm for group cooperative teaching. This study adopts the Latent Dirichlet allocation method to process text data generated by teacher interaction from the evolution of knowledge topics in the learning community network space. At the same time, the interaction data of the network community learning space is used to extract the interaction characteristics between teachers, and a co
APA, Harvard, Vancouver, ISO, and other styles
9

Garg, Mohit, and Priya Rangra. "Bibliometric Analysis of Latent Dirichlet Allocation." DESIDOC Journal of Library & Information Technology 42, no. 2 (2022): 105–13. http://dx.doi.org/10.14429/djlit.42.2.17307.

Full text
Abstract:
Latent Dirichlet Allocation (LDA) has emerged as an important algorithm in big data analysis that finds the group of topics in the text data. It posits that each text document consists of a group of topics, and each topic is a mixture of words related to it. With the emergence of a plethora of text data, the LDA has become a popular algorithm for topic modeling among researchers from different domains. Therefore, it is essential to understand the trends of LDA researches. Bibliometric techniques are established methods to study the research progress of a topic. In this study, bibliographic dat
APA, Harvard, Vancouver, ISO, and other styles
10

Chauhan, Uttam, and Apurva Shah. "Topic Modeling Using Latent Dirichlet allocation." ACM Computing Surveys 54, no. 7 (2022): 1–35. http://dx.doi.org/10.1145/3462478.

Full text
Abstract:
We are not able to deal with a mammoth text corpus without summarizing them into a relatively small subset. A computational tool is extremely needed to understand such a gigantic pool of text. Probabilistic Topic Modeling discovers and explains the enormous collection of documents by reducing them in a topical subspace. In this work, we study the background and advancement of topic modeling techniques. We first introduce the preliminaries of the topic modeling techniques and review its extensions and variations, such as topic modeling over various domains, hierarchical topic modeling, word emb
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!