Results 51 to 60 of about 64,982 (183)

The supervised hierarchical Dirichlet process [PDF]

open access: yes, 2014
We propose the supervised hierarchical Dirichlet process (sHDP), a nonparametric generative model for the joint distribution of a group of observations and a response variable directly associated with that whole group. We compare the sHDP with another leading method for regression on grouped data, the supervised latent Dirichlet allocation (sLDA) model.
arxiv   +1 more source

Augmenting word2vec with latent Dirichlet allocation within a clinical application [PDF]

open access: yesarXiv, 2018
This paper presents three hybrid models that directly combine latent Dirichlet allocation and word embedding for distinguishing between speakers with and without Alzheimer's disease from transcripts of picture descriptions. Two of our models get F-scores over the current state-of-the-art using automatic methods on the DementiaBank dataset.
arxiv  

Communication-Efficient Parallel Belief Propagation for Latent Dirichlet Allocation [PDF]

open access: yesarXiv, 2012
This paper presents a novel communication-efficient parallel belief propagation (CE-PBP) algorithm for training latent Dirichlet allocation (LDA). Based on the synchronous belief propagation (BP) algorithm, we first develop a parallel belief propagation (PBP) algorithm on the parallel architecture. Because the extensive communication delay often causes
arxiv  

Modeling Word Relatedness in Latent Dirichlet Allocation [PDF]

open access: yesarXiv, 2014
Standard LDA model suffers the problem that the topic assignment of each word is independent and word correlation hence is neglected. To address this problem, in this paper, we propose a model called Word Related Latent Dirichlet Allocation (WR-LDA) by incorporating word correlation into LDA topic models.
arxiv  

Incremental Variational Inference for Latent Dirichlet Allocation [PDF]

open access: yesarXiv, 2015
We introduce incremental variational inference and apply it to latent Dirichlet allocation (LDA). Incremental variational inference is inspired by incremental EM and provides an alternative to stochastic variational inference. Incremental LDA can process massive document collections, does not require to set a learning rate, converges faster to a local ...
arxiv  

Exact marginal inference in Latent Dirichlet Allocation [PDF]

open access: yesarXiv, 2020
Assume we have potential "causes" $z\in Z$, which produce "events" $w$ with known probabilities $\beta(w|z)$. We observe $w_1,w_2,...,w_n$, what can we say about the distribution of the causes? A Bayesian estimate will assume a prior on distributions on $Z$ (we assume a Dirichlet prior) and calculate a posterior.
arxiv  

Latent Beta-Liouville Probabilistic Modeling for Bursty Topic Discovery in Textual Data

open access: yesProceedings of the International Florida Artificial Intelligence Research Society Conference
Topic modeling has become a fundamental technique for uncovering latent thematic structures within large collections of textual data. However, conventional models often struggle to capture the burstiness of topics.
Shadan Ghadimi   +2 more
doaj   +1 more source

Blocking Collapsed Gibbs Sampler for Latent Dirichlet Allocation Models [PDF]

open access: yesarXiv, 2016
The latent Dirichlet allocation (LDA) model is a widely-used latent variable model in machine learning for text analysis. Inference for this model typically involves a single-site collapsed Gibbs sampling step for latent variables associated with observations.
arxiv  

Hyperspectral Unmixing with Endmember Variability using Partial Membership Latent Dirichlet Allocation [PDF]

open access: yesarXiv, 2016
The application of Partial Membership Latent Dirichlet Allocation(PM-LDA) for hyperspectral endmember estimation and spectral unmixing is presented. PM-LDA provides a model for a hyperspectral image analysis that accounts for spectral variability and incorporates spatial information through the use of superpixel-based 'documents.' In our application of
arxiv  

Source Code Comments: Overlooked in the Realm of Code Clone Detection [PDF]

open access: yesarXiv, 2020
Reusing code can produce duplicate or near-duplicate code clones in code repositories. Current code clone detection techniques, like Program Dependence Graphs, rely on code structure and their dependencies to detect clones. These techniques are expensive, using large amounts of processing power, time, and memory.
arxiv  

Home - About - Disclaimer - Privacy