Results 291 to 300 of about 23,589,100 (342)
Some of the next articles are maybe not open access.
Proceedings of the 23rd international conference on Machine learning - ICML '06, 2006
Some models of textual corpora employ text generation methods involving n-gram statistics, while others use latent topic variables inferred using the "bag-of-words" assumption, in which word order is ignored. Previously, these methods have not been combined.
openaire +1 more source
Some models of textual corpora employ text generation methods involving n-gram statistics, while others use latent topic variables inferred using the "bag-of-words" assumption, in which word order is ignored. Previously, these methods have not been combined.
openaire +1 more source
Proceedings of the 25th annual international ACM SIGIR conference on Research and development in information retrieval, 2002
In this paper, we present a method based on document probes to quantify and diagnose topic structure, distinguishing topics as monolithic, structured, or diffuse. The method also yields a structure analysis that can be used directly to optimize filter (classifier) creation.
David A. Evans +2 more
openaire +1 more source
In this paper, we present a method based on document probes to quantify and diagnose topic structure, distinguishing topics as monolithic, structured, or diffuse. The method also yields a structure analysis that can be used directly to optimize filter (classifier) creation.
David A. Evans +2 more
openaire +1 more source
Research topics and trends in the maritime transport: A structural topic model
, 2021Xiwen Bai +4 more
semanticscholar +1 more source
Proceedings of the 8th ACM Conference on Web Science, 2016
In this tutorial, we teach the intuition and the assumptions behind topic models. Topic models explain the co-occurrences of words in documents by extracting sets of semantically related words, called topics. These topics are semantically coherent and can be interpreted by humans. Starting with the most popular topic model, Latent Dirichlet Allocation (
Christoph Carl Kling +3 more
openaire +1 more source
In this tutorial, we teach the intuition and the assumptions behind topic models. Topic models explain the co-occurrences of words in documents by extracting sets of semantically related words, called topics. These topics are semantically coherent and can be interpreted by humans. Starting with the most popular topic model, Latent Dirichlet Allocation (
Christoph Carl Kling +3 more
openaire +1 more source
Tracking urban geo-topics based on dynamic topic model
Computers, Environment and Urban Systems, 2020Fang Yao, Yan Wang
semanticscholar +1 more source
Antibody–drug conjugates: Smart chemotherapy delivery across tumor histologies
Ca-A Cancer Journal for Clinicians, 2022Paolo Tarantino +2 more
exaly
Analyzing scientific research topics in manufacturing field using a topic model
Computers & industrial engineering, 2019Hui Xiong +3 more
semanticscholar +1 more source

