Results 21 to 30 of about 547,938 (128)
Development of a Conceptual Model for Accelerated Project Prioritization after Disaster Event [PDF]
There is a need for rapid and responsive infrastructure repair and construction after natural disaster events such as hurricanes, wildfires, and tornadoes.
Martins Claudia +3 more
doaj +1 more source
Classification of Russian Texts by Genres Based on Modern Embeddings and Rhythm
The article investigates modern vector text models for solving the problem of genre classification of Russian-language texts. Models include ELMo embeddings, BERT language model with pre-training and a complex of numerical rhythm features based on lexico-
Ksenia Vladimirovna Lagutina
doaj +1 more source
Table Search Using a Deep Contextualized Language Model
Pretrained contextualized language models such as BERT have achieved impressive results on various natural language processing benchmarks. Benefiting from multiple pretraining tasks and large scale training corpora, pretrained models can capture complex ...
Auer Sören +5 more
core +1 more source
Someone's opinion on a product or service that is poured through a review is something that is quite important for the owner or potential customer. However, the large number of reviews makes it difficult for them to analyze the information contained in ...
Putri Rizki Amalia, Edi Winarko
doaj +1 more source
How to Fine-Tune BERT for Text Classification?
Language model pre-training has proven to be useful in learning universal language representations. As a state-of-the-art language model pre-training model, BERT (Bidirectional Encoder Representations from Transformers) has achieved amazing results in ...
Huang, Xuanjing +3 more
core +1 more source
Aspect-level sentiment classification, a significant task of fine-grained sentiment analysis, aims to identify the sentimental information expressed in each aspect of a given sentence The existing methods combine global features and local structures to ...
Subo Wei +4 more
doaj +1 more source
Neural Language Models for Nineteenth-Century English
We present four types of neural language models trained on a large historical dataset of books in English, published between 1760 and 1900, and comprised of ≈5.1 billion tokens.
Kasra Hosseini +3 more
doaj +1 more source
ParsBERT: Transformer-based Model for Persian Language Understanding
The surge of pre-trained language models has begun a new era in the field of Natural Language Processing (NLP) by allowing us to build powerful language models.
Farahani, Marzieh +3 more
core +1 more source
Nietzsche, immortality, singularity and eternal recurrence [PDF]
Joan Copjec has shown that modernity is privy to a notion of immortality all its own – one that differs fundamentally from any counterpart entertained in Greek antiquity or the Christian Middle Ages.
Olivier, Bert
core +1 more source
Objective While extensive research revealed that interleukin (IL)-1β contributes to insulin resistance (IR) development, the role of IL-1α in obesity and IR was scarcely studied.
Tal Almog +13 more
doaj +1 more source

