Results 41 to 50 of about 10,485,189 (358)

Neural Language Models for Nineteenth-Century English

open access: yesJournal of Open Humanities Data, 2021
We present four types of neural language models trained on a large historical dataset of books in English, published between 1760 and 1900, and comprised of ≈5.1 billion tokens.
Kasra Hosseini   +3 more
doaj   +1 more source

Table Search Using a Deep Contextualized Language Model

open access: yes, 2020
Pretrained contextualized language models such as BERT have achieved impressive results on various natural language processing benchmarks. Benefiting from multiple pretraining tasks and large scale training corpora, pretrained models can capture complex ...
Auer Sören   +5 more
core   +1 more source

I-BERT: Integer-only BERT Quantization

open access: yes, 2021
Transformer based models, like BERT and RoBERTa, have achieved state-of-the-art results in many Natural Language Processing tasks. However, their memory footprint, inference latency, and power consumption are prohibitive efficient inference at the edge, and even at the data center.
Kim, Sehoon   +4 more
openaire   +2 more sources

Med-BERT: pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction [PDF]

open access: yesnpj Digital Medicine, 2020
Deep learning (DL)-based predictive models from electronic health records (EHRs) deliver impressive performance in many clinical tasks. Large training cohorts, however, are often required by these models to achieve high accuracy, hindering the adoption ...
L. Rasmy   +4 more
semanticscholar   +1 more source

Look at the First Sentence: Position Bias in Question Answering

open access: yes, 2020
Many extractive question answering models are trained to predict start and end positions of answers. The choice of predicting answers as positions is mainly due to its simplicity and effectiveness. In this study, we hypothesize that when the distribution
Kang, Jaewoo   +4 more
core   +1 more source

Patient Knowledge Distillation for BERT Model Compression [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2019
Pre-trained language models such as BERT have proven to be highly effective for natural language processing (NLP) tasks. However, the high demand for computing resources in training such models hinders their application in practice. In order to alleviate
S. Sun, Yu Cheng, Zhe Gan, Jingjing Liu
semanticscholar   +1 more source

MaterialBERT for natural language processing of materials science texts

open access: yesScience and Technology of Advanced Materials: Methods, 2022
A BERT (Bidirectional Encoder Representations from Transformers) model, which we named “MaterialBERT”, has been generated using scientific papers in wide area of material science as a corpus.
Michiko Yoshitake   +3 more
doaj   +1 more source

Sentiment Analysis of Twitter's Opinion on The Russia and Ukraine War Using Bert

open access: yesJurnal Riset Informatika, 2022
News about the war that took place between Russia and Ukraine can not be denied affecting various aspects of life in the world. This affects the writings of every citizen of the world on various social media platforms, one of which is Twitter.
Muhammad Fahmi Julianto   +2 more
doaj   +1 more source

Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets [PDF]

open access: yesBioNLP@ACL, 2019
Inspired by the success of the General Language Understanding Evaluation benchmark, we introduce the Biomedical Language Understanding Evaluation (BLUE) benchmark to facilitate research in the development of pre-training language representations in the ...
Yifan Peng, Shankai Yan, Zhiyong Lu
semanticscholar   +1 more source

Interleukin-1α deficiency reduces adiposity, glucose intolerance and hepatic de-novo lipogenesis in diet-induced obese mice

open access: yesBMJ Open Diabetes Research & Care, 2019
Objective While extensive research revealed that interleukin (IL)-1β contributes to insulin resistance (IR) development, the role of IL-1α in obesity and IR was scarcely studied.
Tal Almog   +13 more
doaj   +1 more source

Home - About - Disclaimer - Privacy