Results 21 to 30 of about 10,485,189 (358)

w2v-BERT: Combining Contrastive Learning and Masked Language Modeling for Self-Supervised Speech Pre-Training [PDF]

open access: yesAutomatic Speech Recognition & Understanding, 2021
Motivated by the success of masked language modeling (MLM) in pre-training natural language processing models, we propose w2v-BERT that explores MLM for self-supervised speech representation learning.
Yu-An Chung   +6 more
semanticscholar   +1 more source

BERT Rediscovers the Classical NLP Pipeline [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2019
Pre-trained text encoders have rapidly advanced the state of the art on many NLP tasks. We focus on one such model, BERT, and aim to quantify where linguistic information is captured within the network.
Ian Tenney, Dipanjan Das, Ellie Pavlick
semanticscholar   +1 more source

Language-agnostic BERT Sentence Embedding [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2020
While BERT is an effective method for learning monolingual sentence embeddings for semantic similarity and embedding based transfer learning BERT based cross-lingual sentence embeddings have yet to be explored.
Fangxiaoyu Feng   +4 more
semanticscholar   +1 more source

FinEst BERT and CroSloEngual BERT [PDF]

open access: yes, 2020
Large pretrained masked language models have become state-of-the-art solutions for many NLP problems. The research has been mostly focused on English language, though. While massively multilingual models exist, studies have shown that monolingual models produce much better results.
Ulčar, Matej, Robnik-Šikonja, Marko
openaire   +3 more sources

Effects of hydrogel-encapsulated bacteria on the healing efficiency and compressive strength of concrete

open access: yesJournal of Road Engineering, 2023
Microbial-induced calcium carbonate precipitation is a promising technology for self-healing concrete due to its capability to seal microcracks. The main goal of this study was to evaluate the effects of adding hydrogel-encapsulated bacteria on the ...
Ricardo Hungria   +2 more
doaj   +1 more source

Is BERT Really Robust? A Strong Baseline for Natural Language Attack on Text Classification and Entailment [PDF]

open access: yesAAAI Conference on Artificial Intelligence, 2019
Machine learning algorithms are often vulnerable to adversarial examples that have imperceptible alterations from the original counterparts but can fool the state-of-the-art models. It is helpful to evaluate or even improve the robustness of these models
Di Jin   +3 more
semanticscholar   +1 more source

MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices [PDF]

open access: yesAnnual Meeting of the Association for Computational Linguistics, 2020
Natural Language Processing (NLP) has recently achieved great success by using huge pre-trained models with hundreds of millions of parameters. However, these models suffer from heavy model sizes and high latency such that they cannot be deployed to ...
Zhiqing Sun   +5 more
semanticscholar   +1 more source

Seasonal variation of aerosol water uptake and its impact on the direct radiative effect at Ny-Ålesund, Svalbard [PDF]

open access: yesAtmospheric Chemistry and Physics, 2014
In this study we investigated the impact of water uptake by aerosol particles in ambient atmosphere on their optical properties and their direct radiative effect (ADRE, W m−2) in the Arctic at Ny-Ålesund, Svalbard, during 2008.
N. Rastak   +10 more
doaj   +1 more source

A Lite Romanian BERT: ALR-BERT

open access: yesComputers, 2022
Large-scale pre-trained language representation and its promising performance in various downstream applications have become an area of interest in the field of natural language processing (NLP). There has been huge interest in further increasing the model’s size in order to outperform the best previously obtained performances.
Dragoş Constantin Nicolae   +2 more
openaire   +3 more sources

Importance of aerosol composition and mixing state for cloud droplet activation over the Arctic pack ice in summer [PDF]

open access: yesAtmospheric Chemistry and Physics, 2015
Concentrations of cloud condensation nuclei (CCN) were measured throughout an expedition by icebreaker around the central Arctic Ocean, including a 3 week ice drift operation at 87° N, from 3 August to 9 September 2008.
C. Leck, E. Svensson
doaj   +1 more source

Home - About - Disclaimer - Privacy