Results 301 to 310 of about 10,485,189 (358)
A Primer in BERTology: What We Know About How BERT Works [PDF]
Transformer-based models have pushed state of the art in many areas of NLP, but our understanding of what is behind their success is still limited. This paper is the first survey of over 150 studies of the popular BERT model.
Olga Kovaleva, Anna Rumshisky
exaly +2 more sources
FakeBERT: Fake news detection in social media with a BERT-based deep learning approach
In the modern era of computing, the news ecosystem has transformed from old traditional print media to social media outlets. Social media platforms allow us to consume news much faster, with less restricted editing results in the spread of fake news at ...
Rohit Kumar Kaliyar +1 more
exaly +2 more sources
Some of the next articles are maybe not open access.
Related searches:
Related searches:
ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT
Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, 2020Recent progress in Natural Language Understanding (NLU) is driving fast-paced advances in Information Retrieval (IR), largely owed to fine-tuning deep language models (LMs) for document ranking.
O. Khattab, M. Zaharia
semanticscholar +1 more source
Utilizing BERT for Information Retrieval: Survey, Applications, Resources, and Challenges [PDF]
Recent years have witnessed a substantial increase in the use of deep learning to solve various natural language processing (NLP) problems. Early deep learning models were constrained by their sequential or unidirectional nature, such that they struggled
, Wang Junmei
exaly +2 more sources
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
North American Chapter of the Association for Computational Linguistics, 2019We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models (Peters et al., 2018a; Radford et al., 2018), BERT is designed to pre ...
Jacob Devlin +3 more
semanticscholar +1 more source
Publicly Available Clinical BERT Embeddings
Proceedings of the 2nd Clinical Natural Language Processing Workshop, 2019Contextual word embedding models such as ELMo and BERT have dramatically improved performance for many natural language processing (NLP) tasks in recent months.
Emily Alsentzer +6 more
semanticscholar +1 more source
Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2021
Pre-Trained Models (PTMs) can learn general knowledge representations and perform well in Natural Language Processing (NLP) tasks. For the Chinese language, several PTMs are developed, however, most existing methods concentrate on modern Chinese and are not ideal for processing classical Chinese due to the differences in grammars and semantics between ...
Zijing Ji +3 more
openaire +1 more source
Pre-Trained Models (PTMs) can learn general knowledge representations and perform well in Natural Language Processing (NLP) tasks. For the Chinese language, several PTMs are developed, however, most existing methods concentrate on modern Chinese and are not ideal for processing classical Chinese due to the differences in grammars and semantics between ...
Zijing Ji +3 more
openaire +1 more source
Proceedings of the eighteenth ACM SIGSOFT international symposium on Foundations of software engineering, 2010
During maintenance, software is modified and evolved to enhance its functionality, eliminate faults, and adapt it to changed or new platforms. In this demo, we present BERT, a tool for helping developers identify regression faults that they may have introduced when modifying their code.
Wei Jin, Alessandro Orso, Tao Xie
openaire +1 more source
During maintenance, software is modified and evolved to enhance its functionality, eliminate faults, and adapt it to changed or new platforms. In this demo, we present BERT, a tool for helping developers identify regression faults that they may have introduced when modifying their code.
Wei Jin, Alessandro Orso, Tao Xie
openaire +1 more source
Proceedings of the 2008 international workshop on dynamic analysis: held in conjunction with the ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2008), 2008
During maintenance, it is common to run the new version of a program against its existing test suite to check whether the modifications in the program introduced unforeseen side effects. Although this kind of regression testing can be effective in identifying some change-related faults, it is limited by the quality of the existing test suite.
Alessandro Orso, Tao Xie
openaire +1 more source
During maintenance, it is common to run the new version of a program against its existing test suite to check whether the modifications in the program introduced unforeseen side effects. Although this kind of regression testing can be effective in identifying some change-related faults, it is limited by the quality of the existing test suite.
Alessandro Orso, Tao Xie
openaire +1 more source

