Results 21 to 30 of about 52,931 (300)
A commentary of GPT-3 in MIT Technology Review 2021
Through the development of large-scale natural language models with writing and dialogue capabilities, artificial intelligence (AI) has taken a significant stride towards better natural language understanding (NLU) and human-computer interaction (HCI ...
Min Zhang, Juntao Li
doaj +1 more source
GPT-3: What’s it good for? [PDF]
AbstractGPT-3 made the mainstream media headlines this year, generating far more interest than we’d normally expect of a technical advance in NLP. People are fascinated by its ability to produce apparently novel text that reads as if it was written by a human. But what kind of practical applications can we expect to see, and can they be trusted?
openaire +1 more source
Is GPT-3 a Good Data Annotator?
Data annotation is the process of labeling data that could be used to train machine learning models. Having high-quality annotation is crucial, as it allows the model to learn the relationship between the input data and the desired output. GPT-3, a large-scale language model developed by OpenAI, has demonstrated impressive zero- and few-shot ...
Ding, Bosheng +6 more
openaire +2 more sources
GPT‐3: Its Nature, Scope, Limits, and Consequences [PDF]
In this commentary, we discuss the nature of reversible and irreversible questions, that is, questions that may enable one to identify the nature of the source of their answers. We then introduce GPT-3, a third-generation, autoregressive language model that uses deep learning to produce human-like texts, and use the previous distinction to analyse it ...
Luciano Floridi, Massimo Chiriatti
openaire +2 more sources
Fake and Real Tweet Classification Using a Pre-Trained GPT-3 Approach [PDF]
The widespread utilization of social media has precipitated a notable upsurge in the dissemination of inaccurate information. This underscores the urgency to counteract the propagation of falsehoods and decrease the reliance on these platforms as sources
Delveen Luqman Abd Alnabi
doaj +1 more source
Atom clusters and vibrational excitations in chemically-disordered Pt357Fe [PDF]
Inelastic nuclear resonant scattering spectra of Fe-57 atoms were measured on crystalline alloys of Pt3Fe-57 that were chemically disordered, partially ordered, and L1(2) ordered. Phonon partial density of states curves for Fe-57 were obtained from these
Alp, E. E. +6 more
core +1 more source
Fine-Tuning GPT-3 for Russian Text Summarization [PDF]
Automatic summarization techniques aim to shorten and generalize information given in the text while preserving its core message and the most relevant ideas. This task can be approached and treated with a variety of methods, however, not many attempts have been made to produce solutions specifically for the Russian language despite existing ...
Nikolich Alexandr +4 more
openaire +3 more sources
Detecting Hate Speech with GPT-3
Sophisticated language models such as OpenAI's GPT-3 can generate hateful text that targets marginalized groups. Given this capacity, we are interested in whether large language models can be used to identify hate speech and classify text as sexist or racist.
Chiu, Ke-Li +2 more
openaire +2 more sources
Sidik Cepat Biokatalisasi Air Asam Tambang Pada Lahan Bekas Tambang Batubara Rapid Assessment of Acid Mine Drainage on Ex-coal Mining Land [PDF]
Salah satu hambatan terberat dalam kegiatan rehabilitasi lahan bekas tambang adalah air asam tambang (AAT). AAT merupakan oksidasi mineral sulfida dengan melepaskan sulfat sehingga dapat menurunkan pH tanah secara drastis.
Lantifasari, R. (Ratnawati) +2 more
core +1 more source
What Makes Good In-Context Examples for GPT-3?
GPT-$3$ has attracted lots of attention due to its superior performance across a wide range of NLP tasks, especially with its powerful and versatile in-context few-shot learning ability. Despite its success, we found that the empirical results of GPT-$3$ depend heavily on the choice of in-context examples. In this work, we investigate whether there are
Liu, Jiachang +5 more
openaire +2 more sources

