Results 11 to 20 of about 3,008,888 (261)
Large Language ModelsIn the latest edition of Stats, STAT!, Fralick and colleagues explain the statistics behind large language models - used in chat bots like ChatGPT and Bard. While these new tools may seem remarkably intelligent, at their core they just assemble sentences based on statistics from large amounts of text.
Michael, Fralick +6 more
+9 more sources
Hagedorn transition, vortices and D0 branes: Lessons from 2+1 confining strings [PDF]
We study the behaviour of Polyakov confining string in the Georgi-Glashow model in three dimensions near confining-deconfining phase transition described in hep-th/0010201.
Kogan, Ian. I +2 more
core +2 more sources
Large language models in medicine
Large language models (LLMs) can respond to free-text queries without being specifically trained in the task in question, causing excitement and concern about their use in healthcare settings. ChatGPT is a generative artificial intelligence (AI) chatbot produced through sophisticated fine-tuning of an LLM, and other tools are emerging through similar ...
Thirunavukarasu, Arun James +5 more
openaire +2 more sources
Precursors and BRST Symmetry [PDF]
In the AdS/CFT correspondence, bulk information appears to be encoded in the CFT in a redundant way. A local bulk field corresponds to many different non-local CFT operators (precursors).
de Boer, Jan +3 more
core +4 more sources
Slim Embedding Layers for Recurrent Neural Language Models
Recurrent neural language models are the state-of-the-art models for language modeling. When the vocabulary size is large, the space taken to store the model parameters becomes the bottleneck for the use of recurrent neural language models. In this paper,
Kulhanek, Raymond +4 more
core +1 more source
ABSTRACT Background/Objectives Osteosarcoma is a radioresistant tumor that may benefit from stereotactic body radiation therapy (SBRT) for locoregional control in metastatic/recurrent disease. We report institutional practice patterns, outcomes, toxicity, and failures in osteosarcoma patients treated with SBRT.
Jenna Kocsis +13 more
wiley +1 more source
ChatLLM network: More brains, more intelligence
Dialogue-based language models mark a huge milestone in the field of artificial intelligence, by their impressive ability to interact with users, as well as a series of challenging tasks prompted by customized instructions.
Rui Hao +5 more
doaj +1 more source
Enhancing Personalized Fitness: Integrating Large Language Model [PDF]
This paper explores the integration of Large Language Models (LLMs) into workout planning and personal training to meet the growing demand for personalized fitness solutions.
Pardhi Praful R. +4 more
doaj +1 more source
Role play with large language models
As dialogue agents become increasingly human-like in their performance, it is imperative that we develop effective ways to describe their behaviour in high-level terms without falling into the trap of anthropomorphism. In this paper, we foreground the concept of role-play.
Murray Shanahan +2 more
openaire +6 more sources
ABSTRACT Ongoing evidence indicates increased risk of sarcopenic obesity among children and young people (CYP) with acute lymphoblastic leukemia (ALL), often beginning early in treatment, persisting into survivorship. This review evaluates current literature on body composition in CYP with ALL during and after treatment.
Lina A. Zahed +5 more
wiley +1 more source

