Results 81 to 90 of about 186,001 (296)
Large language models (LLMs) have achieved remarkable success across various domains, but effectively incorporating complex and potentially noisy user timeline data into LLMs remains a challenge. Current approaches often involve translating user timelines into text descriptions before feeding them to LLMs, which can be inefficient and may not fully ...
Lin Ning +8 more
openaire +2 more sources
On Affine Logic and {\L}ukasiewicz Logic [PDF]
The multi-valued logic of {\L}ukasiewicz is a substructural logic that has been widely studied and has many interesting properties. It is classical, in the sense that it admits the axiom schema of double negation, [DNE]. However, our understanding of {\L}
Arthan, Rob, Oliva, Paulo
core
Machine Learning for Green Solvents: Assessment, Selection and Substitution
Environmental regulations have intensified demand for green solvents, but discovery is limited by Solvent Selection Guides (SSGs) that quantify solvent sustainability. Training a machine learning model on GlaxoSmithKline SSG, a database of sustainability metrics for 10,189 solvents, GreenSolventDB is developed. Integrated with Hansen solubility metrics,
Rohan Datta +4 more
wiley +1 more source
This paper analyses the latest novel by Rocco Tanica, Non siamo mai stati sulla Terra, published by il Saggiatore in 2022, the first Italian novel to be written in collaboration with AI.
Maira Martini
doaj +1 more source
Linearizing and Forecasting: A Reservoir Computing Route to Digital Twins of the Brain
A new approach uses simple neural networks to create digital twins of brain activity, capturing how different patterns unfold over time. The method generates and recovers key dynamics even from noisy data. When applied to fMRI, it predicts brain signals and reveals distinctive activity patterns across regions and individuals, opening possibilities for ...
Gabriele Di Antonio +3 more
wiley +1 more source
Advanced methods for knowledge injection in large language models
Transformer-based language models have revolutionized Natural Language Processing tasks, with advancements in language modeling techniques. Current transformer architectures utilize attention mechanisms to model text dependencies effectively.
N. I. Kulin, S. B. Muravyov
doaj +1 more source
LLM Chemistry Estimation for Multi-LLM Recommendation
Multi-LLM collaboration promises accurate, robust, and context-aware solutions, yet existing approaches rely on implicit selection and output assessment without analyzing whether collaborating models truly complement or conflict. We introduce LLM Chemistry -- a framework that measures when LLM combinations exhibit synergistic or antagonistic behaviors ...
Sanchez, Huascar, Hitaj, Briland
openaire +2 more sources
Urban wage premia, cost of living, and collective bargaining [PDF]
In this paper, we estimate the urban wage premia (UWP) in Italy, with its economy characterized by the interplay between collective bargaining and spatial heterogeneity in the cost of living.
Belloc, Marianna +2 more
core
A Perspective on Interactive Theorem Provers in Physics
Into an interactive theorem provers (ITPs), one can write mathematical definitions, theorems and proofs, and the correctness of those results is automatically checked. This perspective goes over the best usage of ITPs within physics and motivates the open‐source community run project PhysLean, the aim of which is to be a library for digitalized physics
Joseph Tooby‐Smith
wiley +1 more source
Language models can be trained and customized by anyone using effective fine-tuning methods, making them versatile tools across various domains. While metrics like loss and accuracy are commonly used to assess the performance of language models, the ...
Emir Öztürk
doaj +1 more source

