Results 181 to 190 of about 616,595 (258)
Large Language Model‐Based Chatbots in Higher Education
The use of large language models (LLMs) in higher education can facilitate personalized learning experiences, advance asynchronized learning, and support instructors, students, and researchers across diverse fields. The development of regulations and guidelines that address ethical and legal issues is essential to ensure safe and responsible adaptation
Defne Yigci +4 more
wiley +1 more source
Invisible lines of inequality: intersections of gender, motherhood, and work-based discrimination in Bulgaria. [PDF]
Spasova L.
europepmc +1 more source
Minority Languages and Territorial Rights. Global Encyclopedia of Territorial Rights
openaire +2 more sources
We demonstrate the direct‐laser patterning of a gold thin film on polymethyl methacrylate to fabricate a temperature sensor for dentures. The temperature sensor‐embedded smart dentures are evaluated in an oral environment, enabling in‐situ monitoring for elderly healthcare.
Han Ku Nam +7 more
wiley +1 more source
Large language models reflect the ideology of their creators. [PDF]
Buyl M +10 more
europepmc +1 more source
Variational Autoencoder+Deep Deterministic Policy Gradient addresses low‐light failures of infrared depth sensing for indoor robot navigation. Stage 1 pretrains an attention‐enhanced Variational Autoencoder (Convolutional Block Attention Module+Feature Pyramid Network) to map dark depth frames to a well‐lit reconstruction, yielding a 128‐D latent code ...
Uiseok Lee +7 more
wiley +1 more source
Examining contraception-related discourse on social media after the Dobbs v. Jackson Women's Health Organization Supreme Court decision: a textual analysis of user-generated content on X (formerly Twitter). [PDF]
Ujah OI, Nnorom OC, Swomen HE.
europepmc +1 more source
Understanding Spanglish and Flemch : A Comparative Analysis of American and Belgian Language Politics [PDF]
Wattal, Urvashi
core +1 more source
FTGRN introduces an LLM‐enhanced framework for gene regulatory network inference through a two‐stage workflow. It combines a Transformer‐based model, pretrained on GPT‐4 derived gene embeddings and regulatory knowledge, with a fine‐tuning stage utilizing single‐cell RNA‐seq data.
Guangzheng Weng +7 more
wiley +1 more source

