Results 11 to 20 of about 771,003 (326)
En la comunidad de Pumapuquio, del distrito de San Jerónimo, provincia de Andahuaylas, aun supervive la forma de trasquilar llamas, que lo hacen con una ceremonia especial, ya que estos auquénidos formaron parte de la ganadería incaica.
Carlos A. Vivanco Flores
openaire +3 more sources
Code Llama: Open Foundation Models for Code [PDF]
We release Code Llama, a family of large language models for code based on Llama 2 providing state-of-the-art performance among open models, infilling capabilities, support for large input contexts, and zero-shot instruction following ability for ...
Adi, Yossi+24 more
core +2 more sources
LLaMA-Adapter: Efficient Fine-tuning of Language Models with Zero-init Attention [PDF]
We present LLaMA-Adapter, a lightweight adaption method to efficiently fine-tune LLaMA into an instruction-following model. Using 52K self-instruct demonstrations, LLaMA-Adapter only introduces 1.2M learnable parameters upon the frozen LLaMA 7B model ...
Gao, Peng+8 more
core +2 more sources
Hypoosmotic Test Combined with Coomassie Blue Staining in Llama Sperm [PDF]
La integridad de las membranas plasmática y acrosomal es de gran importancia para determinar la capacidad fertilizante de los espermatozoides. Los objetivos de este trabajo fueron: 1) evaluar la funcionalidad de la membrana plasmática a través del tiempo
Bertuzzi, Mariana Lucía+3 more
core +3 more sources
Video-LLaMA: An Instruction-tuned Audio-Visual Language Model for Video Understanding [PDF]
We present Video-LLaMA a multi-modal framework that empowers Large Language Models (LLMs) with the capability of understanding both visual and auditory content in the video.
Hang Zhang, Xin Li, Lidong Bing
semanticscholar +1 more source
LLaMA-Adapter V2: Parameter-Efficient Visual Instruction Model [PDF]
How to efficiently transform large language models (LLMs) into instruction followers is recently a popular research direction, while training LLM for multi-modal reasoning remains less explored.
Peng Gao+11 more
semanticscholar +1 more source
Efficient and Effective Text Encoding for Chinese LLaMA and Alpaca [PDF]
Large Language Models (LLMs), such as ChatGPT and GPT-4, have dramatically transformed natural language processing research and shown promising strides towards Artificial General Intelligence (AGI).
Yiming Cui, Ziqing Yang, Xin Yao
semanticscholar +1 more source
Fine-Tuning LLaMA for Multi-Stage Text Retrieval [PDF]
While large language models (LLMs) have shown impressive NLP capabilities, existing IR applications mainly focus on prompting LLMs to generate query expansions or generating permutations for listwise reranking. In this study, we leverage LLMs directly to
Xueguang Ma+4 more
semanticscholar +1 more source
Learning from the llama: on the broad contours of cultural contributions and geographic expansion [PDF]
The llama (Lama glama) is the largest domesticated animal species from South America and is today found worldwide. Andean peoples have used the llama for millennia for meat, wool, packing, spiritual etc. In order to know the history of the llama, we must
Emily Wakild
doaj +1 more source
ChatDoctor: A Medical Chat Model Fine-Tuned on a Large Language Model Meta-AI (LLaMA) Using Medical Domain Knowledge [PDF]
Objective The primary aim of this research was to address the limitations observed in the medical knowledge of prevalent large language models (LLMs) such as ChatGPT, by creating a specialized language model with enhanced accuracy in medical advice ...
Yunxiang Li+5 more
semanticscholar +1 more source