Results 11 to 20 of about 771,003 (326)

Llama rutuy

open access: yesAllpanchis, 1971
En la comunidad de Pumapuquio, del distrito de San Jerónimo, provincia de Andahuaylas, aun supervive la forma de trasquilar llamas, que lo hacen con una ceremonia especial, ya que estos auquénidos formaron parte de la ganadería incaica.
Carlos A. Vivanco Flores
openaire   +3 more sources

Code Llama: Open Foundation Models for Code [PDF]

open access: yesarXiv.org, 2023
We release Code Llama, a family of large language models for code based on Llama 2 providing state-of-the-art performance among open models, infilling capabilities, support for large input contexts, and zero-shot instruction following ability for ...
Adi, Yossi   +24 more
core   +2 more sources

LLaMA-Adapter: Efficient Fine-tuning of Language Models with Zero-init Attention [PDF]

open access: yesarXiv.org, 2023
We present LLaMA-Adapter, a lightweight adaption method to efficiently fine-tune LLaMA into an instruction-following model. Using 52K self-instruct demonstrations, LLaMA-Adapter only introduces 1.2M learnable parameters upon the frozen LLaMA 7B model ...
Gao, Peng   +8 more
core   +2 more sources

Hypoosmotic Test Combined with Coomassie Blue Staining in Llama Sperm [PDF]

open access: yesSPERMOVA, 2019
La integridad de las membranas plasmática y acrosomal es de gran importancia para determinar la capacidad fertilizante de los espermatozoides. Los objetivos de este trabajo fueron: 1) evaluar la funcionalidad de la membrana plasmática a través del tiempo
Bertuzzi, Mariana Lucía   +3 more
core   +3 more sources

Video-LLaMA: An Instruction-tuned Audio-Visual Language Model for Video Understanding [PDF]

open access: yesConference on Empirical Methods in Natural Language Processing, 2023
We present Video-LLaMA a multi-modal framework that empowers Large Language Models (LLMs) with the capability of understanding both visual and auditory content in the video.
Hang Zhang, Xin Li, Lidong Bing
semanticscholar   +1 more source

LLaMA-Adapter V2: Parameter-Efficient Visual Instruction Model [PDF]

open access: yesarXiv.org, 2023
How to efficiently transform large language models (LLMs) into instruction followers is recently a popular research direction, while training LLM for multi-modal reasoning remains less explored.
Peng Gao   +11 more
semanticscholar   +1 more source

Efficient and Effective Text Encoding for Chinese LLaMA and Alpaca [PDF]

open access: yesarXiv.org, 2023
Large Language Models (LLMs), such as ChatGPT and GPT-4, have dramatically transformed natural language processing research and shown promising strides towards Artificial General Intelligence (AGI).
Yiming Cui, Ziqing Yang, Xin Yao
semanticscholar   +1 more source

Fine-Tuning LLaMA for Multi-Stage Text Retrieval [PDF]

open access: yesAnnual International ACM SIGIR Conference on Research and Development in Information Retrieval, 2023
While large language models (LLMs) have shown impressive NLP capabilities, existing IR applications mainly focus on prompting LLMs to generate query expansions or generating permutations for listwise reranking. In this study, we leverage LLMs directly to
Xueguang Ma   +4 more
semanticscholar   +1 more source

Learning from the llama: on the broad contours of cultural contributions and geographic expansion [PDF]

open access: yesHistória, Ciências, Saúde: Manguinhos, 2022
The llama (Lama glama) is the largest domesticated animal species from South America and is today found worldwide. Andean peoples have used the llama for millennia for meat, wool, packing, spiritual etc. In order to know the history of the llama, we must
Emily Wakild
doaj   +1 more source

ChatDoctor: A Medical Chat Model Fine-Tuned on a Large Language Model Meta-AI (LLaMA) Using Medical Domain Knowledge [PDF]

open access: yesCureus, 2023
Objective The primary aim of this research was to address the limitations observed in the medical knowledge of prevalent large language models (LLMs) such as ChatGPT, by creating a specialized language model with enhanced accuracy in medical advice ...
Yunxiang Li   +5 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy