Results 301 to 310 of about 335,632 (328)
Some of the next articles are maybe not open access.

Practical Pulse-Shaping Waveforms for Reduced-Cyclic-Prefix OTFS

IEEE Transactions on Vehicular Technology, 2019
In this paper, we model $M\times N$ orthogonal time frequency space modulation (OTFS) over a $P$-path doubly dispersive channel with delays less than $\tau _{\max }$ and Doppler shifts in the range $(\nu _{\min },\nu _{\max })$.
P. Raviteja   +3 more
semanticscholar   +1 more source

The First Few Tokens Are All You Need: An Efficient and Effective Unsupervised Prefix Fine-Tuning Method for Reasoning Models

arXiv.org
Improving the reasoning capabilities of large language models (LLMs) typically requires supervised fine-tuning with labeled data or computationally expensive sampling.
Ke Ji   +12 more
semanticscholar   +1 more source

ChunkAttention: Efficient Self-Attention with Prefix-Aware KV Cache and Two-Phase Partition

Annual Meeting of the Association for Computational Linguistics
Self-attention is an essential component of large language models (LLM) but a significant source of inference latency for long sequences. In multi-tenant LLM serving scenarios, the compute and memory operation cost of self-attention can be optimized by ...
Lu Ye, Ze Tao, Yong Huang, Yang Li
semanticscholar   +1 more source

BatchLLM: Optimizing Large Batched LLM Inference with Global Prefix Sharing and Throughput-oriented Token Batching

arXiv.org
Large language models (LLMs) increasingly play an important role in a wide range of information processing and management tasks. Many of these tasks are performed in large batches or even offline, and the performance indictor for which is throughput ...
Zhen Zheng   +5 more
semanticscholar   +1 more source

PreFix

PERV, 2018
In modern datacenter networks (DCNs), failures of network devices are the norm rather than the exception, and many research efforts have focused on dealing with failures after they happen.
Shenglin Zhang   +13 more
semanticscholar   +1 more source

An Ultra-Fast Parallel Prefix Adder

IEEE Symposium on Computer Arithmetic, 2019
Parallel Prefix adders are arguably the most commonly used arithmetic units. They have been extensively investigated at architecture level, register transfer level (RTL), gate level, circuit level as well as layout level giving rise to a plethora of ...
K. S. Pandey   +3 more
semanticscholar   +1 more source

When morphological structure overrides meaning: evidence from German prefix and particle verbs

Language, Cognition and Neuroscience, 2018
A key question in the study of lexical processing has been whether the semantic transparency of multimorphemic words affects processing. Previous studies of English and French prefixed words have found that words with greater semantic transparency show ...
Eva Smolka, G. Libben, W. Dressler
semanticscholar   +1 more source

PrefixSpan,: mining sequential patterns efficiently by prefix-projected pattern growth

Proceedings / International Conference on Data Engineering, 2001
J. Pei   +6 more
semanticscholar   +1 more source

Current and future cancer staging after neoadjuvant treatment for solid tumors

Ca-A Cancer Journal for Clinicians, 2021
James D Brierley Mb, Frcp   +1 more
exaly  

Home - About - Disclaimer - Privacy