Results 301 to 310 of about 2,720,618 (349)

Gray code based gradient-free optimization algorithm for parameterized quantum circuit

open access: hybridChinese Physics B, 2023
A Gray code based gradient-free optimization (GCO) algorithm is proposed to update the parameters of parameterized quantum circuits (PQCs) in this work. Each parameter of PQCs is encoded as a binary string, named as a gene, and a genetic-based method is adopted to select the offsprings.
Anqi 安琪 Zhang 张   +2 more
openaire   +2 more sources

g-binary: A New Non-parameterized Code for Improved Inverted File Compression

International Conference on Database and Expert Systems Applications, 2003
The inverted file is a popular and efficient method for indexing text databases and is being used widely in information retrieval applications. As a result, the research literature is rich in models (global and local) that describe and compress inverted file indexes. Global models compress the entire inverted file index using the same method and can be
Ilias Nitsos   +2 more
openaire   +2 more sources

Parameteric coding of speech signals

2009 International Conference on Ultra Modern Telecommunications & Workshops, 2009
In this paper an FS1015 LPC coder has been designed using Matlab to produce intelligible speech. This paper focuses on the different methods of implementation and compared to determine, which gives the best performance. The coder has been tested on Hindi.
Vivek Kumar Sehgal   +5 more
openaire   +1 more source

Generalized Hamming weights and some parameterized codes

Discrete Mathematics, 2016
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
González Sarabia, Manuel   +1 more
openaire   +1 more source

Pion cross section parameterizations for space radiation codes

Nuclear Instruments and Methods in Physics Research Section B: Beam Interactions with Materials and Atoms, 2013
Abstract The space radiation environment consists of energetic particles that originate from the Sun and from sources outside the solar system. It is necessary to understand how these particles interact with materials to design effective radiation shielding.
Charles M. Werneth   +2 more
openaire   +1 more source

Simplifying the Parameterization of Real-Coded Evolutionary Algorithms

Critical Transitions in Water and Environmental Resources Management, 2004
This paper demonstrates how existing parameterization techniques for binary coded genetic algorithms can be extended to real -coded evolutionary algorithms. An easy-to-implement framework for automating parameter setting for real -coded evolutionary algorithms is demonstrated in this study using Differential Evolution (DE), a real -coded evolutionary ...
Patrick M. Reed, Satoshi Yamaguchi
openaire   +1 more source

Parameterized Intractability of Even Set and Shortest Vector Problem from Gap-ETH

Electron. Colloquium Comput. Complex., 2018
The $k$-Even Set problem is a parameterized variant of the Minimum Distance Problem of linear codes over $\mathbb F_2$, which can be stated as follows: given a generator matrix $\mathbf A$ and an integer $k$, determine whether the code generated by ...
Arnab Bhattacharyya   +3 more
semanticscholar   +1 more source

Flexible Parameterization of XOR based Codes for Distributed Storage

2008 Seventh IEEE International Symposium on Network Computing and Applications, 2008
Distributed storage systems apply erasure-tolerant codes to guarantee reliable access to data despite failures of storage resources. While many codes can be mapped to XOR operations and efficiently implemented on common microprocessors, only a certain number of codes are usually implemented in a certain system (out of a wide variety of different codes).
Peter Sobe, Kathrin Peter
openaire   +1 more source

Large Language Diffusion Models

arXiv.org
The capabilities of large language models (LLMs) are widely regarded as relying on autoregressive models (ARMs). We challenge this notion by introducing LLaDA, a diffusion model trained from scratch under the pre-training and supervised fine-tuning (SFT)
Shen Nie   +9 more
semanticscholar   +1 more source

Mercury: Ultra-Fast Language Models Based on Diffusion

arXiv.org
We present Mercury, a new generation of commercial-scale large language models (LLMs) based on diffusion. These models are parameterized via the Transformer architecture and trained to predict multiple tokens in parallel.
Samar Khanna   +11 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy