Results 211 to 220 of about 60,295 (249)
Some of the next articles are maybe not open access.

Recurrent Learning Vector Quantization

2021
Learning Vector Quantization (LVQ) methods have been popular choices of classification models ever since its introduction by T. Kohonen in the 90s. These days, LVQ is combined with Deep Learning methods to provide powerful yet interpretable machine-learning solutions to some of the most challenging computational problems.
openaire   +1 more source

Fuzzy algorithms for learning vector quantization

IEEE Transactions on Neural Networks, 1996
This paper presents the development of fuzzy algorithms for learning vector quantization (FALVQ). These algorithms are derived by minimizing the weighted sum of the squared Euclidean distances between an input vector, which represents a feature vector, and the weight vectors of a competitive learning vector quantization (LVQ) network, which represent ...
N B, Karayiannis, P I, Pai
openaire   +2 more sources

Learning vector quantization: The dynamics of winner-takes-all algorithms

Neurocomputing, 2006
Winner-Takes-All (WTA) prescriptions for learning vector quantization (LVQ) are studied in the framework of a model situation: two competing prototype vectors are updated according to a sequence of example data drawn from a mixture of Gaussialls. The theory of on-line learning allows for an exact mathernatical description of the training dynamics, even
Biehl, Michael   +2 more
openaire   +1 more source

Noise Fuzzy Learning Vector Quantization

Key Engineering Materials, 2010
Fuzzy learning vector quantization (FLVQ) benefits from using the membership values coming from fuzzy c-means (FCM) as learning rates and it overcomes several problems of learning vector quantization (LVQ). However, FLVQ is sensitive to noises because it is a FCM-based algorithm (FCM is sensitive to noises).
Xiao Hong Wu, Bin Wu, Jie Wen Zhao
openaire   +1 more source

Dual Weight Learning Vector Quantization

2008 9th International Conference on Signal Processing, 2008
A new learning vector quantization (LVQ) approach, so-called dual weight learning vector quantization (DWLVQ), is presented in this paper. The basic idea is to introduce an additional weight (namely the importance vector) for each feature of reference vectors to indicate the importance of this feature during the classification.
null Chuanfeng Lv   +3 more
openaire   +1 more source

Improving Dynamic Learning Vector Quantization

18th International Conference on Pattern Recognition (ICPR'06), 2006
We introduce some improvements to the Dynamic Learning Vector Quantization algorithm proposed by us for tackling the two major problems of those networks, namely neuron over-splitting and their distribution in the feature space. We suggest to explicitly estimate the potential improvement on the recognition rate achievable by splitting neurons in those ...
DE STEFANO, Claudio   +3 more
openaire   +2 more sources

Expansive and Competitive Learning for Vector Quantization

Neural Processing Letters, 2002
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Muñoz Pérez, J.   +3 more
openaire   +1 more source

Federated Learning Vector Quantization

ESANN 2021 proceedings, 2021
Johannes Brinkrolf, Barbara Hammer
openaire   +3 more sources

Fuzzy-Kernel Learning Vector Quantization

2004
This paper presents an unsupervised fuzzy-kernel learning vector quantization algorithm called FKLVQ. FKLVQ is a batch type of clustering learning network by fusing the batch learning, fuzzy membership functions, and kernel-induced distance measures.
Daoqiang Zhang   +2 more
openaire   +1 more source

A dynamic approach to learning vector quantization

Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004., 2004
Learning Vector Quantization networks are generally considered a powerful pattern recognition tool. Their main drawback, however, is the Competitive Learning algorithm they are based upon, that suffers of the so called underutilized or dead unit problem.
DE STEFANO, Claudio   +2 more
openaire   +1 more source

Home - About - Disclaimer - Privacy