Results 271 to 280 of about 8,464,096 (315)
Some of the next articles are maybe not open access.
A Unified Continual Learning Framework with General Parameter-Efficient Tuning
IEEE International Conference on Computer Vision, 2023The "pre-training → downstream adaptation" presents both new opportunities and challenges for Continual Learning (CL). Although the recent state-of-the-art in CL is achieved through Parameter-Efficient-Tuning (PET) adaptation paradigm, only prompt has ...
Qiankun Gao +6 more
semanticscholar +1 more source
Multiobjectivization for classifier parameter tuning
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation, 2013We present a multiobjectivization approach to the parameter tuning of RBF networks and multilayer perceptrons. The approach works by adding two new objectives -- maximization of kappa statistic and minimization of root mean square error -- to the originally single-objective problem of minimizing the classification error of the model.
Pilát, M., Neruda, R. (Roman)
openaire +2 more sources
When MOE Meets LLMs: Parameter Efficient Fine-tuning for Multi-task Medical Applications
Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, 2023The recent surge in Large Language Models (LLMs) has garnered significant attention across numerous fields. Fine-tuning is often required to fit general LLMs for a specific domain, like the web-based healthcare system.
Qidong Liu +6 more
semanticscholar +1 more source
2021
Regularized estimators consist of two terms, one for comparing model parameters to data and one for including prior information. The tuning parameters define the weighting: small tuning parameters emphasize the data, while large tuning parameters emphasize the prior information.
openaire +1 more source
Regularized estimators consist of two terms, one for comparing model parameters to data and one for including prior information. The tuning parameters define the weighting: small tuning parameters emphasize the data, while large tuning parameters emphasize the prior information.
openaire +1 more source
IEEE transactions on power electronics
Suppressing current harmonics is an essential issue for photovoltaics (PVs). The equivalent impedance or admittance expresses the interaction between the PV converter (PVC) and the background harmonics from the grid, where the control strategies and ...
Pengbo Shan +5 more
semanticscholar +1 more source
Suppressing current harmonics is an essential issue for photovoltaics (PVs). The equivalent impedance or admittance expresses the interaction between the PV converter (PVC) and the background harmonics from the grid, where the control strategies and ...
Pengbo Shan +5 more
semanticscholar +1 more source
Sensitivity-Aware Visual Parameter-Efficient Fine-Tuning
IEEE International Conference on Computer Vision, 2023Visual Parameter-Efficient Fine-Tuning (PEFT) has become a powerful alternative for full fine-tuning so as to adapt pre-trained vision models to downstream tasks, which only tunes a small number of parameters while freezing the vast majority ones to ease
Haoyu He +4 more
semanticscholar +1 more source
Parameters and Parameter Tuning
2015Chapter 3 presented an algorithmic framework that forms the common basis for all evolutionary algorithms. A decision to use an evolutionary algorithm implies that the user adopts the main design decisions behind this framework. Thus, the main algorithm setup follows automatically: the algorithm is based on a population of candidate solutions that is ...
A. E. Eiben, J. E. Smith
openaire +1 more source
NEOCOGNITRON'S PARAMETER TUNING BY GENETIC ALGORITHMS
International Journal of Neural Systems, 1999The further study on the sensitivity analysis of Neocognitron is discussed in this paper. Fukushima's Neocognitron is capable of recognizing distorted patterns as well as tolerating positional shift. Supervised learning of the Neocognitron is fulfilled by training patterns layer by layer.
Shi, D., Dong, C., Yeung, D.S.
openaire +2 more sources
Parameter-Efficient Fine-Tuning for Large Models: A Comprehensive Survey
Trans. Mach. Learn. Res.Large models represent a groundbreaking advancement in multiple application fields, enabling remarkable achievements across various tasks. However, their unprecedented scale comes with significant computational costs.
Zeyu Han +4 more
semanticscholar +1 more source
Parameter Tuning of Stable Fuzzy Controllers
Journal of Intelligent and Robotic Systems, 2002zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Dieulot, J.-Y., Borne, P.
openaire +2 more sources

