Results 41 to 50 of about 93,556 (252)
Lipschitz Adaptivity with Multiple Learning Rates in Online Learning [PDF]
We aim to design adaptive online learning algorithms that take advantage of any special structure that might be present in the learning task at hand, with as little manual tuning by the user as possible. A fundamental obstacle that comes up in the design
Koolen, Wouter M. +2 more
core +1 more source
ABSTRACT Objective Peripheral neuropathies contribute to patient disability but may be diagnosed late or missed altogether due to late referral, limitation of current diagnostic methods and lack of specialized testing facilities. To address this clinical gap, we developed NeuropathAI, an interpretable deep learning–based multiclass classification ...
Chaima Ben Rabah +7 more
wiley +1 more source
Evaluasi Performa XGBoost dengan Oversampling dan Hyperparameter Tuning untuk Prediksi Alzheimer
Alzheimer adalah gangguan neurodegeneratif yang mempengaruhi kemampuan kognitif dan memori, deteksi dini sangat penting untuk pengobatan yang tepat. Namun, untuk mendeteksi Alzheimer memerlukan biaya yang tinggi, sehingga penggunaan machine learning bisa
Furqon Nurbaril Yahya +2 more
doaj +1 more source
Inefficiency of K-FAC for Large Batch Size Training
In stochastic optimization, using large batch sizes during training can leverage parallel resources to produce faster wall-clock training times per training epoch.
Gholami, Amir +6 more
core +1 more source
ABSTRACT Objective Accurate localization of epileptogenic tubers (ETs) in patients with tuberous sclerosis complex (TSC) is essential but challenging, as these tubers lack distinct pathological or genetic markers to differentiate them from other cortical tubers.
Tinghong Liu +11 more
wiley +1 more source
Research and Analysis of IndoBERT Hyperparameter Tuning in Fake News Detection
The rapid advancement of communication technology has transformed how information is shared, but it has also brought concerns about the proliferation of false information.
Anugerah Simanjuntak +6 more
doaj +1 more source
OBOE: Collaborative Filtering for AutoML Model Selection
Algorithm selection and hyperparameter tuning remain two of the most challenging tasks in machine learning. Automated machine learning (AutoML) seeks to automate these tasks to enable widespread use of machine learning by non-experts.
Akimoto, Yuji +3 more
core +1 more source
Hyperparameter Tuning with Renyi Differential Privacy
For many differentially private algorithms, such as the prominent noisy stochastic gradient descent (DP-SGD), the analysis needed to bound the privacy leakage of a single training run is well understood. However, few studies have reasoned about the privacy leakage resulting from the multiple training runs needed to fine tune the value of the training ...
Papernot, Nicolas, Steinke, Thomas
openaire +2 more sources
This paper proposes two projector‐based Hopfield neural network (HNN) estimators for online, constrained parameter estimation under time‐varying data, additive disturbances, and slowly drifting physical parameters. The first is a constraint‐aware HNN that enforces linear equalities and inequalities (via slack neurons) and continuously tracks the ...
Miguel Pedro Silva
wiley +1 more source

