Results 21 to 30 of about 109,032 (273)
FedProc: Prototypical contrastive federated learning on non-IID data
Federated learning allows multiple clients to collaborate to train high-performance deep learning models while keeping the training data locally. However, when the local data of all clients are not independent and identically distributed (i.e., non-IID), it is challenging to implement this form of efficient collaborative learning.
Xutong Mu +6 more
openaire +2 more sources
Peer-to-Peer Learning+Consensus with Non-IID Data
Peer-to-peer deep learning algorithms are enabling distributed edge devices to collaboratively train deep neural networks without exchanging raw training data or relying on a central server. Peer-to-Peer Learning (P2PL) and other algorithms based on Distributed Local-Update Stochastic/mini-batch Gradient Descent (local DSGD) rely on interleaving epochs
Pranav, Srinivasa, Moura, José M. F.
openaire +2 more sources
Fast converging Federated Learning with Non-IID Data
With the advancement of device capabilities, Internet of Things (IoT) devices can employ built-in hardware to perform machine learning (ML) tasks, extending their horizons in many promising directions. In traditional ML, data are sent to a server for training. However, this approach raises user privacy concerns.
Sigg Stephan, Naas Si-Ahmed
openaire +4 more sources
Data scientists in the Natural Language Processing (NLP) field confront the challenge of reconciling the necessity for data-centric analyses with the imperative to safeguard sensitive information, all while managing the substantial costs linked to the ...
Pascal Riedel +5 more
doaj +1 more source
The effect of (mis-specified) GARCH filters on the finite sample distribution of the BDS test [PDF]
This paper considers the effect of using a GARCH filter on the properties of the BDS test statistic as well as a number of other issues relating to the application of the test.
Brooks, Chris, Heravi, Saeed M.
core +2 more sources
Federated proximal learning with data augmentation for brain tumor classification under heterogeneous data distributions [PDF]
The increasing use of electronic health records (EHRs) has transformed healthcare management, yet data sharing across institutions remains limited due to privacy concerns.
Swetha Ghanta +5 more
doaj +2 more sources
Federated learning (FL) is a field in distributed optimization. Therein, the collection of data and training of neural networks (NN) are decentralized, meaning that these tasks are carried out across multiple clients with limited communication and ...
Tobias Sukianto +4 more
doaj +1 more source
Conditional fiducial models [PDF]
The fiducial is not unique in general, but we prove that in a restricted class of models it is uniquely determined by the sampling distribution of the data. It depends in particular not on the choice of a data generating model.
Lindqvist, Bo H., Taraldsen, Gunnar
core +2 more sources
Federated XGBoost on Sample-Wise Non-IID Data
Federated Learning (FL) is a paradigm for jointly training machine learning algorithms in a decentralized manner which allows for parties to communicate with an aggregator to create and train a model, without exposing the underlying raw data distribution of the local parties involved in the training process.
Jones, Katelinh +3 more
openaire +3 more sources
Paddy leaf diseases encompass a range of ailments affecting rice plants’ leaves, arising from factors like bacteria, fungi, viruses, and environmental stress.
Meenakshi Aggarwal +6 more
doaj +1 more source

