Results 41 to 50 of about 141,184 (170)
Perbandingan Teknik Klasifikasi Dalam Data Mining Untuk Bank Direct Marketing
Klasifikasi merupakan teknik dalam data mining untuk mengelompokkan data berdasarkan keterikatan data terhadapĀ data sampel. Pada penelitian ini, kami melakukan perbandingan 9 teknik klasifikasi untuk mengklasifikasi respon pelanggan pada dataset Bank ...
Irvi Oktanisa, Ahmad Afif Supianto
doaj +1 more source
Adaptive Optical Closed-Loop Control Based on the Single-Dimensional Perturbation Descent Algorithm
Modal-free optimization algorithms do not require specific mathematical models, and they, along with their other benefits, have great application potential in adaptive optics.
Bo Chen +4 more
doaj +1 more source
A Fixed-Point of View on Gradient Methods for Big Data
Interpreting gradient methods as fixed-point iterations, we provide a detailed analysis of those methods for minimizing convex objective functions. Due to their conceptual and algorithmic simplicity, gradient methods are widely used in machine learning ...
Alexander Jung
doaj +1 more source
An efficient algorithm for data parallelism based on stochastic optimization
Deep neural network models can achieve greater performance in numerous machine learning tasks by raising the depth of the model and the amount of training data samples.
Khalid Abdulaziz Alnowibet +3 more
doaj +1 more source
Distributed stochastic gradient descent for link prediction in signed social networks
This paper considers the link prediction problem defined over a signed social network, where the relationship between any two network users can be either positive (friends) or negative (foes).
Han Zhang, Gang Wu, Qing Ling
doaj +1 more source
Accelerated Backpressure Algorithm
We develop an Accelerated Back Pressure (ABP) algorithm using Accelerated Dual Descent (ADD), a distributed approximate Newton-like algorithm that only uses local information.
Jadbabaie, Ali +2 more
core +1 more source
Stochastic Gradient Descent on Riemannian Manifolds [PDF]
Stochastic gradient descent is a simple approach to find the local minima of a cost function whose evaluations are corrupted by noise. In this paper, we develop a procedure extending stochastic gradient descent algorithms to the case where the function is defined on a Riemannian manifold.
openaire +2 more sources
Stochastic gradient descent algorithm preserving differential privacy in MapReduce framework
Aiming at the contradiction between the efficiency and privacy of stochastic gradient descent algorithm in distributed computing environment,a stochastic gradient descent algorithm preserving differential privacy based on MapReduce was proposed.Based on ...
Yihan YU, Yu FU, Xiaoping WU
doaj +2 more sources
Recent Advances in Stochastic Gradient Descent in Deep Learning
In the age of artificial intelligence, the best approach to handling huge amounts of data is a tremendously motivating and hard problem. Among machine learning models, stochastic gradient descent (SGD) is not only simple but also very effective.
Yingjie Tian, Yuqi Zhang, Haibin Zhang
doaj +1 more source
Distributed Stochastic Gradient Descent With Compressed and Skipped Communication
This paper introduces CompSkipDSGD, a new algorithm for distributed stochastic gradient descent that aims to improve communication efficiency by compressing and selectively skipping communication.
Tran Thi Phuong +2 more
doaj +1 more source

