Results 11 to 20 of about 265,325 (316)

Revisiting Gradient Clipping: Stochastic bias and tight convergence guarantees [PDF]

open access: yesInternational Conference on Machine Learning, 2023
Gradient clipping is a popular modification to standard (stochastic) gradient descent, at every iteration limiting the gradient norm to a certain value $c>0$.
Anastasia Koloskova   +2 more
semanticscholar   +1 more source

Automatic Clipping: Differentially Private Deep Learning Made Easier and Stronger [PDF]

open access: yesNeural Information Processing Systems, 2022
Per-example gradient clipping is a key algorithmic step that enables practical differential private (DP) training for deep learning models. The choice of clipping threshold R, however, is vital for achieving high accuracy under DP.
Zhiqi Bu   +3 more
semanticscholar   +1 more source

Exploring the Limits of Differentially Private Deep Learning with Group-wise Clipping [PDF]

open access: yesInternational Conference on Learning Representations, 2022
Differentially private deep learning has recently witnessed advances in computational efficiency and privacy-utility trade-off. We explore whether further improvements along the two axes are possible and provide affirmative answers leveraging two ...
Jiyan He   +8 more
semanticscholar   +1 more source

Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training [PDF]

open access: yesInternational Conference on Machine Learning, 2022
Data clipping is crucial in reducing noise in quantization operations and improving the achievable accuracy of quantization-aware training (QAT). Current practices rely on heuristics to set clipping threshold scalars and cannot be shown to be optimal. We
Charbel Sakr   +5 more
semanticscholar   +1 more source

A Pragmatic Randomized Trial Comparing Surgical Clipping and Endovascular Treatment of Unruptured Intracranial Aneurysms

open access: yesAmerican Journal of Neuroradiology, 2023
Surgical clipping is more effective than endovascular treatment of unruptured intracranial aneurysms in terms of the frequency of the primary outcome of treatment failure. Results were mainly driven by angiographic results at 1 year.
M. Chagnon   +49 more
semanticscholar   +1 more source

Endoscopic direct clipping versus indirect clipping for colonic diverticular bleeding: A large multicenter cohort study

open access: yesUnited European Gastroenterology journal, 2022
Background Direct and indirect clipping treatments are used worldwide to treat colonic diverticular bleeding (CDB), but their effectiveness has not been examined in multicenter studies with more than 100 cases. Objective We sought to determine the short‐
T. Kishino   +44 more
semanticscholar   +1 more source

Scaling up Differentially Private Deep Learning with Fast Per-Example Gradient Clipping [PDF]

open access: yesProceedings on Privacy Enhancing Technologies, 2020
Recent work on Renyi Differential Privacy has shown the feasibility of applying differential privacy to deep learning tasks. Despite their promise, however, differentially private deep networks often lag far behind their non-private counterparts in ...
Jaewoo Lee, Daniel Kifer
semanticscholar   +1 more source

Surgical Clipping Versus Endovascular Coiling in the Management of Intracranial Aneurysms

open access: yesCureus, 2021
Intracranial aneurysms are pathological dilatations of intracranial arteries and prevail in around 3.2% of the general population. The worst outcome of an aneurysm is its rupture. Its prevention and management can be accomplished by two broad modalities:
Rishab Belavadi   +7 more
semanticscholar   +1 more source

TCL: an ANN-to-SNN Conversion with Trainable Clipping Layers [PDF]

open access: yesDesign Automation Conference, 2020
Spiking-neural-networks (SNNs) are promising at edge devices since the event-driven operations of SNNs provides significantly lower power compared to analog-neural-networks (ANNs).
Nguyen-Dong Ho, I. Chang
semanticscholar   +1 more source

Autoclip: Adaptive Gradient Clipping for Source Separation Networks [PDF]

open access: yesInternational Workshop on Machine Learning for Signal Processing, 2020
Clipping the gradient is a known approach to improving gradient descent, but requires hand selection of a clipping threshold hyperparameter. We present AutoClip, a simple method for automatically and adaptively choosing a gradient clipping threshold ...
Prem Seetharaman   +3 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy