Results 1 to 10 of about 491 (55)
Convergence of Linear Bregman ADMM for Nonconvex and Nonsmooth Problems with Nonseparable Structure
The alternating direction method of multipliers (ADMM) is an effective method for solving two-block separable convex problems and its convergence is well understood.
Miantao Chao, Zhao Deng, Jinbao Jian
doaj +2 more sources
A Modified Approach to Distributed Bregman ADMM for a Class of Nonconvex Consensus Problems
This article presents a refined iteration of the distributed Bregman alternating direction method of multipliers (ADMM) tailored to tackle nonconvex consensus issues, especially those with multiple blocks.
Zhonghui Xue, Qianfeng Ma, Yazheng Dang
doaj +2 more sources
Bregman iterative regularization using model functions for nonconvex nonsmooth optimization
In this paper, we propose a new algorithm called ModelBI by blending the Bregman iterative regularization method and the model function technique for solving a class of nonconvex nonsmooth optimization problems.
Haoxing Yang +3 more
doaj +1 more source
Convergence Analysis of Multiblock Inertial ADMM for Nonconvex Consensus Problem
The alternating direction method of multipliers (ADMM) is one of the most powerful and successful methods for solving various nonconvex consensus problem. The convergence of the conventional ADMM (i.e., 2‐block) for convex objective functions has been stated for a long time.
Yang Liu, Yazheng Dang, Qiang Wu
wiley +1 more source
Non‐convex nonlocal adaptive tight frame image deblurring
Abstract The challenge of the image restoration is to recover more detailed information from the degraded images. Based on the observations that wavelet frames have efficient representation ability to image details and the nonconvex regularization in the model may admit unbiased solutions, in this paper, in order to recover more details, a wavelet ...
Zhengwei Shen
wiley +1 more source
Abstract We construct an example of a smooth convex function on the plane with a strict minimum at zero, which is real analytic except at zero, for which Thom's gradient conjecture fails both at zero and infinity. More precisely, the gradient orbits of the function spiral around zero and at infinity.
Aris Daniilidis +2 more
wiley +1 more source
Factorization of completely positive matrices using iterative projected gradient steps
Abstract We aim to factorize a completely positive matrix by using an optimization approach which consists in the minimization of a nonconvex smooth function over a convex and compact set. To solve this problem we propose a projected gradient algorithm with parameters that take into account the effects of relaxation and inertia.
Radu Ioan Boţ, Dang‐Khoa Nguyen
wiley +1 more source
A LogTVSCAD Nonconvex Regularization Model for Image Deblurring in the Presence of Impulse Noise
This paper proposes a nonconvex model (called LogTVSCAD) for deblurring images with impulsive noises, using the log‐function penalty as the regularizer and adopting the smoothly clipped absolute deviation (SCAD) function as the data‐fitting term. The proposed nonconvex model can effectively overcome the poor performance of the classical TVL1 model for ...
Zhijun Luo +3 more
wiley +1 more source
The nonconvex and nonsmooth optimization problem has been attracting increasing attention in recent years in image processing and machine learning research. The algorithm-based reweighted step has been widely used in many applications.
Juyeb Yeo, Myeongmin Kang
doaj +1 more source
Convergence Rate Analysis of the Proximal Difference of the Convex Algorithm
In this paper, we study the convergence rate of the proximal difference of the convex algorithm for the problem with a strong convex function and two convex functions. By making full use of the special structure of the difference of convex decomposition, we prove that the convergence rate of the proximal difference of the convex algorithm is linear ...
Xueyong Wang +4 more
wiley +1 more source

