Results 251 to 260 of about 161,991 (287)
Some of the next articles are maybe not open access.
Improved Denoising Auto-Encoders for Image Denoising
2018 11th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), 2018Image denoising is an important pre-processing step in image analysis. Various denoising algorithms, such as BM3D, PCD and K-SVD, obtain remarkable effects. Recently a deep denoising auto-encoder has been proposed and shown excellent performance compared to conventional image denoising algorithms.
Qian Xiang, Xuliang Pang
openaire +1 more source
Dynamic Slimmable Denoising Network
IEEE Transactions on Image Processing, 2023Recently, tremendous human-designed and automatically searched neural networks have been applied to image denoising. However, previous works intend to handle all noisy images in a pre-defined static network architecture, which inevitably leads to high computational complexity for good denoising quality.
Zutao Jiang +5 more
openaire +2 more sources
2007
We consider the problem of denoising a noisily sampled submanifold M in R^d, where the submanifold M is a priori unknown and we are only given a noisy point sample. The presented denoising algorithm is based on a graph-based diffusion process of the point sample.
Hein, M., Maier, M.
openaire +2 more sources
We consider the problem of denoising a noisily sampled submanifold M in R^d, where the submanifold M is a priori unknown and we are only given a noisy point sample. The presented denoising algorithm is based on a graph-based diffusion process of the point sample.
Hein, M., Maier, M.
openaire +2 more sources
Multiscale Image Blind Denoising
IEEE Transactions on Image Processing, 2015Arguably several thousands papers are dedicated to image denoising. Most papers assume a fixed noise model, mainly white Gaussian or Poissonian. This assumption is only valid for raw images. Yet, in most images handled by the public and even by scientists, the noise model is imperfectly known or unknown.
Marc, Lebrun +2 more
openaire +2 more sources
IEEE Transactions on Information Theory, 2000
Summary: The so-called denoising problem, relative to normal models for noise, is formalized such that ``noise'' is defined as the incompressible part in the data while the compressible part defines the meaningful information-bearing signal. Such a decomposition is effected by minimization of the ideal code length, called for by the minimum description
openaire +1 more source
Summary: The so-called denoising problem, relative to normal models for noise, is formalized such that ``noise'' is defined as the incompressible part in the data while the compressible part defines the meaningful information-bearing signal. Such a decomposition is effected by minimization of the ideal code length, called for by the minimum description
openaire +1 more source
Image denoising review: From classical to state-of-the-art approaches
Information Fusion, 2020Bhawna Goyal, Ayush Dogra, Sunil Agrawal
exaly

