Results 21 to 30 of about 1,562,532 (289)
„They say ev’rything can be replaced Yet ev’ry distance is not near “ Bob ...
Thomas Ballhausen, Lisa Leitenmüller
+5 more sources
A major bottleneck in distributed learning is the communication overhead of exchanging intermediate model update parameters between the worker nodes and the parameter server.
Naifu Zhang, Meixia Tao
doaj +1 more source
Source Coding with a Causal Helper
A multi-terminal network, in which an encoder is assisted by a side-information-aided helper, describes a memoryless identically distributed source to a receiver, is considered. The encoder provides a non-causal one-shot description of the source to both
Shraga I. Bross
doaj +1 more source
Hyperspectral Pansharpening Based on Homomorphic Filtering and Weighted Tensor Matrix
Hyperspectral pansharpening is an effective technique to obtain a high spatial resolution hyperspectral (HS) image. In this paper, a new hyperspectral pansharpening algorithm based on homomorphic filtering and weighted tensor matrix (HFWT) is proposed ...
Jiahui Qu +4 more
doaj +1 more source
Structure Tensor-Based Algorithm for Hyperspectral and Panchromatic Images Fusion
Restricted by technical and budget constraints, hyperspectral (HS) image which contains abundant spectral information generally has low spatial resolution. Fusion of hyperspectral and panchromatic (PAN) images can merge spectral information of the former
Jiahui Qu +5 more
doaj +1 more source
Polar codes for distributed source coding [PDF]
A polar coding method to construct a distributed source coding scheme which can achieve any point on the dominant face of the Slepian‐Wolf rate region for sources with uniform marginals is proposed.
openaire +6 more sources
On Linear Coding over Finite Rings and Applications to Computing
This paper presents a coding theorem for linear coding over finite rings, in the setting of the Slepian–Wolf source coding problem. This theorem covers corresponding achievability theorems of Elias (IRE Conv. Rec. 1955, 3, 37–46) and Csiszár (IEEE Trans.
Sheng Huang, Mikael Skoglund
doaj +1 more source
Robust 2-bit Quantization of Weights in Neural Network Modeled by Laplacian Distribution
Significant efforts are constantly involved in finding manners to decrease the number of bits required for quantization of neural network parameters. Although in addition to compression, in neural networks, the application of quantizer models that are ...
PERIC, Z. +3 more
doaj +1 more source
Efficient and Compact Representations of Deep Neural Networks via Entropy Coding
Matrix operations are nowadays central in many Machine Learning techniques, including in particular Deep Neural Networks (DNNs), whose core of any inference is represented by a sequence of dot product operations.
Giosue Cataldo Marino +3 more
doaj +1 more source
Channel Coding and Source Coding With Increased Partial Side Information
Let ( S 1 , i , S 2 , i ) ∼ i . i . d p ( s 1 , s 2 ) , i = 1 , 2 , ⋯ be a memoryless, correlated partial side information sequence. In this work, we study channel coding and source coding problems where the partial side
Avihay Sadeh-Shirazi +2 more
doaj +1 more source

