Results 11 to 20 of about 15,373 (263)

L1-Norm Tucker Tensor Decomposition [PDF]

open access: yesIEEE Access, 2019
Tucker decomposition is a standard multi-way generalization of Principal-Component Analysis (PCA), appropriate for processing tensor data. Similar to PCA, Tucker decomposition has been shown to be sensitive against faulty data, due to its L2-norm-based ...
Dimitris G. Chachlakis   +2 more
doaj   +3 more sources

Hermitian Tensor Decompositions [PDF]

open access: yesSIAM Journal on Matrix Analysis and Applications, 2020
Hermitian tensors are generalizations of Hermitian matrices, but they have very different properties. Every complex Hermitian tensor is a sum of complex Hermitian rank-1 tensors. However, this is not true for the real case. We study basic properties for Hermitian tensors such as Hermitian decompositions and Hermitian ranks. For canonical basis tensors,
Nie, Jiawang, Yang, Zi
openaire   +2 more sources

Counting Tensor Rank Decompositions [PDF]

open access: yesUniverse, 2021
Tensor rank decomposition is a useful tool for geometric interpretation of the tensors in the canonical tensor model (CTM) of quantum gravity. In order to understand the stability of this interpretation, it is important to be able to estimate how many tensor rank decompositions can approximate a given tensor.
Dennis Obster, Naoki Sasakura
openaire   +3 more sources

Skew-symmetric tensor decomposition [PDF]

open access: yesCommunications in Contemporary Mathematics, 2019
We introduce the “skew apolarity lemma” and we use it to give algorithms for the skew-symmetric rank and the decompositions of tensors in [Formula: see text] with [Formula: see text] and [Formula: see text]. New algorithms to compute the rank and a minimal decomposition of a tritensor are also presented.
Enrique Esteban Arrondo   +3 more
openaire   +5 more sources

Orthogonal Tensor Decompositions [PDF]

open access: yesSIAM Journal on Matrix Analysis and Applications, 2001
The singular value decomposition of a real \(m\times n\) matrix can be reformulated as an orthogonal decomposition in the tensor product \(\mathbb{R}^m \otimes \mathbb{R}^n\). The present paper is concerned with possible generalizations to multiple tensor products \(\mathbb{R}^{m_1} \otimes\cdots \otimes \mathbb{R}^{m_k}\), a prime consideration being ...
openaire   +2 more sources

Spectral Tensor-Train Decomposition [PDF]

open access: yesSIAM Journal on Scientific Computing, 2016
The accurate approximation of high-dimensional functions is an essential task in uncertainty quantification and many other fields. We propose a new function approximation scheme based on a spectral extension of the tensor-train (TT) decomposition. We first define a functional version of the TT decomposition and analyze its properties. We obtain results
Engsig-Karup, Allan P.   +2 more
openaire   +4 more sources

Randomized CP tensor decomposition

open access: yesMachine Learning: Science and Technology, 2020
Abstract The CANDECOMP/PARAFAC (CP) tensor decomposition is a popular dimensionality-reduction method for multiway data. Dimensionality reduction is often sought after since many high-dimensional tensors have low intrinsic rank relative to the dimension of the ambient measurement space.
N Benjamin Erichson   +3 more
openaire   +2 more sources

Tensor-CUR Decompositions for Tensor-Based Data [PDF]

open access: yesSIAM Journal on Matrix Analysis and Applications, 2006
Motivated by numerous applications in which the data may be modeled by a variable subscripted by three or more indices, we develop a tensor-based extension of the matrix CUR decomposition. The tensor-CUR decomposition is most relevant as a data analysis tool when the data consist of one mode that is qualitatively different from the others. In this case,
Michael W. Mahoney   +2 more
openaire   +1 more source

Symmetric tensor decomposition

open access: yesLinear Algebra and its Applications, 2010
Publication in the conference proceedings of EUSIPCO, Glasgow, Scotland ...
Brachat, Jérôme   +3 more
openaire   +5 more sources

Legendre decomposition for tensors*

open access: yesJournal of Statistical Mechanics: Theory and Experiment, 2019
Abstract We present a novel nonnegative tensor decomposition method, called Legendre decomposition, which factorizes an input tensor into a multiplicative combination of parameters. Thanks to the well-developed theory of information geometry, the reconstructed tensor is unique and always minimizes the KL divergence from an input tensor ...
Sugiyama, Mahito   +2 more
openaire   +3 more sources

Home - About - Disclaimer - Privacy