Results 1 to 10 of about 1,665,186 (233)

Spectral tensor-train decomposition [PDF]

open access: yesSIAM Journal on Scientific Computing, 2015
The accurate approximation of high-dimensional functions is an essential task in uncertainty quantification and many other fields. We propose a new function approximation scheme based on a spectral extension of the tensor-train (TT) decomposition.
Bigoni, Daniele   +2 more
core   +6 more sources

Exploring the feasibility of tensor decomposition for analysis of fNIRS signals: a comparative study with grand averaging method [PDF]

open access: yesFrontiers in Neuroscience, 2023
The analysis of functional near-infrared spectroscopy (fNIRS) signals has not kept pace with the increased use of fNIRS in the behavioral and brain sciences.
Jasmine Y. Chan   +4 more
doaj   +2 more sources

Time-aware tensor decomposition for sparse tensors

open access: yes2021 IEEE 8th International Conference on Data Science and Advanced Analytics (DSAA), 2021
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Dawon Ahn, Jun-Gi Jang, U. Kang
semanticscholar   +3 more sources

Random Tensor Theory for Tensor Decomposition

open access: yesProceedings of the AAAI Conference on Artificial Intelligence, 2022
We propose a new framework for tensor decomposition based on trace invariants, which are particular cases of tensor networks. In general, tensor networks are diagrams/graphs that specify a way to "multiply" a collection of tensors together to produce ...
M. Ouerfelli   +2 more
semanticscholar   +3 more sources

Smoothed Analysis of Tensor Decompositions [PDF]

open access: yesProceedings of the forty-sixth annual ACM symposium on Theory of computing, 2014
Low rank tensor decompositions are a powerful tool for learning generative models, and uniqueness results give them a significant advantage over matrix decomposition methods. However, tensors pose significant algorithmic challenges and tensors analogs of
Anandkumar A.   +12 more
core   +4 more sources

L1-Norm Tucker Tensor Decomposition [PDF]

open access: yesIEEE Access, 2019
Tucker decomposition is a standard multi-way generalization of Principal-Component Analysis (PCA), appropriate for processing tensor data. Similar to PCA, Tucker decomposition has been shown to be sensitive against faulty data, due to its L2-norm-based ...
Dimitris G. Chachlakis   +2 more
doaj   +3 more sources

Tensor Decomposition for Model Reduction in Neural Networks: A Review [Feature] [PDF]

open access: yesIEEE Circuits and Systems Magazine, 2023
Modern neural networks have revolutionized the fields of computer vision (CV) and Natural Language Processing (NLP). They are widely used for solving complex CV tasks and NLP tasks such as image classification, image generation, and machine translation ...
Xingyi Liu, Keshab K. Parhi
semanticscholar   +1 more source

Hermitian Tensor Decompositions [PDF]

open access: yesSIAM Journal on Matrix Analysis and Applications, 2020
Hermitian tensors are generalizations of Hermitian matrices, but they have very different properties. Every complex Hermitian tensor is a sum of complex Hermitian rank-1 tensors. However, this is not true for the real case. We study basic properties for Hermitian tensors such as Hermitian decompositions and Hermitian ranks. For canonical basis tensors,
Nie, Jiawang, Yang, Zi
openaire   +2 more sources

Towards Efficient Tensor Decomposition-Based DNN Model Compression with Optimization Framework [PDF]

open access: yesComputer Vision and Pattern Recognition, 2021
Advanced tensor decomposition, such as tensor train (TT) and tensor ring (TR), has been widely studied for deep neural network (DNN) model compression, especially for recurrent neural networks (RNNs).
Miao Yin, Yang Sui, Siyu Liao, Bo Yuan
semanticscholar   +1 more source

Counting Tensor Rank Decompositions [PDF]

open access: yesUniverse, 2021
Tensor rank decomposition is a useful tool for geometric interpretation of the tensors in the canonical tensor model (CTM) of quantum gravity. In order to understand the stability of this interpretation, it is important to be able to estimate how many tensor rank decompositions can approximate a given tensor.
Dennis Obster, Naoki Sasakura
openaire   +3 more sources

Home - About - Disclaimer - Privacy