Results 31 to 40 of about 737,908 (337)

Contractions: Nijenhuis and Saletan tensors for general algebraic structures [PDF]

open access: greenJournal of Physics A: Mathematical and General, 2001
Generalizations in many directions of the contraction procedure for Lie algebras introduced by E.J.Saletan are proposed. Products of arbitrary nature, not necessarily Lie brackets, are considered on sections of finite-dimensional vector bundles.
José F. Cariñena   +2 more
openalex   +5 more sources

Differentiable Programming Tensor Networks

open access: yesPhysical Review X, 2019
Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and optimizes them using gradient search. The concept emerges from deep learning but is not limited to training neural networks. We present the
Jin-Guo Liu, Tao Xiang
exaly   +2 more sources

Polyhedral Specification and Code Generation of Sparse Tensor Contraction with Co-iteration [PDF]

open access: yesACM Transactions on Architecture and Code Optimization (TACO), 2022
This article presents a code generator for sparse tensor contraction computations. It leverages a mathematical representation of loop nest computations in the sparse polyhedral framework (SPF), which extends the polyhedral model to support non-affine ...
Tuowen Zhao   +4 more
semanticscholar   +1 more source

Open Quantum System Dynamics from Infinite Tensor Network Contraction. [PDF]

open access: yesPhysical Review Letters, 2023
Approaching the long-time dynamics of non-Markovian open quantum systems presents a challenging task if the bath is strongly coupled. Recent proposals address this problem through a representation of the so-called process tensor in terms of a tensor ...
Valentin Link, Hong-Hao Tu, W. Strunz
semanticscholar   +1 more source

Stack Operation of Tensor Networks

open access: yesFrontiers in Physics, 2022
The tensor network, as a factorization of tensors, aims at performing the operations that are common for normal tensors, such as addition, contraction, and stacking.
Tianning Zhang   +4 more
doaj   +1 more source

A Practical Guide to the Numerical Implementation of Tensor Networks I: Contractions, Decompositions, and Gauge Freedom

open access: yesFrontiers in Applied Mathematics and Statistics, 2022
We present an overview of the key ideas and skills necessary to begin implementing tensor network methods numerically, which is intended to facilitate the practical application of tensor network methods for researchers that are already versed with their ...
Glen Evenbly
doaj   +1 more source

Athena: high-performance sparse tensor contraction sequence on heterogeneous memory

open access: yesInternational Conference on Supercomputing, 2021
Sparse tensor contraction sequence has been widely employed in many fields, such as chemistry and physics. However, how to efficiently implement the sequence faces multiple challenges, such as redundant computations and memory operations, massive memory ...
Jiawen Liu   +3 more
semanticscholar   +1 more source

Tensor Network Contractions for #SAT [PDF]

open access: yesJournal of Statistical Physics, 2015
The computational cost of counting the number of solutions satisfying a Boolean formula, which is a problem instance of #SAT, has proven subtle to quantify. Even when finding individual satisfying solutions is computationally easy (e.g. 2-SAT, which is in P), determining the number of solutions is #P-hard.
Jacob Turner   +3 more
openaire   +4 more sources

Simple heuristics for efficient parallel tensor contraction and quantum circuit simulation [PDF]

open access: yesarXiv.org, 2020
Tensor networks are the main building blocks in a wide variety of computational sciences, ranging from many-body theory and quantum computing to probability and machine learning. Here we propose a parallel algorithm for the contraction of tensor networks
R. Schutski   +3 more
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy