Results 31 to 40 of about 666,598 (288)
In order to achieve high performance for solving the ion flow field problem, an approach is proposed with the tensor‐structured finite element method (FEM) to accelerate the Newton iteration.
Qiwen Cheng, Jun Zou
doaj +1 more source
On the Optimal Linear Contraction Order of Tree Tensor Networks, and Beyond [PDF]
The contraction cost of a tensor network depends on the contraction order. However, the optimal contraction ordering problem is known to be NP-hard. We show that the linear contraction ordering problem for tree tensor networks admits a polynomial-time ...
Mihail Stoian +2 more
semanticscholar +1 more source
Fast counting with tensor networks
We introduce tensor network contraction algorithms for counting satisfying assignments of constraint satisfaction problems (#CSPs). We represent each arbitrary #CSP formula as a tensor network, whose full contraction yields the number of satisfying ...
Stefanos Kourtis, Claudio Chamon, Eduardo R. Mucciolo, Andrei E. Ruckenstein
doaj +1 more source
Effect of myofibre architecture on ventricular pump function by using a neonatal porcine heart model: from DT-MRI to rule-based methods [PDF]
Myofibre architecture is one of the essential components when constructing personalized cardiac models. In this study, we develop a neonatal porcine bi-ventricle model with three different myofibre architectures for the left ventricle (LV).
Debao Guan +3 more
doaj +1 more source
Benchmarking treewidth as a practical component of tensor network simulations.
Tensor networks are powerful factorization techniques which reduce resource requirements for numerically simulating principal quantum many-body systems and algorithms.
Eugene F Dumitrescu +5 more
doaj +1 more source
Distributed-memory multi-GPU block-sparse tensor contraction for electronic structure
Many domains of scientific simulation (chemistry, condensed matter physics, data science) increasingly eschew dense tensors for block-sparse tensors, sometimes with additional structure (recursive hierarchy, rank sparsity, etc.).
T. Hérault +6 more
semanticscholar +1 more source
High-Performance Tensor Contraction without Transposition [PDF]
Tensor computations---in particular tensor contraction (TC)---are important kernels in many scientific computing applications. Due to the fundamental similarity of TC to matrix multiplication and to the availability of optimized implementations such as ...
D. Matthews
semanticscholar +1 more source
Grassmann higher-order tensor renormalization group approach for two-dimensional strong-coupling QCD
We present a tensor-network approach for two-dimensional strong-coupling QCD with staggered quarks at nonzero chemical potential. After integrating out the gauge fields at infinite coupling, the partition function can be written as a full contraction of ...
Jacques Bloch, Robert Lohmayer
doaj +1 more source
Tensor Contraction Layers for Parsimonious Deep Nets [PDF]
Tensors offer a natural representation for many kinds of data frequently encountered in machine learning. Images, for example, are naturally represented as third order tensors, where the modes correspond to height, width, and channels.
Jean Kossaifi +4 more
semanticscholar +1 more source
Efficient computation of the second-Born self-energy using tensor-contraction operations. [PDF]
In the nonequilibrium Green's function approach, the approximation of the correlation self-energy at the second-Born level is of particular interest, since it allows for a maximal speed-up in computational scaling when used together with the generalized ...
R. Tuovinen, F. Covito, Michael A Sentef
semanticscholar +1 more source

