Results 1 to 10 of about 21,831,698 (267)
Fiber Clustering Acceleration With a Modified Kmeans++ Algorithm Using Data Parallelism [PDF]
Fiber clustering methods are typically used in brain research to study the organization of white matter bundles from large diffusion MRI tractography datasets.
Isaac Goicovich +8 more
doaj +3 more sources
Relating data—parallelism and (and—) parallelism in logic programs [PDF]
Much work has been done in the áreas of and-parallelism and data parallelism in Logic Programs. Such work has proceeded to a certain extent in an independent fashion. Both types of parallelism offer advantages and disadvantages.
Ali +62 more
core +5 more sources
Algebraic data parallelism implementation in HPF
There is not abstract.
Valdona Judickaitė +2 more
doaj +4 more sources
The Parallelism Motifs of Genomic Data Analysis [PDF]
Genomic data sets are growing dramatically as the cost of sequencing continues to decline and small sequencing devices become available. Enormous community databases store and share this data with the research community, but some of these genomic data ...
Awan, Muaaz +13 more
core +4 more sources
Parallelism Strategies for Big Data Delayed Transfer Entropy Evaluation [PDF]
Generated and collected data have been rising with the popularization of technologies such as Internet of Things, social media, and smartphone, leading big data term creation. One class of big data hidden information is causality.
Jonas R. Dourado +2 more
doaj +2 more sources
A note on data-parallelism and (and-parallel) prolog
Abstract is not available.
Hermenegildo, Manuel V. +1 more
openaire +3 more sources
Accelerating Distributed SGD With Group Hybrid Parallelism
The scale of model parameters and datasets is rapidly growing for high accuracy in various areas. To train a large-scale deep neural network (DNN) model, a huge amount of computation and memory is required; therefore, a parallelization technique for ...
Kyung-No Joo, Chan-Hyun Youn
doaj +1 more source
The recent unprecedented success of deep learning (DL) in various fields is underlied by its use of large-scale data and models. Training a large-scale deep neural network (DNN) model with large-scale data, however, is time-consuming.
Yunyong Ko, Sang-Wook Kim
doaj +1 more source
Communication Optimization Schemes for Accelerating Distributed Deep Learning Systems
In a distributed deep learning system, a parameter server and workers must communicate to exchange gradients and parameters, and the communication cost increases as the number of workers increases.
Jaehwan Lee +4 more
doaj +1 more source
Deep learning (DL) has gained increasing prominence in latency-critical artificial intelligence (AI) applications. Due to the intensive computational requirements of these applications, cloud-centric approaches have been attempted to address this issue ...
Emre Kilcioglu +2 more
doaj +1 more source

