Results 191 to 200 of about 41,447 (233)
System log anomaly detection based on contrastive learning and retrieval augmented. [PDF]
Li W +6 more
europepmc +1 more source
A model to optimize maintenance through implementing industry 4.0 technologies. [PDF]
Essalih S +4 more
europepmc +1 more source
Cache Affinity-aware In-memory Caching Management for Hadoop
Jaewon Kwak
openalex +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Proceedings of the 29th ACM on International Conference on Supercomputing, 2015
Despite the widespread adoption of heterogeneous clusters in modern data centers, modeling heterogeneity is still a big challenge, especially for large-scale MapReduce applications. In a CPU/GPU hybrid heterogeneous cluster, allocating more computing resources to a MapReduce application does not always mean better performance, since simultaneously ...
Wenting He +9 more
+4 more sources
Despite the widespread adoption of heterogeneous clusters in modern data centers, modeling heterogeneity is still a big challenge, especially for large-scale MapReduce applications. In a CPU/GPU hybrid heterogeneous cluster, allocating more computing resources to a MapReduce application does not always mean better performance, since simultaneously ...
Wenting He +9 more
+4 more sources
Proceedings of the VLDB Endowment, 2010
MapReduce is a computing paradigm that has gained a lot of attention in recent years from industry and research. Unlike parallel DBMSs, MapReduce allows non-expert users to run complex analytical tasks over very large data sets on very large clusters and clouds.
Jens Dittrich +5 more
openaire +1 more source
MapReduce is a computing paradigm that has gained a lot of attention in recent years from industry and research. Unlike parallel DBMSs, MapReduce allows non-expert users to run complex analytical tasks over very large data sets on very large clusters and clouds.
Jens Dittrich +5 more
openaire +1 more source
V-Hadoop: Virtualized Hadoop using containers
2016 IEEE 15th International Symposium on Network Computing and Applications (NCA), 2016MapReduce is a popular programming model used to process large amounts of data by exploiting parallelism. Open-source implementations of MapReduce such as Hadoop are generally best suited for large, homogeneous clusters of commodity machines. However, many businesses cannot afford to invest in such infrastructure and others are reluctant to use cloud ...
Srihari Radhakrishnan +2 more
openaire +1 more source
Hadoop-MCC: Efficient Multiple Compound Comparison Algorithm Using Hadoop
Combinatorial Chemistry & High Throughput Screening, 2018Aim and Objective: In the past decade, the drug design technologies have been improved enormously. The computer-aided drug design (CADD) has played an important role in analysis and prediction in drug development, which makes the procedure more economical and efficient.
Guan-Jie, Hua +2 more
openaire +2 more sources

