Results 121 to 130 of about 2,821 (160)
System log anomaly detection based on contrastive learning and retrieval augmented. [PDF]
Li W +6 more
europepmc +1 more source
A model to optimize maintenance through implementing industry 4.0 technologies. [PDF]
Essalih S +4 more
europepmc +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
Proceedings of the 29th ACM on International Conference on Supercomputing, 2015
Despite the widespread adoption of heterogeneous clusters in modern data centers, modeling heterogeneity is still a big challenge, especially for large-scale MapReduce applications. In a CPU/GPU hybrid heterogeneous cluster, allocating more computing resources to a MapReduce application does not always mean better performance, since simultaneously ...
Wenting He +9 more
+4 more sources
Despite the widespread adoption of heterogeneous clusters in modern data centers, modeling heterogeneity is still a big challenge, especially for large-scale MapReduce applications. In a CPU/GPU hybrid heterogeneous cluster, allocating more computing resources to a MapReduce application does not always mean better performance, since simultaneously ...
Wenting He +9 more
+4 more sources
Proceedings of the VLDB Endowment, 2010
MapReduce is a computing paradigm that has gained a lot of attention in recent years from industry and research. Unlike parallel DBMSs, MapReduce allows non-expert users to run complex analytical tasks over very large data sets on very large clusters and clouds.
Jens Dittrich +5 more
openaire +1 more source
MapReduce is a computing paradigm that has gained a lot of attention in recent years from industry and research. Unlike parallel DBMSs, MapReduce allows non-expert users to run complex analytical tasks over very large data sets on very large clusters and clouds.
Jens Dittrich +5 more
openaire +1 more source
V-Hadoop: Virtualized Hadoop using containers
2016 IEEE 15th International Symposium on Network Computing and Applications (NCA), 2016MapReduce is a popular programming model used to process large amounts of data by exploiting parallelism. Open-source implementations of MapReduce such as Hadoop are generally best suited for large, homogeneous clusters of commodity machines. However, many businesses cannot afford to invest in such infrastructure and others are reluctant to use cloud ...
Srihari Radhakrishnan +2 more
openaire +1 more source
Hadoop-MCC: Efficient Multiple Compound Comparison Algorithm Using Hadoop
Combinatorial Chemistry & High Throughput Screening, 2018Aim and Objective: In the past decade, the drug design technologies have been improved enormously. The computer-aided drug design (CADD) has played an important role in analysis and prediction in drug development, which makes the procedure more economical and efficient.
Guan-Jie, Hua +2 more
openaire +2 more sources
International Journal of Cloud Applications and Computing, 2011
As a main subfield of cloud computing applications, internet services require large-scale data computing. Their workloads can be divided into two classes: customer-facing query-processing interactive tasks that serve hundreds of millions of users within a short response time and backend data analysis batch tasks that involve petabytes of data.
Zhiwei Xu, Bo Yan, Yongqiang Zou
openaire +1 more source
As a main subfield of cloud computing applications, internet services require large-scale data computing. Their workloads can be divided into two classes: customer-facing query-processing interactive tasks that serve hundreds of millions of users within a short response time and backend data analysis batch tasks that involve petabytes of data.
Zhiwei Xu, Bo Yan, Yongqiang Zou
openaire +1 more source
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining, 2013
From it's beginnings as a framework for building web crawlers for small-scale search engines to being one of the most promising technologies for building datacenter-scale distributed computing and storage platforms, Apache Hadoop has come far in the last seven years.
openaire +1 more source
From it's beginnings as a framework for building web crawlers for small-scale search engines to being one of the most promising technologies for building datacenter-scale distributed computing and storage platforms, Apache Hadoop has come far in the last seven years.
openaire +1 more source
Proceedings of the International Conference on Research in Adaptive and Convergent Systems, 2016
Hadoop is a widely used software framework for handling massive data. As heterogeneous computing gains its momentum, variants of Hadoop have been developed to offload the computation of the Hadoop applications onto the heterogeneous processors, such as GPUs, DSPs, and FPGA. Unfortunately, these variants do not support on-demand resource scaling for the
Yi-Wei Chen +3 more
openaire +1 more source
Hadoop is a widely used software framework for handling massive data. As heterogeneous computing gains its momentum, variants of Hadoop have been developed to offload the computation of the Hadoop applications onto the heterogeneous processors, such as GPUs, DSPs, and FPGA. Unfortunately, these variants do not support on-demand resource scaling for the
Yi-Wei Chen +3 more
openaire +1 more source

