Results 161 to 170 of about 1,063 (218)

Net2Tab: Tabularizing Neural Networks with Applications to Data Prefetching

open access: green
Pengmiao Zhang   +3 more
openalex   +1 more source

Optimal prefetching via data compression

Journal of the ACM, 1996
Caching and prefetching are important mechanisms for speeding up access time to data on secondary storage. Recent work in competitive online algorithms has uncovered several promising new algorithms for caching. In this paper, we apply a form of the competitive philosophy for the first time to the problem of prefetching to develop an ...
Vitter, Jeffrey Scott, Krishnan, P.
openaire   +2 more sources

Domino Temporal Data Prefetcher

2018 IEEE International Symposium on High Performance Computer Architecture (HPCA), 2018
Big-data server applications frequently encounter data misses, and hence, lose significant performance potential. One way to reduce the number of data misses or their effect is data prefetching. As data accesses have high temporal correlations, temporal prefetching techniques are promising for them.
Mohammad Bakhshalipour   +2 more
openaire   +1 more source

Making data prefetch smarter

Proceedings of the 21st international conference on Parallel architectures and compilation techniques, 2012
Hardware data prefetch engines are integral parts of many general purpose server-class microprocessors in the field today. Some prefetch engines allow the user to change some of their parameters. The prefetcher, however, is usually enabled in a default configuration during system bring-up and dynamic reconfiguration of the prefetch engine is not an ...
Victor JimĂ©nez   +5 more
openaire   +1 more source

Buffer-referred Data Prefetching: An Effective Approach to Coverage-Driven Prefetching

2021 26th International Conference on Automation and Computing (ICAC), 2021
This study about data prefetching focuses on maximizing the performance of modern processors by hiding cache misses. This paper suggests that improving prefetch coverage is an effective approach to achieve the goal. This work proposes to employ two simple buffers, block offset buffer and block address buffer, that leverage prefetch coverage. The offset
Jinhyun So, Mi Lu
openaire   +1 more source

Priority-driven active data prefetching

18th International Parallel and Distributed Processing Symposium, 2004. Proceedings., 2004
Summary form only given. Data cache misses reduce the performance of wide-issue processors by stalling the data supply to the processor. It is especially worse in the DSM environment. Prefetching data for the critical data address misses is one way to tolerate the cache miss latencies.
null Ming Zhu   +3 more
openaire   +1 more source

Energy-Efficient Hardware Data Prefetching

IEEE Transactions on Very Large Scale Integration (VLSI) Systems, 2011
Extensive research has been done in prefetching techniques that hide memory latency in microprocessors leading to performance improvements. However, the energy aspect of prefetching is relatively unknown. While aggressive prefetching techniques often help to improve performance, they increase energy consumption by as much as 30% in the memory system ...
Yao Guo   +4 more
openaire   +1 more source

Practical prefetching via data compression

Proceedings of the 1993 ACM SIGMOD international conference on Management of data, 1993
An important issue that affects response time performance in current OODB and hypertext systems is the I/O involved in moving objects from slow memory to cache. A promising way to tackle this problem is to use prefetching , in which we predict the user's next page requests and get those pages into cache in the ...
Kenneth M. Curewitz   +2 more
openaire   +1 more source

Home - About - Disclaimer - Privacy