Results 241 to 250 of about 41,858 (279)
Some of the next articles are maybe not open access.

Cost-sensitive cache replacement algorithms

The Ninth International Symposium on High-Performance Computer Architecture, 2003. HPCA-9 2003. Proceedings., 2003
Cache replacement algorithms originally developed in the context of simple uniprocessor systems aim to reduce the miss count. However, in modern systems, cache misses have different costs. The cost may be latency, penalty, power consumption, bandwidth consumption, or any other ad-hoc numerical property attached to a miss.
J. Jeong, M. Dubois
openaire   +1 more source

A Fairness Conscious Cache Replacement Policy for Last Level Cache

2021 Design, Automation & Test in Europe Conference & Exhibition (DATE), 2021
Multicore systems with shared Last Level Cache (LLC) possess a bigger challenge in allocating the LLC space among multiple applications running in the system. Since all applications use the shared LLC, interference caused by them may evict important blocks of other applications that result in premature eviction and may also lead to thrashing ...
Kousik Kumar Dutta   +2 more
openaire   +1 more source

A tag-based cache replacement

2010 IEEE International Conference on Computer Design, 2010
Conventional cache replacement policies use access information of each cache block for replacement decisions. We observe that there are many identical tags across different cache sets because programs exhibit spatial locality. The number of different tags in cache memory is significantly less than the total number of cache blocks in a cache. We propose
Chuanjun Zhang, Bing Xue
openaire   +1 more source

Cache lifetime enhancement technique using hybrid cache-replacement-policy

Microelectronics Reliability, 2019
Abstract Driven by the trends of emerging technologies in memories, a non-volatile memory (NVM) has been considered as an alternate technology to replace SRAM in an on-chip cache. Spin-transfer torque random access memory (STT-RAM), a new type of NVM technology has low leakage power and huge density of cells.
Bhukya Krishna Priya   +3 more
openaire   +1 more source

Fixed Segmented LRU cache replacement scheme with selective caching

2012 IEEE 31st International Performance Computing and Communications Conference (IPCCC), 2012
Cache replacement policies are an essential part of the memory hierarchy used to bridge the gap in speed between CPU and memory. Most of the cache replacement algorithms that can perform significantly better than LRU (Least Recently Used) replacement policy come at the cost of large hardware requirements [1][3].
Kathlene Morales, Byeong Kil Lee
openaire   +1 more source

An Efficient Cache Replacement Strategy in Mobile Cooperative Caching

2011 7th International Conference on Wireless Communications, Networking and Mobile Computing, 2011
Cooperative caching strategies have been proposed to improve access latency, reduce battery power consumption and bandwidth usage in mobile environment. However the biggest disadvantage of these mechanisms is how to manage mobile client's cache effectively due to its limited storage space.
Thu Nguyen Tran Minh, Thuy Dong Thi Bich
openaire   +1 more source

Hybrid cache architecture replacing SRAM cache with future memory technology

2012 IEEE International Symposium on Circuits and Systems, 2012
Recently, hybrid cache architecture has become illuminated. As heterogeneous memory dies are stacked, it improves the performance of microprocessor enhanced in terms of power consumption and processing speed. This paper analyzed the hybrid cache architecture using different programs and memory types.
Suji Lee, Jongpil Jung, Chong-Min Kyung
openaire   +1 more source

Ml-Powered Cache Replacement Algorithm

2022 IEEE 7th International conference for Convergence in Technology (I2CT), 2022
Gurjit Kaur   +2 more
openaire   +1 more source

An Application-Aware Cache Replacement Policy for Last-Level Caches

2013
Current day multicore processors employ multi-level cache hierarchy with one or two levels of private caches and a shared last-level cache (LLC). Efficient cache replacement policies at LLC are essential for reducing the off-chip memory traffic as well as contention for memory bandwidth.
Tripti S. Warrier   +2 more
openaire   +1 more source

Cache Reuse Aware Replacement Policy for Improving GPU Cache Performance

2017
The performance of computing systems has been improved significantly for several decades. However, increasing the throughput of recent CPUs (Central Processing Units) is restricted by power consumption and thermal issues. GPUs (Graphics Processing Units) are recognized as efficient computing platform with powerful hardware resources to support CPUs in ...
Dong Oh Son   +3 more
openaire   +1 more source

Home - About - Disclaimer - Privacy