Results 211 to 220 of about 55,092 (247)
Some of the next articles are maybe not open access.

In-Network Caching for Chip Multiprocessors

2009
Effective management of data is critical to the performance of emerging multi-core architectures. Our analysis of applications from SpecOMP reveal that a small fraction of shared addresses correspond to a large portion of accesses. Utilizing this observation, we propose a technique that augments a router in a on-chip network with a small data store to ...
Aditya Yanamandra   +4 more
openaire   +1 more source

Dynamic in-network caching for energy efficient content delivery

2013 Proceedings IEEE INFOCOM, 2013
Consider a network of prosumers of media content in which users dynamically create and request content objects. The request process is governed by the objects' popularity and varies across network regions and over time. In order to meet user requests, content objects can be stored and transported over the network, characterized by the capacity and ...
Llorca, Jaime   +6 more
openaire   +3 more sources

In-network caching assisted wireless AP storage management

Proceedings of the ACM SIGCOMM 2013 conference on SIGCOMM, 2013
The goal of this paper is to improve wireless AP caching by leveraging in-network caching. We observe that by treating routers as an in-network storage extension, we can relieve the storage limitation of APs. The unique challenge is that APs and routers cannot have a full collaboration, which makes the problem different from traditional cooperative ...
Zhongxing Ming, Mingwei Xu, Dan Wang
openaire   +1 more source

Energy efficient content locations for in-network caching

2012 18th Asia-Pacific Conference on Communications (APCC), 2012
As various multimedia services are being provided on networks, broadband traffic is growing as well. Reducing traffic is important because power consumption in networks has been increasing year by year. Meanwhile, content caching is expected to reduce data traffic by storing content replicas on the network nodes and it is beneficial in view of energy ...
Satoshi Imai   +2 more
openaire   +1 more source

CAVE: Hybrid Approach for In-Network Content Caching

2014 IEEE 11th Intl Conf on Ubiquitous Intelligence and Computing and 2014 IEEE 11th Intl Conf on Autonomic and Trusted Computing and 2014 IEEE 14th Intl Conf on Scalable Computing and Communications and Its Associated Workshops, 2014
We present CAVE, a novel hybrid algorithm for caching in content-oriented networks (CON). It relies on popularity-based caching and content chunk indexing. CAVE is a combination of two existing caching algorithms: WAVE which is a popularity-based scheme, and CONIC which is an index-based scheme.
Khaled Bakhit   +4 more
openaire   +1 more source

Routing prefix caching in network processor design

Proceedings Tenth International Conference on Computer Communications and Networks (Cat. No.01EX495), 2002
Cache has been time proven to be a very effective technique to improve memory access speed. It is based on the assumption that enough locality exists in memory access patterns, i.e., there is a high probability that an entry will be accessed again shortly after. However, it is questionable whether Internet traffic has enough locality, especially in the
openaire   +1 more source

Accelerating kubernetes with in-network caching

Proceedings of the SIGCOMM '22 Poster and Demo Sessions, 2022
Stefanos Sagkriotis, Dimitrios Pezaros
openaire   +1 more source

Towards Variable-Length In-Network Caching

We present StarCache, a new in-network caching architecture that can cache variable-length items to balance a wide range of key-value workloads. Unlike existing works, StarCache does not cache hot items in the switch memory. Instead, we make hot items revisit the switch data plane continuously by exploiting packet recirculation.
openaire   +1 more source

Integrative oncology: Addressing the global challenges of cancer prevention and treatment

Ca-A Cancer Journal for Clinicians, 2022
Jun J Mao,, Msce   +2 more
exaly  

Home - About - Disclaimer - Privacy