Title
Energy Efficient Buffer Cache Replacement for Data Servers
Abstract
Power consumption is an increasingly impressing concern for data servers as it directly affects running costs and system reliability. Prior studies have shown that most memory space on data servers is used for buffer caching and thus cache replacement becomes critical. Two conflicting factors of buffer caching impacts memory energy efficiency: (1) a higher hit rate reduces memory traffic and thus saves energy, (2) temporally concentrating memory accesses to a smaller set of memory chips increases the chances of "free riding" through DMA overlapping and also makes more memory chips have opportunities to power down. This paper investigates the tradeoff between these two interacting, sometimes conflicting factors and proposes three energy-aware buffer cache replacement algorithms: On a cache miss for a new block b in a file f, evict an victim block from (1)the most recently accessed memory chip, (2) the memory chip that is accessed most recently by file f, or (3) the memory chip that is accessed most recently by file f and whose last access block belongs to the same hot or cold categories as block b. Simulation results based on three real-world I/O traces, including TPC-R, MSN-BEFS and Exchange, show that our algorithms can save up to 24.9% energy with marginal degradation in hit rates. Our algorithms show degradation in response time in some experiments. We propose an off-line energy sub optimal replacement algorithm that serves as a theortical reference.
Year
DOI
Venue
2011
10.1109/NAS.2011.49
NAS
Keywords
Field
DocType
power aware computing,energy efficient buffer cache,conflicting factor,power consumption,memory access,buffer storage,memory chips,memory traffic,accessed memory chip,impacts memory energy efficiency,memory space,memory energy efficiency,file servers,reliability system,memory chip,energy efficient buffer cache replacement,data servers,last access block,i/o traces,new block b,memory chip access,data server,clustering algorithms,greedy algorithms,energy efficient,greedy algorithm,memory management,free riding,approximation algorithms,chip,servers
Interleaved memory,Uniform memory access,Cache pollution,Cache,CPU cache,Computer science,Parallel computing,Cache-only memory architecture,Computer network,Real-time computing,Non-uniform memory access,Cache coloring
Conference
ISBN
Citations 
PageRank 
978-0-7695-4509-7
0
0.34
References 
Authors
20
4
Name
Order
Citations
PageRank
Jianhui Yue11489.53
Yifeng Zhu251335.33
Zhao Cai3407.37
Lin Lin4619.18