Title
Exploiting semantics of virtual memory to improve the efficiency of the on-chip memory system
Abstract
Different virtual memory regions (e.g., stack and heap) have different properties and characteristics. For example, stack data are thread-private by definition while heap data can be shared between threads. Compared with heap memory, stack memory tends to take a large number of accesses to a rather small number of pages. These facts have been largely ignored by designers. In this paper, we propose two novel designs that exploit stack memory's unique characteristics to optimize the on-chip memory system. The first design is Anticipatory Superpaging - automatically create superpages for stack memory at the first page fault in a potential superpage, increasing TLB reach and reducing TLB misses. It is transparent to applications and does not require kernel to employ online analysis algorithms and page copying. The second design is Stack-Aware Cache Placement - stack accesses are routed to their local slices in a distributed shared cache, while non-stack accesses are still routed using cacheline interleaving. The primary benefit of this mechanism is reduced power consumption of the on-chip interconnect. Our simulation shows that the first innovation reduces TLB misses by 10% - 20%, and the second one reduces interconnect power consumption by over 14%.
Year
DOI
Venue
2012
10.1007/978-3-642-32820-6_24
international conference on parallel processing
Keywords
DocType
Volume
different virtual memory region,exploiting semantics,different property,large number,page copying,power consumption,novel design,on-chip memory system,heap data,tlb reach,heap memory
Conference
7484
ISSN
Citations 
PageRank 
0302-9743
0
0.34
References 
Authors
17
8
Name
Order
Citations
PageRank
Bin Li12088.64
Zhen Fang2917.62
Li Zhao360434.84
Xiaowei Jiang400.34
Lin Li500.34
Andrew Herdrich6494.10
Ravishankar Iyer772035.52
Srihari Makineni860037.89