Title
Priority L2 Cache Design For Time Predictability
Abstract
L2 caches are usually unified, and the possible interferences between instructions and data make it very hard, if not impossible, to perform timing analysis for unified L2 caches. This paper proposes a priority L2 cache to achieve both time predictability and high performance for real-time systems. The priority cache allows both the instruction and data streams to share the aggregate L2 cache space while preventing them from mutually replacing each other at runtime. While separate L2 caches can also achieve time predictability, our performance evaluation shows that the instruction priority cache (i.e., giving instructions priority over data) outperforms separate L2 caches. Compared to a unified L2 cache, the instruction priority cache degrades performance by only 1.1% on average. Moreover, we implement a prototype of the priority L2 cache on Virtex-6 FPGA and find hardware overhead of the priority L2 cache is very small.
Year
DOI
Venue
2016
10.1504/IJES.2016.10001313
INTERNATIONAL JOURNAL OF EMBEDDED SYSTEMS
Keywords
Field
DocType
priority cache, time predictability, real-time systems, L2 cache, performance
Cache invalidation,Cache pollution,Cache,Computer science,Parallel computing,Page cache,Cache algorithms,Real-time computing,Cache coloring,Bus sniffing,Smart Cache,Operating system
Journal
Volume
Issue
ISSN
8
5-6
1741-1068
Citations 
PageRank 
References 
0
0.34
0
Authors
2
Name
Order
Citations
PageRank
Jun Yan100.34
Wei Zhang228735.43