Title
Cacheflow: Cache Optimizations For Data Driven Multithreading
Abstract
Data-Driven Multithreading is a non-blocking multithreading model of execution that provides effective latency tolerance by allowing the computation processor do useful work, while a long latency event is in progress. With the Data-Driven Multithreading model, a thread is scheduled for execution only if all of its inputs have been produced and placed in the processor's local memory. Data-driven sequencing leads to irregular memory access patterns that could affect negatively cache performance. Nevertheless, it enables the implementation of short-term optimal cache management policies. This paper presents the implementation of CacheFlow, an optimized cache management policy which eliminates the side effects due to the loss of locality caused by the data-driven sequencing, and reduces further cache misses. CacheFlow employs thread-based prefetching to preload data blocks of threads deemed executable. Simulation results, for nine scientific applications, on a 32-node Data-Driven Multithreaded machine show an average speedup improvement from 19.8 to 22.6. Two techniques to further improve the performance of CacheFlow, conflict avoidance and thread reordering, are proposed and tested. Simulation experiments have shown a speedup improvement of 24% and 32%, respectively. The average speedup for all applications on a 32-node machine with both optimizations is 26.1.
Year
DOI
Venue
2006
10.1142/S0129626406002599
PARALLEL PROCESSING LETTERS
Keywords
Field
DocType
Cache Management, Data-Driven Multithreading, Prefetching
Multithreading,Cache pollution,CPU cache,Computer science,Cache,Parallel computing,Speculative multithreading,Cache algorithms,Cache coloring,Temporal multithreading,Distributed computing
Journal
Volume
Issue
ISSN
16
2
0129-6264
Citations 
PageRank 
References 
0
0.34
1
Authors
3
Name
Order
Citations
PageRank
Costas Kyriacou1656.57
Paraskevas Evripidou231334.69
Pedro Trancoso337743.79