Title
ADAPT: algorithmic differentiation applied to floating-point precision tuning.
Abstract
HPC applications use floating point arithmetic operations extensively to solve computational problems. Mixed-precision computing seeks to use the lowest precision data type that is sufficient to achieve a desired accuracy, improving performance and reducing power consumption. Manually optimizing a program to use mixed precision is challenging as it not only requires extensive knowledge about the numerical behavior of the algorithm but also estimates of the rounding errors. In this work, we present ADAPT, a scalable approach for mixed-precision analysis on HPC workloads using algorithmic differentiation to provide accurate estimates about the final output error. ADAPT provides a floating-point precision sensitivity profile while incurring an overhead of only a constant multiple of the original computation irrespective of the number of variables analyzed. The sensitivity profile can be used to make algorithmic choices and to develop mixed-precision configurations of a program. We evaluate ADAPT on six benchmarks and a proxy application (LULESH) and show that we are able to achieve a speedup of 1.2× on the proxy application.
Year
DOI
Venue
2018
10.5555/3291656.3291720
SC
Keywords
Field
DocType
Tools,Sensitivity,Tuning,Benchmark testing,Adaptation models,Space exploration,Approximation algorithms
Floating point,Computer science,Parallel computing,Automatic differentiation,Rounding,Data type,General-purpose computing on graphics processing units,Benchmark (computing),Scalability,Speedup
Conference
Citations 
PageRank 
References 
4
0.41
0
Authors
7
Name
Order
Citations
PageRank
Harshitha Menon1496.89
Michael O. Lam2505.15
Daniel Osei-Kuffuor392.57
Markus Schordan425923.98
Scott Lloyd540.41
Mohror, Kathryn655336.10
Jeffrey Hittinger7191.51