Title
Analyzing the Impact of Lossy Compressor Variability on Checkpointing Scientific Simulations
Abstract
Lossy compression algorithms are effective tools to reduce the size of high-performance computing data sets. As established lossy compressors such as SZ and ZFP evolve, they seek to improve the compression/decompression bandwidth and the compression ratio. Algorithm improvements may alter the spatial distribution of errors in the compressed data even when using the same error bound and error bound type. If HPC applications are to compute on lossy compressed data, application users require an understanding of how the performance and spatial distribution of error changes. We explore how spatial distributions of error, compression/decompression bandwidth, and compression ratio change for HPC data sets from the applications PlasComCM and Nek5000 between various versions of SZ and ZFP. In addition, we explore how the spatial distribution of error impacts application correctness when restarting from lossy compressed checkpoints. We verify that known approaches to selecting error tolerances for lossy compressed checkpointing are robust to compressor selection and in the face of changes in the distribution of error.
Year
DOI
Venue
2019
10.1109/CLUSTER.2019.8891052
2019 IEEE International Conference on Cluster Computing (CLUSTER)
Keywords
Field
DocType
lossy compressor variability,lossy compression algorithms,high-performance computing data sets,lossy compressed data,HPC data sets,lossy compressed checkpoints,error tolerances,PlasComCM,Nek5000,ZFP,spatial distribution,compression-decompression bandwidth
Data set,Lossy compression,Computer science,Parallel computing,Correctness,Algorithm,Gas compressor,Compression ratio,Bandwidth (signal processing),Graphical model,Data compression
Conference
ISSN
ISBN
Citations 
1552-5244
978-1-7281-4735-2
0
PageRank 
References 
Authors
0.34
16
3
Name
Order
Citations
PageRank
Pavlo Triantafyllides100.34
Tasmia Reza200.34
Jon C. Calhoun333.41