Abstract | ||
---|---|---|
In the last decades, numerous program analyzers have been developed both in academia and industry. Despite their abundance however, there is currently no systematic way of comparing the effectiveness of different analyzers on arbitrary code. In this paper, we present the first automated technique for differentially testing soundness and precision of program analyzers. We used our technique to compare six mature, state-of-the art analyzers on tens of thousands of automatically generated benchmarks. Our technique detected soundness and precision issues in most analyzers, and we evaluated the implications of these issues to both designers and users of program analyzers.
|
Year | DOI | Venue |
---|---|---|
2018 | 10.1145/3293882.3330553 | Proceedings of the 28th ACM SIGSOFT International Symposium on Software Testing and Analysis |
Keywords | Field | DocType |
differential testing, precision, program analysis, soundness | Automated technique,Systems engineering,Software engineering,Computer science,Soundness | Journal |
Volume | Citations | PageRank |
abs/1812.05033 | 5 | 0.42 |
References | Authors | |
0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Christian Klinger | 1 | 5 | 0.42 |
Maria Christakis | 2 | 200 | 16.69 |
Valentin Wüstholz | 3 | 9 | 2.18 |