Title
On the Timed Analysis of Big-Data Applications.
Abstract
Apache Spark is one of the best-known frameworks for executing big-data batch applications over a cluster of (virtual) machines. Defining the cluster (i.e., the number of machines and CPUs) to attain guarantees on the execution times (deadlines) of the application is indeed a trade-off between the cost of the infrastructure and the time needed to execute the application. Sizing the computational resources, in order to prevent cost overruns, can benefit from the use of formal models as a means to capture the execution time of applications. Our model of Spark applications, based on the CLTLoc logic, is defined by considering the directed acyclic graph around which Spark programs are organized, the number of available CPUs, the number of tasks elaborated by the application, and the average execution times of tasks. If the outcome of the analysis is positive, then the execution is feasible-that is, it can be completed within a given time span. The analysis tool has been implemented on top of the Zot formal verification tool. A preliminary evaluation shows that our model is sufficiently accurate: the formal analysis identifies execution times that are close (the error is less than 10%) to those obtained by actually running the applications.
Year
DOI
Venue
2018
10.1007/978-3-319-77935-5_22
Lecture Notes in Computer Science
Keywords
Field
DocType
Big-Data Applications,Metric temporal logic,Formal verification,Apache Spark
Spark (mathematics),Computer science,Execution time,Sizing,Big data,Formal verification,Distributed computing
Conference
Volume
ISSN
Citations 
10811
0302-9743
1
PageRank 
References 
Authors
0.36
8
5
Name
Order
Citations
PageRank
Francesco Marconi161.96
Giovanni Quattrocchi2527.95
L. Baresi316013.28
Marcello M. Bersani412816.06
Matteo Rossi528832.20