Title | ||
---|---|---|
Model transformation languages under a magnifying glass: a controlled experiment with Xtend, ATL, and QVT. |
Abstract | ||
---|---|---|
In Model-Driven Software Development, models are automatically processed to support the creation, build, and execution of systems. A large variety of dedicated model-transformation languages exists, promising to efficiently realize the automated processing of models. To investigate the actual benefit of using such specialized languages, we performed a large-scale controlled experiment in which over 78 subjects solve 231 individual tasks using three languages. The experiment sheds light on commonalities and differences between model transformation languages (ATL, QVT-O) and on benefits of using them in common development tasks (comprehension, change, and creation) against a modern general-purpose language (Xtend). Our results show no statistically significant benefit of using a dedicated transformation language over a modern general-purpose language. However, we were able to identify several aspects of transformation programming where domain-specific transformation languages do appear to help, including copying objects, context identification, and conditioning the computation on types.
|
Year | DOI | Venue |
---|---|---|
2019 | 10.1145/3236024.3236046 | ESEC/SIGSOFT FSE |
Keywords | DocType | ISBN |
Model Transformation Languages,Experiment,Xtend,ATL,QVT | Conference | 978-1-4503-5573-5 |
Citations | PageRank | References |
1 | 0.35 | 20 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Regina Hebig | 1 | 179 | 24.24 |
Christoph Seidl | 2 | 207 | 20.15 |
Thorsten Berger | 3 | 603 | 34.35 |
John Kook Pedersen | 4 | 1 | 0.35 |
Andrzej Wasowski | 5 | 1282 | 60.47 |