Abstract | ||
---|---|---|
Model transformation is a core mechanism for model-driven engineering (MDE). Writing complex model transformations is error-prone, and efficient testing techniques are required as for any complex program development. Testing a model transformation is typically performed by checking the results of the transformation applied to a set of input models. While it is fairly easy to provide some input models, it is difficult to qualify the relevance of these models for testing. In this paper, we propose a set of rules and a framework to assess the quality of given input models for testing a given transformation. Furthermore, the framework identifies missing model elements in input models and assists the user in improving these models. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1007/s10270-007-0074-8 | Software and System Modeling |
Keywords | Field | DocType |
software testing · model transformation · test criteria · test qualification · metamodelling · model-based testing,model driven engineering,software testing,model based testing | Model transformation,Systems engineering,Integration testing,Computer science,Manual testing,Software performance testing,White-box testing,Non-regression testing,Model-based testing,Keyword-driven testing | Journal |
Volume | Issue | ISSN |
8 | 2 | 1619-1374 |
Citations | PageRank | References |
51 | 1.52 | 24 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Franck Fleurey | 1 | 1730 | 85.07 |
Benoit Baudry | 2 | 2000 | 118.08 |
Pierre-Alain Muller | 3 | 511 | 54.09 |
Yves Le Traon | 4 | 3922 | 190.39 |