Title
Understanding Uncertainty in Self-adaptive Systems
Abstract
Ensuring that systems achieve their goals under uncertainty is a key driver for self-adaptation. Nevertheless, the concept of uncertainty in self-adaptive systems (SAS) is still insufficiently understood. Although several taxonomies of uncertainty have been proposed, taxonomies alone cannot convey the SAS research community’s perception of uncertainty. To explore and to learn from this perception, we conducted a survey focused on the SAS ability to deal with unanticipated change and to model uncertainty, and on the major challenges that limit this ability. In this paper, we analyse the responses provided by the 51 participants in our survey. The insights gained from this analysis include the view—held by 71% of our participants—that SAS can be engineered to cope with unanticipated change, e.g., through evolving their actions, synthesising new actions, or using default actions to deal with such changes. To handle uncertainties that affect SAS models, the participants recommended the use of confidence intervals and probabilities for parametric uncertainty, and the use of multiple models with model averaging or selection for structural uncertainty. Notwithstanding this positive outlook, the provision of assurances for safety-critical SAS continues to pose major challenges according to our respondents. We detail these findings in the paper, in the hope that they will inspire valuable future research on self-adaptive systems.
Year
DOI
Venue
2020
10.1109/ACSOS49614.2020.00047
2020 IEEE International Conference on Autonomic Computing and Self-Organizing Systems (ACSOS)
Keywords
DocType
ISBN
Self-adaptation,uncertainty,unanticipated change,models,modeling formalism,survey
Conference
978-1-7281-7278-1
Citations 
PageRank 
References 
3
0.38
0
Authors
4
Name
Order
Citations
PageRank
Radu Calinescu190563.01
Raffaela Mirandola22557133.74
Diego Perez-Palacin312013.23
Danny Weyns42854163.81