Title
Coco: A Platform For Comparing Continuous Optimizers In A Black-Box Setting
Abstract
We introduce COCO, an open-source platform for Comparing Continuous Optimizers in a black-box setting. COCO aims at automatizing the tedious and repetitive task of benchmarking numerical optimization algorithms to the greatest possible extent. The platform and the underlying methodology allow to benchmark in the same framework deterministic and stochastic solvers for both single and multiobjective optimization. We present the rationals behind the (decade-long) development of the platform as a general proposition for guidelines towards better benchmarking. We detail underlying fundamental concepts of COCO such as the definition of a problem as a function instance, the underlying idea of instances, the use of target values, and runtime defined by the number of function calls as the central performance measure. Finally, we give a quick overview of the basic code structure and the currently available test suites.
Year
DOI
Venue
2016
10.1080/10556788.2020.1808977
OPTIMIZATION METHODS & SOFTWARE
Keywords
Field
DocType
Numerical optimization, black-box optimization, derivative-free optimization, benchmarking, performance assessment, test functions, runtime distributions, software
Black box (phreaking),Computer science,Optimization algorithm,Artificial intelligence,Coco,Machine learning,Benchmarking
Journal
Volume
Issue
ISSN
36
1
1055-6788
Citations 
PageRank 
References 
30
1.69
10
Authors
5
Name
Order
Citations
PageRank
Nikolaus Hansen172351.44
Anne Auger2119877.81
Olaf Mersmann3603.52
Tea Tusar418119.91
Dimo Brockhoff594853.97