Title
Linearly constrained minimax optimization.
Abstract
We present an algorithm for nonlinear minimax optimization subject to linear equality and inequality constraints which requires first order partial derivatives. The algorithm is based on successive linear approximations to the functions defining the problem. The resulting linear subproblems are solved in the minimax sense subject to the linear constraints. This ensures a feasible-point algorithm. Further, we introduce local bounds on the solutions of the linear subproblems, the bounds being adjusted automatically, depending on the quality of the linear approximations. It is proved that the algorithm will always converge to the set of stationary points of the problem, a stationary point being defined in terms of the generalized gradients of the minimax objective function. It is further proved that, under mild regularity conditions, the algorithm is identical to a quadratically convergent Newton iteration in its final stages. We demonstrate the performance of the algorithm by solving a number of numerical examples with up to 50 variables, 163 functions, and 25 constraints. We have also implemented a version of the algorithm which is particularly suited for the solution of restricted approximation problems.
Year
DOI
Venue
1978
10.1007/BF01588966
Math. Program.
Keywords
Field
DocType
minimax,optimization,quadratic convergence.,linear constraints,first order,linear approximation,newton iteration,objective function,quadratic convergence
Linear-fractional programming,Mathematical optimization,Minimax,Nonlinear system,Minimax approximation algorithm,Stationary point,Rate of convergence,Criss-cross algorithm,Mathematics,Newton's method
Journal
Volume
Issue
ISSN
14
1
1436-4646
Citations 
PageRank 
References 
19
4.12
1
Authors
2
Name
Order
Citations
PageRank
Kaj Madsen134163.86
Hans SCHJAER-JACOBSEN2194.12