Title
Approximating the objective function׳s gradient using perceptrons for constrained minimization with application in drag reduction
Abstract
This paper is concerned with the minimization of a function whose closed-form analytical expression is unknown, subject to well-defined and differentiable constraints. We assume that there is available data to train a multi-layer perceptron, which can be used for estimating the gradient of the objective function. We combine this estimate with the gradients of the constraints to approximate the reduced gradient, which is ultimately used for determining a feasible descent direction. We call this variant of the reduced gradient method as the Neural Reduced Gradient algorithm. We evaluate its performance on a large set of constrained convex and nonconvex test problems. We also provide an interesting and important application of the new method in the minimization of shear stress for drag reduction in the control of turbulence. HighlightsConsiders nonlinear constrained minimization with unknown objective function.Introduces a perceptron based method to estimate the gradient of the objective.Proposes a neural reduced gradient algorithm to compute a stationary point.Applies the new method to shear stress minimization for reducing the turbulence.
Year
DOI
Venue
2015
10.1016/j.cor.2015.05.012
Computers & Operations Research
Keywords
Field
DocType
Constrained optimization,Reduced gradient,Neural networks,Drag reduction,Shear stress,Turbulence
Gradient method,Mathematical optimization,Stochastic gradient descent,Gradient descent,Descent direction,Frank–Wolfe algorithm,Nonlinear conjugate gradient method,Backpropagation,Mathematics,Constrained optimization
Journal
Volume
Issue
ISSN
64
C
0305-0548
Citations 
PageRank 
References 
1
0.39
12
Authors
3
Name
Order
Citations
PageRank
Burak Kocuk1333.94
I. Kuban Altinel215014.60
Necati Aras346230.62