/
Survey of gradient based constrained optimization algorithm Survey of gradient based constrained optimization algorithm

Survey of gradient based constrained optimization algorithm - PowerPoint Presentation

lois-ondreau
lois-ondreau . @lois-ondreau
Follow
450 views
Uploaded On 2016-03-08

Survey of gradient based constrained optimization algorithm - PPT Presentation

Select algorithms based on their popularity Additional details and additional algorithms in Chapter 5 of Haftka and Gurdals Elements of Structural Optimization Optimization with constraints ID: 247213

fmincon function fval 0000 function fmincon 0000 fval ring output algorithm penalty methods find flag quad2 constraint algorithms point

Share:

Link:

Embed:

Download Presentation from below link

Download Presentation The PPT/PDF document "Survey of gradient based constrained opt..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

Slide1

Survey of gradient based constrained optimization algorithms

Select algorithms based on their popularity.Additional details and additional algorithms in Chapter 5 of Haftka and Gurdal’s Elements of Structural OptimizationSlide2

Optimization with constraints

Standard formulationEquality constraints are a challenge, but are fortunately missing in most engineering design problems, so this lecture will deal only with equality constraints.Slide3

Derivative based optimizers

All are predicated on the assumption that function evaluations are expensive and gradients can be calculated.Similar to a person put at night on a hill and directed to find the lowest point in an adjacent valley using a flashlight with limited batteryBasic strategy:

Flash light to get derivative and select direction.

Go straight in that direction until you start going up or hit constraint.

Repeat until converged.Some methods move mostly along the constraint boundaries, some mostly on the inside (interior point algorithms)Slide4

Gradient projection and reduced gradient methods

Find good direction tangent to active constraintsMove a distance and then restore to constraint boundariesA typical active set algorithm, used in ExcelSlide5

Penalty function methods

Quadratic penalty functionGradual rise of penalty parameter leads to sequence of unconstrained minimization technique (SUMT). Why is it important?Slide6

Example 5.7.1Slide7

Contours for r=1

.Slide8

Contours for r=1000

.

For non-derivative methods can avoid this by having penalty proportional to absolute value of violation instead of its square!Slide9

Problems Penalty

With an extremely robust algorithm, we can find a very accurate solution with a penalty function approach by using a very high r. However, at some high value the algorithm will begin to falter, either taking very large number of iterations or not reaching the solution. Test fminunc and fminsearch on Example 5.7.1 starting from x0=[2,2]. Start with r=1000 and increase.

SolutionSlide10

5.9: Projected Lagrangian methods

Sequential quadratic programmingFind direction by solving

Find alpha by minimizingSlide11

Matlab function fmincon

FMINCON attempts to solve problems of the form: min F(X) subject to: A*X <= B, Aeq*X =

Beq

(linear cons)

X C(X) <= 0,

Ceq(X) = 0 (nonlinear cons) LB <= X <= UB [X,FVAL,EXITFLAG,OUTPUT,LAMBDA] =

FMINCON(FUN,X0,A,B,Aeq,Beq,LB,UB,NONLCON) The function NONLCON accepts X and returns the vectors C and

Ceq

, representing the nonlinear inequalities and equalities respectively. (Set LB=[] and/or UB=[] if no bounds exist

.)

. Possible values of EXITFLAG 1 First order optimality conditions satisfied.

0 Too many function evaluations or iterations. -1 Stopped by output/plot function.

-2 No feasible point found.Slide12

Quadratic function and constraint

examplefunction f=quad2(x)

Global a

f=x(1)^2+a*x(2)^2

;

end

function [c,ceq]=ring(x)

global

ri

roc(1)=ri^2-x(1)^2-x(2)^2;

c(2)=x(1)^2+x(2)^2-ro^2;ceq=[];

endglobal a, ro, rix0=[1,10];

a=10;ri=10.; ro=20;[x,fval]=fmincon(@quad2,x0,[],[],[],[],[],[],@ring)x =10.0000 -0.0000 fval =100.0000Slide13

Fuller output

[x,fval,flag,output,lambda]=fmincon(@quad2,x0,[],[],[],[],[],[],@ring)Optimization completed because the objective function is non-decreasing in feasible directions, to within the default value of the function tolerance, and constraints are satisfied to within the default value of the constraint tolerance.

x =10.0000 -0.0000

fval

=100.0000 flag = 1

output = iterations: 6

funcCount: 22 lssteplength: 1

stepsize

: 9.0738e-06

algorithm: 'medium-scale: SQP, Quasi-Newton, line-search'

firstorderopt: 9.7360e-08

constrviolation: -8.2680e-11lambda.ineqnonlin

’=1.0000 0Slide14

Making it harder for fmincon

a=1.1;[x,fval,flag,output,lambda]=fmincon

(@quad2,x0,[],[],[],[],[],[],@ring)

Maximum

number of function evaluations exceeded;

increase OPTIONS.MaxFunEvals.

x =4.6355 8.9430 fval =

109.4628 flag=0

iterations: 14

funcCount: 202

lssteplength: 9.7656e-04

stepsize: 2.2830 algorithm: 'medium-scale: SQP, Quasi-Newton, line-search' firstorderopt: 5.7174 constrviolation

: -6.6754Slide15

Restart sometimes helps

x0=xx0 = 4.6355 8.9430[x,fval,flag,output,lambda]=

fmincon

(@quad2,x0,[],[],[],[],[],[],@ring)

x =10.0000

0.0000 fval =100.0000

flag = 1 iterations: 15

funcCount

: 108

lssteplength: 1

stepsize: 4.6293e-04 algorithm: 'medium-scale: SQP, Quasi-Newton, line-search'

firstorderopt: 2.2765e-07 constrviolation: -2.1443e-07Slide16

Problem fmincon

For the ring problem with a=10, ro=20, can you find a starting point within a circle of radius 30 around the origin that will prevent fmincon of finding the optimum? Solution in the notes page.