Name

step_nelder_mead — A step by step Nelder & Mead optimization method

Calling Sequence

[x_next,data_next,eval_Func,f_hist,x_hist] = step_nelder_mead(f_current, x_current, data_current, nm_mode, Log, kelley_restart, kelley_alpha)

Parameters

f_current

the objective function value corresponding to x_current. If nm_mode, f_current must be a vector of n+1 values corresponding to each value of the simplex

x_current

the initial simplex (n x n+1 matrix) or the current solution (1 column vectors) - depends on the value of nm_mode

data_current

the state of the Nelder & Mead step by step optimization method. For the initial iteration, this parameter can be empty

nm_mode

the mode of operation of the step by step Nelder & Mead. Can be: - 'init': for the initial iteration - 'run': during the computation of an iteration - 'exit': to get the last solutions

Log

a boolean. If true, displays some informations during the run of the optimization (optional parameter: false by default)

kelley_restart

a boolean. If true, we allow the simplex to be recomputed when a threshold has been reached (optional parameter: kelley_restart is false by default)

kelley_alpha

a threshold related to the shape of the simplex (optional parameter: kelley_alpha = 1e-4)

x_next

a value for which the Nelder & Mead step by step optimization method wan't to compute the objective function or the best solution found so far (in 'init' mode)

data_next

the state of the step by step Nelder & Mead optimisation method to be transmitted for the next iteration

eval_Func

the number of evaluation of the objective function

f_hist

the value of the best objective function value for each iteration

x_hist

the current state of the simplex for each iteration (3 columns vector)

Description

A step by step Nelder & Mead optimization method.

Examples

 
//
// The Rosenbrock function
//
function Res = min_bd_rosenbrock()
  Res = [-2 -2]';
endfunction
function Res = max_bd_rosenbrock()
  Res = [2 2]';
endfunction
function Res = opti_rosenbrock()
  Res = [1 1]';
endfunction
function y = rosenbrock(x)
  y = 100*(x(2)-x(1)^2)^2+(1-x(1))^2;
endfunction

ItMX = 100;
TOL  = 1e-4;
MaxEvalFunc = 400;
Min = min_bd_rosenbrock();
Max = max_bd_rosenbrock();
    
x_init(:,1) = (Max - Min).*rand(2, 1) + Min;
x_init(:,2) = (Max - Min).*rand(2, 1) + Min;
x_init(:,3) = (Max - Min).*rand(2, 1) + Min;
    
f_init(1) = rosenbrock(x_init(:,1));
f_init(2) = rosenbrock(x_init(:,2));
f_init(3) = rosenbrock(x_init(:,3));
    
disp(x_init)
    
// Initial iteration
   
[x_next, data_next, eval_Func, f_hist, x_hist] = step_nelder_mead(f_init, x_init, [], 'init');
f_current = rosenbrock(x_next);
printf('step_nelder_mead - Initial iteration: f = %f\n', f_current);

// Start the optimization
while eval_Func<MaxEvalFunc
  [x_next, data_next, eval_Func, f_hist, x_hist] = step_nelder_mead(f_current, x_next, data_next, 'run');
  f_current = rosenbrock(x_next);
  printf('step_nelder_mead - Iteration %d: f = %f\n', eval_Func, f_current);
end
    
// Last iteration
[x_best, f_best, eval_Func, f_hist, x_hist] = step_nelder_mead(f_current, x_next, data_next, 'exit');
printf('step_nelder_mead: best value found: %f\n', f_best);
printf('step_nelder_mead: nb of function evaluation: %d\n', eval_Func);
 

See Also

optim_nelder_mead

Authors

collette

Yann COLLETTE (ycollet@freesurf.fr)