step_nelder_mead — A step by step Nelder & Mead optimization method
[x_next,data_next,eval_Func,f_hist,x_hist] = step_nelder_mead(f_current, x_current, data_current, nm_mode, Log, kelley_restart, kelley_alpha)
the objective function value corresponding to x_current. If nm_mode, f_current must be a vector of n+1 values corresponding to each value of the simplex
the initial simplex (n x n+1 matrix) or the current solution (1 column vectors) - depends on the value of nm_mode
the state of the Nelder & Mead step by step optimization method. For the initial iteration, this parameter can be empty
the mode of operation of the step by step Nelder & Mead. Can be: - 'init': for the initial iteration - 'run': during the computation of an iteration - 'exit': to get the last solutions
a boolean. If true, displays some informations during the run of the optimization (optional parameter: false by default)
a boolean. If true, we allow the simplex to be recomputed when a threshold has been reached (optional parameter: kelley_restart is false by default)
a threshold related to the shape of the simplex (optional parameter: kelley_alpha = 1e-4)
a value for which the Nelder & Mead step by step optimization method wan't to compute the objective function or the best solution found so far (in 'init' mode)
the state of the step by step Nelder & Mead optimisation method to be transmitted for the next iteration
the number of evaluation of the objective function
the value of the best objective function value for each iteration
the current state of the simplex for each iteration (3 columns vector)
// // The Rosenbrock function // function Res = min_bd_rosenbrock() Res = [-2 -2]'; endfunction function Res = max_bd_rosenbrock() Res = [2 2]'; endfunction function Res = opti_rosenbrock() Res = [1 1]'; endfunction function y = rosenbrock(x) y = 100*(x(2)-x(1)^2)^2+(1-x(1))^2; endfunction ItMX = 100; TOL = 1e-4; MaxEvalFunc = 400; Min = min_bd_rosenbrock(); Max = max_bd_rosenbrock(); x_init(:,1) = (Max - Min).*rand(2, 1) + Min; x_init(:,2) = (Max - Min).*rand(2, 1) + Min; x_init(:,3) = (Max - Min).*rand(2, 1) + Min; f_init(1) = rosenbrock(x_init(:,1)); f_init(2) = rosenbrock(x_init(:,2)); f_init(3) = rosenbrock(x_init(:,3)); disp(x_init) // Initial iteration [x_next, data_next, eval_Func, f_hist, x_hist] = step_nelder_mead(f_init, x_init, [], 'init'); f_current = rosenbrock(x_next); printf('step_nelder_mead - Initial iteration: f = %f\n', f_current); // Start the optimization while eval_Func<MaxEvalFunc [x_next, data_next, eval_Func, f_hist, x_hist] = step_nelder_mead(f_current, x_next, data_next, 'run'); f_current = rosenbrock(x_next); printf('step_nelder_mead - Iteration %d: f = %f\n', eval_Func, f_current); end // Last iteration [x_best, f_best, eval_Func, f_hist, x_hist] = step_nelder_mead(f_current, x_next, data_next, 'exit'); printf('step_nelder_mead: best value found: %f\n', f_best); printf('step_nelder_mead: nb of function evaluation: %d\n', eval_Func);