optim_nelder_mead — A Nelder & Mead optimization method
[x_opt,x_history] = optim_nelder_mead(f, x0, ItMX, Tol, MaxEvalFunc, Log, kelley_restart, kelley_alpha)
objective function
initial starting simplex (must be n x n+1 matrix)
the maximum of Nelder & Mead iteration steps (optional parameter: ItMX = 100)
a tolerance on the value of the objective function between 2 consecutive iterations (optional parameter: Tol = 0.0)
maximum number of objective function evaluation (in 1 Nelder & Mead iteration, there are several function evaluation) (optional parameter: MaxEvalFunc = 10*ItMX)
a boolean. If true, displays some informations during the run of the optimization (optional parameter: false by default)
a boolean. If true, we allow the simplex to be recomputed when a threshold has been reached (optional parameter: kelley_restart is false by default)
a threshold related to the shape of the simplex (optional parameter: kelley_alpha = 1e-4)
the best solution found so far
the list of each simplexes tested so far (a list of list of n+1 points with n the dimension of the vector of parameters)
// // The Rosenbrock function // function Res = min_bd_rosenbrock() Res = [-2 -2]'; endfunction function Res = max_bd_rosenbrock() Res = [2 2]'; endfunction function Res = opti_rosenbrock() Res = [1 1]'; endfunction function y = rosenbrock(x) y = 100*(x(2)-x(1)^2)^2+(1-x(1))^2; endfunction ItMX = 100; TOL = 1e-4; MaxEvalFunc = 400; Min = min_bd_rosenbrock(); Max = max_bd_rosenbrock(); x_init(:,1) = (Max - Min).*rand(2, 1) + Min; x_init(:,2) = (Max - Min).*rand(2, 1) + Min; x_init(:,3) = (Max - Min).*rand(2, 1) + Min; // Start the optimization printf('Initial iteration\n'); printf('x_init = '); disp(x_init) // fss is not needed here because the default line search method doesn't need a second derivative function [x_opt, x_history] = optim_nelder_mead(rosenbrock, x_init, ItMX, TOL, MaxEvalFunc); printf('xopt = '); disp(x_opt) printf('f_opt = %f\n', rosenbrock(x_opt));