Constructor for CMA-ES, a (stochastic) non-convex function minimizer
param = cma_new() // return default parameters param = cma_new([]) // return default parameters quietly es = cma_new(param) // return initialized CMA-ES object [es param] = cma_new(param) // return also actual parameter setting
struct (or list) of parameters. param=cma_new() returns a struct with valid field names and their default values and a short description in param.readme, param=cma_new([]) does the same quietly.
Mandatory parameters that need to be set by the user are (i) param.x0, the initial solution point (column vector) and (ii) scalar param.sigma0, the initial search standard deviation acting on param.x0, typically 1/3 of the typical variable ranges. param.opt_scaling_of_variables must be used if the variables ranges are essentially different.
param.verb controls verbosity. After the objective function is debugged and when the returned solution(s) are satisfactory, logmodulo=0 (no data saving) and displaymodulo=0 (no screen output) make the code completely quiet (and faster).
tlist of type 'cma', an optimizer according to OMD specifications: defined functions are X=cma_ask(es), es=cma_tell(es,X,y), and [y x]=cma_best(es) (all also starting with '%'), and the boolean field es.stop. CAVE: for the time being X is a list with lambda column vectors, see example.
struct of the actually used parameters
useful output collected in a struct (useful only after some iterations on es).
CMA-ES is a stochastic optimizer for nonlinear nonconvex objective functions. It can make perfectly sense to re-run the optimizer several times. The initial seed is collected in es.out.seed and using rand('seed', es.out.seed) will reproduce the same run.
The verbosity can be controlled using param.verb (see above). Running the code quietly is only recommended after extensive testings. For longer runs, setting verb.plotmodulo=0 and plotting the data from a second Scilab terminal (using function) cma_plot is well worth considering.
// see demos/runcma.sce // get aquired p = cma_new(); // return (default) parameters disp(p); // investigate default parameters p.readme // display doc for parameters p.stop // investigate the possible and default termination criteria clear p // we do not need the defaults for initialization // initialization p.x0 = rand(10,1); // mandatory: initial solution vector p.sigma0 = 0.3; // mandatory: initial standard deviation on x0 // (somewhat a search range) es = cma_new(p); // get an cma "object", constructor call fitfun = ... // objective function needs to be defined // iteration loop while ~ cma_stop(es) X = cma_ask(es); // list of solution points (column vectors) y = []; // just in case for i = 1:length(X) y(i) = fitfun(X(i)); // evaluate each point end es = cma_tell(es, X, y'); // update for the next iteration end // output [ybest, xbest] = cma_best(es); printf(" \n x(1) = %f y = %f\n", xbest(1), ybest); for str = es.out.stopflags disp(' ' + str); // display termination reasons end disp(es.out); // further investigate some useful output | ![]() | ![]() |
Hansen, N. (2006). The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larranga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of distribution algorithms. pp. 75-102, Springer.
Hansen, N. and S. Kern (2004). Evaluating the CMA Evolution Strategy on Multimodal Test Functions. Eighth International Conference on Parallel Problem Solving from Nature PPSN VIII, Proceedings, pp. 282-291, Berlin: Springer. (http://www.bionik.tu-berlin.de/user/niko/ppsn2004hansenkern.pdf)
Hansen, N., S.D. Mueller and P. Koumoutsakos (2003). Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES). Evolutionary Computation, 11(1). (http://mitpress.mit.edu/journals/pdf/evco_11_1_1_0.pdf).
Hansen, N. and A. Ostermeier (2001). Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation, 9(2), pp. 159-195. (http://www.bionik.tu-berlin.de/user/niko/cmaartic.pdf).