<< uncprb_getobjfcn Unconstrained Optimization Problems Toolbox uncprb_getproblems >>

Unconstrained Optimization Problems Toolbox >> Unconstrained Optimization Problems Toolbox > uncprb_getopt

uncprb_getopt

Returns the name.

Calling Sequence

[fopt,xopt] = uncprb_getopt(nprob,n,m)

Parameters

nprob:

a floating point integers, the problem number

n:

the number of variables, i.e. the size of x

m:

the number of functions, i.e. the size of fvec

fopt :

a 1 x 1 matrix of doubles, the minimum of the function

xopt :

a matrix of doubles, n x 1, the optimum of the function

Description

Returns the optimum, according to the paper "Algo 566". For each problem this corresponds to the data in the (d) section. For some problems, the optimum is known only for particular values of m and n. If fopt or xopt is unknown, an error is returned.

When the paper mentions several local optimums, we return the first one in the paper. When the paper only gives a limited number of significant digits, so do we.

When the paper writes +/-inf, we set +/-%inf.

When the paper gives the function value, but not the optimum, we return xopt=[]. This allows to return something, instead of nothing at all.

There are some cases where the optimum is known for several particular values of n or m. This is why n and m are input arguments. If it turns out that the optimum is not known for this particular value of n or m, we generate an error.

We emphasize that, when xopt is unknown, there is no way to check that fopt is correct, that is satisfies f(xopt)=fopt and g(xopt)=0.

All default settings provided by getinitf do correspond to the parameters used here. This allows to make checkings with the default settings, when possible.

For problem #20 - Watson - the values of the approximate solutions are taken from Brent, "Algorithms for Minimization with Derivatives", Prentice-Hall, 1973. The values are taken from John Burkardt - "test_opt" - March 2000. I improved the precision based on a full-precision optimization with optim : the gradient is smaller.

For problem #3 - Powell Badly Scaled - a manual optimization was done and the results are reported to full precision. We claim that the gradient is exactly zero at optimum.

Examples

// Get optimum for Rosenbrock's test case
nprob = 1;
[n,m,x0]=uncprb_getinitf(nprob);
[fopt,xopt] = uncprb_getopt(nprob,n,m)
// Check that the optimum is known is this case
isknown = ( xopt <> [] )

// Check what is provided for problem #8
nprob = 8;
[n,m,x0]=uncprb_getinitf(nprob);
[fopt,xopt] = uncprb_getopt(nprob,n,m)
// See that fopt is known, but not xopt
isknown = ( xopt <> [] )

Authors

<< uncprb_getobjfcn Unconstrained Optimization Problems Toolbox uncprb_getproblems >>