<< EstimateVar krisp toolbox LOOCV >>

krisp toolbox >> krisp toolbox > LL_driver_optim

LL_driver_optim

Returns negative log Likelihood for kriging model as a function of kriging parameters for linking with new optimization packages.

Calling Sequence

[y]=LL_driver_optim(x)

Parameters

x:

parameters for kriging model. For 0 noise (model.estimateNoise =0) x is a vector (1,n) of thetas, where n is the number of dimensions. For non zero noise (model.estimateNoise =1) x is a vector (1,n+1) of (1,n) of thetas and (1,n+1) is sigma, n is the number of dimensions. For unknown noise and unknown sigma (model.estimateNoise =-1 and model.estimateSigma=-1), x is a vector (1,n+1) of (1,n) of thetas and (1,n+1) is propotionality coefficent model.alfa, n is the number of dimensions.

y:

negative log likelihood of kriging model.

Description

Returns negative log Likelihood for kriging model as a function of kriging parameters for use in optimization packages. The global variable model_kriging should be defined, it describes kriging model.

Examples

// Examples are based on simplex toolbox v2.1.

function [f]=Costfunction(x)
// maps x to transpose of x
    f = LL_driver_optim ( x' );
endfunction

//2D example of kriging model
// DETERMINISTIC case (no noise, numericly estimates only theta)

function [f]=cmb_2(x)
x1=x(:,1);x2=x(:,2);
f=(4-2.1*x1.*x1+(x1^4)/3).*x1.*x1+x1.*x2+(-4+4*x2.*x2).*x2.*x2
endfunction

x_bound=[-1 -1; 1 1];//for 2- hump camell back function
ftest=cmb_2;
X=RLHS(20,2,x_bound(1,:),x_bound(2,:))
YExp=ftest(X);
noise=0;theta=[1 1];
[kmodel] =km(X,YExp,0,'gaussian',theta,noise);// creates kriging model

global model_kriging; // sets global variable for LL_driver_optim
model_kriging=kmodel;
kmodel.MLL

// defines simplex search
ItMX            = 400;  // maximum number of descent steps
TOL             = 1e-6; // accuracy for convergence test - derivatives.
// The tolerance is tested on the absolute variation of objective function found on the simplex.
// if abs(f_Good - f_Worth)&lt;TOL then stop
Log             = %f;   //k print info from line step, 1 = on / 0 = off
MaxEvalFunc     = 400;
KelleyRestart   = %F;
KelleyAlpha     = 1e-7;
SimplexRelSize  = 0.1;

// We set the initial simplex
x0=kmodel.theta';
Min=ones(x0)*%eps*1000;
Max=2*(x_bound(2,:)-x_bound(1,:))';

x_init(:,1) = x0 + SimplexRelSize*0.5*((Max - Min).*grand(x0,'unf',0,1) + Min);
x_init(:,2) = x0 + SimplexRelSize*0.5*((Max - Min).*grand(x0,'unf',0,1) + Min);
x_init(:,3) = x0 + SimplexRelSize*0.5*((Max - Min).*grand(x0,'unf',0,1) + Min);

[x_opt, x_hist] = optim_nelder_mead(Costfunction, x_init, ItMX, TOL, MaxEvalFunc, Log, KelleyRestart, KelleyAlpha);

[kmodel] =km(X,YExp,0,'gaussian',x_opt',noise);// creates kriging model  with found parameter
kmodel.MLL

[kmodel p]=findTheta(kmodel,Min',Max','MLL','RS',400);
kmodel.MLL

///////////////////////////============================================

//2D example of kriging model
// NOISY case (estimates theta, noise and sigma numericly from MLL)

function [f]=cmb_2(x)
x1=x(:,1);x2=x(:,2);
m=size(x,'r');
f=(4-2.1*x1.*x1+(x1^4)/3).*x1.*x1+x1.*x2+(-4+4*x2.*x2).*x2.*x2 +grand(m,1,'nor',0,0.1);
endfunction

x_bound=[-1 -1; 1 1];//for 2- hump camell back function
ftest=cmb_2;
X=RLHS(20,2,x_bound(1,:),x_bound(2,:))
YExp=ftest(X);
theta=[1 1];
[kmodel] =km(X,YExp,0,'gaussian',theta);// creates kriging model

global model_kriging; // sets global variable for LL_driver_optim
model_kriging=kmodel;
kmodel.MLL

// defines simplex search
ItMX            = 400;  // maximum number of descent steps
TOL             = 1e-6; // accuracy for convergence test - derivatives.
// The tolerance is tested on the absolute variation of objective function found on the simplex.
// if abs(f_Good - f_Worth)&lt;TOL then stop
Log             = %f;   // print info from line step, 1 = on / 0 = off
MaxEvalFunc     = 400;
KelleyRestart   = %F;
KelleyAlpha     = 1e-7;
SimplexRelSize  = 0.1;

// We set the initial simplex
x0=[kmodel.theta'; 0.9];
Min=ones(x0)*%eps*1000;
Max=[2*(x_bound(2,:)-x_bound(1,:))'; 1];

x_init=[];x_opt=[];x_hist=[];
x_init(:,1) = x0 + SimplexRelSize*0.5*((Max - Min).*grand(x0,'unf',0,1) + Min);
x_init(:,2) = x0 + SimplexRelSize*0.5*((Max - Min).*grand(x0,'unf',0,1) + Min);
x_init(:,3) = x0 + SimplexRelSize*0.5*((Max - Min).*grand(x0,'unf',0,1) + Min);
x_init(:,4) = x0 + SimplexRelSize*0.5*((Max - Min).*grand(x0,'unf',0,1) + Min);

[x_opt, x_hist] = optim_nelder_mead(Costfunction, x_init, ItMX, TOL, MaxEvalFunc, Log, KelleyRestart, KelleyAlpha);

kmodel.theta=x_opt(1:2)';
kmodel.alfa=x_opt(3);

// functions to create kriging model
kmodel=CorrMatrix(kmodel);
decompose_estBeta(kmodel);
[kmodel]=EstimateVar(kmodel);
[kmodel]=logLL(kmodel);

kmodel.MLL

// compare to random search
[kmodel p]=findTheta(kmodel,Min(1:$-1)',Max(1:$-1)','MLL','RS',400);
kmodel.MLL

See also

Authors

Bibliography

<< EstimateVar krisp toolbox LOOCV >>