building of a neural network with 1 hidden layer and with bias
[result]=nns_buildbayes(wh_in,wo_in,x,y, (options))
coefficients of the hidden neurons + bias, obtained from the initialization step
wh_in is a matrix of dimensions ((q+1) x nh) or a Div structure
nh is the number of hidden neurons
coefficients of the output neurons + bias, obtained from the initialization step
wo_in is a matrix of dimensions ((nh+1) x no) or a Div structure
no is the number of output neurons
calibration dataset
x is a matrix of dimensions (n x q) or a Div structure
reference values, to be predicted
y is a matrix of dimensions (n x no) or a Div structure
options for the building of the model
options_in.maxtime: max. run time, in seconds (default=300)
options_in.maxiter: max. number of iterations (default=10000)
options_in.displayfreq: gap between two successive iterations to be plotted in a figure (default=10)
options_in.precresid: accuracy of the residuals; a scalar or a vector of length no (default=1e-6*stdout)
options_in.precparam: accuracy of the parameters; a scalar or a vector of length nc (default=1e-4)
options_in.stdresmin: minimum of the standard errors of the residuals; a scalar or a vector of length no (default=1e-6*stdout)
options_in.stdresmax: maximum of the standard errors of the residuals; a scalar or a vector of length no (default=0.1*stdout)
options_in.regclass: weights regularization
… 0 = no regularization
… 1 = a class for all weights
… 2 = two classes: one for the inputs, the other for the outputs
… 3 = a class for each input, a class for the bias of the hidden layer and a class for each output (by default)
… 4 = a class for each weight
options_in.preproc: preprocessings before learning
… 0 = no preprocessing
… 1 = normalization (sum of squares of the differences between the inputs and the outputs = n)
… 2 = standardization (inputs mean = outputs mean = 0, sum of squares of the residuals = n)
options_in.momentparam: tuning of the effective number in each class; between 0 and 1 (default = 0.8)
coefficients of the hidden neurons + bias, after model calculation
result.wh_out is a matrix of dimension ((q+1) x no) or a Div structure
nh is the number of hidden neurons
coefficients of the output neurons + biais, after model calculation
result.wo_out is a matrix of dimensions ((nh + 1) x no) or a Div structure
no is the number of output neurons
estimation of the standard error of the residuals
result.stdres is a Div structure
result.stdres.d is a vector of dimensions (1 x no)
estimation of the variance-covariance matrix for the weights
result.covw is a Div structure
result.covw.d is a matrix of dimensions (nw x nw)
options_in with the following fields:
result.options_out.stop: a string message about the learning stop
result.options_out.r2: square correlation coefficient for each output
result.options_out.wheff: ratio number of parameters effective/total for each weight of the hidden layer
result.options_out.wheff.d is a matrix of dimensions ((q+1) x nh)
result.options_out.woeff: ratio number of parameters effective/total for each weight of the output layer
result.options_out.woeff.d is a matrix of dimensions ((nh+1) x no)
result.options_out.histiters: number of iterations
result.options_out.histresid: residual errors for each iteration
result.options_out.histresid.d is a matrix of no lines and as columns as iterations
result.options_out.histparam: effective number of parameters, for each class
result.options_out.histparam.d has as coulons as classes and as lines as iterations
result.options_out.classnumbers: number of class for each weight, ranked in a vector according to [wh(:);wo(:)]
result.options_out.totalparam: total number of parameters (weights) in each class of weights
result.options_out.totalparam.d is a line vector with as columns as classes