ANN_LMBR — Function to train a feed-forward artificial neural network with one hidden layer.
[W,OUT,RMSE,[SSE,GAMK,SSX]] = ANN_LMBR(IN,TARG,Nhid,Wini,[EPOCH,EpochShow,GraphFin,...])
Input data (matrix [PxN] where P is the number of input neurons and N the number of input patterns)
Target data (matrix [MxN] where M is the number of ouput neurons and N the number of input patterns)
Number of neurons in the hidden layer
Initial weight and bias values (2 dimensions Matrix [max(Nhid,M) x max(P+1,Nhid+1) x 2]).
Wini(1:Nhid,1,1) are the bias for the hidden neurons
Wini(1:Nhid,2:P+1,1) are the weights for the hidden neurons (P weights for each hidden neuron)
Wini(1:M,1,2) are the bias for the ouput neurons
Wini(1:M,2:Nhid+1,2) are the weights for the ouput neurons (Nhid weights for each output neuron)
Number of epochs (should >2). Default = 30
Periodicity of results display during network calibration. Default = 10
Graphical display of calibration progresses (%T or %F). Default = %T
Final weight and biais values (same matrix structure than Wini).
Final network outputs (Matrix [MxN] where M is the number of ouput neurons and N the number of input patterns)
Root Mean square error of final output compared with target
Serie of SSE value (1 value for each epoch)
Serie of GAMK value (1 value for each epoch)
Serie of SSX value (1 value for each epoch)
The activation function of the hidden layer is the hyperbolic tangent and the identity function for the output layer.
The objective function to be minimized is the Sum of Squared Errors (SSE).
The training algorithm is Levenberg-Marquadt algorithm with bayesian regulation.
// Calibration of a network with 6 input nodes, 4 nodes in the hidden layer and 1 output node IN = rand(6,100); TARG = rand(1,100); Wini = rand(4,7,2); [W,OUT,RMSE] = ANN_LMBR(IN,TARG,4,Wini);