Name

ANN_JACOB — Function to calculate the jacobian performance vector of a feed forward artifical neural network

Calling Sequence

[JE,JJ,normJE,[JX]] = ANN_JACOB(IN,W,IN_W,HID_OUT,ERR_OUT)

Parameters

IN

Input data (Matrix [PxN] where P is the number of input neurons and N the number of input patterns)

W

Initial weight and bias values (2 dimensions Matrix [max(Nhid,M) x max(P+1,Nhid+1) x 2]).

Wini(1:Nhid,1,1) are the bias for the hidden neurons

Wini(1:Nhid,2:P+1,1) are the weights for the hidden neurons (P weights for each hidden neuron)

Wini(1:M,1,2) are the bias for the ouput neurons

Wini(1:M,2:Nhid+1,2) are the weights for the ouput neurons (Nhid weights for each output neuron)

IN_W

Weighted inputs

HID_OUT

Ouputs from the hidden layer

ERR_OUT

Errors (target-network ouput)

JE

Jacobian times errors (Matrix [N_PARxNB_OUT] where NB_PAR is the number of parameters in the network

JJ

Transposed jacobian times jacobian (Matrix [N_PARxNB_PAR])

normJE

Transposed JE matrix times JE

JX

Full Jacobian matrix

Description

  • This function is valid only for feed forward network with one hidden layer.

    The activation function of the hidden layer is the hyperbolic tangent and the identity function for the output layer.

    This function is used in the function 'LMBR' for the training of feed-forward neural network with Levenberg-Marquadt algorithm under bayesian regulation.

Examples

   // Calibration of a network with 6 input nodes, 4 nodes in the hidden layer and 1 output node
   IN   = rand(6,100);
   TARG = rand(1,100);
   W    = rand(4,7,2);
   [OUT,IN_W,HID_OUT] = ANN_SIM(IN,4,1,W);
   ERR_OUT            = TARG-OUT;
   [JE,JJ,NORMGX]     = ANN_JACOB(IN,W,IN_W,HID_OUT,ERR_OUT);    
  

See Also

ANN_CONV_W , ANN_LMBR , ANN_NORM , ANN_SIM

Authors

Julien Lerat

CEMAGREF Antony, HBAN Unit, julien.lerat@cemagref.fr

Bibliography

MacKay, Neural Computation, vol. 4, no. 3, 1992, pp. 415-447.

Foresee and Hagan, Proceedings of the International Joint Conference on Neural Networks, June, 1997.