<< ann_FFBP_gd Neural_Network_Functions ann_FFBP_gdm >>

NeuralNet >> Neural_Network_Functions > ann_FFBP_gda

ann_FFBP_gda

ANN FeedForward Backpropagation Gradient Decent with Adaptive Learning Rate training function.

Calling Sequence

W = ann_FFBP_gda(P,T,N)
W = ann_FFBP_gda(P,T,N,af,lr,lr_inc,lr_dec,itermax,mse_min,gd_min,mse_diff_max)

Parameters

P :

Training input

T :

Training target

N :

Number of Neurons in each layer, incluing Input and output layer

af :

Activation Function from 1st hidden layer to the output layer

lr :

Learning rate

lr_inc :

Learning Rate Increase Rate

lr_dec :

Learning Rate Decrease Rate

itermax :

Maximum epoch for training

mse_min :

Minumum Error (Performance Goal)

gd_min :

Minimum Gradient

mse_diff_max :

MSE changes max in percentage, default 5%

W :

Output Weight and bias

Description

This function perform FeedForward Backpropagation with Gradient Decent training algorithm with Adaptive Learning Rate.

Examples

P = [1 2 3 4; 1 2 3 4];
T = [1 2 3 4];
W = ann_FFBP_gda(P,T,[2 3 1]);
y = ann_FFBP_run(P,W)

See also

Authors


Report an issue
<< ann_FFBP_gd Neural_Network_Functions ann_FFBP_gdm >>