Scilab Home Page | Wiki | Bug Tracker | Forge | Mailing List Archives | Scilab Online Help | File Exchange
ATOMS : ANN Toolbox details
Please login or create an account

ANN Toolbox

(9334/17201 downloads)
ANN Toolbox
Ryurick M. Hristev
Allan Cornet
Private Individual
Package maintainer
Allan Cornet
Supported Scilab Versions
>= 5.4
Creation Date
24th of November 2011
ATOMS packaging system
Available on
How To Install
This represents a toolbox for artificial neural networks, based on "Matrix ANN" book Current feature: - Only layered feedforward networks are supported *directly* at the moment (for others use the "hooks" provided) - Unlimited number of layers - Unlimited number of neurons per each layer separately - User defined activation function (defaults to logistic) - User defined error function (defaults to SSE) - Algorithms implemented so far: * standard (vanilla) with or without bias, on-line or batch * momentum with or without bias, on-line or batch * SuperSAB with or without bias, on-line or batch * Conjugate gradients * Jacobian computation * Computation of result of multiplication between "vector" and Hessian - Some helper functions provided For full descriptions start with the toplevel "ANN" man page. functions: ann_FF — Algorithms for feedforward nets. ann_FF_ConjugGrad — Conjugate Gradient algorithm. ann_FF_Hess — computes Hessian by finite differences. ann_FF_INT — internal implementation of feedforward nets. ann_FF_Jacobian — computes Jacobian by finite differences. ann_FF_Jacobian_BP — computes Jacobian trough backpropagation. ann_FF_Mom_batch — batch backpropagation with momentum. ann_FF_Mom_batch_nb — batch backpropagation with momentum (without bias). ann_FF_Mom_online — online backpropagation with momentum. ann_FF_Mom_online_nb — online backpropagation with momentum. ann_FF_SSAB_batch — batch SuperSAB algorithm. ann_FF_SSAB_batch_nb — batch SuperSAB algorithm (without bias). ann_FF_SSAB_online — online SuperSAB training algorithm. ann_FF_SSAB_online_nb — online backpropagation with SuperSAB ann_FF_Std_batch — standard batch backpropagation. ann_FF_Std_batch_nb — standard batch backpropagation (without bias). ann_FF_Std_online — online standard backpropagation. ann_FF_Std_online_nb — online standard backpropagation ann_FF_VHess — multiplication between a "vector" V and Hessian ann_FF_grad — error gradient trough finite differences. ann_FF_grad_BP — error gradient trough backpropagation ann_FF_grad_BP_nb — error gradient trough backpropagation (without bias) ann_FF_grad_nb — error gradient trough finite differences ann_FF_init — initialize the weight hypermatrix. ann_FF_init_nb — initialize the weight hypermatrix (without bias). ann_FF_run — run patterns trough a feedforward net. ann_FF_run_nb — run patterns trough a feedforward net (without bias). ann_d_log_activ — derivative of logistic activation function ann_d_sum_of_sqr — derivative of sum-of-squares error ann_log_activ — logistic activation function ann_pat_shuffle — shuffles randomly patterns for an ANN ann_sum_of_sqr — calculates sum-of-squares error
Files (2)
[164.07 Ko]
Binary version
Automatically generated by the ATOMS compilation chain    
News (0)
Comments (2)
    Leave a comment 
Comment from Rajive Ganguli -- 14th of June 2012, 02:25:56 AM    
I have basic questions on the NN toolbox.
Do the inputs and outputs have to be normalized [0,1] or [-1,1] before use? Are there any 
example Scilab codes for conjugate gradient? 
Comment from Е;л;е;н;а; Р;о;ж;и;н;а; -- 18th of November 2013, 07:31:15 PM    
once installed run NN toolbox? ANN_ToolboxEdit();?
Leave a comment
You need to log in before you can leave a comment.