Scilab Home Page | Wiki | Bug Tracker | Forge | Mailing List Archives | Scilab Online Help | File Exchange
ATOMS : ANN Toolbox details
Please login or create an account

ANN Toolbox

(4506 downloads for this version - 13311 downloads for all versions)
Details
Version
0.4.2.5
Authors
Ryurick M. Hristev
Allan Cornet
Entity
Private Individual
Maintainer
Allan Cornet
License
Creation Date
November 24, 2011
Source created on
Scilab 5.4.x
Binaries available on
Scilab 5.4.x:
Windows 64-bit Windows 32-bit Linux 64-bit Linux 32-bit MacOSX
Scilab 5.5.x:
Windows 64-bit Windows 32-bit Linux 64-bit Linux 32-bit MacOSX
Install command
--> atomsInstall("ANN_Toolbox")
Description
            This represents a toolbox for artificial neural networks,
based on "Matrix ANN" book

Current feature:
 - Only layered feedforward networks are supported *directly* at the moment
   (for others use the "hooks" provided)
 - Unlimited number of layers
 - Unlimited number of neurons per each layer separately
 - User defined activation function (defaults to logistic)
 - User defined error function (defaults to SSE)
 - Algorithms implemented so far:
    * standard (vanilla) with or without bias, on-line or batch
    * momentum with or without bias, on-line or batch
    * SuperSAB with or without bias, on-line or batch
    * Conjugate gradients
    * Jacobian computation
    * Computation of result of multiplication between "vector" and
Hessian
 - Some helper functions provided

For full descriptions start with the toplevel "ANN" man page.

functions:

ann_FF — Algorithms for feedforward nets.
ann_FF_ConjugGrad — Conjugate Gradient algorithm.
ann_FF_Hess — computes Hessian by finite differences.
ann_FF_INT — internal implementation of feedforward nets.
ann_FF_Jacobian — computes Jacobian by finite differences.
ann_FF_Jacobian_BP — computes Jacobian trough backpropagation.
ann_FF_Mom_batch — batch backpropagation with momentum.
ann_FF_Mom_batch_nb — batch backpropagation with momentum (without bias).
ann_FF_Mom_online — online backpropagation with momentum.
ann_FF_Mom_online_nb — online backpropagation with momentum.
ann_FF_SSAB_batch — batch SuperSAB algorithm.
ann_FF_SSAB_batch_nb — batch SuperSAB algorithm (without bias).
ann_FF_SSAB_online — online SuperSAB training algorithm.
ann_FF_SSAB_online_nb — online backpropagation with SuperSAB
ann_FF_Std_batch — standard batch backpropagation.
ann_FF_Std_batch_nb — standard batch backpropagation (without bias).
ann_FF_Std_online — online standard backpropagation.
ann_FF_Std_online_nb — online standard backpropagation
ann_FF_VHess — multiplication between a "vector" V and Hessian
ann_FF_grad — error gradient trough finite differences.
ann_FF_grad_BP — error gradient trough backpropagation
ann_FF_grad_BP_nb — error gradient trough backpropagation (without bias)
ann_FF_grad_nb — error gradient trough finite differences
ann_FF_init — initialize the weight hypermatrix.
ann_FF_init_nb — initialize the weight hypermatrix (without bias).
ann_FF_run — run patterns trough a feedforward net.
ann_FF_run_nb — run patterns trough a feedforward net (without bias).
ann_d_log_activ — derivative of logistic activation function
ann_d_sum_of_sqr — derivative of sum-of-squares error
ann_log_activ — logistic activation function
ann_pat_shuffle — shuffles randomly patterns for an ANN
ann_sum_of_sqr — calculates sum-of-squares error

            
Files (3)
[88.39 kB]
Source code archive

[169.30 kB]
OS-independent binary for Scilab 5.4.x
Mise à jour du fichier de Description.
[161.41 kB]
OS-independent binary for Scilab 5.5.x
Binary version (all platforms)
Automatically generated by the ATOMS compilation chain

News (0)
Comments (2)     Leave a comment 
Comment from Rajive Ganguli -- June 14, 2012, 02:25:56 AM    
I have basic questions on the NN toolbox.
Do the inputs and outputs have to be normalized [0,1] or [-1,1] before use? Are there any 
example Scilab codes for conjugate gradient? 
Answer from Mike Mike -- May 12, 2015, 12:26:28 PM    
> I have basic questions on the NN toolbox.
> Do the inputs and outputs have to be normalized [0,1] or [-1,1] before use? Are there
> any 
> example Scilab codes for conjugate gradient? 


It depends on activation function. In ordinary way input aren't limited (no need for 
normalization) but output is in range (0,1). I use my "line function" and then it
is 
unlimited too. 
Comment from Е;л;е;н;а; Р;о;ж;и;н;а; -- November 18, 2013, 07:31:15 PM    
once installed run NN toolbox? ANN_ToolboxEdit();?
Answer from Mike Mike -- May 12, 2015, 12:28:38 PM    
> once installed run NN toolbox? ANN_ToolboxEdit();?

As far as I am able to judge it is only a few function with a lot of limitations (like 
only one type of activation function for all net) :(
Leave a comment
You need to log in before you can leave a comment.