Home Page

Manual

- overview

- data structures

- preprocessing

- networks

- classification

- graphing

- examples

 

History:

..2015..

We start moving the neural network engine to the library interfaced with IDE of POOL programming language.

v0.9.5.2

- Fixed bug in saving error values in MomentumOptimize algorithm.

v0.9.5.1

- Small example project added to the application zip archive.
- Added alias for Math.Sign() function: sign().

v0.9.5.0

- Scaled Conjugate Gradient training algorithm added.
- Other Conjugate Gradient algorithms cleaned and reorganized.
- Fill plot style added (nice, but quite slow, so some optimization is to be done).

v0.9.4.1 - v0.9.4.3

- Added softmax output layer and cross-entropy error function. These two are intended to work together in classification tasks. Please, use target values 1.0 and 0.0; implementation for continuous P(class) target is to be done.
- Added Graph Container block. It allows to save graphs in the project file and restore them automaticaly (on the next time project is open or on the trigger signal from other blocks in the project).
- Some other fixes and small changes, more thread-safe.

v0.9.3.1 - v0.9.3.4

- Fixed parameter naming (conjugate gradients algorithm).
- TCP communication updated, ROOT code and example updated (but this parts still need some work to be really robust).
- Fix to the regularization in Cascade-Correlation networks (option with no bias regularization).
- Minor update in the graph layout (ordering of the curves).

v0.9.3.0

- Added plot-data connection so all plots could be updated without recreating; Reload Data and Live Update functions added to the graph context menu.
- More information on training progress added to Net Error graphs.
- Added option to enable/disable curves on plots.

v0.9.2.1 - v0.9.2.8

- Added index (index of the event) and size (size of the data set) variables for expression parser.
- Fix in the math and filter expression parser (some function alias names were not processed correctly).
- Fixed histogram creation where all entries have the same value.
- Minor fix to the Transform block (in lock option saving to file).
- Few fixes to the QSVD and ICA transformations (reading matrix from the project file was causing application crush in some configurations).
- Fix to the Cascade-Correlation network (reading network file with run type set to JustRun was crushing the application).
- Fix to the dynamic structure algorithm (bug was occasionally stopping the training with an error message).
- Minor fix to the Classify block.
- Minor fix to graph display code.
- Fix to the OBS algorithm.

- Regularization (weight decay) added for Cascade-Correlation networks.
- Moved to Framework.NET 3.5 (required by new SVM library)
- LIBSVM changed to 2.89.
- More destriptions for algorithm parameters.
- Minor optimizations in quick-prop training.

- Some fixes in the Hessian calculations using finite differences algorithm.
- Algorithm parameters display changed (Setup and Go! dialog windows).
- Changes in the project file format (but old files should be parsed with no problems).

v0.9.2.0

- Regularization (weight decay) added for MLP networks.
- Bayesian Framework for network output uncertainty estimation and regularization factor optimization added (only for function approximation, support for the classification task to be implemented).
- Few steps towards the network committee (multiple network instances can be trained, but the output is not averaged yet).
- More options in graph layout.

v0.9.1.7

- One more fix.

v0.9.1.6

- Correction of the total iteration counter for the Cascade-Correlation networks.

v0.9.1.5

- Network ASCII file format changed; this format is now default in load/save operations. Support for reading old ASCII and binary files will be removed in version 1.0.
- Added limits on the number of hidden units and the network coefficients (all growing networks).
- Added limit on the total number of the training iterations for Cascade-Correlation network.
- Removed split neuron option of the dynamic MLP training setup (now this feature is always turned on).
- Extended graph layout contol (point, line and error styles setup).

v0.9.1.2

- RMLP code cleaned up and extended - recurrent loop may start at any hidden or output layer.
- Progress bar in AutoStop training mode shows stopping criterion advances instead of iterations.

v0.9.1.1

- Extended graph layout contol (x-y grid enable/disable, grid colors, font colors...).
- Graph title, axes description, labels formatting added: html markups for subscript (<sub>), superscript (<sup>), italic (<i>), bold (<b>), and underline (<u>).
- Math expressions parser added for histograms and scatter plots (in place of simple variable selection).
- Graph opacity and new drawing styles added.
- Scatter plots extended - error expression added.
- Some reorganizations in the user interface.

v0.9.0.5

- User-defined error function.
- Levenberg-Marquardt training (still developing, only feed-forward MLP supported now).
- Further user interface improvements.
- New projects files saved in refined xml format (*.NetPrj). Just open and save the project to make conversion. Old format support to be removed in version v1.0.

v0.9.0.1

Small corrections in the user interface behaviour.

v0.9.0.0

Edit your old project files (*.NetPrj) with any text editor: replace all , (coma) symbols with ; (semicolon).

Formatting string syntax changed: entries are separated with ; (semicolon).

- math expression parser and compiler added; C# syntax is used; it is available in all filter-expression text-boxes (expresion should evaluate to boolean value), in Output Format tab of all Setup dialog windows (any numerical value should be returned), and in weight variable in the network training setup; remember that single precision floating point format is used for data vector elements, so correct comparison is: "t1 == 0.05F" (without "F" number is treated like a double precision and is slightly different than 0.05F); vector elements are accessed in the standard way: "i1", "o5, ..."; all math functions have intuitive aliases: sin(), log(), sqrt(), ...;
- user interface changed: toolbar, output format tab in dialog windows;
- SVM library changed to v2.84;
- background rejection vs efficiency mode added to signal selection plots;
- log-scale added to plots;
- some SSE4.1 instruction set optimizations.

v2.09.2007b

- Epoch size may be adjusted for the steepest-descend training algorithm - specified number of training vectors is randomly chosen at each iteration; for epoch size = 1 this is equivalent to on-line training (local approach).
- Training vectors are shuffled partially after each epoch.
- Weight variable may be specified for training vectors; this weight should be a non-negative number, vectors with lower weight are less important during the training.
- Support for SVM classification/regression added (uses SVM.NET C# port of LIBSVM code).
- Single filter condition changed to unlimited sequence of conditions encoded in C-like syntax, for example: (i1>2)&(i2<i3)&((o1>=0.5)|(t1==0.95)). This applies to: 1) Forwarding tabs in Transformation, Classification and Network blocks; 2) filter variable of different plots; 3) Save Data tab page of DataSet block. Project file format has changed - forwarding setup needs to be redone in old files.
- Many user interface changes: training/testing set selection moved to Connection Add dialog window; Transformation and Classification blocks have simplified and more uniform setup dialog windows; multiple training sets may be used with Classification block.

v29.06.2007e

- Neuron split added to the algorithm of the network growth (manual updated).
- Hold and Save phases training control options added.
- Fixed issue: filter variable of the Scatter Plot (2D) set to "nxx" had no effect.
- Improved detection of the training-stop conditions (for configurations with testing set connected during the training).
- Default values of the training algorithms changed.
- Small changes in the user interface.

v21.10.2006d

- Twin neurons measure modified - previous values should be divided by 2 to get similar results. - Tuning of the neuron candidates initial weights. - Fixed issue with log messages. - Number of weights removed with OBS in one shot is now limited: total Lq < 5 Lq threshold and max 10 weights; then the network is retrained. This is due to possible large error jump in case of many weights with Lq << threshold.
- OBS code optimized (really small improvement).

v21.10.2006

- Changes in algorithm of adding new neurons: if new neuron was accepted and then killed, next new neuron candidates are not tried in the same layer. Initial weights (to the following layer) are randomized using the range corresponding to the existing connection weights (this makes "spikes" on the error(iterations) plot higher, don't worry about that).

v4.10.2006a

- ITanh error function added; use this function if you expect outliers or gross errors in the training data. I recommend to try this function especially if MSE was the best for your task until now.

v4.10.2006

- Algorithm for twin neurons joining improved (more pairs are detected); default neuron candidate pool size changed to 5 - pre-training of candidates is quite fast now, so this value seems to be reasonable; some other optimizations in the network structure reduction procedures.
- Optimal Brain Surgeon added (weight elimination technique). It may be applied at the end of the training to reduce the network size (sometimes whole neurons are removed), or it may support existing size adjustment algorithm when called iteratively before each attempt of removing redundant neurons. Be aware: this is time and memory consuming technique - O(N3 * M), where N is the number of the network coefficients and M is the training set size. Code will be optimized in further versions. Some examples and description in manual soon.
- SSE3 instructions (P4 Prescott and above) used, in a few places, not particularly faster, but why not to use them if they are here...
- Log window is now resizable and shows some more events; order of events is inverted (newest are on the top).

 

older entries removed