NeuroSolutions Feature Comparison

NeuroSolutions Features

NeuroSolutions for Excel*

Users

Consultants

Developers

Topologies

 

 

 

 

Linear Regression

X

X

X

X

Multilayer Perceptron (MLP)

X

X

X

X

Generalized Feedforward Network

X

X

X

X

Probabilistic Neural Network (PNN)

X

X

X

X

Modular Network

X

X

X

Jordan / Elman Networks

X

X

X

Self-Organizing Map (SOM)

X

X

X

Principal Component Analysis (PCA)

X

X

X

Radial Basis Function (RBF)

X

X

X

General Regression Neural Network (GRNN)

X

X

X

Neuro-Fuzzy Network (CANFIS)

X

X

X

Support Vector Machine Network

X

X

X

Support Vector Machine Regression Network

X

X

X

Hopefield Network

X

X

Time Delay Neural Network (TDNN)

X

X

Time-Lag Recurrent Network (TLRN)

X

X

General Recurrent Network

X

X

Maximum Number of Inputs / Outputs / Neurons Per Layer Maximum Number of Inputs / Outputs / Neurons Per
Layer
indicates the number of allow inputs and outputs
including the weights on each hidden-layer.

50

500

Unlimited

Unlimited

Maximum Number of Hidden Layers

2

6

Unlimited

Unlimited

Learning Paradigms

 

 

 

 

Backpropagation

X

X

X

X

Unsupervised Learning Unsupervised Learning include Hebbian, Ojas, Sangers,
Competitive and Kohonen.

X

X

X

Recurrent Backpropagation

X

X

Backpropagation Through Time

X

X

Optimization Techniques

 

 

 

 

Genetic Optimization Genetic Optimization allows you to optimize virtually any
parameter in a neural network to produce the lowest error.
For example, the number of hidden units, the learning rates,
and the input selection can all be optimized to improve the
network performance. Individual weights used in the neural
network can even be updated through Genetic Optimization
as an alternative to traditional training methods.

X

X

X

Input Optimization: Greedy Search Input Optimization: Greedy Search is a type of input
optimization that the evolution terminates immediately when
adding a single input to the previous input set does not
improve the fitness.

X

X

X

Input Optimization: Back-Elimination Input Optimization: Back-Elimination is a type of input
optimization that the evolution terminates when removing a
single input from the previous input collection leads to a
worse fitness.

X

X

X

Input Projection Component Input Projection Component reduces input dimensions by automatically
mapping multiple pieces of information to single inputs.
Some of the algorithms include: Principal Component
Analysis, M-Dimensional Scaling, K-means Clustering,
Locally Linear Embed and Self Organizing Map.

X

X

Gradient Descent Methods

 

 

 

 

Resilient Backpropagation (RProp)

X

X

X

X

Step / Momentum

X

X

X

X

Delta Bar Delta

X

X

X

X

Quickprop

X

X

X

X

Conjugate Gradient

X

X

X

X

Levenberg-Marquardt Levenberg-Marquardt is a second-order learning algorithm
that generally trains significantly faster than Momentum
learning and usually arrives at a solution with a significantly
lower error. It also is supported for processing through
NeuroSolutions CUDA GPU processing.

X

X

X

X

Advanced Features

 

 

 

 

Weight Regularization The purpose of Weight Regularization is to prevent the
learning from overfitting to the training set by adapting some
weights to be relatively large values.

X

X

X

X

Exemplar Weighting Exemplar Weighting improves training for data with
unequal class distribution.

X

X

X

X

Sensitivity Analysis Sensitivity Analysis is a technique to determine the most
influential inputs.

X

X

X

X

Macros / OLE Automation Macros / OLE Automation is the API to automate and
control NeuroSolutions.

X

X

X

X

Iterative Prediction Iterative Prediction is an advanced method for time series
prediction.

X

X

User-defined Neural Components (using DLLs) User-Defined Neural Components (using DLLs) allows
you to integrate your own algorithms into NeuroSolutions
through user-defined dynamic link libraries (DLLs). Every
NeuroSolutions component implements a function conforming
to a simple protocol in C. To add a new component you
simply modify the template function for the base component
and compile the code into a DLL -- all directly from
NeuroSolutions!

X

ANSI C++ Source Code Generation ANSI C++ Source Code Generation The source code
generation facility of NeuroSolutions is as robust as its
object-oriented design environment. No matter how simple or
complex of a network you create within the graphical user
interface, NeuroSolutions will generate the equivalent neural
network in ANSI C++ source code -- even those networks
that contain your own algorithms implemented with DLLs!
The generated network can be trained beforehand within the
graphical design environment of NeuroSolutions or from within
your C++ application.

X

* NeuroSolutions for Excel can also be used as an add-on product to higher levels of NeuroSolutions to take advantage of the higher level features.