C++ classes for implementing Feedforward, Simple Recurrent and Random-Order Recurrent Neural Nets trained by Backpropagation.
Feel free to drop by and chat if you have any questions - one of us is usually around during normal CST/CDT business hours.|
havBpNet++ is focused on implementation of the underlying NN functions/activities rather than the higher-level simulator/UI. It is designed to be fully embedable; however, it can, as easily, be used to implement stand-alone training or consultation applications. The typical application can take advantage of the layer oriented API to define either simple nets or very complicated and large nets consisting of one or more sub-nets.
Individual and Site Licenses of the havBpNet++TM class library, can be purchased online using all Major Credit Cards, approved Purchase Orders or US Checks - Phone, Fax and Wire Transfer orders are also available.
Frequently Asked Questions
Is a demo available?
Yes. There are both free and licensed versions of the havBpETT Demo Simulator available. havBpETT uses a DLL version of the havBpNet++ library to implement an Example Trainer/Tester.
How about an online demo?
What sort of restrictions are there in terms of layer size (number
of nodes) and number of layers?
There are no realistic restrictions imposed by the library.
In fact, however, your platform does impose certain restrictions -
for example, the number of nodes in a collection of connected layers
cannot exceed the maximum positive long int on your platform.
You can still create nets (or composite nets) that total more nodes
than this restriction's limit by using "Copy" rather than "Direct"
connections between layers.
What network parameters are implemented?
havBpNet++ implements all standard feed-forward/backprop parameters such as
- Beta (learning-rate),
- Mu (momentum),
- Cascade-coefficient and
Five activation-functions are suppported. These are
- sin and
Three error functions are supported. These are
- Squared (e^2),
- Cube (e^3) and
- Quad (e^4).
Both pattern-by-pattern and batch training modes are supported. In pattern-by-pattern mode, you may choose either interleved or non-interleved weight updates. In batch mode, you have control over the batch size.
Bias is supported in such a way that you can use one bias node for all layers or have a bias-node for each layer or some sub-set of layers.
Typical applications will utilize the layer-API, so parameters are assigned to layers; however, more sophisticated developers can use direct node access methods to effect parameters on a per node basis.
Is Recurrency supported?
In addition to supporting the standard cascade-coeficient,
havBpNet++ supports two forms of recurrency. First, the typical weighted-copy
recurrency (a la Jordan or Elman). Second, a layer may be connected as input
to itself or to "lower" layers in the net. This form of recurrency is supported
with Random-Ordered train and cycle messages that causes the node in a layer to
be processed in a random order, thus reducing the skew effect that would otherwise
Full Backprop-through-time is not yet implemented but will be supported in
future versions of havBpNet++.
How are layers saved?
havBpNet++ is not tied to a specific database. As delivered, networks are saved to
flat-files and thus avoid the requirement for additional DB support. If you have
a prefered DB, it should be
a fairly "simple" thing for you to modify the Save and Restore methods to utilize
the DB's API.
How are sub-nets connected together?
havBpNet++ supports both direct connection and intermediate-Copy connections between sub-nets.
What platforms can I use?
In designing havBpNet++ we chose to avoid using platform specific foundation classes. By doing this, we have been able to attain a high level of portability. To date, havBpNet++ has been used on
... using native C++ compilers and (in most cases) g++ as well.
- PC (DOS, Win3.1, NT, UNIX),
- HP-9000 and
PC NOTE: Windows QuickWin applications may use the standard form of havBpNet++ as delivered; however, certain restrictions imposed by DLL required that we develop a special DLL version of the library. The main changes required involved
changing standard file I/O calls (like fprintf and fscanf) to file-streams. The normal distribution of havBpNet++ includes this special DLL version at no additional cost.
There is a free version of the Win 3.1/NT Demo Simulator available that uses the DLL version of the havBpNet++ library to implement an Example Trainer/Tester. Several version builds are available.
We would be glad to send you any further information that you might need. Do you by any chance have access to FrameMaker or can you read postscript or .pdf files? If so, we can send some manual excerpts.
Alternatively, you might give us a call at (281) 341-5035 in order to discuss you application and how the hav.Software NN libraries can help.