havBpNet++ - C++ classes for implementing Feedforward, Simple Recurrent and Random-Order Recurrent Neural Nets trained by Backpropagation.
hav.Software
hav.Software
Neural Net class libs - C++ and Java havBpNet++TM

C++ classes for implementing Feedforward, Simple Recurrent and Random-Order Recurrent Neural Nets trained by Backpropagation.
havBpNet++TM is focused on implementation of the underlying NN functions/activities rather than the higher-level simulator/UI. It is designed to be fully embedable; however, it can, as easily, be used to implement stand-alone training or consultation applications. The typical application can take advantage of the layer oriented API to define either simple nets or very complicated and large nets consisting of one or more sub-nets.

Frequently Asked Questions

Is a demo available?

Yes. There are both free and licensed versions of the havBpETT Demo Simulator available. havBpETT uses a DLL version of the havBpNet++TM library to implement an Example Trainer/Tester.
 
How about an online demo?

Yes. We made a little javascript demo (long ago when Javascript was first introduced) which performs consultation of a 3 layer feed forward net to solve a simple parity problem and a 3-D Feature Map which classifies 5x7 bit maps of the uppercase Roman alphabet. The FF net used was trained using the havBpNet++TM Class Library and the FM net was trained using the havFmNet++ Class Library.

You're welcome to play with the Javascript FF and FM Consultation online demo.
 
What sort of restrictions are there in terms of layer size (number of nodes) and number of layers?

There are no realistic restrictions imposed by the library.

In fact, however, your platform does impose certain restrictions - for example, the number of nodes in a collection of connected layers cannot exceed the maximum positive long int on your platform. You can still create nets (or composite nets) that total more nodes than this restriction's limit by using "Copy" rather than "Direct" connections between layers.
 


What network parameters are implemented?

havBpNet++TM implements all standard feed-forward/backprop parameters such as
  • Beta (learning-rate),
  • Mu (momentum),
  • Cascade-coefficient and
  • weight-decay.

Five activation-functions are suppported. These are
  • Linear,
  • Logistic-Sigmoid,
  • Hyperbolic-tangent,
  • sin and
  • Hermite.

Three error functions are supported. These are
  • Squared (e^2),
  • Cube (e^3) and
  • Quad (e^4).

Both pattern-by-pattern and batch training modes are supported. In pattern-by-pattern mode, you may choose either interleved or non-interleved weight updates. In batch mode, you have control over the batch size.

Bias is supported in such a way that you can use one bias node for all layers or have a bias-node for each layer or some sub-set of layers.

Typical applications will utilize the layer-API, so parameters are assigned to layers; however, more sophisticated developers can use direct node access methods to effect parameters on a per node basis.
 


Is Recurrency supported?

In addition to supporting the standard cascade-coeficient, havBpNet++TM supports two forms of recurrency. First, the typical weighted-copy recurrency (a la Jordan or Elman). Second, a layer may be connected as input to itself or to "lower" layers in the net. This form of recurrency is supported with Random-Ordered train and cycle messages that causes the node in a layer to be processed in a random order, thus reducing the skew effect that would otherwise occur.

Full Backprop-through-time is not yet implemented but will be supported in future versions of havBpNet++TM.
 


How are layers saved?

havBpNet++TM is not tied to a specific database. As delivered, networks are saved to flat-files and thus avoid the requirement for additional DB support. If you have a prefered DB, it should be a fairly "simple" thing for you to modify the Save and Restore methods to utilize the DB's API.
 


How are sub-nets connected together?

havBpNet++TM supports both direct connection and intermediate-Copy connections between sub-nets.
 


What platforms can I use?

In designing havBpNet++TM we chose to avoid using platform specific foundation classes. By doing this, we have been able to attain a high level of portability. To date, havBpNet++TM has been used on
  • PC (DOS, Win3.1, NT, UNIX),
  • SUN,
  • IBM-RS/6000,
  • HP-9000 and
  • SGI.
... using native C++ compilers and (in most cases) g++ as well.

PC NOTE: Windows QuickWin applications may use the standard form of havBpNet++TM as delivered; however, certain restrictions imposed by DLL required that we develop a special DLL version of the library. The main changes required involved changing standard file I/O calls (like fprintf and fscanf) to file-streams. The normal distribution of havBpNet++TM includes this special DLL version at no additional cost.


Link There is a free version of the Win 3.1/NT Demo Simulator available that uses the DLL version of the havBpNet++TM library to implement an Example Trainer/Tester. Several version builds are available.


[ HOME ]
Copyright © 1994-2017 by hav.Software and Horace "Kicker" Vallas. All Rights Reserved.

hav.Software havBpNet:J, havFmNet:J, havBpNet++, havFmNet++, havBpETT, havCNet, WebSnarfer, havIndex and havChat are all trademarks of hav.Software.

Java and all Java-based marks are trademarks or registered trademarks of Sun Microsystems, Inc. in the U.S. and other countries.

There may be other trademarks or tradenames listed in this document to refer to the entities claiming the marks and names or products. hav.Software disclaims any proprietary interest in any trademark, tradename or products other than its own.


6,782,478 / 31,899,708
Page Modified Wed Sep 14 21:03:20 CDT 2011
TennisPensacola.com - the official online hub for tennis in Pensacola, Fl. Kick-kick the  Australian Shepherd - Welcome the newest Vallas pack member In memory of Bear Dudley Vallas, a Long Time and Faithful Companion ndex of Kicker's Classical Guitar Videos Bonsai Information, sites and icons etc. Horace's Horror - just why is it so hard to test software?