Here is a list of things to do:
- Improvments for backpropagation ( delta-bar-delta, quickprop, resilient prop, batch mode etc.)
- Import training set from images, database, url etc.
- Samples in Java code on how to use Neuroph library for typical tasks such as classification, recognition, approximation predicton etc.
- Support for test sets is easyNeurons, so after learning we can quickly test how the network behaves for an untrained data
- Automated environment for training and testing networks, with test set/training set generators
- Support for modular neural networks
- Learning for RBF layer in RBF Network
- Use of genetic algorithms in learning
- All sorts of vizualizations of learning, weights, datasets etc.
- Visual editor for neural networks which supports free drawing of neural networks in graph view: add, edit, delete, inspect network components (neurons, layers, connections), drag n drop, component pallete etc.
- Separate and develop application logic layer, a set of classes which will provide API to use Neuroph library in development of other applications. This includes classes which act as controller, monitor, wrapper, container etc. (this can be extracted from the existing easyNeurons app)
- More neural network architectures and learning rules: ART, Counterpropagation, Elman, Interactive Activation Controller, Optimal Brain Damage (all other are welcome too)
If you you are interested in joining the development, or contributing in any other way,
feel free to contact us at firstname.lastname@example.org