java neural network
Forum  |  Blog  |  Wiki  
Get Java Neural Network Framework Neuroph at SourceForge.net. Fast, secure and Free Open Source software downloads
      

NeurophRM: integration of the Neuroph framework into RapidMiner

By Jelena Stojanovic, Faculty of Organization Sciences, University of Belgrade

A project for Intelligent Systems course

 

Introduction

The study of artificial neural networks (NN) is omnipresent in the research literature, and spans its application and interest in many research fields, including computer science, artificial intelligence, optimization, data mining, statistics, even bioinformatics, medicine, and many more [1].

Despite some drawbacks that NNs have, like the lack of the interpretability of the built model [2], it is still a widely used method and included in most data analytics frameworks. Since the neural network model is hard to understand, software packages, especially commercial ones, usually simplify the NN model, reducing it to several parameters that users can alter. There are only few software products that offer full range of neural network customizable models, and they require expertise in understanding the neural network paradigm. In open-source community, there are currently several stable neural network frameworks that offer to experts the tool for full customization of NN models.

Since RapidMiner [3] is an open-source framework, connection to one of these NN frameworks would draw attention of more users, offering a more customizable and powerful NN tool for handling various data mining tasks. This is especially true for NN experts, who would definitely find RapidMiner a useful tool for overall data analysis and all the logistic support for using NN models, including preprocessing, evaluation, comparison with different algorithms, etc.

 

NeurophRM extension

The idea of integration of Neuroph into RapidMiner comes from the analysis of potential synergy and mutual benefits of each of the frameworks.

Rapid Miner software already offers several operators that rely on neural network models. These are Neural Net, AutoMLP and Perceptron operators for classification, Self Organizing Map for clustering (preprocessing), and several operators included in the Weka extension. All these operators try to hide much of the details of NN models and simplify the parameterization for the users. Still, neural network paradigm offers much more architectures and customization, which if approached by an expert, could be well leveraged. Such new operators would be a valuable addition to the rich RapidMiner toolbox.

On the other hand, Neuroph has a lot of flexibility in building NN models through its API, and also with its GUI, which offers some simplification for the user through different wizards. Although it provides some data manipulation tools (e.g. normalization of features), it generally lacks the tools for comprehensive data extraction, handling and preprocessing. Also, it could benefit from more systematic evaluation tools, like cross-validation, leave-one-out, bootstrapping, and others. Furthermore, it would greatly benefit from different parameter optimization procedures, which are quite useful in an application scenario. Finally, it is very hard to compare the performance of the build model against other available algorithms, especially because of the difference in data formats and evaluation procedures in general. All of these auxiliary tools are readily available in RapidMiner.

These potentials in both frameworks could be partially leveraged if we allow the usage of Neuroph framework within the RapidMiner. Since the frameworks are open-source and developed separately, one of the main requirements for this integration is not to limit to the options currently available in Neuroph, but to allow all future changes in one framework to be directly visible in the other.

Newly developed RapidMiner extension, called NeurophRM, allows users to define customized neural networks in Neuroph, save the definitions in application specific format .nnet file, and use such neural network model definition to train, use and test within RapidMiner. Since the NN definition is created through Neuroph, the plug-in will remain unchanged even if the Neuroph framework introduces new features and possibilities. It does require usage of both Neuroph GUI and RapidMiner GUI, but the flexibility and special features of both frameworks this way is kept at the higher level.

To illustrate how all these frameworks fit in the scheme, in order to better understand the synergetic effect, frameworks are stacked and showed in Figure 1. Also there is a class diagram in Figure 2.

Figure 1: Stack diagram of framework dependency


Figure 2: NeurophRM class diagram


Finally, NeurophRM is freely available, both the RapidMiner extension and the source code, on [14].


Using NeurophRM: a use case

To avoid misuse, but still be flexible on the type of NN that the user can use, we created two general RapidMiner operators: Neuroph Classification NN and Neuroph Regression NN, that differ only in capabilities of handling polynomial and numerical label respectivelly.

In a typical usage, the user would first have to define the NN model, including the architecture (Figure 3), parameters, and learning (Figure 4) for the customized NN. This would all be performed within Neuroph Studio, since it provides an easy way to define all this through different wizards, and save the resulting definition. When there is a need for more customization (e.g. arange neurons in some custom way, unlike well-known architectures), one can even use Neuroph API to program a NN model in a programming language. Either way, the result of this step would be a .nnet file which contains the NN definition, and is specific for the Neuroph framework.

Figure 3: Selection of NN architectures in Neuroph Studio


Figure 4: NN Parameters in Neuroph Studio


The second step would then be to use the NeurophRM operators in RapidMiner, and to set its parameters to point to the created .nnet file, as seen on Figure 5. NeurophRM operators are actually Learners in RapidMiner, so after giving them input example sets, and run the process, they will output a NN model, trained on the input data (Figure 6).

Figure 5: Parameters of the NeurophRM Operator in RapidMiner


Figure 6: NeuropRM operator as a Learner in RapidMiner


The resulting model in RapidMiner could be used as any other, for prediction on some new data, validation, or any other usage in processes defined in RapidMiner. This way, usage of NNs built by Neuroph is made simpler and accesible to large RapidMiner community. One problem is that the resulting model is still not visualized in any way, and remains a black-box for the user. Since there are ways to visualize NN models and at least help users to understand bits of it, we plan to improve this in the next versions of NeurophRM.


Preliminary experimental results

Since the Neuroph framework offers a broad variety of NN models, we cannot fully test it against existing RapidMiner operators that use NN. Thus, we selected Multi-layered Perceptron architecture for comparison, and are leaving other architectures for further study.

The goal of this comparison is to see how Neuroph performs at tasks that existing RapidMiner operators can handle, and compare performance through expected accuracy, and time to build the model. The comparison is done with operators NeuralNet, a standard multi-layer perceptron model, and AutoMLP, which performs additional parameter (and structure) optimization of the model. Accuracy is measured using 10-fold cross-validation, on 8 publicly available datasets [15]. Since this is a preliminary experimental evaluation, no strong conclusions are produced; instead the experiments generate some hypothesis, which would require more thorough investigation.

For each trial, we set the parameters equal for both NeurophRM and NeuralNet operators, determining size and number of hidden layers, type of neurons, learning schemes and learning parameters. Any parameter unique for one of the operators is left at default value. AutoMLP optimizes these parameters, so the expected result is that it should be more accurate but slower. Data normalization preprocessing step is done by a separate RapidMiner operator, to exclude its influence on the results. The results are shown in Table 1, stating accuracy with deviation originating from 10 runs within cross-validation. Speed is measured and noted in seconds.

DataSet Neuroph NeuralNet AutoMLP
Accuracy % Speed Accuracy % Speed Accuracy % Speed
BALANCE 96.08+-2.5 0:06 92.32+-2.44 0:13 94.74+-3.40 4:33
CAR 97.40+-0.74 0:44 97.28+-1.30 2:00 99.65+-0.53 14:12
CLEVELAND 79.59+-4.25 0:21 78.55+-5:33 00:36 79.84+-5.12 5:32
CREDIT 81.30+-5.55 1:40 84.06+-3.18 2:40 86.23+-3.51 10:37
IRIS 96.67+-4.47 0:10 93.33+-7.89 0:03 96.5+-5.33 3:34
TIC-TAC 96.56+-1.48 0:19 97.39+-2.10 1:28 97.81+-1.84 7:47
VOTE 96.08+-2.94 0:05 95.62+-3.19 1:22 95.86+-3.53 6:38
TIC-TAC 95.00+-5 0:05 96.00+-4.9 0:26 95.00+-6.71 4:46

Table 1: Accuracy and speed of different NN operators in RapidMiner


When it comes to speed, Neuroph is preferred most of the times (bolded are smallest run-times), leading to speed improvements sometimes in an order of magnitude. The speed comes from both Neuroph and Encog core libraries, which are optimized and quite efficient on the computational level, but the exploration of the real cause would require deeper investigation.

So taking all into account, the Neuroph managed to come to comparable results, but often in a more time efficient way.


Discussion and further developments

Integration of Neuroph neural net framework into RapidMiner is a pareto move, since it provides benefits for both frameworks, and for both communities of users. Neuroph is clearly getting all the logistics for the comprehensive applications of different kinds of neural networks. On the other hand, RapidMiner is getting a new operator for using the whole field of different neural networks, which is a notable addition to its existing rich toolbox.

Experimentally, NeurophRM showed good results, especially with time performance. In order to make fair comparisons, only multi-layered perceptrons were tested, but full strength of Neuroph should come from its variety of NNs.

NeurophRM extension provides this bridge between these frameworks, but still needs more improvements to enhance the integration, such as:

  • including a visualization of the NN model, which could help user to partially understand it;
  • using of Neuroph GUI (with different wizards) from RapidMiner, instead of using a complete different software product;
  • inter-architecture NN study of performance, and some recommender system, as an aid for users with less expertise with NN;
  • making operators for more data mining tasks, such as clustering, attribute reduction, association, which all could be handled by some form of neural network;
  • making operators with exposed NN parameters, to enable optimization of these parameter using RapidMiner optimization tools. This however has a drawback that it would fix the choice of parameters, and require updating after Neuroph core updates the set of available parameters.

Important side-effect is that both frameworks will benefit from the partial merge of user communities, which will hopefully spark further studies of this integration, which is needed for this effort to evaluate and mature. What is more, separate development of both frameworks will have benefits on both sides. So future versions will only improve the experience and increase the need for this integration.

TRANSLATIONS:

The translation of this tutorial in russian is available here


References

  1. Haykin, S. (1998) Neural Networks: A Comprehensive Foundation 2nd. Prentice Hall PTR Upper Saddle River, NJ, USA. ISBN0132733501.
  2. Olden, J. D., Jackson, D. A. (2002). Illuminating the "black box": a randomization approach for understanding variable contributions in artifcial neural networks. Department of Zoology, University of Toronto.
  3. Mierswa, I., Wurst, M., Klinkenberg, R., Scholz, M., Euler, T. (2006). YALE: Rapid Prototyping for Complex Data Mining Tasks. In Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD-06).
  4. Sevarac, Z. Neuroph - Java neural network framework. Retrieved from http://neuroph.sourceforge.net/ ( May, 2012).
  5. Gupta, M. M., Jin, L., Homma, N. (2003). Static and Dynamic Neural Networks: From Fundamentals to Advanced Theory. Wiley-IEEE Press.
  6. Carter-Greaves, L. E.. Time Series prediction with Feed-Forward Neural Networks -A Beginners Guide and Tutorial for Neuroph. Retrieved from http://neuroph.sourceforge.net/TimeSeriesPredictionTutorial.html. (May, 2012).
  7. Steinhauer, V. Stock market prediction using neural networks. Retrieved from http://neuroph.sourceforge.net/tutorials/StockMarketPredictionTutorial.html. (May, 2012).
  8. Steinhauer, V. Chicken prices prediction using neural networks, Retrieved from http://neuroph.sourceforge.net/tutorials/ChickenPricePredictionTutorial.htm (May, 2012).
  9. Micic, D. Creating android image recognition application using netbeans and neuroph. Retrieved from http://neuroph.sourceforge.net/tutorials/android_image_recognition_using_neuroph.htm. (May, 2012).
  10. Andersen, A.C. (2010). Autonomous Neural Development and Pruning. Department of Bioengineering, University of California San Diego, La Jolla, CA 92122. Retrieved from: http://itspiren.no/2010/12/autonomous-neural-development-and-pruning/
  11. Abhishek, C., Kumar. V. P., Vitta, H., Srivastava, P. R.(2010). Test Effort Estimation Using Neural Network. Journal of Software Engineering and Applications, 3:331-340, doi:10.4236/jsea.2010.34038
  12. Maciel, A., Carvalho, E. (2010). FIVE - Framework for an Integrated Voice Environment. 17th International Conference on Systems, Signals and Image Processing.
  13. Zhu, J., Fung, G., Wang, L. (2011). Efficient name disambiguation in digital libraries. Proceedings of the 12th international conference on Web-age information management, 430-441, Springer-Verlag, Berlin, ISBN: 978-3-642-23534-4.
  14. Heaton J. (2010) Programming Neural Networks with Encog 2 in Java, Heaton Research, Inc., ISBN: 1604390077
  15. Jovanovic M., Stojanovic J. NeurophRM: Rapid miner plugin for Neuroph Studio. Retrieved from: http://code.google.com/p/neurophrm/ (May, 2012).
  16. Asuncion A and Newman D.J., (2007) UCI machine learning repository, University of California, School of Information and Computer Science, Retrieved from: www.ics.uci.edu/~mlearn/MLRepository.html.

Download:

NeurophRM plug-in

Full NeurophRM

RapidMiner software

See also:

NeurophRM source code

RapidMiner

 

      Java Get Powered      Java Get Powered                           Get Java Neural Network Framework Neuroph at SourceForge.net. Fast, secure and Free Open Source software downloads