Visualization of Artificial Neural Network with WebGL


This WebGL experiment shows an Artificial Neural Network which learns to detect the frequency of the input signal independent from the phase. The implementation is based on Three.js (r56), Guava and Google Web Toolkit.

Here GWT helps to translate the Java code for the Artificial Neural Network into JavaScript, which is then executed in the browser. The experiment has been developed with GWT 2.5 and App Engine SDK 1.7.6. It runs best in Chrome Browser.


Open the WebGL experiment ( with WebGL compatible Browser like Chrome 25 for Win32 or Firefox for Android 17 and you should see something like this:

Figure 1: Not trained Neural Network (press button Reset)

Use the buttons in the right bottom corner to reset the network and train it again. You may use the Arrow- and Page-UP/Down Keys to move the graphic.

Figure 2: Trained Neural Network (press button Train)

If your browser does not support WebGL, it will try to render directly to the canvas. This fallback solution is significantly slower than WebGL. In Figure 3 you see Firefox 11.0 (Windows)

Figure 3: Open in browser without WebGL support

Project Structure

To compile and test all the sources you will need a Google AppEngine project. Eclipse Juno (JEE SR1) has been used to develop the project. At the end of this article you find a link to download the entire project.

Figure 4: Project structure in Eclipse Juno

Find Code on GitHub

The code implements a simple multi-layer perceptron neural network with backpropagation algorithm as training method. If you are not familiar with Artificial Neural Networks you may start reading with Wikipedia - Neural Network and then Neural Networks - A Systematic Introduction.

Implementation - Artificial Neural Network

The class represents a single neuron of the artificial neural network. Neurons are grouped in Layers and connected with Links. A Neuron can be part of the input-, output- or inner-layer. The neuron implements a nonlinear transfer function and the first derivative to train the network with back propagation algorithm. In this ANN implementation a so called Sigmoid Function has been used.

The very simple class has a list of neurons and knows how to recall the neurons of the layer.

The class has all layers and can train and recall. You will find here the function which calculates the relative mean square error and the Back Propagation training is implemented.

The class provides the link between neurons.

The class manages the input and output pattern used during training and recall.

The class creates a neural network and the data pattern for the demo.

Implementation - WebGL User Interface

In Webgl_ann_sample.js the model is rendered as 3D. In this code the library three.js has been used. This helps to reduce the boilerplate code needed for native WebGL. You may read more about three.js here.

The file Webgl_ann_sample.html is the main entry point for rendering.