You are here

XOR Neural Network with JavaScript

Neural Networks, and other machine learning methods, are often trained with the XOR Operator. This example uses a feedforward neural network. In many ways the XOR operator is the "Hello World" program of neural networks, and other Machine Learning Methods. The HTML5 Javascript application below allows you to train a neural network for the XOR operator. You can use either Resilient Propagation or Backpropagation. Resilieint propagation will be more efficient.

Max Iterations: , Max Error:
Backpropagation(BPROP): Learning rate: , Momentum:
Resilient Propagation(RPROP):

The truth table for the XOR operator is what is used for the training data of the neural network. You can see the XOR truth table here.

Op1 Op2 Result
--- --- ------
 0   0     0
 1   0     1
 0   1     1
 1   1     0

After training we would like for the neural network to produce the give result with the two inputs provided. The results will not be exact, do to the estimating nature of the neural network.l

The above example makes use of a 3-layer (input, output, hidden) neural network. This neural network makes use of the sigmoid activation function. Bias neurons are also used. You can see a diagram of this neural network below.

XOR Neural Network

This example allows you to make use of two different training methods. Backpropagation was one of the earliest neural network training methods. It is effective, but slow. Backpropagation requires you to define a learning rate and momentum. Learning rate specifies how fast the network will train. Generally you should not go above 1.0. Usually, you want the highest learning rate you can before the neural network becomes unstable. In this example, we are using 0.7. The value for momentum is helpful to prevent the neural network from becoming caught in local minima. Too high a value will also make the neural network unstable.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer