The truth table for the XOR operator is what is used for the training data of the neural network. You can see the XOR truth table here.
Op1 Op2 Result --- --- ------ 0 0 0 1 0 1 0 1 1 1 1 0
After training we would like for the neural network to produce the give result with the two inputs provided. The results will not be exact, do to the estimating nature of the neural network.l
The above example makes use of a 3-layer (input, output, hidden) neural network. This neural network makes use of the sigmoid activation function. Bias neurons are also used. You can see a diagram of this neural network below.
This example allows you to make use of two different training methods. Backpropagation was one of the earliest neural network training methods. It is effective, but slow. Backpropagation requires you to define a learning rate and momentum. Learning rate specifies how fast the network will train. Generally you should not go above 1.0. Usually, you want the highest learning rate you can before the neural network becomes unstable. In this example, we are using 0.7. The value for momentum is helpful to prevent the neural network from becoming caught in local minima. Too high a value will also make the neural network unstable.