You are here

Does Encog support online neural network training, or custom batch sizes?

As of version 3.2 Encog does support online training. Online training is supported for both flat and freeform networks. Batch/online training typically refers to propagation style training, Encog allows batch sizes to be specified for the following training algorithms.

Batch sizes specify how often the weights of a neural network should be updated. A batch size of 100 specifies that the weights should be updated for every 100 training elements. A batch size of 1 specifies online training, because the weights will be updated after every training element. A batch size of 0 means to update the weights only after the entire training set has been processed. A batch size of 0 is the default for Encog.

Online training works best with backpropagation. Online training will likely not converge with resilient propagation. Because of this batch sizes greater than one should be used with resilient propagation.

Java Flat Example

The code below shows a simple online training example. This example is also contained in the Encog examples.

public class XOROnline {

	/**
	 * The input necessary for XOR.
	 */
	public static double XOR_INPUT[][] = { { 0.0, 0.0 }, { 1.0, 0.0 },
			{ 0.0, 1.0 }, { 1.0, 1.0 } };

	/**
	 * The ideal data necessary for XOR.
	 */
	public static double XOR_IDEAL[][] = { { 0.0 }, { 1.0 }, { 1.0 }, { 0.0 } };
	
	/**
	 * The main method.
	 * @param args No arguments are used.
	 */
	public static void main(final String args[]) {
		
		// Create a neural network, using the utility.
		BasicNetwork network = EncogUtility.simpleFeedForward(2, 2, 0, 1, false);
		network.reset();

		// Create training data.
		MLDataSet trainingSet = new BasicMLDataSet(XOR_INPUT, XOR_IDEAL);
		
		// Train the neural network.
		final Backpropagation train = new Backpropagation(network, trainingSet, 0.07, 0.02);
		train.setBatchSize(1);
		
		// Evaluate the neural network.
		EncogUtility.trainToError(train, 0.01);
		EncogUtility.evaluate(network, trainingSet);
		
		// Shut down Encog.
		Encog.getInstance().shutdown();
	}
}

Java Freeform Example

The code below shows a simple online training example. This example is also contained in the Encog examples.

public class FreeformOnlineXOR {

	/**
	 * The input necessary for XOR.
	 */
	public static double XOR_INPUT[][] = { { 0.0, 0.0 }, { 1.0, 0.0 },
			{ 0.0, 1.0 }, { 1.0, 1.0 } };

	/**
	 * The ideal data necessary for XOR.
	 */
	public static double XOR_IDEAL[][] = { { 0.0 }, { 1.0 }, { 1.0 }, { 0.0 } };
	
	/**
	 * The main method.
	 * @param args No arguments are used.
	 */
	public static void main(final String args[]) {
		
		// create a neural network, without using a factory
		FreeformNetwork network = new FreeformNetwork();
		FreeformLayer inputLayer = network.createInputLayer(2);
		FreeformLayer hiddenLayer1 = network.createLayer(3);
		FreeformLayer outputLayer = network.createOutputLayer(1);
		
		network.connectLayers(inputLayer, hiddenLayer1, new ActivationSigmoid(), 1.0, false);
		network.connectLayers(hiddenLayer1, outputLayer, new ActivationSigmoid(), 1.0, false);		
		network.reset();

		// create training data
		MLDataSet trainingSet = new BasicMLDataSet(XOR_INPUT, XOR_IDEAL);
		FreeformBackPropagation train = new FreeformBackPropagation(network,trainingSet, 0.7, 0.2);
		train.setBatchSize(1);
		EncogUtility.trainToError(train, 0.01);
		EncogUtility.evaluate(network, trainingSet);

		Encog.getInstance().shutdown();
	}
}
Question Number: 
5.03

Comments

evilspeculator's picture

The above says that online learning is only available for Backpropagation and Resilient Propagation - are there any plans to include the SVM engine as well? Obviously it would be a strong candidate for online/realtime learning.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer