In the feed-Forward with back propagation network with:
5n hiden layer 1
50n hiden layer 1
In theory you could. The longest part of training is typically the activation functions and calculation of each connection. The time to train is going to be some forumla like:
(numWeights * weightTime) + ((numNeurons - inputNeurons) * activationTime)
numWeights = number of weights in the network
numNeurons = total number of neurons in the network
inputNeurons = input neuron count
activationTime & weightTime will depend on your hardware and will need to be determined experimentally.
I subtract out input neurons since they are not processed by an activation function.
Couldn't you take the total training time and divide by number of epochs? This gives you the average training time per epoch. If you want an even finer grained recording of time per epoch you can try wrapping each train.iteration() statement in a timing block.
RSS YouTube Twitter Facebook github LinkedIn
Copyright 2013 by Heaton Research, Inc.
Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer