Excuse the basic question folks but is it ok to train in parallel? as below?
Parallel.For(0, dll.Globals.itsMultiplier, w =>
e.g. dll.Globals.itsMultiplier = 1000
I'm simply trying to do my training iterations in parallel? hoping this will still provide a VALID train.Error value? ( and im assuming that this is the MSE )
I have removed any gradient checking (greedy, improvment strategy etc) from my network structure, so that should not be an issue and its runs fine!
thanks in advance.