You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I don't know if i got it right reading documentation and examples. However my question is: in order to train a neural network in full bartch mode (that is, using all the available instancdes), is it correct to impose "batchsize: number_of_all_available_instances" in the RegressionNeuralNetLearner constructor ?
Tnx a lot in advance
The text was updated successfully, but these errors were encountered:
The implementation in SharpLearning is not intended for running in "full batch mode" as such. The all the different optimizers er based on stochastic gradient descent, so is designed for mini-batch learning. However, if you want to try and use it for "full batch" learning, you are correct. The batch size should be set to the number of examples or instances in the training set. Be aware that it might require a lot of memory to run in this mode.
Hi! I don't know if i got it right reading documentation and examples. However my question is: in order to train a neural network in full bartch mode (that is, using all the available instancdes), is it correct to impose "batchsize: number_of_all_available_instances" in the RegressionNeuralNetLearner constructor ?
Tnx a lot in advance
The text was updated successfully, but these errors were encountered: