Looking at Daniel Möller's answer for this question I understand that recompiling a trained model should not affect/change the weights already trained. However, whenever I recompile my model to further train it using, say a different learning rate or Batch Size, the val_mse starts at a higher/worse value than it was at by the end of the initial training.
Though eventually decreasing back to the val_mse reached before, I am not sure if by recompiling the model I am simply resetting the model and retraining.
Could someone confirm whether recompiling actually does restart the learning process from scratch or not? Also whether or not it is a common practice (or if it's any good) to follow the initial training of the model with secondary phases of training with different hyper-parameters?
https://stackoverflow.com/questions/65929050/does-re-compiling-reset-the-models-weights January 28, 2021 at 07:44AM
没有评论:
发表评论