You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I propose setting the PATIENCE to some non zero value. I picked 20, which resulted in 17 epoch for DNN, and 3 epochs for CNN. All of the trainings lasted short even with the GPU-less basic jupyter notebook instance still.
The text was updated successfully, but these errors were encountered:
MrCsabaToth
changed the title
Zero patience stops training after first epoch resulting in empty training graphs in keras_for_text_classification.ipynb
Zero patience stops training after first epoch resulting in empty training graphs in keras_for_text_classification.ipynbSep 10, 2023
MrCsabaToth
added a commit
to MrCsabaToth/training-data-analyst
that referenced
this issue
Sep 10, 2023
This is part of the NLP module of the Machine Learning Engineer learning path (for certification).
The "Keras for Text Classification using Vertex AI" lab (https://www.cloudskillsboost.google/course_sessions/2920308/labs/363228) in the "NLP models" section - https://github.com/GoogleCloudPlatform/training-data-analyst/blob/master/courses/machine_learning/deepdive2/text_classification/labs/keras_for_text_classification.ipynb
has
PATIENCE = 0
which controls the early stopping during training (EarlyStopping(patience=PATIENCE)
. As the code kinda suggests, zero patience results in immediate stopping, which renders the example training graphs plotted right after the trainings mute / empty. The lab goes through three trainings: DNN, RNN, CNN.I propose setting the PATIENCE to some non zero value. I picked 20, which resulted in 17 epoch for DNN, and 3 epochs for CNN. All of the trainings lasted short even with the GPU-less basic jupyter notebook instance still.
The text was updated successfully, but these errors were encountered: