Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds a myriad of tunable training params #60

Merged
merged 1 commit into from
May 15, 2020

Conversation

TechnikEmpire
Copy link
Contributor

We can now select our optimizer, configure each optimizer, we have conversion to FP16 (where possible) as an option which improves memory usage during training at the very least. We can now perform image augmentation during training, control label smoothing, etc.

We also automatically and always do early stopping once validation accuracy does not improve, and since I'm an old man with the old memes, I've set the training target epochs to 9001 in the v2 training scripts because early stopping will actually control this.

Basically I brought over a bunch of configuration stuff from automl and merged them in, fulfilling a bunch of TODOs in the original training code that was brought in from tfhub.

Unfortunately, all these new shiny tools didn't improve overall accuracy on newly trained models. Loss was significantly lower however. I'll attach newly trained mobilenet model to this PR later.

We can now select our optimizer, configure each optimizer, we have conversion to FP16 (where possible) as an option which improves memory usage during training at the very least. We can now perform image augmentation during training, control label smoothing, etc.

We also automatically and always do early stopping once validation accuracy does not improve, and since I'm an old man with the old memes, I've set the training target epochs to 9001 because early stopping will actually control this.

Basically I brought over a bunch of configuration stuff from automl and merged them in, fulfilling a bunch of TODOs in the original training code that was brought in from tfhub.
@TechnikEmpire
Copy link
Contributor Author

@GantMan GantMan merged commit 1e7da36 into GantMan:master May 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants