-
Notifications
You must be signed in to change notification settings - Fork 420
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Fixed PyTorch param loading #425
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks
Codecov Report
@@ Coverage Diff @@
## main #425 +/- ##
=======================================
Coverage 95.80% 95.81%
=======================================
Files 96 96
Lines 3937 3942 +5
=======================================
+ Hits 3772 3777 +5
Misses 165 165
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Following up on #422, this PR switches back to the common parameter loading mechanism.
Previously, some pytorch models' factory functions were throwing errors when
pretrained=True
. With this PR, it throws a warning and proceeds with param initialization when no checkpoint is available.Any feedback is welcome!