Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add learner training report summary #1591

Merged
merged 7 commits into from
Apr 11, 2024
Merged

Add learner training report summary #1591

merged 7 commits into from
Apr 11, 2024

Conversation

laggui
Copy link
Member

@laggui laggui commented Apr 8, 2024

Checklist

  • Confirmed that run-checks all script has been executed.
  • Made sure the book is up to date with changes in this PR.

Related Issues/PRs

Closes #1264

Changes

Added LearnerSummary struct to retrieve recorded metrics during training.

  • Added NumericEntry enum to properly aggregate values
    • Previous implementation did not take into account different mini-batch sizes which resulted in incorrect averages
  • Changed NumericMetricState to use NumericEntry with corresponding value and batch size for each metric entry that is serialized
  • Renamed register_metric_train -> register_train_metric in Metrics to match other method names

Testing

Added aggregate test for NumericEntry usage with a logger and ran the examples with the newly added summary.

Example Usage

Since the training artifacts are persistent (still on disk after a training is completed), the LearnerSummary doesn't have to be used only at the end of your training script. It can be used at any time by pointing to the artifact directory, and you can select which metrics to look for.

let summary = LearnerSummary::new(
    artifact_dir,
    &[AccuracyMetric::<B>::NAME, LossMetric::<B>::NAME],
);
println!("{}", summary);

or

let summary = LearnerSummary::new(
    artifact_dir,
    &["Accuracy", "Loss"],
);
println!("{}", summary);

Sample output:

======================== Learner Summary ========================
Total Epochs: 6


| Split | Metric   | Min.     | Epoch    | Max.     | Epoch    |
|-------|----------|----------|----------|----------|----------|
| Train | Accuracy | 93.492   | 1        | 99.133   | 5        |
| Train | Loss     | 0.029    | 5        | 0.243    | 1        |
| Valid | Accuracy | 97.630   | 1        | 98.910   | 5        |
| Valid | Loss     | 0.031    | 5        | 0.076    | 1        |

Feel free to suggest improvements on the output formatting!

Copy link

codecov bot commented Apr 8, 2024

Codecov Report

Attention: Patch coverage is 71.97232% with 81 lines in your changes are missing coverage. Please review.

Project coverage is 86.35%. Comparing base (2f88548) to head (76422c3).
Report is 6 commits behind head on main.

❗ Current head 76422c3 differs from pull request most recent head be6afe9. Consider uploading reports for the commit be6afe9 to get more accurate results

Files Patch % Lines
crates/burn-train/src/learner/summary.rs 62.03% 71 Missing ⚠️
crates/burn-train/src/logger/metric.rs 87.09% 4 Missing ⚠️
crates/burn-train/src/metric/base.rs 82.60% 4 Missing ⚠️
crates/burn-train/src/learner/builder.rs 0.00% 1 Missing ⚠️
crates/burn-train/src/metric/processor/metrics.rs 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1591      +/-   ##
==========================================
- Coverage   86.39%   86.35%   -0.05%     
==========================================
  Files         688      689       +1     
  Lines       78675    78950     +275     
==========================================
+ Hits        67974    68176     +202     
- Misses      10701    10774      +73     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Collaborator

@antimora antimora left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

I had one comment

@laggui
Copy link
Member Author

laggui commented Apr 8, 2024

Fixed your comment! Just gonna add a couple of tests for the expected results.

Copy link
Member

@nathanielsimard nathanielsimard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe the learner.fit(train, valid) should return the summary with the trained model as well?

Otherwise LGTM!

@laggui
Copy link
Member Author

laggui commented Apr 11, 2024

Maybe the learner.fit(train, valid) should return the summary with the trained model as well?

Otherwise LGTM!

I thought about that initially!

Went over a couple of iterations before landing on this to retrieve the summary. Wasn't sure if we always wanted to return the summary, but it can always be added with the current implementation.

- Add LearnerSummaryConfig
- Keep track of summary metrics names
- Add model field when displaying from learner.fit()
@laggui
Copy link
Member Author

laggui commented Apr 11, 2024

Made the requested changes we discussed offline to automatically display the summary at the end of learner.fit().

Copy link
Member

@nathanielsimard nathanielsimard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@laggui laggui merged commit 0cbe9a9 into main Apr 11, 2024
13 checks passed
@laggui laggui deleted the feat/train/report branch April 11, 2024 16:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Generate a training report in burn-train
3 participants