Skip to content

Commit

Permalink
Fixed analytics report: working time rounding to minimal 1 hour is no…
Browse files Browse the repository at this point in the history
…t applied to annotation speed anymore (#7898)

<!-- Raise an issue to propose your change
(https://github.com/cvat-ai/cvat/issues).
It helps to avoid duplication of efforts from multiple independent
contributors.
Discuss your ideas with maintainers to be sure that changes will be
approved and merged.
Read the [Contribution guide](https://docs.cvat.ai/docs/contributing/).
-->

<!-- Provide a general summary of your changes in the Title above -->

### Motivation and context
Depends on ##7883

### How has this been tested?
<!-- Please describe in detail how you tested your changes.
Include details of your testing environment, and the tests you ran to
see how your change affects other areas of the code, etc. -->

### Checklist
<!-- Go over all the following points, and put an `x` in all the boxes
that apply.
If an item isn't applicable for some reason, then ~~explicitly
strikethrough~~ the whole
line. If you don't do that, GitHub will show incorrect progress for the
pull request.
If you're unsure about any of these, don't hesitate to ask. We're here
to help! -->
- [x] I submit my changes into the `develop` branch
- [x] I have created a changelog fragment <!-- see top comment in
CHANGELOG.md -->
- [ ] I have updated the documentation accordingly
- [ ] I have added tests to cover my changes
- [ ] I have linked related issues (see [GitHub docs](

https://help.github.com/en/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword))
- [ ] I have increased versions of npm packages if it is necessary

([cvat-canvas](https://github.com/cvat-ai/cvat/tree/develop/cvat-canvas#versioning),

[cvat-core](https://github.com/cvat-ai/cvat/tree/develop/cvat-core#versioning),

[cvat-data](https://github.com/cvat-ai/cvat/tree/develop/cvat-data#versioning)
and

[cvat-ui](https://github.com/cvat-ai/cvat/tree/develop/cvat-ui#versioning))

### License

- [x] I submit _my code changes_ under the same [MIT License](
https://github.com/cvat-ai/cvat/blob/develop/LICENSE) that covers the
project.
  Feel free to contact the maintainers if that's a concern.


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

- **Bug Fixes**
- Corrected an issue where analytic reports showed an incorrect count of
objects for skeleton tracks and shapes.

- **Improvements**
- Renamed metrics related to annotation speed from total to average for
jobs, tasks, and projects.
- Updated descriptions for annotation speed metrics to specify the
number of objects per hour.
  - Removed unnecessary clamping function for working time statistics.

These changes enhance the accuracy and clarity of analytic reports,
providing more meaningful insights into annotation speeds and object
counts.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
  • Loading branch information
bsekachev and coderabbitai[bot] authored May 20, 2024
1 parent 59e31b3 commit 1a86ccd
Show file tree
Hide file tree
Showing 9 changed files with 39 additions and 46 deletions.
4 changes: 4 additions & 0 deletions changelog.d/20240516_093233_boris_fixed_annotation_speed.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
### Changed

- Working time rounding to a minimal value of 1 hour is not applied to the annotation speed metric any more
(<https://github.com/cvat-ai/cvat/pull/7898>)
4 changes: 4 additions & 0 deletions changelog.d/20240516_093537_boris_fixed_annotation_speed.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
### Changed

- Total annotation speed metric renamed to Average annotation speed
(<https://github.com/cvat-ai/cvat/pull/7898>)
14 changes: 7 additions & 7 deletions cvat/apps/analytics_report/report/create.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,17 +14,17 @@
from cvat.apps.analytics_report.models import AnalyticsReport
from cvat.apps.analytics_report.report.derived_metrics import (
DerivedMetricBase,
JobTotalAnnotationSpeed,
JobAverageAnnotationSpeed,
JobTotalObjectCount,
ProjectAnnotationSpeed,
ProjectAnnotationTime,
ProjectAverageAnnotationSpeed,
ProjectObjects,
ProjectTotalAnnotationSpeed,
ProjectTotalObjectCount,
TaskAnnotationSpeed,
TaskAnnotationTime,
TaskAverageAnnotationSpeed,
TaskObjects,
TaskTotalAnnotationSpeed,
TaskTotalObjectCount,
)
from cvat.apps.analytics_report.report.primary_metrics import (
Expand All @@ -45,7 +45,7 @@ def get_empty_report():
JobAnnotationSpeed(None),
JobAnnotationTime(None),
JobTotalObjectCount(None),
JobTotalAnnotationSpeed(None),
JobAverageAnnotationSpeed(None),
]

statistics = [AnalyticsReportUpdateManager._get_empty_statistics_entry(dm) for dm in metrics]
Expand Down Expand Up @@ -369,7 +369,7 @@ def _compute_report_for_job(
data_extractor=None,
primary_statistics=primary_statistics[JobAnnotationSpeed.key()],
),
JobTotalAnnotationSpeed(
JobAverageAnnotationSpeed(
db_job,
data_extractor=None,
primary_statistics=primary_statistics[JobAnnotationSpeed.key()],
Expand Down Expand Up @@ -433,7 +433,7 @@ def _compute_report_for_task(
for jr in job_reports
],
),
TaskTotalAnnotationSpeed(
TaskAverageAnnotationSpeed(
db_task,
data_extractor=None,
primary_statistics=[
Expand Down Expand Up @@ -496,7 +496,7 @@ def _compute_report_for_project(
for jr in job_reports
],
),
ProjectTotalAnnotationSpeed(
ProjectAverageAnnotationSpeed(
db_project,
data_extractor=None,
primary_statistics=[
Expand Down
10 changes: 5 additions & 5 deletions cvat/apps/analytics_report/report/derived_metrics/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@

from .annotation_speed import ProjectAnnotationSpeed, TaskAnnotationSpeed
from .annotation_time import ProjectAnnotationTime, TaskAnnotationTime
from .average_annotation_speed import (
JobAverageAnnotationSpeed,
ProjectAverageAnnotationSpeed,
TaskAverageAnnotationSpeed,
)
from .base import DerivedMetricBase
from .objects import ProjectObjects, TaskObjects
from .total_annotation_speed import (
JobTotalAnnotationSpeed,
ProjectTotalAnnotationSpeed,
TaskTotalAnnotationSpeed,
)
from .total_object_count import JobTotalObjectCount, ProjectTotalObjectCount, TaskTotalObjectCount
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@


class TaskAnnotationSpeed(DerivedMetricBase, JobAnnotationSpeed):
_description = "Metric shows the annotation speed in objects per hour for the Task."
_description = "Metric shows annotation speed in the task as number of objects per hour."
_query = None

def calculate(self):
Expand Down Expand Up @@ -52,4 +52,4 @@ def calculate(self):


class ProjectAnnotationSpeed(TaskAnnotationSpeed):
_description = "Metric shows the annotation speed in objects per hour for the Project."
_description = "Metric shows annotation speed in the project as number of objects per hour."
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (C) 2023 CVAT.ai Corporation
# Copyright (C) 2023-2024 CVAT.ai Corporation
#
# SPDX-License-Identifier: MIT

Expand All @@ -7,11 +7,11 @@
from .base import DerivedMetricBase


class JobTotalAnnotationSpeed(DerivedMetricBase):
_title = "Total Annotation Speed (objects per hour)"
_description = "Metric shows total annotation speed in the Job."
class JobAverageAnnotationSpeed(DerivedMetricBase):
_title = "Average Annotation Speed (objects per hour)"
_description = "Metric shows average annotation speed in the Job."
_default_view = ViewChoice.NUMERIC
_key = "total_annotation_speed"
_key = "average_annotation_speed"
_is_filterable_by_date = False

def calculate(self):
Expand All @@ -23,14 +23,12 @@ def calculate(self):
total_wt += ds[1]["value"]

metric = self.get_empty()
metric["total_annotation_speed"][0]["value"] = (
total_count / max(total_wt, 1) if total_wt != 0 else 0
)
metric[self._key][0]["value"] = total_count / total_wt if total_wt != 0 else 0
return metric

def get_empty(self):
return {
"total_annotation_speed": [
self._key: [
{
"value": 0,
"datetime": self._get_utc_now().strftime("%Y-%m-%dT%H:%M:%SZ"),
Expand All @@ -39,8 +37,8 @@ def get_empty(self):
}


class TaskTotalAnnotationSpeed(JobTotalAnnotationSpeed):
_description = "Metric shows total annotation speed in the Task."
class TaskAverageAnnotationSpeed(JobAverageAnnotationSpeed):
_description = "Metric shows average annotation speed in the Task."

def calculate(self):
total_count = 0
Expand All @@ -53,14 +51,14 @@ def calculate(self):
total_wt += wt_entry["value"]

return {
"total_annotation_speed": [
self._key: [
{
"value": total_count / max(total_wt, 1) if total_wt != 0 else 0,
"value": total_count / total_wt if total_wt != 0 else 0,
"datetime": self._get_utc_now().strftime("%Y-%m-%dT%H:%M:%SZ"),
},
]
}


class ProjectTotalAnnotationSpeed(TaskTotalAnnotationSpeed):
_description = "Metric shows total annotation speed in the Project."
class ProjectAverageAnnotationSpeed(TaskAverageAnnotationSpeed):
_description = "Metric shows average annotation speed in the Project."
15 changes: 1 addition & 14 deletions cvat/apps/analytics_report/report/get.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,19 +39,6 @@ def _convert_datetime_to_date(statistics):
return statistics


def _clamp_working_time(statistics):
affected_metrics = "annotation_speed"
for metric in statistics:
if metric["name"] not in affected_metrics:
continue
data_series = metric.get("data_series", {})
if data_series:
for df in data_series["working_time"]:
df["value"] = max(df["value"], 1)

return statistics


def _get_object_report(obj_model, pk, start_date, end_date):
data = {}
try:
Expand All @@ -65,7 +52,7 @@ def _get_object_report(obj_model, pk, start_date, end_date):

statistics = _filter_statistics_by_date(db_analytics_report.statistics, start_date, end_date)
statistics = _convert_datetime_to_date(statistics)
data["statistics"] = _clamp_working_time(statistics)
data["statistics"] = statistics
data["created_date"] = db_analytics_report.created_date

if obj_model is Job:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ def __init__(
class JobAnnotationSpeed(PrimaryMetricBase):
_key = "annotation_speed"
_title = "Annotation speed (objects per hour)"
_description = "Metric shows the annotation speed in objects per hour."
_description = "Metric shows annotation speed in the job as number of objects per hour."
_default_view = ViewChoice.HISTOGRAM
_granularity = GranularityChoice.DAY
_is_filterable_by_date = False
Expand Down
4 changes: 2 additions & 2 deletions tests/cypress/e2e/features/analytics_pipeline.js
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ context('Analytics pipeline', () => {
},
];

const cardEntryNames = ['annotation_time', 'total_object_count', 'total_annotation_speed'];
const cardEntryNames = ['annotation_time', 'total_object_count', 'average_annotation_speed'];
function checkCards() {
cy.get('.cvat-analytics-card')
.should('have.length', 3)
Expand All @@ -61,7 +61,7 @@ context('Analytics pipeline', () => {
.invoke('data', 'entry-name')
.then((val) => {
expect(cardEntryNames.includes(val)).to.eq(true);
if (['total_object_count', 'total_annotation_speed'].includes(val)) {
if (['total_object_count', 'average_annotation_speed'].includes(val)) {
cy.wrap(card).within(() => {
cy.get('.cvat-analytics-card-value').should('not.have.text', '0.0');
});
Expand Down

0 comments on commit 1a86ccd

Please sign in to comment.