Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: invalid metric should raise an exception #20882

Merged
merged 8 commits into from
Jul 28, 2022

Conversation

zhaoyongjie
Copy link
Member

@zhaoyongjie zhaoyongjie commented Jul 27, 2022

SUMMARY

In some unknown cases(may be database migration?) the ad-hoc metric may be missing expressionType field, then an exception should be raised to let the user handle the incorrect metrics.


the original issue was reported from shortcut.

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/api/__init__.py", line 85, in wraps
    return f(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/superset/views/base_api.py", line 113, in wraps
    raise ex
  File "/usr/local/lib/python3.8/site-packages/superset/views/base_api.py", line 110, in wraps
    duration, response = time_function(f, self, *args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/superset/utils/core.py", line 1507, in time_function
    response = func(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/superset/utils/log.py", line 245, in wrapper
    value = f(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/superset/dashboards/api.py", line 381, in get_datasets
    datasets = DashboardDAO.get_datasets_for_dashboard(id_or_slug)
  File "/usr/local/lib/python3.8/site-packages/superset/dashboards/dao.py", line 52, in get_datasets_for_dashboard
    return dashboard.datasets_trimmed_for_slices()
  File "/usr/local/lib/python3.8/site-packages/flask_caching/__init__.py", line 905, in decorated_function
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/superset/models/dashboard.py", line 314, in datasets_trimmed_for_slices
    result.append(datasource.data_for_slices(slices))
  File "/usr/local/lib/python3.8/site-packages/superset/connectors/base/models.py", line 313, in data_for_slices
    metric_names.add(utils.get_metric_name(metric))
  File "/usr/local/lib/python3.8/site-packages/superset/utils/core.py", line 1309, in get_metric_name
    return verbose_map.get(metric, metric)  # type: ignore
TypeError: unhashable type: 'dict'

Just judging from this error, the first metric is not a hashable type. e.g.

>>> foo = {'key': 123}
>>> bar = {'a': 1}
>>>
>>> bar.get(foo)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: unhashable type: 'dict'

TESTING INSTRUCTIONS

before fix

  1. open explore page, use birth_name as datasource, table as viz
  2. Click metrics control, create a adhoc metic as following screenshot.
    image
  3. save chart and save to a new dashboard
  4. open a arbitrary SQL client to connect superset metadata
  5. open slices table and find out the newest record.
  6. check out params column and remove "expressionType":"SQL", in it.

image

7. go to `/api/v1/dashboard/{$DashboardID}}/datasets` will look at the error, responded status code is 500

after fix

  1. go to /api/v1/dashboard/{$DashboardID}}/datasets will look at the error, the status code is 400, and error message will point out specific errors and failed the metric.

ADDITIONAL INFORMATION

  • Has associated issue:
  • Required feature flags:
  • Changes UI
  • Includes DB Migration (follow approval process in SIP-59)
    • Migration is atomic, supports rollback & is backwards-compatible
    • Confirm DB migration upgrade and downgrade tested
    • Runtime estimates and downtime expectations provided
  • Introduces new feature or API
  • Removes existing feature or API

@codecov
Copy link

codecov bot commented Jul 27, 2022

Codecov Report

Merging #20882 (ad0215c) into master (77db065) will decrease coverage by 11.47%.
The diff coverage is 75.00%.

❗ Current head ad0215c differs from pull request most recent head 5572f3f. Consider uploading reports for the commit 5572f3f to get more accurate results

@@             Coverage Diff             @@
##           master   #20882       +/-   ##
===========================================
- Coverage   66.25%   54.78%   -11.48%     
===========================================
  Files        1758     1758               
  Lines       67048    67049        +1     
  Branches     7118     7118               
===========================================
- Hits        44423    36730     -7693     
- Misses      20808    28502     +7694     
  Partials     1817     1817               
Flag Coverage Δ
hive 53.24% <50.00%> (+<0.01%) ⬆️
mysql ?
postgres ?
presto 53.14% <50.00%> (+<0.01%) ⬆️
python 57.84% <75.00%> (-23.72%) ⬇️
sqlite ?
unit 50.25% <75.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
superset/dashboards/api.py 51.15% <0.00%> (-41.33%) ⬇️
superset/models/sql_lab.py 74.16% <100.00%> (-4.59%) ⬇️
superset/utils/core.py 62.88% <100.00%> (-27.35%) ⬇️
superset/utils/dashboard_import_export.py 0.00% <0.00%> (-100.00%) ⬇️
superset/key_value/commands/update.py 0.00% <0.00%> (-88.89%) ⬇️
superset/key_value/commands/delete.py 0.00% <0.00%> (-85.30%) ⬇️
superset/key_value/commands/delete_expired.py 0.00% <0.00%> (-80.77%) ⬇️
superset/dashboards/commands/importers/v0.py 15.62% <0.00%> (-76.25%) ⬇️
superset/datasets/commands/update.py 25.30% <0.00%> (-68.68%) ⬇️
superset/datasets/commands/create.py 29.41% <0.00%> (-68.63%) ⬇️
... and 281 more

Help us with your feedback. Take ten seconds to tell us how you rate us.

@pull-request-size pull-request-size bot added size/M and removed size/S labels Jul 27, 2022
@@ -716,7 +716,7 @@ def test_get_data_transforms_dataframe(self):
self.assertEqual(data, expected)

def test_get_data_empty_null_keys(self):
form_data = {"groupby": [], "metrics": ["", None]}
Copy link
Member Author

@zhaoyongjie zhaoyongjie Jul 27, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The metric must be a string or AdhocMetric.

@@ -43,5 +43,5 @@
"granularity_sqla": null,
"autozoom": true,
"url_params": {},
"size": 100
"size": "100"
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the metric must be string

@@ -45,5 +45,5 @@
"granularity_sqla": null,
"autozoom": true,
"url_params": {},
"size": 100
"size": "100"
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the metric must be string

Comment on lines -64 to +66
class Query(Model, ExtraJSONMixin, ExploreMixin): # pylint: disable=abstract-method
class Query(
Model, ExtraJSONMixin, ExploreMixin
): # pylint: disable=abstract-method,too-many-public-methods
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bycatch: CI was failed by pylint error.

superset/dashboards/api.py Outdated Show resolved Hide resolved
}
self.assertEqual(data, expected)

form_data = {"groupby": [], "metrics": [None]}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add a test case where metric is a dict, too?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the more unit test at here. I added some invalid metric case as well.

@zhaoyongjie zhaoyongjie merged commit 718bc30 into apache:master Jul 28, 2022
@mistercrunch mistercrunch added 🏷️ bot A label used by `supersetbot` to keep track of which PR where auto-tagged with release labels 🚢 2.1.0 and removed 🚢 2.1.3 labels Mar 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🏷️ bot A label used by `supersetbot` to keep track of which PR where auto-tagged with release labels size/M 🚢 2.1.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants