iercan edited a comment on issue #12766:
URL: https://github.com/apache/superset/issues/12766#issuecomment-768829623
> * Which visualizations are triggering the errors?
Dashboard I tested we have filter box, line chart, time series chart, big
number and table. Also we have defined metrics on that dataset.
> * It appears from the stacktraces that you're querying an Apache Druid DB,
is that correct? Are you seeing errors with other DB types?
Yes It come from druid but I got this error from mysql too.
```
worker_1 | [2021-01-28 06:17:13,108: ERROR/ForkPoolWorker-8] Task
load_explore_json_into_cache[fd7b0575-725e-4b19-925b-4f6530cfc478] raised
unexpected: SupersetVizException('[{\'message\': \'"Could not locate column in
row for column \\\'sql_metrics.id\\\'"\', \'error_type\':
<SupersetErrorType.VIZ_GET_DF_ERROR: \'VIZ_GET_DF_ERROR\'>, \'level\':
<ErrorLevel.ERROR: \'error\'>, \'extra\': None}]')
worker_1 | Traceback (most recent call last):
worker_1 | File
"/usr/local/lib/python3.7/site-packages/celery/app/trace.py", line 412, in
trace_task
worker_1 | R = retval = fun(*args, **kwargs)
worker_1 | File "/app/superset/app.py", line 116, in __call__
worker_1 | return task_base.__call__(self, *args, **kwargs)
worker_1 | File
"/usr/local/lib/python3.7/site-packages/celery/app/trace.py", line 704, in
__protected_call__
worker_1 | return self.run(*args, **kwargs)
worker_1 | File "/app/superset/tasks/async_queries.py", line 108, in
load_explore_json_into_cache
worker_1 | raise exc
worker_1 | File "/app/superset/tasks/async_queries.py", line 86, in
load_explore_json_into_cache
worker_1 | raise SupersetVizException(errors=payload["errors"])
worker_1 | superset.exceptions.SupersetVizException: [{'message': '"Could
not locate column in row for column \'sql_metrics.id\'"', 'error_type':
<SupersetErrorType.VIZ_GET_DF_ERROR: 'VIZ_GET_DF_ERROR'>, 'level':
<ErrorLevel.ERROR: 'error'>, 'extra': None}]
```
> * Are you able to run asynchronous queries in SQL Lab?
Sql lab works fine.
> The same code is used to query the analytics DB in sync and async mode, so
I'm also very curious why there's a discrepancy between the two. I should note
that celery does not support hot reloads, so workers need to be restarted on
any config changes.
I'm restarting all containers by using `docker-compose restart` whenever I
change a config
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]