GitHub user Nimishbansal-16 added a comment to the discussion: Automatic 
refresh of Dashboards

@dosu I am using celery to enable async querying as well, but for that I am 
getting error that - Result backend is not configured.

My superset config - 

class CeleryConfig(object):
    broker_url = "redis://:{pass}@{ip}:{port}/1"
    imports = (
        "superset.sql_lab",
        "superset.tasks.scheduler",
    )
    result_backend = "redis://:{pass}@{ip}:{port}/1"
    worker_prefetch_multiplier = 10
    task_acks_late = True
#    broker_connection_retry_on_startup = True
    task_annotations = {
        "sql_lab.get_sql_results": {
            "rate_limit": "100/s",
        },
    }
    task_default_queue = 'superset_queue'
    task_default_routing_key = 'superset.tasks'
    task_queues = (
        Queue('superset_queue', routing_key='superset.#'),
    )

CELERY_CONFIG = CeleryConfig

FEATURE_FLAGS = {
        "HORIZONTAL_FILTER_BAR": True,
        "DRILL_TO_DETAIL": True,
        "DASHBOARD_CACHE" : True,
        "EMBEDDED_SUPERSET": True,
        "TAGGING_SYSTEM": True,
        "ENABLE_ASYNC_QUERY_EXECUTION": True,
        "GLOBAL_ASYNC_QUERIES": True
}


On running query in SQL Lab for a db connection where async query is turned on 
, these are the celery debug logs - 

[2025-07-29 01:50:17,229: DEBUG/MainProcess] | Worker: Starting Consumer
[2025-07-29 01:50:17,230: DEBUG/MainProcess] | Consumer: Starting Connection
[2025-07-29 01:50:17,240: INFO/MainProcess] Connected to 
redis://:**@10.225.0.31:6379/1
[2025-07-29 01:50:17,240: DEBUG/MainProcess] ^-- substep ok
[2025-07-29 01:50:17,240: DEBUG/MainProcess] | Consumer: Starting Events
[2025-07-29 01:50:17,243: DEBUG/MainProcess] ^-- substep ok
[2025-07-29 01:50:17,243: DEBUG/MainProcess] | Consumer: Starting Heart
[2025-07-29 01:50:17,246: DEBUG/MainProcess] ^-- substep ok
[2025-07-29 01:50:17,246: DEBUG/MainProcess] | Consumer: Starting Mingle
[2025-07-29 01:50:17,246: INFO/MainProcess] mingle: searching for neighbors
[2025-07-29 01:50:18,256: INFO/MainProcess] mingle: all alone
[2025-07-29 01:50:18,257: DEBUG/MainProcess] ^-- substep ok
[2025-07-29 01:50:18,257: DEBUG/MainProcess] | Consumer: Starting Gossip
[2025-07-29 01:50:18,260: DEBUG/MainProcess] ^-- substep ok
[2025-07-29 01:50:18,261: DEBUG/MainProcess] | Consumer: Starting Tasks
[2025-07-29 01:50:18,264: DEBUG/MainProcess] ^-- substep ok
[2025-07-29 01:50:18,264: DEBUG/MainProcess] | Consumer: Starting Control
[2025-07-29 01:50:18,268: DEBUG/MainProcess] ^-- substep ok
[2025-07-29 01:50:18,268: DEBUG/MainProcess] | Consumer: Starting event loop
[2025-07-29 01:50:18,268: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2025-07-29 01:50:18,269: INFO/MainProcess] celery@fcc-pricing-pg-superset 
ready.
[2025-07-29 01:50:18,269: DEBUG/MainProcess] basic.qos: prefetch_count->40
[2025-07-29 01:50:24,071: INFO/MainProcess] Task 
sql_lab.get_sql_results[3856e447-89ce-463f-b1cc-2a2857cb8b71] received
[2025-07-29 01:50:24,071: DEBUG/MainProcess] TaskPool: Apply <function 
fast_trace_task at 0x7f80710fe840> (args:('sql_lab.get_sql_results', 
'3856e447-89ce-463f-b1cc-2a2857cb8b71', {'lang': 'py', 'task': 
'sql_lab.get_sql_results', 'id': '3856e447-89ce-463f-b1cc-2a2857cb8b71', 
'shadow': None, 'eta': None, 'expires': None, 'group': None, 'group_index': 
None, 'retries': 0, 'timelimit': [21660, 21600], 'root_id': 
'3856e447-89ce-463f-b1cc-2a2857cb8b71', 'parent_id': None, 'argsrepr': "(81, 
'select * from ut_product limit 100')", 'kwargsrepr': "{'return_results': 
False, 'store_results': True, 'username': 'admin', 'start_time': 
1753753824037.0059, 'expand_data': False, 'log_params': {'user_agent': 
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, 
like Gecko) Chrome/138.0.0.0 Safari/537.36'}}", 'origin': 
'gen12840@fcc-pricing-pg-superset', 'ignore_result': False, 
'replaced_task_nesting': 0, 'stamped_headers': None, 'stamps': {}, 
'properties': {'correlation_id': '3856
 e447-89ce-463f-b1cc-2a2857cb8b71', 'reply_to': 
'43f0e6a5-70cb-344a-9a29-d3be85ff4bab', 'delivery_mode': 2, 'delivery_info': 
{'exchange':... kwargs:{})
[2025-07-29 01:50:24,093: DEBUG/ForkPoolWorker-2] [stats_logger] (timing) 
sqllab.query.time_pending | 56.128173828125 
[2025-07-29 01:50:24,124: DEBUG/ForkPoolWorker-2] Using python library for 
writing JSON byte strings
[2025-07-29 01:50:24,178: DEBUG/ForkPoolWorker-2] Query 81: Results backend is 
not configured.
[2025-07-29 01:50:24,178: DEBUG/ForkPoolWorker-2] [stats_logger] (incr) 
error_sqllab_unhandled
[2025-07-29 01:50:24,193: INFO/ForkPoolWorker-2] Task 
sql_lab.get_sql_results[3856e447-89ce-463f-b1cc-2a2857cb8b71] succeeded in 
0.1202460378408432s: {'status': 'failed', 'error': 'Results backend is not 
configured.', 'errors': [{'message': 'Results backend is not configured.', 
'error_type': 'RESULTS_BACKEND_NOT_CONFIGURED_ERROR', 'level': 'error', 
'extra': {...}}]}



GitHub link: 
https://github.com/apache/superset/discussions/34340#discussioncomment-13916437

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: 
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to