aminghadersohi opened a new pull request, #38977:
URL: https://github.com/apache/superset/pull/38977

   ## Summary
   
   Batch fix for three MCP tool bugs:
   
   1. **execute_sql crashes parallel calls on nonexistent database_id** — 
Instead of raising a hard `SupersetErrorException` that kills all sibling 
parallel tool calls, return an `ExecuteSqlResponse` with `success=False`. Same 
treatment for access-denied errors.
   
   2. **All list tools return null for changed_on_humanized** — The DAO column 
query skips `changed_on_humanized` because it's a Python `@property`, not a DB 
column. Fix: add `changed_on` to DEFAULT_COLUMNS so the timestamp is actually 
queried, then compute the humanized string in each serializer using 
`humanize.naturaltime()`.
   
   3. **get_chart_data fails for legacy deck.gl charts without query_context** 
— Deck.gl charts use spatial data (lat/lon) that can't be reconstructed from 
form_data. Return a clear `ChartError` explaining the chart needs to be 
re-saved, instead of falling through to a cryptic empty-query error.
   
   ## Test plan
   
   - [ ] Call `execute_sql` with a nonexistent `database_id` — should return 
`{success: false, error: "Database with ID X not found"}` instead of crashing
   - [ ] Call `execute_sql` with a database the user has no access to — should 
return graceful error
   - [ ] Call `list_charts`, `list_dashboards`, `list_datasets` — 
`changed_on_humanized` should be populated (e.g. "2 hours ago") instead of null
   - [ ] Call `get_chart_data` on a legacy deck.gl chart — should return a 
clear error message about re-saving


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to