iercan opened a new issue, #36765: URL: https://github.com/apache/superset/issues/36765
### Bug description Creating a native filter with a boolean column causes the Databricks query to fail, likely due to an SQL translation issue. Here is the filter we add. <img width="486" height="222" alt="Image" src="https://github.com/user-attachments/assets/911801b7-e745-4929-949d-e9c85b0f6650" /> And this is the error whenever we try to set any value on this filter ``` Error: [DATATYPE_MISMATCH.DATA_DIFF_TYPES] Cannot resolve "(is_test_user IN (0))" due to data type mismatch: Input to `in` should all be the same type, but it's ["BOOLEAN", "INT"]. SQLSTATE: 42K09; line 6 pos 15 ``` This is the query superset sending to databricks. ``` SELECT ... FROM prod.video_coin_operation_extended WHERE is_test_user IN (0) ... ``` Databricks is not accepting integer values for boolean columns. Query should have sent is as `is_test_user=False` Tested it on master and version 6.0.0 We worked around this issue by creating a string-type calculated column for filtering but would still appreciate a resolution. ### Screenshots/recordings _No response_ ### Superset version master / latest-dev ### Python version 3.9 ### Node version 16 ### Browser Chrome ### Additional context _No response_ ### Checklist - [ ] I have searched Superset docs and Slack and didn't find a solution to my problem. - [ ] I have searched the GitHub issue tracker and didn't find a similar bug report. - [ ] I have checked Superset's logs for errors and if I found a relevant Python stacktrace, I included it here as text in the "additional context" section. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
