martin-raymond opened a new issue, #25749: URL: https://github.com/apache/superset/issues/25749
We are using a pinot table as a dataset, with a temporal column _startDate_ in _epoch_ms_. Before the 3.0.0.rc4 version, we were using it to create the followin area chart :  Since the release of the 3.0.0.rc4 version, our chart is not loading : **DB engine Erreur : could not convert string to Timestamp** Before 3.0.0.rc4, the query generated by the simple area chart was : `SELECT DATETIMECONVERT(startDate, '1:MILLISECONDS:EPOCH', '1:MILLISECONDS:EPOCH', '1:DAYS'), count(*) FROM "dataset" WHERE startDate >= 1694908800000 AND startDate < 1697500800000 GROUP BY DATETIMECONVERT(startDate, '1:MILLISECONDS:EPOCH', '1:MILLISECONDS:EPOCH', '1:DAYS') ORDER BY count(*) DESC LIMIT 10000;` (1694908800000 and 1697500800000 are just examples of our time filter) After the 3.0.0.rc4, the query that causes the error is : `SELECT CAST(DATE_TRUNC('day', CAST(DATETIMECONVERT((startDate/1000), '1:SECONDS:EPOCH', '1:SECONDS:EPOCH', '1:SECONDS') AS TIMESTAMP)) AS TIMESTAMP), count(*) FROM "dataset" WHERE startDate >= 1694908800000 AND startDate < 1697500800000 GROUP BY CAST(DATE_TRUNC('day', CAST(DATETIMECONVERT((startDate/1000), '1:SECONDS:EPOCH', '1:SECONDS:EPOCH', '1:SECONDS') AS TIMESTAMP)) AS TIMESTAMP) ORDER BY count(*) DESC LIMIT 10000;` The part : `DATETIMECONVERT((startDate/1000), '1:SECONDS:EPOCH', '1:SECONDS:EPOCH', '1:SECONDS')` seems odd because it literally said "convert seconds in seconds", and we think it might causes the crash. We think the problem is caused by : https://github.com/apache/superset/pull/24942 Thx for your help - superset version: `3.0.0.rc4` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
