Hi Everyone,

while reading data into spark 2.0.0 data frames through Calcite JDBC driver, 
depends on Calcite JDBC connection property setup (lexical), sometimes the data 
frame query returns empty result set, sometimes it errors out with exception: 
java.sql.SQLException: Error while preparing statement…

One of the scenarios I realized is that , with df.filter($”col”=“value”),  
Spark generated sql has “where (col IS NOT NULL) and (col='value’), which fails 
Calcite sql parser. if (col IS NOT NULL) is removed, query went through fine.

So, has anybody encountered similar sql code compatibility issue, especially 
with Calcite? Is it possible to have some configuration changes, on both Spark 
and Calcite sides to make it working together?

Thanks
Herman.

Reply via email to