I run into this very strange issue. After loading parquet tables and trying to run an sql query with the sql module the results are not correct with Spark 2.0 although over the same exactly dataset Spark 1.6 results are correct. With Textfiles however both versions of Spark work as expected. I haven't noticed something strange in the command log stages, tasks and shuffle data are almost the same. How can I debug the query execution to find out why some columns in the result are null or why the resultset is empty? Could this be related to datatype casting that is required in Spark 2.0 queries?
Thank you in advance -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-2-0-sql-module-empty-columns-in-result-over-parquet-tables-tp18579.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org