Hi all,
I'm running SQL queries (sqlContext.sql()) on Parquet tables and facing a problem with table caching (sqlContext.cacheTable()), using spark-shell of Spark 1.5.1.

After I run the sqlContext.cacheTable(table), the sqlContext.sql(query) takes longer the first time (well, for the lazy execution reason) but it finishes and returns results. However, the weird thing is that after I run the same query again, I get the error: "java.lang.StackOverflowError".

I Googled it but didn't find the error appearing with table caching and querying.
Any hint is appreciated.

Reply via email to