Hi everyone,

I'm using Spark with HiveSupport enabled, the data is stored in parquet
format in a fixed location.
I just downloaded Spark 2.1.0 and it broke Spark-SQL queries. I can do
count(*) and it returns the correct count, but all columns show as "NULL".
It worked fine on 1.6 & 2.0.x.

I'm guessing it has to with "SPARK-18360 The default table path of tables
in the default database will be under the location of the default database
instead of always depending on the warehouse location setting."
I *want *the table paths to depend on the warehouse location setting but I
couldn't find the configurations to change the behavior back to what it was
before.


Best regards,
*Babak Alipour ,*
*University of Florida*

Reply via email to