I haven’t had time to really look into this problem, but I want to mention it. 
I downloaded 
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-preview-bin/spark-2.0.0-preview-bin-hadoop2.7.tgz
and tried to run it against our Secure Hadoop cluster and access a Hive table.

1. “val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)”  doesn’t 
work because “HiveContext not a member of org.apache.spark.sql.hive”  I checked 
the documentation, and it looks like it should still work for 
spark-2.0.0-preview-bin-hadoop2.7.tgz

2. I also tried the new spark session, ‘spark.table(“db.table”)’, it fails with 
a HDFS permission denied can’t write to “/user/hive/warehouse”

Is there a new config option that I missed ? 

I tried a  SNAPSHOT version, downloaded from Patricks apache’s dir  from Apr 
26th,  that worked the way I expected.
I’m going to go through the commits and see which one broke the change, but my 
builds are not running (no such method ConcurrentHashMap.keySet()) so I have to 
fix that problem first.

Thanks for any hints. 

Doug



---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to