Re: using existing hive with spark sql

2014-10-27 Thread Michael Armbrust
Passing -Phive is required to build support for accessing hive data into Spark. You do not need a hive installation for this to work (all queries are executed by Spark), but you can connect this to your existing metastore by placing the hive-site.xml in sparks conf/ directory. More here: http://s

using existing hive with spark sql

2014-10-27 Thread Pagliari, Roberto
If I already have hive running on Hadoop, do I need to build Hive using sbt/sbt -Phive assembly/assembly command? If the answer is no, how do I tell spark where hive home is? Thanks,