Re: Which Hive version should be used for Spark 1.3

2015-04-09 Thread Denny Lee
By default Spark 1.3 has bindings to Hive 0.13.1 though you can bind it to
Hive 0.12 if you specify it in the profile when building Spark as per
https://spark.apache.org/docs/1.3.0/building-spark.html.

If you are downloading a pre built version of Spark 1.3 - then by default,
it is set to Hive 0.13.1.

HTH!

On Thu, Apr 9, 2015 at 10:03 AM ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote:

 Most likely you have an existing Hive installation with data in it. In
 this case i was not able to get Spark 1.3 communicate with existing Hive
 meta store. Hence when i read any table created in hive, Spark SQL used to
 complain Data table not found

 If you get it working, please share the steps.

 On Thu, Apr 9, 2015 at 9:25 PM, Arthur Chan arthur.hk.c...@gmail.com
 wrote:

 Hi,

 I use Hive 0.12 for Spark 1.2 at the moment and plan to upgrade to Spark
 1.3.x

 Could anyone advise which Hive version should be used to match Spark
 1.3.x?
 Can I use Hive 1.1.0 for Spark 1.3? or can I use Hive 0.14 for Spark 1.3?

 Regards
 Arthur




 --
 Deepak




Re: Which Hive version should be used for Spark 1.3

2015-04-09 Thread ๏̯͡๏
Most likely you have an existing Hive installation with data in it. In this
case i was not able to get Spark 1.3 communicate with existing Hive meta
store. Hence when i read any table created in hive, Spark SQL used to
complain Data table not found

If you get it working, please share the steps.

On Thu, Apr 9, 2015 at 9:25 PM, Arthur Chan arthur.hk.c...@gmail.com
wrote:

 Hi,

 I use Hive 0.12 for Spark 1.2 at the moment and plan to upgrade to Spark
 1.3.x

 Could anyone advise which Hive version should be used to match Spark
 1.3.x?
 Can I use Hive 1.1.0 for Spark 1.3? or can I use Hive 0.14 for Spark 1.3?

 Regards
 Arthur




-- 
Deepak