Thanks for the prompt response, so is it only needed for is libs?, I dont need to run hadoop?
On Tue, Oct 20, 2015 at 11:46 AM, Xuefu Zhang <xzh...@cloudera.com> wrote: > Yes. You need HADOOP_HOME, which tells Hive how to connect to HDFS and get > its dependent libraries there. > > On Tue, Oct 20, 2015 at 7:36 AM, Andrés Ivaldi <iaiva...@gmail.com> wrote: > >> I've already installed cygwin, and configure spark_home, but when I tried >> to run ./hive , hive expected HADOOP_HOME. >> Does Hive needs hadoop always? or there are some configuration missing? >> >> Thanks >> >> On Mon, Oct 19, 2015 at 11:31 PM, Xuefu Zhang <xzh...@cloudera.com> >> wrote: >> >>> Hi Andres, >>> >>> We haven't tested Hive on Spark on Windows. However, if you can get Hive >>> and Spark to work on Windows, I'd assume that the configuration is no >>> different from on Linux. Let's know if you encounter any specific problems. >>> >>> Thanks, >>> Xuefu >>> >>> On Mon, Oct 19, 2015 at 5:13 PM, Andrés Ivaldi <iaiva...@gmail.com> >>> wrote: >>> >>>> Hello, I would like to install Hive with Spark on Windows, I've already >>>> installed Spark, but I cant find a clear documentation on how to configure >>>> hive on windows with spark. >>>> >>>> Regards >>>> >>>> >>>> >>>> >>>> -- >>>> Ing. Ivaldi Andres >>>> >>> >>> >> >> >> -- >> Ing. Ivaldi Andres >> > > -- Ing. Ivaldi Andres