I've already installed cygwin, and configure spark_home, but when I tried
to run ./hive , hive expected HADOOP_HOME.
Does Hive needs hadoop always? or there are some configuration missing?

Thanks

On Mon, Oct 19, 2015 at 11:31 PM, Xuefu Zhang <xzh...@cloudera.com> wrote:

> Hi Andres,
>
> We haven't tested Hive on Spark on Windows. However, if you can get Hive
> and Spark to work on Windows, I'd assume that the configuration is no
> different from on Linux. Let's know if you encounter any specific problems.
>
> Thanks,
> Xuefu
>
> On Mon, Oct 19, 2015 at 5:13 PM, Andrés Ivaldi <iaiva...@gmail.com> wrote:
>
>> Hello, I would like to install Hive with Spark on Windows, I've already
>> installed Spark, but I cant find a clear documentation on how to configure
>> hive on windows with spark.
>>
>> Regards
>>
>>
>>
>>
>> --
>> Ing. Ivaldi Andres
>>
>
>


-- 
Ing. Ivaldi Andres

Reply via email to