How can I set HADOOP_HOME if I am running the Spark on my local machine without 
anything else? Do I have to install some other pre-built file? I am on Windows 
7 and Spark’s official site says that it is available on Windows, I added Java 
path in the PATH variable.

Vineet

From: Sean Owen [mailto:so...@cloudera.com]
Sent: Donnerstag, 28. August 2014 13:49
To: Hingorani, Vineet
Cc: user@spark.apache.org
Subject: Re: Spark-submit not running

You need to set HADOOP_HOME. Is Spark officially supposed to work on Windows or 
not at this stage? I know the build doesn't quite yet.

On Thu, Aug 28, 2014 at 11:37 AM, Hingorani, Vineet 
<vineet.hingor...@sap.com<mailto:vineet.hingor...@sap.com>> wrote:
The file is compiling properly but when I try to run the jar file using 
spark-submit, it is giving some errors. I am running spark locally and have 
downloaded a pre-built version of Spark named “For Hadoop 2 (HDP2, CDH5)”. AI 
don’t know if it is a dependency problem but I don’t want to have Hadoop in my 
system. The error says:

14/08/28 12:34:36 ERROR util.Shell: Failed to locate the winutils binary in the 
hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the 
Hadoop binaries.

Vineet

Reply via email to