Here are some resources which will help you with that.

-
http://stackoverflow.com/questions/19620642/failed-to-locate-the-winutils-binary-in-the-hadoop-binary-path

- https://issues.apache.org/jira/browse/SPARK-2356

Thanks
Best Regards

On Tue, Jul 21, 2015 at 1:57 PM, Nitin Kalra <nitinkalra2...@gmail.com>
wrote:

> Hi Akhil,
>
> I don't have HADOOP_HOME or HADOOP_CONF_DIR and even winutils.exe ? What's
> the configuration required for this ? From where can I get winutils.exe ?
>
> Thanks and Regards,
> Nitin Kalra
>
>
> On Tue, Jul 21, 2015 at 1:30 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> Do you have HADOOP_HOME, HADOOP_CONF_DIR and hadoop's winutils.exe in the
>> environment?
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Jul 20, 2015 at 5:45 PM, nitinkalra2000 <nitinkalra2...@gmail.com
>> > wrote:
>>
>>> Hi All,
>>>
>>> I am working on Spark 1.4 on windows environment. I have to set eventLog
>>> directory so that I can reopen the Spark UI after application has
>>> finished.
>>>
>>> But I am not able to set eventLog.dir, It gives an error on Windows
>>> environment.
>>>
>>> Configuation is :
>>>
>>> <entry key="spark.eventLog.enabled" value="true" />
>>> <entry key="spark.eventLog.dir" value="file:///c:/sparklogs" />
>>>
>>> Exception I get :
>>>
>>> java.io.IOException: Cannot run program "cygpath": CreateProcess error=2,
>>> The system cannot find the file specified
>>>     at java.lang.ProcessBuilder.start(Unknown Source)
>>>     at org.apache.hadoop.util.Shell.runCommand(Shell.java:206)
>>>
>>> I have also tried installing Cygwin but still the error doesn't go.
>>>
>>> Can anybody give any advice on it?
>>>
>>> I have posted the same question on Stackoverflow as well :
>>>
>>> http://stackoverflow.com/questions/31468716/apache-spark-spark-eventlog-dir-on-windows-environment
>>>
>>> Thanks
>>> Nitin
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-spark-eventLog-dir-on-Windows-Environment-tp23913.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to