Thanks for the replies, guys.
Is this a permanent change as of 1.3, or will it go away at some point? Also,
does it require an entire Hadoop installation, or just WinUtils.exe?
Thanks,Ashic.
Date: Fri, 26 Jun 2015 18:22:03 +1000
Subject: Re: Recent spark sc.textFile needs hadoop for folders
You just need to set your HADOOP_HOME which appears to be null in the
stackstrace. If you are not having the winutils.exe, then you can download
https://github.com/srccodes/hadoop-common-2.2.0-bin/archive/master.zip
and put it there.
Thanks
Best Regards
On Thu, Jun 25, 2015 at 11:30 PM, Ashic
Yes, Spark Core depends on Hadoop libs, and there is this unfortunate
twist on Windows. You'll still need HADOOP_HOME set appropriately
since Hadoop needs some special binaries to work on Windows.
On Fri, Jun 26, 2015 at 11:06 AM, Akhil Das ak...@sigmoidanalytics.com wrote:
You just need to set
It's a problem since 1.3 I think
On 26 Jun 2015 04:00, Ashic Mahtab as...@live.com wrote:
Hello,
Just trying out spark 1.4 (we're using 1.1 at present). On Windows, I've
noticed the following:
* On 1.4, sc.textFile(D:\\folder\\).collect() fails from both
spark-shell.cmd and when running a
On 26 Jun 2015, at 09:29, Ashic Mahtab as...@live.commailto:as...@live.com
wrote:
Thanks for the replies, guys.
Is this a permanent change as of 1.3, or will it go away at some point?
Don't blame the spark team, complain to the hadoop team for being slow to
embrace the java 1.7 APIs for
Thanks for the awesome response, Steve.
As you say, it's not ideal, but the clarification greatly helps.
Cheers, everyone :)
-Ashic.
Subject: Re: Recent spark sc.textFile needs hadoop for folders?!?
From: ste...@hortonworks.com
To: as...@live.com
CC: guha.a...@gmail.com; user@spark.apache.org
Hello,Just trying out spark 1.4 (we're using 1.1 at present). On Windows, I've
noticed the following:
* On 1.4, sc.textFile(D:\\folder\\).collect() fails from both spark-shell.cmd
and when running a scala application referencing the spark-core package from
maven.*