> On 11 Dec 2015, at 05:14, michael_han wrote:
>
> Hi Sarala,
> I found the reason, it's because when spark run it still needs Hadoop
> support, I think it's a bug in Spark and still not fixed now ;)
>
It's related to how the hadoop filesystem apis are used to access pretty much
every filesys
Hi Sarala,
I found the reason, it's because when spark run it still needs Hadoop
support, I think it's a bug in Spark and still not fixed now ;)
After I download winutils.exe and following the steps from bellow
workaround, it works fine:
http://qnalist.com/questions/4994960/run-spark-unit-test-on-
Hi Sarala,
Thanks for your reply. But it doesn't work.
I tried the following 2 commands:
*<1>*
spark-submit --master local --name "SparkTest App" --class
com.qad.SparkTest1
target/Spark-Test-1.0.jar;c:/spark-1.5.2-bin-hadoop2.6/lib/spark-assembly-1.5.2-hadoop2.6.0.jar
with error: c:\spark-1.5.2-