Apparently Spark does require Hadoop even if you do not intend to use Hadoop. 
Is there a workaround for the below error I get when creating the SparkContext 
in Scala?
I will note that I didn't have this problem yesterday when creating the Spark 
context in Java as part of the getting started App. It could be because I was 
using Maven project to manage dependencies and that did something for me or 
else JavaSparkContext has some different code. 
I would say, in order for Spark to be general purpose this is a pretty big bug 
since now it appears Spark depends upon Hadoop. 
"Could not locate executable null\bin\winutils.exe in the Hadoop binaries"

                                          

Reply via email to