Hello, *(Before everything : I use IntellijIdea 14.0.1, SBT and Scala 2.11.6)*
This morning, I was looking to resolve the "Failed to locate the winutils binary in the hadoop binary path" error. I noticed that I can solve it configuring my build.sbt to "... libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "1.0.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1" excludeAll( ExclusionRule(organization = "org.apache.hadoop") ) libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1" excludeAll( ExclusionRule(organization = "org.apache.hadoop") ) " but if i change the line libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "1.0.4" to libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" the error is back. What does it mean? Spark is build for an old version of hadoop? I really want to understand. *Also, a bonus question : * As you can see I am using spark 1.3.1 and spark-mllib APIs. I am using the last version, but my APIs are not corresponding to the latest official APIs (https://spark.apache.org/docs/*latest*/api/scala/#package) For example, to run a KMeans algo, I have to use KMeans.train() whereas it does not exist in the latest API. First time, I ask something in the mailing list, I hope I use it well. Sorry for my bad english. Thank you and have a good day, JC ᐧ