You need to set your HADOOP_HOME in the environment.

Here :
Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

null is supposed to be your HADOOP_HOME.
On 29 Jan 2015 15:57, "Naveen Kumar Pokala" <npok...@spcapitaliq.com> wrote:

> Hi,
>
>
>
> I am facing the following issue when I am connecting from spark-shell.
> Please tell me how to avoid it.
>
>
>
> 15/01/29 17:21:27 ERROR Shell: Failed to locate the winutils binary in the
> hadoop binary path
>
> java.io.IOException: Could not locate executable null\bin\winutils.exe in
> the Hadoop binaries.
>
>         at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318)
>
>         at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333)
>
>         at org.apache.hadoop.util.Shell.<clinit>(Shell.java:326)
>
>         at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
>
>         at
> org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
>
>         at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
>
>         at
> org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:283)
>
>         at
> org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:36)
>
>         at
> org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:109)
>
>         at
> org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
>
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:228)
>
>         at
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:972)
>
>         at $line3.$read$$iwC$$iwC.<init>(<console>:8)
>
>         at $line3.$read$$iwC.<init>(<console>:14)
>
>         at $line3.$read.<init>(<console>:16)
>
>         at $line3.$read$.<init>(<console>:20)
>
>         at $line3.$read$.<clinit>(<console>)
>
>         at $line3.$eval$.<init>(<console>:7)
>
>         at $line3.$eval$.<clinit>(<console>)
>
>         at $line3.$eval.$print(<console>)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:483)
>
>         at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:789)
>
>         at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1062)
>
>         at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:615)
>
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:646)
>
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:610)
>
>         at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:814)
>
>         at
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:859)
>
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:771)
>
>         at
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
>
>         at
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
>
>         at
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:264)
>
>         at
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
>
>         at
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
>
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:931)
>
>         at
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
>
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
>
>         at
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
>
>         at
> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
>
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:948)
>
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
>
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)
>
>         at
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:902)
>
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:997)
>
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>
>         at org.apache.spark.repl.Main.main(Main.scala)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:483)
>
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
>
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> 15/01/29 17:21:28 INFO Executor: Using REPL class URI:
> http://172.22.5.79:60436
>
> 15/01/29 17:21:28 INFO AkkaUtils: Connecting to HeartbeatReceiver:
> akka.tcp://
> sparkdri...@ii01-hdhlg32.ciqhyd.com:60464/user/HeartbeatReceiver
>
> 15/01/29 17:21:28 INFO SparkILoop: Created spark context..
>
> Spark context available as sc.
>
>
>
>
>
>
>
>
>
> -Naveen
>

Reply via email to