Hi,

I'm running Spark 1.4.0 without Hadoop. I'm using the binary
spark-1.4.0-bin-hadoop2.6.

I start the spark-shell as :

spark-shell --master local[2] --packages
com.databricks:spark-csv_2.11:1.1.0  --executor-memory 2G --conf
spark.local.dir="C:/Users/Sourav".

Then I run :

val df =
sqlContext.read.format("com.databricks.spark.csv").load("file:///C:/Users/Sourav/Work/SparkDataScience/test.csv").

It gives a null pointer exception -

15/06/30 12:03:44 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
15/06/30 12:03:44 INFO Executor: Fetching
http://9.49.140.239:64868/jars/com.dat
abricks_spark-csv_2.11-1.1.0.jar with timestamp 1435690997767
15/06/30 12:03:44 INFO Utils: Fetching
http://9.49.140.239:64868/jars/com.databr
icks_spark-csv_2.11-1.1.0.jar to
C:\Users\Sourav\spark-18eb9880-4a19-46be-8c23-c
7f7e000c454\userFiles-d4df579c-4672-46ee-836c-d4dd9ea9be23\fetchFileTemp40728667
75534302313.tmp
15/06/30 12:03:44 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.NullPointerException
        at java.lang.ProcessBuilder.start(Unknown Source)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
        at org.apache.hadoop.util.Shell.run(Shell.java:455)
        at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
715)
        at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:873)
        at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:853)
        at org.apache.spark.util.Utils$.fetchFile(Utils.scala:465)
        at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor
$Executor$$updateDependencies$5.apply(Executor.scala:398)
        at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor
$Executor$$updateDependencies$5.apply(Executor.scala:390)
        at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(
TraversableLike.scala:772)
        at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.sca
la:98)
        at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.sca
la:98)
        at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala
:226)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
        at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.s
cala:771)
        at org.apache.spark.executor.Executor.org
$apache$spark$executor$Executor
$$updateDependencies(Executor.scala:390)
        at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)

        at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
Source)
        at java.lang.Thread.run(Unknown Source)
15/06/30 12:03:44 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
localh
ost): java.lang.NullPointerException
        at java.lang.ProcessBuilder.start(Unknown Source)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)


Any idea what is going wrong.

Regards,
Sourav

Reply via email to