Was your spark assembly jarred with Java 7?  There's a known issue with jar 
files made with that version. It prevents them from being used on PYTHONPATH. 
You can rejar with Java 6 for better results. 

----
Eric Friedman

> On Dec 29, 2014, at 8:01 AM, Naveen Kumar Pokala <npok...@spcapitaliq.com> 
> wrote:
> 
>  
> 14/12/29 18:10:56 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 2, 
> nj09mhf0730.mhf.mhc, PROCESS_LOCAL, 1246 bytes)
> 14/12/29 18:10:56 INFO TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1) on 
> executor nj09mhf0730.mhf.mhc: org.apache.spark.SparkException (
> Error from python worker:
>   python: module pyspark.daemon not found
> PYTHONPATH was:
>  
> /home/npokala/data/spark-install/spark-master/python:/home/npokala/data/spark-install/spark-master/python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar:/home/npokala/data/spark-install/spark-master/sbin/../python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/sbin/../python:
> java.io.EOFException) [duplicate 1]
> 14/12/29 18:10:56 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID 3, 
> nj09mhf0731.mhf.mhc, PROCESS_LOCAL, 1246 bytes)
> 14/12/29 18:10:56 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2) on 
> executor nj09mhf0730.mhf.mhc: org.apache.spark.SparkException (
> Error from python worker:
>   python: module pyspark.daemon not found
> PYTHONPATH was:
>   
> /home/npokala/data/spark-install/spark-master/python:/home/npokala/data/spark-install/spark-master/python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar:/home/npokala/data/spark-install/spark-master/sbin/../python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/sbin/../python:
> java.io.EOFException) [duplicate 2]
> 14/12/29 18:10:56 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 4, 
> nj09mhf0731.mhf.mhc, PROCESS_LOCAL, 1246 bytes)
> 14/12/29 18:10:58 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory 
> on nj09mhf0731.mhf.mhc:48802 (size: 3.4 KB, free: 265.1 MB)
> 14/12/29 18:10:58 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory 
> on nj09mhf0731.mhf.mhc:41243 (size: 3.4 KB, free: 265.1 MB)
> 14/12/29 18:10:59 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID 3) on 
> executor nj09mhf0731.mhf.mhc: org.apache.spark.SparkException (
> Error from python worker:
>   python: module pyspark.daemon not found
> PYTHONPATH was:
>   
> /home/npokala/data/spark-install/spark-master/python:/home/npokala/data/spark-install/spark-master/python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar:/home/npokala/data/spark-install/spark-master/sbin/../python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/sbin/../python:
> java.io.EOFException) [duplicate 3]
> 14/12/29 18:10:59 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID 5, 
> nj09mhf0730.mhf.mhc, PROCESS_LOCAL, 1246 bytes)
> 14/12/29 18:10:59 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 4) on 
> executor nj09mhf0731.mhf.mhc: org.apache.spark.SparkException (
> Error from python worker:
>   python: module pyspark.daemon not found
> PYTHONPATH was:
>   
> /home/npokala/data/spark-install/spark-master/python:/home/npokala/data/spark-install/spark-master/python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar:/home/npokala/data/spark-install/spark-master/sbin/../python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/sbin/../python:
> java.io.EOFException) [duplicate 4]
> 14/12/29 18:10:59 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID 6, 
> nj09mhf0730.mhf.mhc, PROCESS_LOCAL, 1246 bytes)
> 14/12/29 18:11:00 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory 
> on nj09mhf0730.mhf.mhc:60005 (size: 3.4 KB, free: 265.1 MB)
> 14/12/29 18:11:00 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory 
> on nj09mhf0730.mhf.mhc:40227 (size: 3.4 KB, free: 265.1 MB)
> 14/12/29 18:11:01 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID 6) on 
> executor nj09mhf0730.mhf.mhc: org.apache.spark.SparkException (
> Error from python worker:
>   python: module pyspark.daemon not found
> PYTHONPATH was:
>   
> /home/npokala/data/spark-install/spark-master/python:/home/npokala/data/spark-install/spark-master/python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar:/home/npokala/data/spark-install/spark-master/sbin/../python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/sbin/../python:
> java.io.EOFException) [duplicate 5]
> 14/12/29 18:11:01 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; 
> aborting job
> 14/12/29 18:11:01 INFO TaskSchedulerImpl: Cancelling stage 0
> 14/12/29 18:11:01 INFO TaskSchedulerImpl: Stage 0 was cancelled
> 14/12/29 18:11:01 INFO DAGScheduler: Job 0 failed: reduce at 
> D:\WorkSpace\python\spark\src\test\__init__.py:21, took 15.491196 s
> Traceback (most recent call last):
>   File "D:\WorkSpace\python\spark\src\test\__init__.py", line 21, in <module>
>     count = sc.parallelize(xrange(1, n + 1), partitions).map(f).reduce(add)
>   File "C:\Users\npokala\Downloads\spark-master\python\pyspark\rdd.py", line 
> 715, in reduce
>     vals = self.mapPartitions(func).collect()
>   File "C:\Users\npokala\Downloads\spark-master\python\pyspark\rdd.py", line 
> 676, in collect
>     bytesInJava = self._jrdd.collect().iterator()
>   File 
> "C:\Users\npokala\Downloads\spark-master\python\lib\py4j-0.8.2.1-src.zip\py4j\java_gateway.py",
>  line 538, in __call__
>   File 
> "C:\Users\npokala\Downloads\spark-master\python\lib\py4j-0.8.2.1-src.zip\py4j\protocol.py",
>  line 300, in get_return_value
> py4j.protocol.Py4JJavaError: An error occurred while calling o23.collect.
> : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 
> in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 
> (TID 6, nj09mhf0730.mhf.mhc): org.apache.spark.SparkException:
> Error from python worker:
>   python: module pyspark.daemon not found
> PYTHONPATH was:
>   
> /home/npokala/data/spark-install/spark-master/python:/home/npokala/data/spark-install/spark-master/python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar:/home/npokala/data/spark-install/spark-master/sbin/../python/lib/py4j-0.8.2.1-src.zip:/home/npokala/data/spark-install/spark-master/sbin/../python:
> java.io.EOFException
>        at java.io.DataInputStream.readInt(DataInputStream.java:392)
>        at 
> org.apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:163)
>        at 
> org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:86)
>        at 
> org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:62)
>        at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:102)
>        at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
>        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:265)
>        at org.apache.spark.rdd.RDD.iterator(RDD.scala:232)
>        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
>        at org.apache.spark.scheduler.Task.run(Task.scala:56)
>        at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
>        at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>        at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>        at java.lang.Thread.run(Thread.java:745)
>  
> Driver stacktrace:
>  
>        at 
> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
>  
>        at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
>  
>        at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
>  
>        at 
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>  
>        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>  
>        at 
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
>  
>        at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
>  
>        at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
>  
>        at scala.Option.foreach(Option.scala:236)
>  
>        at 
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
>  
>        at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
>  
>        at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
>  
>        at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375)
>  
>        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
>  
>        at akka.actor.ActorCell.invoke(ActorCell.scala:487)
>  
>        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
>  
>        at akka.dispatch.Mailbox.run(Mailbox.scala:220)
>  
>        at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>  
>        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>  
>        at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>  
>        at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>  
>        at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Please can anyone suggest me how to resolve the issue.
>  
> -Naveen

Reply via email to