So I have the pyspark shell open and after some idle time I sometimes get
this:

>>> PySpark worker failed with exception:
> Traceback (most recent call last):
>   File "/root/spark/python/pyspark/worker.py", line 77, in main
>     serializer.dump_stream(func(split_index, iterator), outfile)
>   File "/root/spark/python/pyspark/serializers.py", line 182, in
> dump_stream
>     self.serializer.dump_stream(self._batched(iterator), stream)
>   File "/root/spark/python/pyspark/serializers.py", line 118, in
> dump_stream
>     self._write_with_length(obj, stream)
>   File "/root/spark/python/pyspark/serializers.py", line 130, in
> _write_with_length
>     stream.write(serialized)
> IOError: [Errno 32] Broken pipe
> Traceback (most recent call last):
>   File "/root/spark/python/pyspark/daemon.py", line 117, in launch_worker
>     worker(listen_sock)
>   File "/root/spark/python/pyspark/daemon.py", line 107, in worker
>     outfile.flush()
> IOError: [Errno 32] Broken pipe


The shell is still alive and I can continue to do work.

Is this anything to worry about or fix?

Nick




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/PySpark-worker-fails-with-IOError-Broken-Pipe-tp2916.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to