Hi,

I am trying to broadcast large objects (order of a couple of 100 MBs).
However, I keep getting errors when trying to do so:

Traceback (most recent call last):
  File "/LORM_experiment.py", line 510, in <module>
    broadcast_gradient_function = sc.broadcast(gradient_function)
  File "/scratch/users/213444/spark/python/pyspark/context.py", line 643, in
broadcast
    return Broadcast(self, value, self._pickled_broadcast_vars)
  File "/scratch/users/213444/spark/python/pyspark/broadcast.py", line 65,
in __init__
    self._path = self.dump(value, f)
  File "/scratch/users/213444/spark/python/pyspark/broadcast.py", line 82,
in dump
    cPickle.dump(value, f, 2)
SystemError: error return without exception set
15/02/22 04:52:14 ERROR Utils: Uncaught exception in thread delete Spark
local dirs
java.lang.IllegalStateException: Shutdown in progress

Any idea how to prevent that? I got plenty of RAM, so there shouldn't be any
problem with that.

Thanks,
 Tassilo



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Broadcasting-Large-Objects-Fails-tp21752.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to