Yes, In my case, my StateSpec had a small partition size. I increased the numPartitions and the problem went away. (Details of why the problem was happening in the first place is elided.)
TL;DR StateSpec takes a "numPartitions" which can be set to high enough number. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/mapwithstate-Hangs-with-Error-cleaning-broadcast-tp26500p27994.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org