[ 
https://issues.apache.org/jira/browse/SPARK-23860?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shixiong Zhu resolved SPARK-23860.
----------------------------------
    Resolution: Duplicate

already fixed in SPARK-20652

> SQLAppStatusListener should handle the cases that an accumulator may be GCed 
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-23860
>                 URL: https://issues.apache.org/jira/browse/SPARK-23860
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Shixiong Zhu
>            Priority: Major
>
> "SQLAppStatusListener.onTaskEnd" is running in the event thread. When it's 
> called, the Spark job may already finished, and accumulators may be GCed. 
> "SQLAppStatusListener.onTaskEnd" should handle this case.
> Here is an example of this failure (SQLAppStatusListener was called 
> SQLListener in 2.2):
> {code}
> 18/03/30 06:49:58 ERROR LiveListenerBus: Listener SQLListener threw an 
> exception
> java.lang.IllegalStateException: Attempted to access garbage collected 
> accumulator 78705157
>       at 
> org.apache.spark.util.AccumulatorContext$$anonfun$get$1.apply(AccumulatorV2.scala:268)
>       at 
> org.apache.spark.util.AccumulatorContext$$anonfun$get$1.apply(AccumulatorV2.scala:264)
>       at scala.Option.map(Option.scala:146)
>       at 
> org.apache.spark.util.AccumulatorContext$.get(AccumulatorV2.scala:264)
>       at 
> org.apache.spark.util.AccumulatorV2$$anonfun$name$1.apply(AccumulatorV2.scala:90)
>       at 
> org.apache.spark.util.AccumulatorV2$$anonfun$name$1.apply(AccumulatorV2.scala:90)
>       at scala.Option.orElse(Option.scala:289)
>       at org.apache.spark.util.AccumulatorV2.name(AccumulatorV2.scala:90)
>       at org.apache.spark.util.AccumulatorV2.toInfo(AccumulatorV2.scala:111)
>       at 
> org.apache.spark.sql.execution.ui.SQLListener$$anonfun$onTaskEnd$1.apply(SQLListener.scala:227)
>       at 
> org.apache.spark.sql.execution.ui.SQLListener$$anonfun$onTaskEnd$1.apply(SQLListener.scala:227)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>       at 
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>       at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
>       at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>       at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>       at 
> org.apache.spark.sql.execution.ui.SQLListener.onTaskEnd(SQLListener.scala:227)
>       at 
> org.apache.spark.scheduler.SparkListenerBus$class.doPostEvent(SparkListenerBus.scala:45)
>       at 
> org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:42)
>       at 
> org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:42)
>       at 
> org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:84)
>       at 
> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:42)
>       at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:100)
>       at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:81)
>       at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:81)
>       at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
>       at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:81)
>       at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1304)
>       at 
> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:80)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to