[ https://issues.apache.org/jira/browse/SPARK-28967?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-28967. ------------------------------- Fix Version/s: 3.0.0 Assignee: Jungtaek Lim Resolution: Fixed Resolved by https://github.com/apache/spark/pull/25672 > ConcurrentModificationException is thrown from EventLoggingListener > ------------------------------------------------------------------- > > Key: SPARK-28967 > URL: https://issues.apache.org/jira/browse/SPARK-28967 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.0.0 > Reporter: Jungtaek Lim > Assignee: Jungtaek Lim > Priority: Minor > Fix For: 3.0.0 > > > While testing SPARK-28869 manually, I've found simple Structured Streaming > query is continuously throwing ConcurrentModificationException from > EventLoggingListener. > Stack trace follows: > {code:java} > 19/09/04 09:48:49 ERROR AsyncEventQueue: Listener EventLoggingListener threw > an exception19/09/04 09:48:49 ERROR AsyncEventQueue: Listener > EventLoggingListener threw an > exceptionjava.util.ConcurrentModificationException at > java.util.Hashtable$Enumerator.next(Hashtable.java:1387) at > scala.collection.convert.Wrappers$JPropertiesWrapper$$anon$6.next(Wrappers.scala:424) > at > scala.collection.convert.Wrappers$JPropertiesWrapper$$anon$6.next(Wrappers.scala:420) > at scala.collection.Iterator.foreach(Iterator.scala:941) at > scala.collection.Iterator.foreach$(Iterator.scala:941) at > scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at > scala.collection.IterableLike.foreach(IterableLike.scala:74) at > scala.collection.IterableLike.foreach$(IterableLike.scala:73) at > scala.collection.AbstractIterable.foreach(Iterable.scala:56) at > scala.collection.TraversableLike.map(TraversableLike.scala:237) at > scala.collection.TraversableLike.map$(TraversableLike.scala:230) at > scala.collection.AbstractTraversable.map(Traversable.scala:108) at > org.apache.spark.util.JsonProtocol$.mapToJson(JsonProtocol.scala:514) at > org.apache.spark.util.JsonProtocol$.$anonfun$propertiesToJson$1(JsonProtocol.scala:520) > at scala.Option.map(Option.scala:163) at > org.apache.spark.util.JsonProtocol$.propertiesToJson(JsonProtocol.scala:519) > at org.apache.spark.util.JsonProtocol$.jobStartToJson(JsonProtocol.scala:155) > at > org.apache.spark.util.JsonProtocol$.sparkEventToJson(JsonProtocol.scala:79) > at > org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:149) > at > org.apache.spark.scheduler.EventLoggingListener.onJobStart(EventLoggingListener.scala:217) > at > org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:37) > at > org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) > at > org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) > at > org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) > at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:99) at > org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:84) at > org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:102) > at > org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:102) > at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) at > scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at > org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:97) > at > org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:93) > at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319) at > org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:93) > {code} > > It also occurs with current master branch. -- This message was sent by Atlassian Jira (v8.3.2#803003) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org