[ https://issues.apache.org/jira/browse/SPARK-2527?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Josh Rosen resolved SPARK-2527. ------------------------------- Resolution: Cannot Reproduce Assignee: Josh Rosen I think that this was fixed in either 1.1 or 1.2 since I was unable to reproduce this when writing a Selenium test to run your example script: https://github.com/apache/spark/commit/bf589fc717c842d1998e3c3a523bc8775cb30269#diff-f346ada4cd59416756b6dd36b6c2605aR53 Therefore, I'm going to mark this as "Cannot Reproduce" since it was probably fixed. Please re-open this ticket if you observe this in the wild with a newer version of Spark. > incorrect persistence level shown in Spark UI after repersisting > ---------------------------------------------------------------- > > Key: SPARK-2527 > URL: https://issues.apache.org/jira/browse/SPARK-2527 > Project: Spark > Issue Type: Bug > Components: Web UI > Affects Versions: 1.0.0 > Reporter: Diana Carroll > Assignee: Josh Rosen > Attachments: persistbug1.png, persistbug2.png > > > If I persist an RDD at one level, unpersist it, then repersist it at another > level, the UI will continue to show the RDD at the first level...but > correctly show individual partitions at the second level. > {code} > import org.apache.spark.api.java.StorageLevels > import org.apache.spark.api.java.StorageLevels._ > val test1 = sc.parallelize(Array(1,2,3))test1.persist(StorageLevels.DISK_ONLY) > test1.count() > test1.unpersist() > test1.persist(StorageLevels.MEMORY_ONLY) > test1.count() > {code} > after the first call to persist and count, the Spark App web UI shows: > RDD Storage Info for 14 Storage Level: Disk Serialized 1x Replicated > rdd_14_0 Disk Serialized 1x Replicated > After the second call, it shows: > RDD Storage Info for 14 Storage Level: Disk Serialized 1x Replicated > rdd_14_0 Memory Deserialized 1x Replicated -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org