[jira] [Updated] (SPARK-23394) Storage info's Cached Partitions doesn't consider the replications (but sc.getRDDStorageInfo does)
[ https://issues.apache.org/jira/browse/SPARK-23394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Xiao Li updated SPARK-23394: Fix Version/s: (was: 2.3.1) 2.3.0 > Storage info's Cached Partitions doesn't consider the replications (but > sc.getRDDStorageInfo does) > -- > > Key: SPARK-23394 > URL: https://issues.apache.org/jira/browse/SPARK-23394 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Attila Zsolt Piros >Assignee: Attila Zsolt Piros >Priority: Major > Fix For: 2.3.0, 2.4.0 > > Attachments: Spark_2.2.1.png, Spark_2.4.0-SNAPSHOT.png, > Storage_Tab.png > > > Start spark as: > {code:bash} > $ bin/spark-shell --master local-cluster[2,1,1024] > {code} > {code:scala} > scala> import org.apache.spark.storage.StorageLevel._ > import org.apache.spark.storage.StorageLevel._ > scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count > res0: Long = 100 > > scala> sc.getRDDStorageInfo(0).numCachedPartitions > res1: Int = 20 > {code} > h2. Cached Partitions > On the UI at the Storage tab Cached Partitions is 10: > !Storage_Tab.png! . > h2. Full tab > Moreover the replicated partitions was also listed on the old 2.2.1 like: > !Spark_2.2.1.png! > But now it is like: > !Spark_2.4.0-SNAPSHOT.png! -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-23394) Storage info's Cached Partitions doesn't consider the replications (but sc.getRDDStorageInfo does)
[ https://issues.apache.org/jira/browse/SPARK-23394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Attila Zsolt Piros updated SPARK-23394: --- Description: Start spark as: {code:bash} $ bin/spark-shell --master local-cluster[2,1,1024] {code} {code:scala} scala> import org.apache.spark.storage.StorageLevel._ import org.apache.spark.storage.StorageLevel._ scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count res0: Long = 100 scala> sc.getRDDStorageInfo(0).numCachedPartitions res1: Int = 20 {code} h2. Cached Partitions On the UI at the Storage tab Cached Partitions is 10: !Storage_Tab.png! . h2. Full tab Moreover the replicated partitions was also listed on the old 2.2.1 like: !Spark_2.2.1.png! But now it is like: !Spark_2.4.0-SNAPSHOT.png! was: Start spark as: {code:bash} $ bin/spark-shell --master local-cluster[2,1,1024] {code} {code:scala} scala> import org.apache.spark.storage.StorageLevel._ import org.apache.spark.storage.StorageLevel._ scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count res0: Long = 100 scala> sc.getRDDStorageInfo(0).numCachedPartitions res1: Int = 20 {code} But on the UI at the Storage tab Cached Partitions is 10. See attached screenshot !Storage_Tab.png! . > Storage info's Cached Partitions doesn't consider the replications (but > sc.getRDDStorageInfo does) > -- > > Key: SPARK-23394 > URL: https://issues.apache.org/jira/browse/SPARK-23394 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Attila Zsolt Piros >Priority: Major > Attachments: Spark_2.2.1.png, Spark_2.4.0-SNAPSHOT.png, > Storage_Tab.png > > > Start spark as: > {code:bash} > $ bin/spark-shell --master local-cluster[2,1,1024] > {code} > {code:scala} > scala> import org.apache.spark.storage.StorageLevel._ > import org.apache.spark.storage.StorageLevel._ > scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count > res0: Long = 100 > > scala> sc.getRDDStorageInfo(0).numCachedPartitions > res1: Int = 20 > {code} > h2. Cached Partitions > On the UI at the Storage tab Cached Partitions is 10: > !Storage_Tab.png! . > h2. Full tab > Moreover the replicated partitions was also listed on the old 2.2.1 like: > !Spark_2.2.1.png! > But now it is like: > !Spark_2.4.0-SNAPSHOT.png! -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-23394) Storage info's Cached Partitions doesn't consider the replications (but sc.getRDDStorageInfo does)
[ https://issues.apache.org/jira/browse/SPARK-23394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Attila Zsolt Piros updated SPARK-23394: --- Description: Start spark as: {code:bash} $ bin/spark-shell --master local-cluster[2,1,1024] {code} {code:scala} scala> import org.apache.spark.storage.StorageLevel._ import org.apache.spark.storage.StorageLevel._ scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count res0: Long = 100 scala> sc.getRDDStorageInfo(0).numCachedPartitions res1: Int = 20 {code} But on the UI at the Storage tab Cached Partitions is 10. See attached screenshot !Storage_Tab.png! . was: Start spark as: {code:bash} $ bin/spark-shell --master local-cluster[2,1,1024] {code} {code:scala} scala> import org.apache.spark.storage.StorageLevel._ import org.apache.spark.storage.StorageLevel._ scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count res0: Long = 100 scala> sc.getRDDStorageInfo(0).numCachedPartitions res1: Int = 20 {code} But on the UI at the Storage tab Cached Partitions is 10. See attached screenshot. > Storage info's Cached Partitions doesn't consider the replications (but > sc.getRDDStorageInfo does) > -- > > Key: SPARK-23394 > URL: https://issues.apache.org/jira/browse/SPARK-23394 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Attila Zsolt Piros >Priority: Major > Attachments: Spark_2.2.1.png, Spark_2.4.0-SNAPSHOT.png, > Storage_Tab.png > > > Start spark as: > {code:bash} > $ bin/spark-shell --master local-cluster[2,1,1024] > {code} > {code:scala} > scala> import org.apache.spark.storage.StorageLevel._ > import org.apache.spark.storage.StorageLevel._ > scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count > res0: Long = 100 > > scala> sc.getRDDStorageInfo(0).numCachedPartitions > res1: Int = 20 > {code} > But on the UI at the Storage tab Cached Partitions is 10. See attached > screenshot !Storage_Tab.png! . -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-23394) Storage info's Cached Partitions doesn't consider the replications (but sc.getRDDStorageInfo does)
[ https://issues.apache.org/jira/browse/SPARK-23394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Attila Zsolt Piros updated SPARK-23394: --- Attachment: (was: Screen Shot 2018-02-12 at 11.24.22.png) > Storage info's Cached Partitions doesn't consider the replications (but > sc.getRDDStorageInfo does) > -- > > Key: SPARK-23394 > URL: https://issues.apache.org/jira/browse/SPARK-23394 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Attila Zsolt Piros >Priority: Major > Attachments: Spark_2.2.1.png, Spark_2.4.0-SNAPSHOT.png, > Storage_Tab.png > > > Start spark as: > {code:bash} > $ bin/spark-shell --master local-cluster[2,1,1024] > {code} > {code:scala} > scala> import org.apache.spark.storage.StorageLevel._ > import org.apache.spark.storage.StorageLevel._ > scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count > res0: Long = 100 > > scala> sc.getRDDStorageInfo(0).numCachedPartitions > res1: Int = 20 > {code} > But on the UI at the Storage tab Cached Partitions is 10. See attached > screenshot !Storage_Tab.png! . -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-23394) Storage info's Cached Partitions doesn't consider the replications (but sc.getRDDStorageInfo does)
[ https://issues.apache.org/jira/browse/SPARK-23394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Attila Zsolt Piros updated SPARK-23394: --- Attachment: Storage_Tab.png > Storage info's Cached Partitions doesn't consider the replications (but > sc.getRDDStorageInfo does) > -- > > Key: SPARK-23394 > URL: https://issues.apache.org/jira/browse/SPARK-23394 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Attila Zsolt Piros >Priority: Major > Attachments: Screen Shot 2018-02-12 at 11.24.22.png, Spark_2.2.1.png, > Spark_2.4.0-SNAPSHOT.png, Storage_Tab.png > > > Start spark as: > {code:bash} > $ bin/spark-shell --master local-cluster[2,1,1024] > {code} > {code:scala} > scala> import org.apache.spark.storage.StorageLevel._ > import org.apache.spark.storage.StorageLevel._ > scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count > res0: Long = 100 > > scala> sc.getRDDStorageInfo(0).numCachedPartitions > res1: Int = 20 > {code} > But on the UI at the Storage tab Cached Partitions is 10. See attached > screenshot. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-23394) Storage info's Cached Partitions doesn't consider the replications (but sc.getRDDStorageInfo does)
[ https://issues.apache.org/jira/browse/SPARK-23394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Attila Zsolt Piros updated SPARK-23394: --- Attachment: Spark_2.4.0-SNAPSHOT.png Spark_2.2.1.png > Storage info's Cached Partitions doesn't consider the replications (but > sc.getRDDStorageInfo does) > -- > > Key: SPARK-23394 > URL: https://issues.apache.org/jira/browse/SPARK-23394 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Attila Zsolt Piros >Priority: Major > Attachments: Screen Shot 2018-02-12 at 11.24.22.png, Spark_2.2.1.png, > Spark_2.4.0-SNAPSHOT.png > > > Start spark as: > {code:bash} > $ bin/spark-shell --master local-cluster[2,1,1024] > {code} > {code:scala} > scala> import org.apache.spark.storage.StorageLevel._ > import org.apache.spark.storage.StorageLevel._ > scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count > res0: Long = 100 > > scala> sc.getRDDStorageInfo(0).numCachedPartitions > res1: Int = 20 > {code} > But on the UI at the Storage tab Cached Partitions is 10. See attached > screenshot. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-23394) Storage info's Cached Partitions doesn't consider the replications (but sc.getRDDStorageInfo does)
[ https://issues.apache.org/jira/browse/SPARK-23394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Attila Zsolt Piros updated SPARK-23394: --- Description: Start spark as: {code:bash} $ bin/spark-shell --master local-cluster[2,1,1024] {code} {code:scala} scala> import org.apache.spark.storage.StorageLevel._ import org.apache.spark.storage.StorageLevel._ scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count res0: Long = 100 scala> sc.getRDDStorageInfo(0).numCachedPartitions res1: Int = 20 {code} But on the UI at the Storage tab Cached Partitions is 10. See attached screenshot. was: The SparkContext.getRDDStorageInfo considers repla Reproduce: Start spark as: {code:bash} $ bin/spark-shell --master local-cluster[2,1,1024] {code} {code:scala} scala> import org.apache.spark.storage.StorageLevel._ import org.apache.spark.storage.StorageLevel._ scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count res0: Long = 100 scala> sc.getRDDStorageInfo(0).numCachedPartitions res1: Int = 20 {code} > Storage info's Cached Partitions doesn't consider the replications (but > sc.getRDDStorageInfo does) > -- > > Key: SPARK-23394 > URL: https://issues.apache.org/jira/browse/SPARK-23394 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Attila Zsolt Piros >Priority: Major > Attachments: Screen Shot 2018-02-12 at 11.24.22.png > > > Start spark as: > {code:bash} > $ bin/spark-shell --master local-cluster[2,1,1024] > {code} > {code:scala} > scala> import org.apache.spark.storage.StorageLevel._ > import org.apache.spark.storage.StorageLevel._ > scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count > res0: Long = 100 > > scala> sc.getRDDStorageInfo(0).numCachedPartitions > res1: Int = 20 > {code} > But on the UI at the Storage tab Cached Partitions is 10. See attached > screenshot. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-23394) Storage info's Cached Partitions doesn't consider the replications (but sc.getRDDStorageInfo does)
[ https://issues.apache.org/jira/browse/SPARK-23394?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Attila Zsolt Piros updated SPARK-23394: --- Attachment: Screen Shot 2018-02-12 at 11.24.22.png > Storage info's Cached Partitions doesn't consider the replications (but > sc.getRDDStorageInfo does) > -- > > Key: SPARK-23394 > URL: https://issues.apache.org/jira/browse/SPARK-23394 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 2.3.0 >Reporter: Attila Zsolt Piros >Priority: Major > Attachments: Screen Shot 2018-02-12 at 11.24.22.png > > > Start spark as: > {code:bash} > $ bin/spark-shell --master local-cluster[2,1,1024] > {code} > {code:scala} > scala> import org.apache.spark.storage.StorageLevel._ > import org.apache.spark.storage.StorageLevel._ > scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count > res0: Long = 100 > > scala> sc.getRDDStorageInfo(0).numCachedPartitions > res1: Int = 20 > {code} > But on the UI at the Storage tab Cached Partitions is 10. See attached > screenshot. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org