Attila Zsolt Piros created SPARK-23394:
------------------------------------------

             Summary: Storage info's Cached Partitions doesn't consider the 
replications (but sc.getRDDStorageInfo does)
                 Key: SPARK-23394
                 URL: https://issues.apache.org/jira/browse/SPARK-23394
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.3.0
            Reporter: Attila Zsolt Piros


The SparkContext.getRDDStorageInfo considers repla 

Reproduce:
Start spark as:
{code:bash}
$ bin/spark-shell --master local-cluster[2,1,1024]
{code}


{code:scala}
scala> import org.apache.spark.storage.StorageLevel._
import org.apache.spark.storage.StorageLevel._

scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count
res0: Long = 100                                                                

scala> sc.getRDDStorageInfo(0).numCachedPartitions
res1: Int = 20
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to