Attila Zsolt Piros created SPARK-23394:

             Summary: Storage info's Cached Partitions doesn't consider the 
replications (but sc.getRDDStorageInfo does)
                 Key: SPARK-23394
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.3.0
            Reporter: Attila Zsolt Piros

The SparkContext.getRDDStorageInfo considers repla 

Start spark as:
$ bin/spark-shell --master local-cluster[2,1,1024]

scala> import

scala> sc.parallelize((1 to 100), 10).persist(MEMORY_AND_DISK_2).count
res0: Long = 100                                                                

scala> sc.getRDDStorageInfo(0).numCachedPartitions
res1: Int = 20

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to