This appears to be missing from PySpark.

Reported in SPARK-2141 <https://issues.apache.org/jira/browse/SPARK-2141>.


On Fri, Jun 13, 2014 at 10:43 AM, Mayur Rustagi <mayur.rust...@gmail.com>
wrote:

>
>
>     val myRdds = sc.getPersistentRDDs
>
>     assert(myRdds.size === 1)
>
>
>
> It'll return a map. Its pretty old 0.8.0 onwards.
>
>
> Regards
> Mayur
>
>
> Mayur Rustagi
> Ph: +1 (760) 203 3257
> http://www.sigmoidanalytics.com
> @mayur_rustagi <https://twitter.com/mayur_rustagi>
>
>
>
> On Fri, Jun 13, 2014 at 9:42 AM, mrm <ma...@skimlinks.com> wrote:
>
>> Hi Daniel,
>>
>> Thank you for your help! This is the sort of thing I was looking for.
>> However, when I type "sc.getPersistentRDDs", i get the error
>> "AttributeError: 'SparkContext' object has no attribute
>> 'getPersistentRDDs'".
>>
>> I don't get any error when I type "sc.defaultParallelism" for example.
>>
>> I would appreciate it if you could help me with this, I have tried
>> different
>> ways and googling it! I suspect it might be a silly error but I can't
>> figure
>> it out.
>>
>> Maria
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/list-of-persisted-rdds-tp7564p7569.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>
>

Reply via email to