Thanks Josh. This information will help me a lot.

- Sasmita


On Mon, Nov 11, 2013 at 11:24 AM, Josh Rosen <[email protected]> wrote:

> We need to add Java-friendly wrappers for these methods to
> JavaSparkContext (hopefully we'll include this in 0.8.1).
>
> In the meantime, the Java APIs allow you to access the underlying Scala
> RDD and SparkContext classes.  For example, you can call
> getRDDStorageInfo() on the underlying SparkContext:
>
> myJavaSparkContext.sc().getRDDStorageInfo()
>
> Similarly, you could call getPersisentRDDs(), but this won't work as
> nicely because it returns a Scala Map instead of java.util.Map, and the
> map's values will be RDDs instead of JavaRDDs.  You can convert a RDD to
> JavaRDD via its toJavaRDD method, so you could write something like:
>
> myJavaSparkContext.sc().getPersistentRDDs().values().head().toJavaRDD()
>
>
> to get the first persisted RDD.
>
>
> On Sat, Nov 9, 2013 at 11:52 PM, sasmita Patra <[email protected]>wrote:
>
>> Hi,
>> I have a question regarding how to find the cached JavaRDDs if you are
>> using JavaSparkContext?
>>
>>  In the API documentation, i see there is a api to find cached Rdds for
>> SparkContext, but the same api is not available in JavaSparkContext.
>>
>> - Sasmita
>>
>
>

Reply via email to