Hi David,

Can you use persist instead? Perhaps with some other StorageLevel? It
worked with Spark 2.2.0-SNAPSHOT I use and don't remember how it
worked back then in 1.6.2.

You could also check the Executors tab and see how many blocks you
have in their BlockManagers.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Mon, Dec 26, 2016 at 7:08 PM, David Hodeffi
<david.hode...@niceactimize.com> wrote:
> I have tried the following code but didn't see anything on the storage tab.
>
>
>
>
>
> val myrdd = sc.parallelilize(1 to 100)
>
> myrdd.setName("my_rdd")
>
> myrdd.cache()
>
> myrdd.collect()
>
>
>
> Storage tab is empty, though I can see the stage of collect() .
>
> I am using 1.6.2 ,HDP 2.5 , spark on yarn
>
>
>
>
>
> Thanks David
>
>
>
>
> Confidentiality: This communication and any attachments are intended for the
> above-named persons only and may be confidential and/or legally privileged.
> Any opinions expressed in this communication are not necessarily those of
> NICE Actimize. If this communication has come to you in error you must take
> no action based on it, nor must you copy or show it to anyone; please
> delete/destroy and inform the sender by e-mail immediately.
> Monitoring: NICE Actimize may monitor incoming and outgoing e-mails.
> Viruses: Although we have taken steps toward ensuring that this e-mail and
> attachments are free from any virus, we advise that in keeping with good
> computing practice the recipient should ensure they are actually virus free.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to