arted
>>
>>
>>
>> Now some time later while the query is running we do
>>
>>
>>
>> val dfRefreshedBlackList = spark.read.csv(….)
>> dfRefreshedBlackList.createOrReplaceTempView(“blacklist”)
>>
>>
>>
>> Now, will dfBlackLis
ewly created blacklist? Or
> will it continue to hold the reference to the old dataframe? What if we had
> done RDD operations instead of using Spark SQL to join the dataframes?
>
>
>
> *From: *Tathagata Das <tathagata.das1...@gmail.com>
> *Date: *Wednesday, May 3, 2
operations instead of using Spark SQL to join the dataframes?
From: Tathagata Das <tathagata.das1...@gmail.com>
Date: Wednesday, May 3, 2017 at 6:32 PM
To: "Lalwani, Jayesh" <jayesh.lalw...@capitalone.com>
Cc: user <user@spark.apache.org>
Subject: Re: Refreshing a persisted RD
ehavior? Is there a better way to refresh cached data
> without restarting the Spark application?
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/Refreshi
? Is there a better way to refresh cached data
without restarting the Spark application?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Refreshing-a-persisted-RDD-tp28642.html
Sent from the Apache Spark User List mailing list archive at Nabble.com