CacheManager#cacheQuery() is called where:
  * Caches the data produced by the logical representation of the given
[[Queryable]].
...
    val planToCache = query.queryExecution.analyzed
    if (lookupCachedData(planToCache).nonEmpty) {

Is the schema for dfNew different from that of dfOld ?

Cheers

On Fri, Dec 18, 2015 at 3:33 AM, Sahil Sareen <sareen...@gmail.com> wrote:

> Spark 1.5.2
>
> dfOld.registerTempTable("oldTableName")
> sqlContext.cacheTable("oldTableName")
> // ....
> // do something
> // ....
> dfNew.registerTempTable("oldTableName")
> sqlContext.cacheTable("oldTableName")
>
>
> Now when I use the "oldTableName" table I do get the latest contents
> from dfNew but do the contents of dfOld get removed from the memory?
>
> Or is the right usage to do this:
> dfOld.registerTempTable("oldTableName")
> sqlContext.cacheTable("oldTableName")
> // ....
> // do something
> // ....
> dfNew.registerTempTable("oldTableName")
> sqlContext.unCacheTable("oldTableName") <========== unCache the old
> contents first
> sqlContext.cacheTable("oldTableName")
>
> -Sahil
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to