Spark 1.5.2
dfOld.registerTempTable("oldTableName")
sqlContext.cacheTable("oldTableName")
// ....
// do something
// ....
dfNew.registerTempTable("oldTableName")
sqlContext.cacheTable("oldTableName")
Now when I use the "oldTableName" table I do get the latest contents
from dfNew but do the contents of dfOld get removed from the memory?
Or is the right usage to do this:
dfOld.registerTempTable("oldTableName")
sqlContext.cacheTable("oldTableName")
// ....
// do something
// ....
dfNew.registerTempTable("oldTableName")
sqlContext.unCacheTable("oldTableName") <========== unCache the old
contents first
sqlContext.cacheTable("oldTableName")
-Sahil
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]