A few questions about caching a table in Spark SQL.

1) Is there any difference between caching the dataframe and the table?

df.cache() vs sqlContext.cacheTable("tableName")

2) Do you need to "warm up" the cache before seeing the performance
benefits? Is the cache LRU? Do you need to run some queries on the table
before it is cached in memory?

3) Is caching the table much faster than .saveAsTable? I am only seeing a
10 %- 20% performance increase.

Reply via email to