It's already fixed in the master branch. Sorry that we forgot to update this before releasing 1.2.0 and caused you trouble...

Cheng

On 2/2/15 2:03 PM, ankits wrote:
Great, thank you very much. I was confused because this is in the docs:

https://spark.apache.org/docs/1.2.0/sql-programming-guide.html, and on the
"branch-1.2" branch,
https://github.com/apache/spark/blob/branch-1.2/docs/sql-programming-guide.md

"Note that if you call schemaRDD.cache() rather than
sqlContext.cacheTable(...), tables will not be cached using the in-memory
columnar format, and therefore sqlContext.cacheTable(...) is strongly
recommended for this use case.".

If this is no longer accurate, i could make a PR to remove it.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Get-size-of-rdd-in-memory-tp10366p10392.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org




---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to