/scala/org/apache/spark/sql/columnar/compression
Although compression ratio is not as good as Parquet.
Thanks
-Nitin
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/compress-in-memory-column-storage-used-in-sparksql-cache-table-tp13932p13937.html
Sent
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/compress-in-memory-column-storage-used-in-sparksql-cache-table-tp13932p13937.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
Hi, I have an idea, can someone give me some advice?
I want to compress data in in-memory column storage which is used by cache
table in spark. This will make cache table use less memory.
I will set an conf to this function, so if anyone want to use this function, he
can set this conf to