How did you upload the data to the new table?
You can get the data compressed by doing a insert overwrite to the
destination table with setting "hive.exec.compress.output" to true.

Thanks
Yongqiang
On Mon, Jan 24, 2011 at 12:30 PM, Edward Capriolo <edlinuxg...@gmail.com> wrote:
> I am trying to explore some use case that I believe are perfect for
> the columnarSerDe, tables with 100+ columns where only one or two are
> selected in a particular query.
>
> CREATE TABLE (....)
> ROW FORMAT SERDE "org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"
>   STORED AS RCFile ;
>
> My issue is my data from our source table, with gzip sequence files,
> is much smaller then the ColumnarSerDe table and as a result any
> performance gains are lost.
>
> Any ideas?
>
> Thank you,
> Edward
>

Reply via email to