Charlie Gao created ARROW-17008:
-----------------------------------
Summary: Parquet Snappy Compression Fails for Integer Type Data
Key: ARROW-17008
URL: https://issues.apache.org/jira/browse/ARROW-17008
Project: Apache Arrow
Issue Type: Bug
Components: R
Affects Versions: 8.0.0
Environment: R4.2.1 Ubuntu 22.04 x86_64
R4.1.2 Ubuntu 22.04 Aarch64
Reporter: Charlie Gao
Snappy compression is not working when writing to parquet for integer type data.
E.g. compare file sizes for:
{code:r}
write_parquet(data.frame(x = 1:1e6), "snappy.parquet", compression = "snappy")
write_parquet(data.frame(x = 1:1e6), "uncomp.parquet", compression =
"uncompressed")
{code}
whereas for double:
{code:r}
write_parquet(data.frame(x = as.double(1:1e6)), "snappyd.parquet", compression
= "snappy")
write_parquet(data.frame(x = as.double(1:1e6)), "uncompd.parquet", compression
= "uncompressed")
{code}
I have inspected the integer files using parquet-tools and compression level
shows as 0%. Needless to say, I can achieve compression using Spark (sparklyr)
etc.
Thanks.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)