Hi Wei Chiu,
What is the Hadoop version being used?
Give a check if HADOOP-15822 is there, it had something similar error.

-Ayush

> On 11-May-2020, at 10:11 PM, Wei-Chiu Chuang <weic...@apache.org> wrote:
> 
> Hadoop devs,
> 
> A colleague of mine recently hit a strange issue where zstd compression
> codec crashes.
> 
> Caused by: java.lang.InternalError: Error (generic)
> at
> org.apache.hadoop.io.compress.zstd.ZStandardCompressor.deflateBytesDirect(Native
> Method)
> at
> org.apache.hadoop.io.compress.zstd.ZStandardCompressor.compress(ZStandardCompressor.java:216)
> at
> org.apache.hadoop.io.compress.CompressorStream.compress(CompressorStream.java:81)
> at
> org.apache.hadoop.io.compress.CompressorStream.write(CompressorStream.java:76)
> at
> org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:57)
> at java.io.DataOutputStream.write(DataOutputStream.java:107)
> at
> org.apache.tez.runtime.library.common.sort.impl.IFile$Writer.writeKVPair(IFile.java:617)
> at
> org.apache.tez.runtime.library.common.sort.impl.IFile$Writer.append(IFile.java:480)
> 
> Anyone out there hitting the similar problem?
> 
> A temporary workaround is to set buffer size "set
> io.compression.codec.zstd.buffersize=8192;"
> 
> We suspected it's a bug in zstd library, but couldn't verify. Just want to
> send this out and see if I can get some luck.

---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Reply via email to