[ 
https://issues.apache.org/jira/browse/SPARK-34479?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17287463#comment-17287463
 ] 

Yuming Wang edited comment on SPARK-34479 at 2/20/21, 3:02 AM:
---------------------------------------------------------------

But zstd 1.4.5-12 is not compatible with 1.4.8-4.
https://github.com/apache/avro/blob/release-1.10.1/lang/java/pom.xml#L64
https://github.com/apache/spark/blob/331c6fd4efcb337d903b7179b05997dca2dae2a8/pom.xml#L703
{noformat}
Caused by: java.lang.NoSuchMethodError: 
com.github.luben.zstd.ZstdOutputStream.setCloseFrameOnFlush(Z)Lcom/github/luben/zstd/ZstdOutputStream;
        at org.apache.avro.file.ZstandardLoader.output(ZstandardLoader.java:40)
        at org.apache.avro.file.ZstandardCodec.compress(ZstandardCodec.java:67)
        at 
org.apache.avro.file.DataFileStream$DataBlock.compressUsing(DataFileStream.java:386)
        at 
org.apache.avro.file.DataFileWriter.writeBlock(DataFileWriter.java:407)
        at org.apache.avro.file.DataFileWriter.sync(DataFileWriter.java:428)
        at org.apache.avro.file.DataFileWriter.flush(DataFileWriter.java:437)
        at org.apache.avro.file.DataFileWriter.close(DataFileWriter.java:460)
        at 
org.apache.spark.sql.avro.SparkAvroKeyRecordWriter.close(SparkAvroKeyOutputFormat.java:88)
        at 
org.apache.spark.sql.avro.AvroOutputWriter.close(AvroOutputWriter.scala:86)
        at 
org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:58)
        at 
org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:75)
        at 
org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:281)
{noformat}

[~iemejia] May be we need to release Avro 1.10.2 or 1.11.0.


was (Author: q79969786):
But zstd 1.4.5-12 is not compatible with 1.4.8-4.
https://github.com/apache/avro/blob/release-1.10.1/lang/java/pom.xml#L64
https://github.com/apache/spark/blob/331c6fd4efcb337d903b7179b05997dca2dae2a8/pom.xml#L703

[~iemejia] May be we need to release Avro 1.10.2 or 1.11.0.


{noformat}
Caused by: java.lang.NoSuchMethodError: 
com.github.luben.zstd.ZstdOutputStream.setCloseFrameOnFlush(Z)Lcom/github/luben/zstd/ZstdOutputStream;
        at org.apache.avro.file.ZstandardLoader.output(ZstandardLoader.java:40)
        at org.apache.avro.file.ZstandardCodec.compress(ZstandardCodec.java:67)
        at 
org.apache.avro.file.DataFileStream$DataBlock.compressUsing(DataFileStream.java:386)
        at 
org.apache.avro.file.DataFileWriter.writeBlock(DataFileWriter.java:407)
        at org.apache.avro.file.DataFileWriter.sync(DataFileWriter.java:428)
        at org.apache.avro.file.DataFileWriter.flush(DataFileWriter.java:437)
        at org.apache.avro.file.DataFileWriter.close(DataFileWriter.java:460)
        at 
org.apache.spark.sql.avro.SparkAvroKeyRecordWriter.close(SparkAvroKeyOutputFormat.java:88)
        at 
org.apache.spark.sql.avro.AvroOutputWriter.close(AvroOutputWriter.scala:86)
        at 
org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:58)
        at 
org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:75)
        at 
org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:281)
{noformat}


> Add zstandard codec to spark.sql.avro.compression.codec
> -------------------------------------------------------
>
>                 Key: SPARK-34479
>                 URL: https://issues.apache.org/jira/browse/SPARK-34479
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> Avro add zstandard codec since AVRO-2195.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to