[ https://issues.apache.org/jira/browse/SPARK-24924?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16573221#comment-16573221 ]
Thomas Graves commented on SPARK-24924: --------------------------------------- | There was a discussion about why we shouldn't support it: [https://github.com/apache/spark/pull/21841] There is no discussion on that pr? Assume you are referring to comment that points to by? It looks like we aren't supporting because python and R aren't going to supported, correct? That may be a fine thing for us to not support it internally, I'm not against that, I'm saying it is not a very good compatibility or upgrade story for users who want to switch from databricks avro to internal avro. We are adding this mapping so users can easily upgrade and claiming its functionally the same but its not really that easy as they potentially have to change their code to not use spark.read/write.avro. If we don't support spark.read/write.avro, I know at least for my users I will create something so that works for the 2.4 feature release because I view that as an api incompatibility and they don't expect that for a feature release. I realize this is a 3rd party library though so we may be able to get away with it but that doesn't mean its nice for our users. > Add mapping for built-in Avro data source > ----------------------------------------- > > Key: SPARK-24924 > URL: https://issues.apache.org/jira/browse/SPARK-24924 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 2.4.0 > Reporter: Dongjoon Hyun > Assignee: Dongjoon Hyun > Priority: Minor > Fix For: 2.4.0 > > > This issue aims to the followings. > # Like `com.databricks.spark.csv` mapping, we had better map > `com.databricks.spark.avro` to built-in Avro data source. > # Remove incorrect error message, `Please find an Avro package at ...`. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org