[ 
https://issues.apache.org/jira/browse/BEAM-7829?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16893885#comment-16893885
 ] 

Ismaël Mejía commented on BEAM-7829:
------------------------------------

It looks like Spark has the same issue and they deal with it by auto assigning 
a generic name and namespace (Thanks [~kanterov] for the pointer). So that 
could be a possible first approach to the issue since Beam schemas do not save 
metadata at the moment.
https://github.com/databricks/spark-avro/blob/55d4f08ad8f0cbc9d303acd30e948d0002309ebc/src/main/scala/com/databricks/spark/avro/DefaultSource.scala#L113

> AvroUtils.toAvroSchema should put a Schema name to pass Avro Schema validation
> ------------------------------------------------------------------------------
>
>                 Key: BEAM-7829
>                 URL: https://issues.apache.org/jira/browse/BEAM-7829
>             Project: Beam
>          Issue Type: Test
>          Components: io-java-avro, sdk-java-core
>            Reporter: Ismaël Mejía
>            Assignee: Ismaël Mejía
>            Priority: Minor
>
> While trying to use an Avro PCollection with the SQL transform I notice you 
> could not do correctly a bijective transform: PCollection<GenericRecord> -> 
> SQL -> PCollection<Row> -> ParDo -> PCollection<GenericRecord> I noticed that 
> some of the Avro metadata gets lost in particular the name of the Avro 
> Schema. This is important because Avro validates that the schema has a name 
> and if it does not it breaks with a ParseException.
> {quote}
> org.apache.avro.SchemaParseException: Illegal character in: EXPR$1
>     at org.apache.avro.Schema.validateName (Schema.java:1151)
>     at org.apache.avro.Schema.access$200 (Schema.java:81)
>     at org.apache.avro.Schema$Field.<init> (Schema.java:403)
>     at org.apache.avro.Schema$Field.<init> (Schema.java:423)
>     at org.apache.avro.Schema$Field.<init> (Schema.java:415){quote}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

Reply via email to