[jira] [Commented] (BEAM-8177) BigQueryAvroUtils unable to convert field with record

2020-06-01 Thread Beam JIRA Bot (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17122651#comment-17122651
 ] 

Beam JIRA Bot commented on BEAM-8177:
-

This issue is P2 but has been unassigned without any comment for 60 days so it 
has been labeled "stale-P2". If this issue is still affecting you, we care! 
Please comment and remove the label. Otherwise, in 14 days the issue will be 
moved to P3.

Please see https://beam.apache.org/contribute/jira-priorities/ for a detailed 
explanation of what these priorities mean.


> BigQueryAvroUtils unable to convert field with record 
> --
>
> Key: BEAM-8177
> URL: https://issues.apache.org/jira/browse/BEAM-8177
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.15.0
>Reporter: Zaka Zaidan Azminur
>Priority: P2
>  Labels: stale-P2
>
> I'm trying to create a simple test pipeline that export BigQuery as Parquet 
> using BigQueryAvroUtils.java from Beam's code.
> When trying to read the BigQuery data and read it as Avro Generic Record, 
> somehow the code failed when trying to read the data with this exception
> {code:java}
> org.apache.avro.UnresolvedUnionException: Not in union 
> ["null",{"type":"record","name":"record","namespace":"Translated Avro Schema 
> for 
> record","doc":"org.apache.beam.sdk.io.gcp.bigquery","fields":[{"name":"key_2","type":["null","string"]},{"name":"key_1","type":["null","double"]}]}]:
>  {"key_2": "asdasd", "key_1": 123123.123}
> {code}
> I have checked the Avro schema and it's the same with its BigQuery schema 
> counterpart.
> Then I tried to export the BigQuery table using BigQuery console as Avro and 
> compare its schema with the one generated from BigQueryAvroUtils.java. Turns 
> out there's some difference at the Avro namespace between 
> BigQueryAvroUtils.java and from BigQuery export.
> After I tried to patch the BigQueryAvroUtils.java to make the schema result 
> the same with the schema from BigQuery export then the exception went away.
> So, I want to confirm whether there's problem in my implementation or 
> BigQuery create a slightly different Avro schema
> I've created a simple code along with the patch and data sample here 
> [https://github.com/zakazai/bq-to-parquet]
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8177) BigQueryAvroUtils unable to convert field with record

2019-10-15 Thread Kenneth Knowles (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16952303#comment-16952303
 ] 

Kenneth Knowles commented on BEAM-8177:
---

[~chamikara] any ideas about this?

> BigQueryAvroUtils unable to convert field with record 
> --
>
> Key: BEAM-8177
> URL: https://issues.apache.org/jira/browse/BEAM-8177
> Project: Beam
>  Issue Type: Bug
>  Components: io-java-gcp
>Affects Versions: 2.15.0
>Reporter: Zaka Zaidan Azminur
>Priority: Major
>
> I'm trying to create a simple test pipeline that export BigQuery as Parquet 
> using BigQueryAvroUtils.java from Beam's code.
> When trying to read the BigQuery data and read it as Avro Generic Record, 
> somehow the code failed when trying to read the data with this exception
> {code:java}
> org.apache.avro.UnresolvedUnionException: Not in union 
> ["null",{"type":"record","name":"record","namespace":"Translated Avro Schema 
> for 
> record","doc":"org.apache.beam.sdk.io.gcp.bigquery","fields":[{"name":"key_2","type":["null","string"]},{"name":"key_1","type":["null","double"]}]}]:
>  {"key_2": "asdasd", "key_1": 123123.123}
> {code}
> I have checked the Avro schema and it's the same with its BigQuery schema 
> counterpart.
> Then I tried to export the BigQuery table using BigQuery console as Avro and 
> compare its schema with the one generated from BigQueryAvroUtils.java. Turns 
> out there's some difference at the Avro namespace between 
> BigQueryAvroUtils.java and from BigQuery export.
> After I tried to patch the BigQueryAvroUtils.java to make the schema result 
> the same with the schema from BigQuery export then the exception went away.
> So, I want to confirm whether there's problem in my implementation or 
> BigQuery create a slightly different Avro schema
> I've created a simple code along with the patch and data sample here 
> [https://github.com/zakazai/bq-to-parquet]
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)