[
https://issues.apache.org/jira/browse/BEAM-13391?focusedWorklogId=696034&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-696034
]
ASF GitHub Bot logged work on BEAM-13391:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 14/Dec/21 18:31
Start Date: 14/Dec/21 18:31
Worklog Time Spent: 10m
Work Description: satybald commented on pull request #16156:
URL: https://github.com/apache/beam/pull/16156#issuecomment-993864279
> Are you able to root cause why zero-row writes using Avro format fail?
Not really, the issue happens on BQ side. When the job get submitted to BQ,
BQ returns this error. As issue happens on BQ propriety code path and not a
Beam side, it's pretty hard to deduce the proper root cause here
> Does it only fail for schema update only or for `WriteToBigQuery` with no
rows in general?
Not an expert with `WTBQ` but the issue happens when load format is AVRO
with empty byte stream. Empty byte stream works fine with JSON format.
I've tried to play in the past with beam BQ submission API, in case there're
empty rows, beam API throrws exception saying that the argument shoud be
non-empty one.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 696034)
Remaining Estimate: 46h (was: 46h 10m)
Time Spent: 2h (was: 1h 50m)
> The Apache Avro library failed to parse the header with the following error:
> EOF reached
> ----------------------------------------------------------------------------------------
>
> Key: BEAM-13391
> URL: https://issues.apache.org/jira/browse/BEAM-13391
> Project: Beam
> Issue Type: Bug
> Components: io-py-gcp
> Affects Versions: 2.34.0
> Reporter: Sayat Satybaldiyev
> Priority: P2
> Original Estimate: 48h
> Time Spent: 2h
> Remaining Estimate: 46h
>
> When WriteToBigQuery is done with AVRO file loads, and the load is triggering
> a schema [modification
> job|https://github.com/apache/beam/blob/ad56edbf1fa2e6c19fea23555ecfad015f67174d/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py#L323-L340],
> then following error is triggered:
> {code:java}
> E RuntimeError: BigQuery job
> beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_SCHEMA_MOD_STEP_171_736013176f6ea340cfd22b9f5b575c81_643a0ee608c04300bfbe67c8db85b2b0
> failed. Error Result: <ErrorProto
> E message: 'Error while reading data, error message: The Apache Avro
> library failed to parse the header with the following error: EOF reached'
> E reason: 'invalid'> [while running
> 'WriteToBigQuery/BigQueryBatchFileLoads/WaitForSchemaModJobs']
> {code}
--
This message was sent by Atlassian Jira
(v8.20.1#820001)