[
https://issues.apache.org/jira/browse/BEAM-8933?focusedWorklogId=361176&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-361176
]
ASF GitHub Bot logged work on BEAM-8933:
----------------------------------------
Author: ASF GitHub Bot
Created on: 17/Dec/19 22:45
Start Date: 17/Dec/19 22:45
Worklog Time Spent: 10m
Work Description: iemejia commented on issue #10384: [WIP] [BEAM-8933]
Utilities for converting Arrow schemas and reading Arrow batches as Rows
URL: https://github.com/apache/beam/pull/10384#issuecomment-566783361
Yes exactly as your last proposal, we already did that for ZetaSQL in the
past too (making it an extension) that is a dep of GCP IO. Thanks for
understanding my point. As an extra point a module module for Arrow related
code may end up benefiting other IOs without the tradeoff of leaking too many
things into core.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 361176)
Time Spent: 3.5h (was: 3h 20m)
> BigQuery IO should support read/write in Arrow format
> -----------------------------------------------------
>
> Key: BEAM-8933
> URL: https://issues.apache.org/jira/browse/BEAM-8933
> Project: Beam
> Issue Type: Improvement
> Components: io-java-gcp
> Reporter: Kirill Kozlov
> Assignee: Kirill Kozlov
> Priority: Major
> Time Spent: 3.5h
> Remaining Estimate: 0h
>
> As of right now BigQuery uses Avro format for reading and writing.
> We should add a config to BigQueryIO to specify which format to use: Arrow or
> Avro (with Avro as default).
--
This message was sent by Atlassian Jira
(v8.3.4#803005)