[ 
https://issues.apache.org/jira/browse/BEAM-13990?focusedWorklogId=733669&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-733669
 ]

ASF GitHub Bot logged work on BEAM-13990:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 28/Feb/22 00:08
            Start Date: 28/Feb/22 00:08
    Worklog Time Spent: 10m 
      Work Description: liu-du commented on pull request #16926:
URL: https://github.com/apache/beam/pull/16926#issuecomment-1053733060


   @reuvenlax Thanks for the review. 
   
   I've made changes to address the issues flagged, please have a look at the 
code changes. I still think the cache is not needed:
   1. `TwoLevelMessageConverterCache` already cache MessageConverter twice.
   2. code will be very different even if we add back caching, because it needs 
to cache TableSchema in addition to Descriptor.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

            Worklog Id:     (was: 733669)
    Remaining Estimate: 109h 10m  (was: 109h 20m)
            Time Spent: 10h 50m  (was: 10h 40m)

> BigQueryIO cannot write to DATE and TIMESTAMP columns when using Storage 
> Write API 
> -----------------------------------------------------------------------------------
>
>                 Key: BEAM-13990
>                 URL: https://issues.apache.org/jira/browse/BEAM-13990
>             Project: Beam
>          Issue Type: Improvement
>          Components: io-java-gcp
>    Affects Versions: 2.36.0
>            Reporter: Du Liu
>            Assignee: Du Liu
>            Priority: P2
>   Original Estimate: 120h
>          Time Spent: 10h 50m
>  Remaining Estimate: 109h 10m
>
> when using Storage Write API with BigQueryIO, DATE and TIMESTAMP values are 
> currently converted to String type in protobuf message. This is incorrect, 
> according to storage write api [documentation|#data_type_conversions],] DATE 
> should be converted to int32 and TIMESTAMP should be converted to int64.
> Here's error message: 
> INFO: Stream finished with error 
> com.google.api.gax.rpc.InvalidArgumentException: 
> io.grpc.StatusRuntimeException: INVALID_ARGUMENT: The proto field mismatched 
> with BigQuery field at D6cbe536b_4dab_4292_8fda_ff2932dded49.datevalue, the 
> proto field type string, BigQuery field type DATE Entity
> I have included an integration test here: 
> [https://github.com/liu-du/beam/commit/b56823d1d213adf6ca5564ce1d244cc4ae8f0816]
>  
> The problem is because DATE and TIMESTAMP are converted to String in protobuf 
> message here: 
> [https://github.com/apache/beam/blob/a78fec72d0d9198eef75144a7bdaf93ada5abf9b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/TableRowToStorageApiProto.java#L69]
>  
> Storage Write API reject the request because it's expecting int32/int64 
> values. 
>  
> I've opened a PR here: https://github.com/apache/beam/pull/16926



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to