[ 
https://issues.apache.org/jira/browse/BEAM-13990?focusedWorklogId=733579&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-733579
 ]

ASF GitHub Bot logged work on BEAM-13990:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 27/Feb/22 01:37
            Start Date: 27/Feb/22 01:37
    Worklog Time Spent: 10m 
      Work Description: liu-du commented on a change in pull request #16926:
URL: https://github.com/apache/beam/pull/16926#discussion_r815375516



##########
File path: 
sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/TableRowJsonCoder.java
##########
@@ -68,7 +73,17 @@ public long getEncodedElementByteSize(TableRow value) throws 
Exception {
   // FAIL_ON_EMPTY_BEANS is disabled in order to handle null values in
   // TableRow.
   private static final ObjectMapper MAPPER =
-      new ObjectMapper().disable(SerializationFeature.FAIL_ON_EMPTY_BEANS);
+      JsonMapper.builder()
+          .disable(SerializationFeature.FAIL_ON_EMPTY_BEANS)
+          .addModule(new JavaTimeModule())
+          // serialize Date/Time to string instead of floats
+          .configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false)
+          // serialize BigDecimal to string without scientific notation 
instead of floats
+          .configure(JsonGenerator.Feature.WRITE_BIGDECIMAL_AS_PLAIN, true)
+          .withConfigOverride(
+              BigDecimal.class,
+              it -> 
it.setFormat(JsonFormat.Value.forShape(JsonFormat.Shape.STRING)))

Review comment:
       I changed it because Jackson serializes BigDecimal as float by default, 
but floats are prone to lose precision. this can be avoided by serializing 
BigDecimal to string.
   
   The new configuration only changes how Jackson serialize TableRows, 
deserialization should continue to work since it's just deserializing plain 
json. Still, the in-flight BigDecimal (already serialized as floats) may need 
to be converted to protobuf String, for this I've added some code in 
`TableRowToStorageApiProto.scalarToProtoValue` to convert double/floats to 
String. I hope this should be enough to address backward compatibility?
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

            Worklog Id:     (was: 733579)
    Remaining Estimate: 111h 50m  (was: 112h)
            Time Spent: 8h 10m  (was: 8h)

> BigQueryIO cannot write to DATE and TIMESTAMP columns when using Storage 
> Write API 
> -----------------------------------------------------------------------------------
>
>                 Key: BEAM-13990
>                 URL: https://issues.apache.org/jira/browse/BEAM-13990
>             Project: Beam
>          Issue Type: Improvement
>          Components: io-java-gcp
>    Affects Versions: 2.36.0
>            Reporter: Du Liu
>            Assignee: Du Liu
>            Priority: P2
>   Original Estimate: 120h
>          Time Spent: 8h 10m
>  Remaining Estimate: 111h 50m
>
> when using Storage Write API with BigQueryIO, DATE and TIMESTAMP values are 
> currently converted to String type in protobuf message. This is incorrect, 
> according to storage write api [documentation|#data_type_conversions],] DATE 
> should be converted to int32 and TIMESTAMP should be converted to int64.
> Here's error message: 
> INFO: Stream finished with error 
> com.google.api.gax.rpc.InvalidArgumentException: 
> io.grpc.StatusRuntimeException: INVALID_ARGUMENT: The proto field mismatched 
> with BigQuery field at D6cbe536b_4dab_4292_8fda_ff2932dded49.datevalue, the 
> proto field type string, BigQuery field type DATE Entity
> I have included an integration test here: 
> [https://github.com/liu-du/beam/commit/b56823d1d213adf6ca5564ce1d244cc4ae8f0816]
>  
> The problem is because DATE and TIMESTAMP are converted to String in protobuf 
> message here: 
> [https://github.com/apache/beam/blob/a78fec72d0d9198eef75144a7bdaf93ada5abf9b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/TableRowToStorageApiProto.java#L69]
>  
> Storage Write API reject the request because it's expecting int32/int64 
> values. 
>  
> I've opened a PR here: https://github.com/apache/beam/pull/16926



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to