kkdoon commented on code in PR #23620:
URL: https://github.com/apache/beam/pull/23620#discussion_r1017394461
##########
sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryAvroUtils.java:
##########
@@ -448,4 +449,32 @@ private static Field convertField(TableFieldSchema
bigQueryField) {
bigQueryField.getDescription(),
(Object) null /* Cast to avoid deprecated JsonNode constructor. */);
}
+
+ private static Schema handleAvroLogicalTypes(TableFieldSchema bigQueryField,
Type avroType) {
+ String bqType = bigQueryField.getType();
+ switch (bqType) {
+ case "NUMERIC":
+ // Default value based on
+ //
https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types#decimal_types
+ int precision =
Optional.ofNullable(bigQueryField.getPrecision()).orElse(38L).intValue();
+ int scale =
Optional.ofNullable(bigQueryField.getScale()).orElse(9L).intValue();
+ return LogicalTypes.decimal(precision,
scale).addToSchema(Schema.create(Type.BYTES));
+ case "BIGNUMERIC":
+ // Default value based on
+ //
https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types#decimal_types
+ int precisionBigNumeric =
+
Optional.ofNullable(bigQueryField.getPrecision()).orElse(77L).intValue();
Review Comment:
sorry, i was not referring to these unit tests. I tested with a standalone
beam job, in direct runner mode, against an actual BigQuery table and verified
the default precision/scale value set in the Avro writer schema (that's why
chose to set default as 77 instead of 76). Also, tried inserting and reading
the max documented value for BigNumeric field and it works for numbers with 77
precision. I could add a BQ integration test to test this edge case, if that
helps.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]