ahmedabu98 commented on code in PR #24783:
URL: https://github.com/apache/beam/pull/24783#discussion_r1093490259
##########
sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryUtils.java:
##########
@@ -691,6 +692,12 @@ public static Row toBeamRow(Schema rowSchema, TableSchema
bqSchema, TableRow jso
if (JSON_VALUE_PARSERS.containsKey(fieldType.getTypeName())) {
return
JSON_VALUE_PARSERS.get(fieldType.getTypeName()).apply(jsonBQString);
} else if (fieldType.isLogicalType(SqlTypes.DATETIME.getIdentifier())) {
+ // Handle if datetime value is in micros
Review Comment:
> Does BQ support both these formats ?
These changes support converting DateTime values expressed in micros to Beam
value (LocalDateTime --> [DateTime logical
type](https://github.com/apache/beam/blob/7a6598f2c96397dd47c69cc21600d778e395a0a5/sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/logicaltypes/DateTime.java#L41)).
Was necessary to add because Storage Write API returns failed TableRows with
micros DateTime values, so was running into issues converting them to Beam Rows
to output in DLQ. In any case (String or Long) they are converted to
LocalDateTime object.
> Also, let's add a log to the catch clause justifying swelling the
exception here.
Wouldn't this be problematic with large writes? It would log for every row
with a DateTime in String format.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]