[ 
https://issues.apache.org/jira/browse/NIFI-14928?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pierre Villard updated NIFI-14928:
----------------------------------
    Status: Patch Available  (was: Open)

> PutBigQuery fails to send data into DATETIME columns
> ----------------------------------------------------
>
>                 Key: NIFI-14928
>                 URL: https://issues.apache.org/jira/browse/NIFI-14928
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Extensions
>            Reporter: Pierre Villard
>            Assignee: Pierre Villard
>            Priority: Major
>
> PutBigQuery cannot send data to DATETIME columns right now.
> In the proto message, DATETIME will be mapped to INT64 but if the reader is 
> giving us a long representation of the timestamp, we get an error such as:
> {code:java}
> Cannot return an invalid datetime value of 694224000000000 microseconds 
> expressed as civil time, according to civil_time encoding. The range of valid 
> datetime values is [0001-01-01 00:00:00, 9999-12-31 23:59:59.999999] on field 
> datetime. on field birth. {code}
> The failure comes from how DATETIME values must be encoded for BigQuery 
> Storage Write API. We are currently sending microseconds since epoch (valid 
> for TIMESTAMP), but DATETIME requires a packed “civil” datetime micros value 
> via CivilTimeEncoder.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to