[
https://issues.apache.org/jira/browse/NIFI-8223?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17282761#comment-17282761
]
Joe Witt commented on NIFI-8223:
--------------------------------
i see this mattyb. Will keep an eye on CI build. Code looks good. Will pull
into 1.13 if no issues. Starting RC on heels of this if so
> PutDatabaseRecord should use table column datatype instead of field datatype
> ----------------------------------------------------------------------------
>
> Key: NIFI-8223
> URL: https://issues.apache.org/jira/browse/NIFI-8223
> Project: Apache NiFi
> Issue Type: Improvement
> Components: Extensions
> Reporter: Matt Burgess
> Assignee: Matt Burgess
> Priority: Minor
> Fix For: 1.13.0
>
> Time Spent: 10m
> Remaining Estimate: 0h
>
> When PutDatabaseRecord calls putObject() to insert a field value into a
> prepared statement, it passes is the SQL type as determined from the NiFi
> record field's type. Most of the time this matches the table column's data
> type or else an error would occur when trying to put incompatible values into
> the column.
> However in the case of the BIGINT and TIMESTAMP types, the field could be
> inferred to be BIGINT when the column is of type TIMESTAMP. There's no way to
> know for large integers whether they correspond to a "plain" number or a
> number of (milli)seconds for example. In this case PutDatabaseRecord throws
> an error because it tries to put a BIGINT value into a TIMESTAMP field.
> This Jira proposes to improve this by comparing the field and column
> datatypes. If they match, either can be used. If they don't match, attempt to
> convert the value to the column datatype and use the column datatype in
> setObject(). If conversion is unsuccessful, fall back to the current behavior
> of using the field datatype and value.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)