[ 
https://issues.apache.org/jira/browse/NIFI-2531?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15415828#comment-15415828
 ] 

ASF subversion and git services commented on NIFI-2531:
-------------------------------------------------------

Commit d544274881f738d387ffad51b85c344a8f097238 in nifi's branch 
refs/heads/master from [~mattyb149]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=d544274 ]

NIFI-2531: Fixed JdbcCommon handling of BigInteger objects for Avro

This closes #823.

Signed-off-by: Bryan Bende <[email protected]>


> SQL-to-Avro processors do not convert BIGINT correctly
> ------------------------------------------------------
>
>                 Key: NIFI-2531
>                 URL: https://issues.apache.org/jira/browse/NIFI-2531
>             Project: Apache NiFi
>          Issue Type: Bug
>    Affects Versions: 1.0.0, 0.7.0
>            Reporter: Matt Burgess
>            Assignee: Matt Burgess
>             Fix For: 1.0.0
>
>
> For the SQL to Avro processors that use JdbcCommon (such as ExecuteSQL), if a 
> BigInteger object is being put into an Avro record, it is being put in as a 
> String. However when the Avro schema is created and the SQL type of the 
> column is BIGINT, the schema contains the expected type "long" (actually a 
> union between null and long to allow for null values). This causes errors 
> such as:
> UnresolvedUnionException: not in union: ["null", "long"]
> If a BigInteger is retrieved from the result set and the SQL type is BIGINT, 
> then its value is expected to fit into 8 bytes and should thus be converted 
> to a long before storing in the Avro record.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to