[
https://issues.apache.org/jira/browse/NIFI-5788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16675072#comment-16675072
]
ASF GitHub Bot commented on NIFI-5788:
--------------------------------------
GitHub user vadimar opened a pull request:
https://github.com/apache/nifi/pull/3128
NIFI-5788: Introduce batch size limit in PutDatabaseRecord processor
Thank you for submitting a contribution to Apache NiFi.
In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:
### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
- [ ] Does your PR title start with NIFI-XXXX where XXXX is the JIRA number
you are trying to resolve? Pay particular attention to the hyphen "-" character.
- [ ] Has your PR been rebased against the latest commit within the target
branch (typically master)?
- [ ] Is your initial contribution a single, squashed commit?
### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies
licensed in a way that is compatible for inclusion under [ASF
2.0](http://www.apache.org/legal/resolved.html#category-a)?
- [ ] If applicable, have you updated the LICENSE file, including the main
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to
.name (programmatic access) for each of the new properties?
### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in
which it is rendered?
### Note:
Please ensure that once the PR is submitted, you check travis-ci for build
issues and submit an update to your PR as soon as possible.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/vadimar/nifi-1 nifi-5788
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/nifi/pull/3128.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #3128
----
commit 2f36c8b1a732e249238f5f6f53968e84c05b497c
Author: vadimar <varshavsky@...>
Date: 2018-11-05T11:15:12Z
NIFI-5788: Introduce batch size limit in PutDatabaseRecord processor
----
> Introduce batch size limit in PutDatabaseRecord processor
> ---------------------------------------------------------
>
> Key: NIFI-5788
> URL: https://issues.apache.org/jira/browse/NIFI-5788
> Project: Apache NiFi
> Issue Type: Bug
> Components: Core Framework
> Affects Versions: 1.8.0
> Environment: Teradata DB
> Reporter: Vadim
> Priority: Major
> Labels: pull-request-available
> Fix For: 1.8.0
>
>
> Certain JDBC drivers do not support unlimited batch size in INSERT/UPDATE
> prepared SQL statements. Specifically, Teradata JDBC driver
> ([https://downloads.teradata.com/download/connectivity/jdbc-driver)] would
> fail SQL statement when the batch overflows the internal limits.
> Dividing data into smaller chunks before the PutDatabaseRecord is applied can
> work around the issue in certain scenarios, but generally, this solution is
> not perfect because the SQL statements would be executed in different
> transaction contexts and data integrity would not be preserved.
> The solution suggests the following:
> * introduce a new optional parameter in *PutDatabaseRecord* processor,
> *batch_size* which defines the maximum size of the bulk in INSERT/UPDATE
> statement; its default value is -1 (INFINITY) preserves the old behavior
> * divide the input into batches of the specified size and invoke
> PreparedStatement.executeBatch() for each batch
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)