[jira] [Updated] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] zeyk updated NIFI-7785: --- Attachment: Screenshot from 2020-09-15 05-29-48.png > CaptureChangeMySQL processor captures enum values as "INDEX of those values" > from Mysql DB" > --- > > Key: NIFI-7785 > URL: https://issues.apache.org/jira/browse/NIFI-7785 > Project: Apache NiFi > Issue Type: Bug > Components: Tools and Build >Affects Versions: 1.11.4 > Environment: Ubuntu EC2 instance with 8 GB ram >Reporter: zeyk >Priority: Major > Labels: features > Attachments: Screenshot from 2020-09-15 05-29-48.png, flow.xml.gz > > > CaptureChangeMySQL processor captures enum values as "INDEX of those values" > rather than the values specified. > for example: > A table has columns (id int, fruit enum ('apple','pears','orange'), price int) > On doing an insert: > insert into (1,'apple',45) > insert into (2,'pears',56) > I have used CaptureChangeMySql processor to capture the CDC changes, the > process does the capture but captures the enum column alone based on its > index like the sample below: > for 1st insert: > > { > "type":"insert", > "timestamp":1599004442000, > "binlog_filename":"mysql-bin-changelog.39", > "binlog_position":1537835, > "database":"sample", > "table_name":"sample", > "table_id":82, > "columns":[ > { > "id":1, > "name":"id", > "column_type":-5, > "value":139 > }, > { > "id":2, > "name":"fruit", > "column_type":12, > "value":0 > }, > { > "id":3, > "name":"price", > "column_type":12, > "value":45 > } > ] > } > > for 2nd insert: > > { > "type":"insert", > "timestamp":1599004442000, > "binlog_filename":"mysql-bin-changelog.39", > "binlog_position":1537835, > "database":"sample", > "table_name":"sample", > "table_id":82, > "columns":[ > { > "id":1, > "name":"id", > "column_type":-5, > "value":139 > }, > { > "id":2, > "name":"fruit", > "column_type":12, > "value":1 > }, > { > "id":3, > "name":"price", > "column_type":12, > "value":56 > } > ] > } > > > So the above has 0 and 1 in place of apple and pears respectively. > > Could you of you help me on this, if there are folks who have faced similar > kinda issue > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195876#comment-17195876 ] zeyk commented on NIFI-7785: Hi [~RobertoGarcia], I have tried the above way, and facing issues from the groovy script as shown below !Screenshot from 2020-09-15 05-29-48.png! > CaptureChangeMySQL processor captures enum values as "INDEX of those values" > from Mysql DB" > --- > > Key: NIFI-7785 > URL: https://issues.apache.org/jira/browse/NIFI-7785 > Project: Apache NiFi > Issue Type: Bug > Components: Tools and Build >Affects Versions: 1.11.4 > Environment: Ubuntu EC2 instance with 8 GB ram >Reporter: zeyk >Priority: Major > Labels: features > Attachments: Screenshot from 2020-09-15 05-29-48.png, flow.xml.gz > > > CaptureChangeMySQL processor captures enum values as "INDEX of those values" > rather than the values specified. > for example: > A table has columns (id int, fruit enum ('apple','pears','orange'), price int) > On doing an insert: > insert into (1,'apple',45) > insert into (2,'pears',56) > I have used CaptureChangeMySql processor to capture the CDC changes, the > process does the capture but captures the enum column alone based on its > index like the sample below: > for 1st insert: > > { > "type":"insert", > "timestamp":1599004442000, > "binlog_filename":"mysql-bin-changelog.39", > "binlog_position":1537835, > "database":"sample", > "table_name":"sample", > "table_id":82, > "columns":[ > { > "id":1, > "name":"id", > "column_type":-5, > "value":139 > }, > { > "id":2, > "name":"fruit", > "column_type":12, > "value":0 > }, > { > "id":3, > "name":"price", > "column_type":12, > "value":45 > } > ] > } > > for 2nd insert: > > { > "type":"insert", > "timestamp":1599004442000, > "binlog_filename":"mysql-bin-changelog.39", > "binlog_position":1537835, > "database":"sample", > "table_name":"sample", > "table_id":82, > "columns":[ > { > "id":1, > "name":"id", > "column_type":-5, > "value":139 > }, > { > "id":2, > "name":"fruit", > "column_type":12, > "value":1 > }, > { > "id":3, > "name":"price", > "column_type":12, > "value":56 > } > ] > } > > > So the above has 0 and 1 in place of apple and pears respectively. > > Could you of you help me on this, if there are folks who have faced similar > kinda issue > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-minifi-cpp] szaszm commented on a change in pull request #895: MINIFICPP-1352 - Comment out unused parameters (for enabling -Wall)
szaszm commented on a change in pull request #895: URL: https://github.com/apache/nifi-minifi-cpp/pull/895#discussion_r488243927 ## File path: libminifi/include/utils/file/FileUtils.h ## @@ -115,14 +115,15 @@ class FileUtils { * @param force_posix returns the posix path separator ('/'), even when not on posix. Useful when dealing with remote posix paths. * @return the path separator character */ - static char get_separator(bool force_posix = false) { #ifdef WIN32 + static char get_separator(bool force_posix = false) { Review comment: I'd keep this outside the `#ifdef` to avoid confusing simple editors with the multiple braces. Use `(void)force_posix;` inside the function body to suppress the warning. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] szaszm opened a new pull request #908: MINIFICPP-1370 clarify README.md
szaszm opened a new pull request #908: URL: https://github.com/apache/nifi-minifi-cpp/pull/908 ...and fix obvious issues Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [x] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [x] Does your PR title start with MINIFICPP- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [x] Has your PR been rebased against the latest commit within the target branch (typically main)? - [x] Is your initial contribution a single, squashed commit? ### For code changes: - [x] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [x] If applicable, have you updated the LICENSE file? - [x] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [x] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check GitHub Actions CI results for build issues and submit an update to your PR as soon as possible. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] YolandaMDavis commented on a change in pull request #4522: NIFI-7796: Add Prometheus counters for total bytes sent/received
YolandaMDavis commented on a change in pull request #4522: URL: https://github.com/apache/nifi/pull/4522#discussion_r488230787 ## File path: nifi-nar-bundles/nifi-extension-utils/nifi-prometheus-utils/src/main/java/org/apache/nifi/prometheus/util/PrometheusMetricsUtil.java ## @@ -249,10 +249,8 @@ public static CollectorRegistry createNifiMetrics(NiFiMetricsRegistry nifiMetric nifiMetricsRegistry.setDataPoint(portStatus.getBytesSent(), "AMOUNT_BYTES_SENT", instanceId, portComponentType, portComponentName, portComponentId, parentId); nifiMetricsRegistry.setDataPoint(portStatus.getInputBytes(), "AMOUNT_BYTES_READ", instanceId, portComponentType, portComponentName, portComponentId, parentId); nifiMetricsRegistry.setDataPoint(portStatus.getOutputBytes(), "AMOUNT_BYTES_WRITTEN", instanceId, portComponentType, portComponentName, portComponentId, parentId); -nifiMetricsRegistry.incrementCounter(status.getBytesRead(), "TOTAL_BYTES_READ", instanceId, portComponentType, portComponentName, portComponentId, parentId); -nifiMetricsRegistry.incrementCounter(status.getBytesWritten(), "TOTAL_BYTES_WRITTEN", instanceId, portComponentType, portComponentName, portComponentId, parentId); - nifiMetricsRegistry.incrementCounter(status.getBytesReceived(), "TOTAL_BYTES_RECEIVED", instanceId, portComponentType, portComponentName, portComponentId, parentId); -nifiMetricsRegistry.incrementCounter(status.getBytesSent(), "TOTAL_BYTES_SENT", instanceId, portComponentType, portComponentName, portComponentId, parentId); + nifiMetricsRegistry.incrementCounter(portStatus.getBytesReceived(), "TOTAL_BYTES_RECEIVED", instanceId, portComponentType, portComponentName, portComponentId, parentId); + nifiMetricsRegistry.incrementCounter(portStatus.getBytesSent(), "TOTAL_BYTES_SENT", instanceId, portComponentType, portComponentName, portComponentId, parentId); Review comment: @mattyb149 in this instance we'll need to keep the TOTAL_BYTES_READ and TOTAL_BYTES_WRITTEN (lines previously on 252 and 253 however the portStatus.getInputBytes() and portStatus.getOutputBytes() should be used respectively as in lines 250 and 251 ## File path: nifi-nar-bundles/nifi-extension-utils/nifi-prometheus-utils/src/main/java/org/apache/nifi/prometheus/util/PrometheusMetricsUtil.java ## @@ -277,10 +275,8 @@ public static CollectorRegistry createNifiMetrics(NiFiMetricsRegistry nifiMetric nifiMetricsRegistry.setDataPoint(portStatus.getBytesSent(), "AMOUNT_BYTES_SENT", instanceId, portComponentType, portComponentName, portComponentId, parentId); nifiMetricsRegistry.setDataPoint(portStatus.getInputBytes(), "AMOUNT_BYTES_READ", instanceId, portComponentType, portComponentName, portComponentId, parentId); nifiMetricsRegistry.setDataPoint(portStatus.getOutputBytes(), "AMOUNT_BYTES_WRITTEN", instanceId, portComponentType, portComponentName, portComponentId, parentId); -nifiMetricsRegistry.incrementCounter(status.getBytesRead(), "TOTAL_BYTES_READ", instanceId, portComponentType, portComponentName, portComponentId, parentId); -nifiMetricsRegistry.incrementCounter(status.getBytesWritten(), "TOTAL_BYTES_WRITTEN", instanceId, portComponentType, portComponentName, portComponentId, parentId); - nifiMetricsRegistry.incrementCounter(status.getBytesReceived(), "TOTAL_BYTES_RECEIVED", instanceId, portComponentType, portComponentName, portComponentId, parentId); Review comment: same as comment above This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Resolved] (MINIFICPP-1342) Prevent patch reapplication in git-based ExtenalProjects
[ https://issues.apache.org/jira/browse/MINIFICPP-1342?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Marton Szasz resolved MINIFICPP-1342. - Resolution: Fixed > Prevent patch reapplication in git-based ExtenalProjects > > > Key: MINIFICPP-1342 > URL: https://issues.apache.org/jira/browse/MINIFICPP-1342 > Project: Apache NiFi MiNiFi C++ > Issue Type: Bug >Reporter: Marton Szasz >Assignee: Marton Szasz >Priority: Minor > Fix For: 0.8.0 > > Time Spent: 40m > Remaining Estimate: 0h > > Sometimes during rebuild, cmake tries to apply patches to git-based third > parties again and that fails. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (MINIFICPP-1351) PublishKafka notifyStop accesses the connection without synchronization
[ https://issues.apache.org/jira/browse/MINIFICPP-1351?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Marton Szasz updated MINIFICPP-1351: Resolution: Fixed Status: Resolved (was: Patch Available) > PublishKafka notifyStop accesses the connection without synchronization > --- > > Key: MINIFICPP-1351 > URL: https://issues.apache.org/jira/browse/MINIFICPP-1351 > Project: Apache NiFi MiNiFi C++ > Issue Type: Bug >Reporter: Marton Szasz >Assignee: Marton Szasz >Priority: Major > Time Spent: 1h > Remaining Estimate: 0h > > notifyStop doesn't lock the connection_mutex_ before reseting the conn_ to > nullptr. If onTrigger is still running, this crashes MiNiFi. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7650) UI is not navigable with a keyboard
[ https://issues.apache.org/jira/browse/NIFI-7650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195733#comment-17195733 ] Matt Gilman commented on NIFI-7650: --- Thanks for filing this JIRA [~sardell]! This will be a great addition to NiFi and look forward to checking it out. I agree that this could be a very large effort and chipping away may be the best approach. > UI is not navigable with a keyboard > --- > > Key: NIFI-7650 > URL: https://issues.apache.org/jira/browse/NIFI-7650 > Project: Apache NiFi > Issue Type: Sub-task > Components: Core UI >Reporter: Shane Ardell >Assignee: Shane Ardell >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (NIFI-7650) UI is not navigable with a keyboard
[ https://issues.apache.org/jira/browse/NIFI-7650?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shane Ardell reassigned NIFI-7650: -- Assignee: Shane Ardell > UI is not navigable with a keyboard > --- > > Key: NIFI-7650 > URL: https://issues.apache.org/jira/browse/NIFI-7650 > Project: Apache NiFi > Issue Type: Sub-task > Components: Core UI >Reporter: Shane Ardell >Assignee: Shane Ardell >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-minifi-cpp] szaszm commented on a change in pull request #837: MINIFICPP-1121 - Upgrade spdlog to version 1.8.0
szaszm commented on a change in pull request #837: URL: https://github.com/apache/nifi-minifi-cpp/pull/837#discussion_r488153714 ## File path: cmake/BundledSpdlog.cmake ## @@ -0,0 +1,74 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +function(use_bundled_spdlog SOURCE_DIR BINARY_DIR) +# Define byproducts +if (WIN32) +if ("${CMAKE_BUILD_TYPE}" STREQUAL "Debug") +set(BYPRODUCT "lib/spdlogd.lib") +else() +set(BYPRODUCT "lib/spdlog.lib") +endif() +else() +include(GNUInstallDirs) +if ("${CMAKE_BUILD_TYPE}" STREQUAL "Debug") +set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlogd.a") +else() +set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlog.a") +endif() Review comment: Could you convert this to use generator expressions? Consistent usage of them would make multi-config build possible (e.g. switching config in visual studio after generation, before build). We're not there yet on the project level, but this could be a small step in the right direction. https://cmake.org/cmake/help/v3.17/manual/cmake-generator-expressions.7.html I know I was the one to commit the wrong version originally, but here's the fix: ```suggestion set(BYPRODUCT "lib/spdlog$<$:d>.lib") else() include(GNUInstallDirs) set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlog$<$:d>.a") ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Commented] (NIFI-7804) Dependencies ending up in nifi-standard-services-api
[ https://issues.apache.org/jira/browse/NIFI-7804?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195682#comment-17195682 ] Andy LoPresto commented on NIFI-7804: - Thanks for summarizing Bryan. Aiming to have the PR up for this in the evening. > Dependencies ending up in nifi-standard-services-api > > > Key: NIFI-7804 > URL: https://issues.apache.org/jira/browse/NIFI-7804 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.10.0, 1.11.0, 1.12.0, 1.11.1, 1.11.2, 1.11.3, 1.11.4 >Reporter: Bryan Bende >Assignee: Andy LoPresto >Priority: Critical > Fix For: 1.13.0, 1.12.1 > > > One of the changes in https://issues.apache.org/jira/browse/NIFI-7407 was a > refactoring to the SSLContextService interface which added classes from > nifi-security-utils. The result is that nifi-standard-services-api-nar now > has transitive dependencies of nifi-security-utils included... > {code:java} > bcpkix-jdk15on-1.66.jar > bcprov-jdk15on-1.66.jar > bcrypt-0.9.0.jar > bytes-1.3.0.jar > commons-codec-1.14.jar > commons-lang3-3.9.jar {code} > This means any NAR that has a parent of nifi-standard-services-api-nar, which > is most of them, now has these on the classpath which may conflict with > versions of the same libraries used in the child NAR. > We should come up with a way to not depend on nifi-security-utils, or split > it up into more isolated modules so that these dependencies don't get brought > in to the service APIs. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (NIFI-7804) Dependencies ending up in nifi-standard-services-api
[ https://issues.apache.org/jira/browse/NIFI-7804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andy LoPresto reassigned NIFI-7804: --- Assignee: Andy LoPresto (was: Bryan Bende) > Dependencies ending up in nifi-standard-services-api > > > Key: NIFI-7804 > URL: https://issues.apache.org/jira/browse/NIFI-7804 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.10.0, 1.11.0, 1.12.0, 1.11.1, 1.11.2, 1.11.3, 1.11.4 >Reporter: Bryan Bende >Assignee: Andy LoPresto >Priority: Critical > Fix For: 1.13.0, 1.12.1 > > > One of the changes in https://issues.apache.org/jira/browse/NIFI-7407 was a > refactoring to the SSLContextService interface which added classes from > nifi-security-utils. The result is that nifi-standard-services-api-nar now > has transitive dependencies of nifi-security-utils included... > {code:java} > bcpkix-jdk15on-1.66.jar > bcprov-jdk15on-1.66.jar > bcrypt-0.9.0.jar > bytes-1.3.0.jar > commons-codec-1.14.jar > commons-lang3-3.9.jar {code} > This means any NAR that has a parent of nifi-standard-services-api-nar, which > is most of them, now has these on the classpath which may conflict with > versions of the same libraries used in the child NAR. > We should come up with a way to not depend on nifi-security-utils, or split > it up into more isolated modules so that these dependencies don't get brought > in to the service APIs. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7804) Dependencies ending up in nifi-standard-services-api
[ https://issues.apache.org/jira/browse/NIFI-7804?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195681#comment-17195681 ] Bryan Bende commented on NIFI-7804: --- Discussed offline with [~alopresto] and [~markap14] and we discussed two changes for this ticket... * Splitting _nifi-security-utils_ into two modules, the first would be _nifi-security-utils-core_ which would have no dependencies and would be safe to use in _nifi-standard-services-api_, and then _nifi-security-utils_ would depend on _nifi-security-utils-core_ and would include all the other utility code that requires bouncy castle, etc. * Restore the ClientAuth enum to SSLContextService and make it extend an enum from nifi-security-utils-core The first changed addresses the classpath issue outline in this ticket, and the second change ensures that SSLContextService from 1.12.0 is backward compatible for any extensions built against an older version. > Dependencies ending up in nifi-standard-services-api > > > Key: NIFI-7804 > URL: https://issues.apache.org/jira/browse/NIFI-7804 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.10.0, 1.11.0, 1.12.0, 1.11.1, 1.11.2, 1.11.3, 1.11.4 >Reporter: Bryan Bende >Assignee: Bryan Bende >Priority: Critical > Fix For: 1.13.0, 1.12.1 > > > One of the changes in https://issues.apache.org/jira/browse/NIFI-7407 was a > refactoring to the SSLContextService interface which added classes from > nifi-security-utils. The result is that nifi-standard-services-api-nar now > has transitive dependencies of nifi-security-utils included... > {code:java} > bcpkix-jdk15on-1.66.jar > bcprov-jdk15on-1.66.jar > bcrypt-0.9.0.jar > bytes-1.3.0.jar > commons-codec-1.14.jar > commons-lang3-3.9.jar {code} > This means any NAR that has a parent of nifi-standard-services-api-nar, which > is most of them, now has these on the classpath which may conflict with > versions of the same libraries used in the child NAR. > We should come up with a way to not depend on nifi-security-utils, or split > it up into more isolated modules so that these dependencies don't get brought > in to the service APIs. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-registry] kevdoran commented on pull request #302: NIFIREG-415 Add Support for Unicode in X-ProxiedEntitiesChain
kevdoran commented on pull request #302: URL: https://github.com/apache/nifi-registry/pull/302#issuecomment-692225242 Rebased on `main` to (hopefully) fix GHA CI builds This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] tpalfy commented on a change in pull request #4510: NIFI-7549 Adding Hazelcast based DistributedMapCacheClient support
tpalfy commented on a change in pull request #4510: URL: https://github.com/apache/nifi/pull/4510#discussion_r488057487 ## File path: nifi-nar-bundles/nifi-hazelcast-bundle/nifi-hazelcast-services/src/main/java/org/apache/nifi/hazelcast/services/util/LongUtil.java ## @@ -0,0 +1,66 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.hazelcast.services.util; + +/** + * Helper methods to work with long values effectively. + */ +public final class LongUtil { Review comment: Any reason why we can't (or shouldn't) use guava's `Longs.toByteArray` and `Longs.fromByteArray` instead of this class? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] tpalfy commented on a change in pull request #4510: NIFI-7549 Adding Hazelcast based DistributedMapCacheClient support
tpalfy commented on a change in pull request #4510: URL: https://github.com/apache/nifi/pull/4510#discussion_r487977149 ## File path: nifi-api/src/main/java/org/apache/nifi/controller/NodeTypeProvider.java ## @@ -34,4 +36,12 @@ * @return true if this instance is the primary node in the cluster; false otherwise */ boolean isPrimary(); + +/** + * @return In case of the instance is clustered, returns the collection of the host of the expected members + * in the cluster, regardless of their state. This includes the current host. In case of non-clustered instance + * the result will be an empty set. + */ Review comment: ```suggestion * @return Names/IP addresses of all expected hosts in the cluster (including the current one). For a standalone NiFi this returns an empty set instead. */ ``` ## File path: nifi-nar-bundles/nifi-hazelcast-bundle/nifi-hazelcast-services/src/main/java/org/apache/nifi/hazelcast/services/util/LongUtil.java ## @@ -0,0 +1,66 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.hazelcast.services.util; + +/** + * Helper methods to work with long values effectively. + */ +public final class LongUtil { Review comment: Any reason why we can't (or shouldn't) use guava's `Longs.toByteArray` and `Longs.fromByteArray` here? ## File path: nifi-nar-bundles/nifi-hazelcast-bundle/nifi-hazelcast-services-api/src/main/java/org/apache/nifi/hazelcast/services/cachemanager/HazelcastCacheManager.java ## @@ -0,0 +1,36 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.hazelcast.services.cachemanager; + +import org.apache.nifi.controller.ControllerService; +import org.apache.nifi.hazelcast.services.cache.HazelcastCache; + +/** + * Controller service responsible for providing cache instances and managing connection with the Hazelcast server. + */ +public interface HazelcastCacheManager extends ControllerService { + +/** + * Returns a cache instance maintaining a Hazelcast connection. + * + * @param name Name of the cache instance. Cache instances having the same name are depending on the same Hazelcast data structure! + * @param ttlInMillis The guaranteed lifetime of a cache entry in milliseconds. In case of 0, the entry will exists until it's deletion. + * + * @return Cache instance. Depending on the implementation it is not guaranteed if it will be a new instance. Review comment: ```suggestion * @return Cache instance. Depending on the implementation it is not guaranteed that it will be a new instance. ``` ## File path: nifi-nar-bundles/nifi-hazelcast-bundle/nifi-hazelcast-services-api/src/main/java/org/apache/nifi/hazelcast/services/cache/HazelcastCache.java ## @@ -0,0 +1,120 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed
[GitHub] [nifi-minifi-cpp] lordgamez opened a new pull request #907: MINIFICPP-1365 Use minimal docker image for integration tests
lordgamez opened a new pull request #907: URL: https://github.com/apache/nifi-minifi-cpp/pull/907 Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [ ] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [ ] Does your PR title start with MINIFICPP- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [ ] Has your PR been rebased against the latest commit within the target branch (typically main)? - [ ] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file? - [ ] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check GitHub Actions CI results for build issues and submit an update to your PR as soon as possible. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-7809) ListSFTP File Filter Regex documentation needs to be clearer
[ https://issues.apache.org/jira/browse/NIFI-7809?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Opher Shachar updated NIFI-7809: Description: The documentation for "File Filter Regex" property of ListSFTP processor reads: {quote}Provides a Java Regular Expression for filtering Filenames; if a filter is supplied, only files whose names match that Regular Expression will be fetched {quote} But it should be made clear that the Regex *_needs_* to match the _*whole*_ filename. e.g. to filter files ending in ".txt.gz" the following will not work: {code:java} \.txt\.gz$ {code} You'd need to specify: {code:java} .*\.txt\.gz {code} or equally: {code:java} .*\.txt\.gz$ {code} was: The documentation for "File Filter Regex" property of ListSFTP processor reads: {quote}Provides a Java Regular Expression for filtering Filenames; if a filter is supplied, only files whose names match that Regular Expression will be fetched {quote} But it should be made clear that the Regex *_needs_* to match the whole filename. e.g. to filter files ending in ".txt.gz" the following will not work: {code} \.txt\.gz$ {code} You'd need to specify: {code} .*\.txt\.gz {code} or equally: {code} .*\.txt\.gz$ {code} > ListSFTP File Filter Regex documentation needs to be clearer > > > Key: NIFI-7809 > URL: https://issues.apache.org/jira/browse/NIFI-7809 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Affects Versions: 1.11.4 >Reporter: Opher Shachar >Priority: Major > Labels: documentaion, sftp > > The documentation for "File Filter Regex" property of ListSFTP processor > reads: > {quote}Provides a Java Regular Expression for filtering Filenames; if a > filter is supplied, only files whose names match that Regular Expression will > be fetched > {quote} > But it should be made clear that the Regex *_needs_* to match the _*whole*_ > filename. e.g. to filter files ending in ".txt.gz" the following will not > work: > {code:java} > \.txt\.gz$ > {code} > You'd need to specify: > {code:java} > .*\.txt\.gz > {code} > or equally: > {code:java} > .*\.txt\.gz$ > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (NIFI-7809) ListSFTP File Filter Regex documentation needs to be clearer
Opher Shachar created NIFI-7809: --- Summary: ListSFTP File Filter Regex documentation needs to be clearer Key: NIFI-7809 URL: https://issues.apache.org/jira/browse/NIFI-7809 Project: Apache NiFi Issue Type: Improvement Components: Extensions Affects Versions: 1.11.4 Reporter: Opher Shachar The documentation for "File Filter Regex" property of ListSFTP processor reads: {quote}Provides a Java Regular Expression for filtering Filenames; if a filter is supplied, only files whose names match that Regular Expression will be fetched {quote} But it should be made clear that the Regex *_needs_* to match the whole filename. e.g. to filter files ending in ".txt.gz" the following will not work: {code} \.txt\.gz$ {code} You'd need to specify: {code} .*\.txt\.gz {code} or equally: {code} .*\.txt\.gz$ {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (NIFIREG-188) Login by hitting Enter in the password field in web UI
[ https://issues.apache.org/jira/browse/NIFIREG-188?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard resolved NIFIREG-188. Fix Version/s: 0.8.0 Resolution: Fixed > Login by hitting Enter in the password field in web UI > -- > > Key: NIFIREG-188 > URL: https://issues.apache.org/jira/browse/NIFIREG-188 > Project: NiFi Registry > Issue Type: Improvement >Affects Versions: 0.1.0, 0.2.0 >Reporter: Julian Gimbel >Assignee: Kotaro Terada >Priority: Minor > Fix For: 0.8.0 > > Time Spent: 1h > Remaining Estimate: 0h > > While logging in to Nifi Registry it should be possible to hit enter in the > password field to login instead of using the mouse or tab on to the login > button. > Unfortunately I am not experienced enough with angular to do it myself. > It is not as simple as just adding a tag around the dialog fields and > buttons. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-registry] pvillard31 commented on pull request #147: NIFIREG-188: Login by hitting Enter in the input fields in web UI
pvillard31 commented on pull request #147: URL: https://github.com/apache/nifi-registry/pull/147#issuecomment-692207690 Merged to main branch, thanks @kotarot This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-registry] pvillard31 closed pull request #147: NIFIREG-188: Login by hitting Enter in the input fields in web UI
pvillard31 closed pull request #147: URL: https://github.com/apache/nifi-registry/pull/147 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] asfgit closed pull request #4526: NIFI-7807 PutKudu documentation clarity & examples
asfgit closed pull request #4526: URL: https://github.com/apache/nifi/pull/4526 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] pvillard31 commented on pull request #4526: NIFI-7807 PutKudu documentation clarity & examples
pvillard31 commented on pull request #4526: URL: https://github.com/apache/nifi/pull/4526#issuecomment-692193891 Merged to main branch, thanks @sdairs This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Resolved] (NIFI-7807) PutKudu documentation clarity & examples
[ https://issues.apache.org/jira/browse/NIFI-7807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard resolved NIFI-7807. -- Fix Version/s: 1.13.0 Resolution: Fixed > PutKudu documentation clarity & examples > > > Key: NIFI-7807 > URL: https://issues.apache.org/jira/browse/NIFI-7807 > Project: Apache NiFi > Issue Type: Improvement > Components: Documentation Website >Affects Versions: 1.12.1 >Reporter: Alasdair Brown >Assignee: Alasdair Brown >Priority: Trivial > Fix For: 1.13.0 > > Time Spent: 0.5h > Remaining Estimate: 0h > > The PutKudu processor could use some changes to documentation. > > Issue created to track some changes, namely: > * Updating the processor description for more clarity about how the Record > -> Column schema works (i.e. inferred from the record reader) > * Adding an additional details page for more clarity on the above and some > examples to demonstrate > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (NIFI-7807) PutKudu documentation clarity & examples
[ https://issues.apache.org/jira/browse/NIFI-7807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard updated NIFI-7807: - Affects Version/s: (was: 1.12.1) > PutKudu documentation clarity & examples > > > Key: NIFI-7807 > URL: https://issues.apache.org/jira/browse/NIFI-7807 > Project: Apache NiFi > Issue Type: Improvement > Components: Documentation Website >Reporter: Alasdair Brown >Assignee: Alasdair Brown >Priority: Trivial > Fix For: 1.13.0 > > Time Spent: 0.5h > Remaining Estimate: 0h > > The PutKudu processor could use some changes to documentation. > > Issue created to track some changes, namely: > * Updating the processor description for more clarity about how the Record > -> Column schema works (i.e. inferred from the record reader) > * Adding an additional details page for more clarity on the above and some > examples to demonstrate > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7807) PutKudu documentation clarity & examples
[ https://issues.apache.org/jira/browse/NIFI-7807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195625#comment-17195625 ] ASF subversion and git services commented on NIFI-7807: --- Commit b3ae27a4cabbd3b4d9e30a18afac00224cd5fc8b in nifi's branch refs/heads/main from abrown [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=b3ae27a ] NIFI-7807 Updating in-class documentation to be more clear & adding additionalDetails with examples Signed-off-by: Pierre Villard This closes #4526. > PutKudu documentation clarity & examples > > > Key: NIFI-7807 > URL: https://issues.apache.org/jira/browse/NIFI-7807 > Project: Apache NiFi > Issue Type: Improvement > Components: Documentation Website >Affects Versions: 1.12.1 >Reporter: Alasdair Brown >Assignee: Alasdair Brown >Priority: Trivial > Time Spent: 10m > Remaining Estimate: 0h > > The PutKudu processor could use some changes to documentation. > > Issue created to track some changes, namely: > * Updating the processor description for more clarity about how the Record > -> Column schema works (i.e. inferred from the record reader) > * Adding an additional details page for more clarity on the above and some > examples to demonstrate > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (NIFI-7616) PrometheusReportingTask: Wrong number of processor active threads reported ?
[ https://issues.apache.org/jira/browse/NIFI-7616?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Matt Burgess updated NIFI-7616: --- Resolution: Duplicate Status: Resolved (was: Patch Available) Resolving this as a duplicate (actually a subset) of NIFI-7796, I will incorporate this change along with the other copy-paste errors (portStatus for example). Thank you for your contribution [~bcharron]! > PrometheusReportingTask: Wrong number of processor active threads reported ? > > > Key: NIFI-7616 > URL: https://issues.apache.org/jira/browse/NIFI-7616 > Project: Apache NiFi > Issue Type: Bug > Components: Extensions >Reporter: Benjamin Charron >Priority: Trivial > Labels: Metrics, reporting_task > Attachments: nifi-metrics.diff > > > The number of _processor_ active threads reported by prometheus > ("nifi_amount_threads_active") seem off. > I think "{{status.getActiveThreadCount()}}" should be > "{{processorStatus.getActiveThreadCount()}}" at > [PrometheusMetricsUtil.java:193|https://github.com/apache/nifi/blob/aa741cc5967f62c3c38c2a47e712b7faa6fe19ff/nifi-nar-bundles/nifi-extension-utils/nifi-prometheus-utils/src/main/java/org/apache/nifi/prometheus/util/PrometheusMetricsUtil.java#L193] > and :194 > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (NIFI-7804) Dependencies ending up in nifi-standard-services-api
[ https://issues.apache.org/jira/browse/NIFI-7804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bryan Bende reassigned NIFI-7804: - Assignee: Bryan Bende > Dependencies ending up in nifi-standard-services-api > > > Key: NIFI-7804 > URL: https://issues.apache.org/jira/browse/NIFI-7804 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.10.0, 1.11.0, 1.12.0, 1.11.1, 1.11.2, 1.11.3, 1.11.4 >Reporter: Bryan Bende >Assignee: Bryan Bende >Priority: Critical > Fix For: 1.13.0, 1.12.1 > > > One of the changes in https://issues.apache.org/jira/browse/NIFI-7407 was a > refactoring to the SSLContextService interface which added classes from > nifi-security-utils. The result is that nifi-standard-services-api-nar now > has transitive dependencies of nifi-security-utils included... > {code:java} > bcpkix-jdk15on-1.66.jar > bcprov-jdk15on-1.66.jar > bcrypt-0.9.0.jar > bytes-1.3.0.jar > commons-codec-1.14.jar > commons-lang3-3.9.jar {code} > This means any NAR that has a parent of nifi-standard-services-api-nar, which > is most of them, now has these on the classpath which may conflict with > versions of the same libraries used in the child NAR. > We should come up with a way to not depend on nifi-security-utils, or split > it up into more isolated modules so that these dependencies don't get brought > in to the service APIs. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7802) Unable to create a PropertiesFileLookupService controller service in version 1.12.0
[ https://issues.apache.org/jira/browse/NIFI-7802?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195575#comment-17195575 ] Joe Witt commented on NIFI-7802: reopened because as of yet i've not applied this to the support/nifi-1.12.1 branch as given NIFI-7804 the go forward plan is unclear > Unable to create a PropertiesFileLookupService controller service in version > 1.12.0 > --- > > Key: NIFI-7802 > URL: https://issues.apache.org/jira/browse/NIFI-7802 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.12.0 > Environment: Windows 10, jdk1.8.0_251 >Reporter: David >Assignee: Bryan Bende >Priority: Blocker > Fix For: 1.13.0, 1.12.1 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > Steps to reproduce: > 1) Fresh nifi install > 2) Click on configuration icon on the root PG. > 3) Create a PropertiesFileLookupService controller service > 4) Configure it to point to a simple properties file. > 5) Enable the processor. > Startup fails with the following stack trace: > {{}} > {code:java} > 2020-09-11 11:11:41,383 INFO [Validate Components Thread-5] > o.apache.nifi.security.xml.XXEValidator Validating > c:\Users\drsnyder\temp\test.properties for XXE attack2020-09-11 11:11:41,420 > ERROR [Timer-Driven Process Thread-4] o.a.n.c.s.StandardControllerServiceNode > StandardControllerServiceNode[service=PropertiesFileLookupService[id=7db74eb3-0174-1000-f309-1635af53ee3a], > versionedComponentId=null, > processGroup=StandardProcessGroup[identifier=7db6d1a3-0174-1000-c5c2-3573fd835310,name=NiFi > Flow], active=true] Failed to invoke @OnEnabled method due to > java.lang.NoClassDefFoundError: org/apache/commons/beanutils/DynaBean: {} > java.lang.NoClassDefFoundError: org/apache/commons/beanutils/DynaBean > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:264) > at com.sun.proxy.$Proxy133.(Unknown Source) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:739) > at > org.apache.commons.configuration2.builder.fluent.Parameters.createParametersProxy(Parameters.java:306) > at > org.apache.commons.configuration2.builder.fluent.Parameters.fileBased(Parameters.java:185) > at > org.apache.nifi.lookup.configuration2.CommonsConfigurationLookupService.onEnabled(CommonsConfigurationLookupService.java:106) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:142) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:130) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:75) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:52) > at > org.apache.nifi.controller.service.StandardControllerServiceNode$2.run(StandardControllerServiceNode.java:432) > at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > at java.lang.Thread.run(Thread.java:748) > Caused by: java.lang.ClassNotFoundException: > org.apache.commons.beanutils.DynaBean > at java.net.URLClassLoader.findClass(URLClassLoader.java:382) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > ... 28 common frames omitted{code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (NIFI-7804) Dependencies ending up in nifi-standard-services-api
[ https://issues.apache.org/jira/browse/NIFI-7804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Joe Witt updated NIFI-7804: --- Fix Version/s: 1.13.0 > Dependencies ending up in nifi-standard-services-api > > > Key: NIFI-7804 > URL: https://issues.apache.org/jira/browse/NIFI-7804 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.10.0, 1.11.0, 1.12.0, 1.11.1, 1.11.2, 1.11.3, 1.11.4 >Reporter: Bryan Bende >Priority: Critical > Fix For: 1.13.0, 1.12.1 > > > One of the changes in https://issues.apache.org/jira/browse/NIFI-7407 was a > refactoring to the SSLContextService interface which added classes from > nifi-security-utils. The result is that nifi-standard-services-api-nar now > has transitive dependencies of nifi-security-utils included... > {code:java} > bcpkix-jdk15on-1.66.jar > bcprov-jdk15on-1.66.jar > bcrypt-0.9.0.jar > bytes-1.3.0.jar > commons-codec-1.14.jar > commons-lang3-3.9.jar {code} > This means any NAR that has a parent of nifi-standard-services-api-nar, which > is most of them, now has these on the classpath which may conflict with > versions of the same libraries used in the child NAR. > We should come up with a way to not depend on nifi-security-utils, or split > it up into more isolated modules so that these dependencies don't get brought > in to the service APIs. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Reopened] (NIFI-7802) Unable to create a PropertiesFileLookupService controller service in version 1.12.0
[ https://issues.apache.org/jira/browse/NIFI-7802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Joe Witt reopened NIFI-7802: > Unable to create a PropertiesFileLookupService controller service in version > 1.12.0 > --- > > Key: NIFI-7802 > URL: https://issues.apache.org/jira/browse/NIFI-7802 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.12.0 > Environment: Windows 10, jdk1.8.0_251 >Reporter: David >Assignee: Bryan Bende >Priority: Blocker > Fix For: 1.13.0, 1.12.1 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > Steps to reproduce: > 1) Fresh nifi install > 2) Click on configuration icon on the root PG. > 3) Create a PropertiesFileLookupService controller service > 4) Configure it to point to a simple properties file. > 5) Enable the processor. > Startup fails with the following stack trace: > {{}} > {code:java} > 2020-09-11 11:11:41,383 INFO [Validate Components Thread-5] > o.apache.nifi.security.xml.XXEValidator Validating > c:\Users\drsnyder\temp\test.properties for XXE attack2020-09-11 11:11:41,420 > ERROR [Timer-Driven Process Thread-4] o.a.n.c.s.StandardControllerServiceNode > StandardControllerServiceNode[service=PropertiesFileLookupService[id=7db74eb3-0174-1000-f309-1635af53ee3a], > versionedComponentId=null, > processGroup=StandardProcessGroup[identifier=7db6d1a3-0174-1000-c5c2-3573fd835310,name=NiFi > Flow], active=true] Failed to invoke @OnEnabled method due to > java.lang.NoClassDefFoundError: org/apache/commons/beanutils/DynaBean: {} > java.lang.NoClassDefFoundError: org/apache/commons/beanutils/DynaBean > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:264) > at com.sun.proxy.$Proxy133.(Unknown Source) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:739) > at > org.apache.commons.configuration2.builder.fluent.Parameters.createParametersProxy(Parameters.java:306) > at > org.apache.commons.configuration2.builder.fluent.Parameters.fileBased(Parameters.java:185) > at > org.apache.nifi.lookup.configuration2.CommonsConfigurationLookupService.onEnabled(CommonsConfigurationLookupService.java:106) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:142) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:130) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:75) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:52) > at > org.apache.nifi.controller.service.StandardControllerServiceNode$2.run(StandardControllerServiceNode.java:432) > at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > at java.lang.Thread.run(Thread.java:748) > Caused by: java.lang.ClassNotFoundException: > org.apache.commons.beanutils.DynaBean > at java.net.URLClassLoader.findClass(URLClassLoader.java:382) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > ... 28 common frames omitted{code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-minifi-cpp] szaszm commented on pull request #905: MINIFICPP-1368 - Increment version number on main
szaszm commented on pull request #905: URL: https://github.com/apache/nifi-minifi-cpp/pull/905#issuecomment-692170069 libminifi/CMakeLists.txt needs to be changed, too This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] lordgamez opened a new pull request #906: MINIFICPP-1364 Remove rsync from docker builds
lordgamez opened a new pull request #906: URL: https://github.com/apache/nifi-minifi-cpp/pull/906 - Merge Dockerbuild.sh and Containerbuild.sh - Remove rsync command use added .dockerignore file instead - Use local sources in context instead of copied sources Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [ ] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [ ] Does your PR title start with MINIFICPP- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [ ] Has your PR been rebased against the latest commit within the target branch (typically main)? - [ ] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file? - [ ] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check GitHub Actions CI results for build issues and submit an update to your PR as soon as possible. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (MINIFICPP-1370) Clarify README to not guarantee backwards compatiblity in 0.x versions
[ https://issues.apache.org/jira/browse/MINIFICPP-1370?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Marton Szasz updated MINIFICPP-1370: Description: 0.x versions are development versions with no compatibility guarantees according to semver. README incorrectly suggests backwards compatibility with minor versions even in 0.x. > Clarify README to not guarantee backwards compatiblity in 0.x versions > -- > > Key: MINIFICPP-1370 > URL: https://issues.apache.org/jira/browse/MINIFICPP-1370 > Project: Apache NiFi MiNiFi C++ > Issue Type: Bug >Reporter: Marton Szasz >Priority: Major > > 0.x versions are development versions with no compatibility guarantees > according to semver. README incorrectly suggests backwards compatibility with > minor versions even in 0.x. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (MINIFICPP-1370) Clarify README to not guarantee backwards compatiblity in 0.x versions
Marton Szasz created MINIFICPP-1370: --- Summary: Clarify README to not guarantee backwards compatiblity in 0.x versions Key: MINIFICPP-1370 URL: https://issues.apache.org/jira/browse/MINIFICPP-1370 Project: Apache NiFi MiNiFi C++ Issue Type: Bug Reporter: Marton Szasz -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7804) Dependencies ending up in nifi-standard-services-api
[ https://issues.apache.org/jira/browse/NIFI-7804?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195545#comment-17195545 ] Joe Witt commented on NIFI-7804: ...in addition to the dependency implications it sounds like from comments in nifi slack https://apachenifi.slack.com/archives/C0L9UPWJZ/p1600097281032100that we've harmed compatibility. Generally this is ok and we've updated the migration guide to show this for now. Users can upload their custom nar plus the nar versions it was dependent on and this should be fine. However, for something like the SSL Context this is a bit tougher as that is so common. We probably should consider pulling that into a defended API we honor at the same level as the nifi-api. Anyway...we need to really think and resolve this sufficiently before kicking out 1.12.1 > Dependencies ending up in nifi-standard-services-api > > > Key: NIFI-7804 > URL: https://issues.apache.org/jira/browse/NIFI-7804 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.10.0, 1.11.0, 1.12.0, 1.11.1, 1.11.2, 1.11.3, 1.11.4 >Reporter: Bryan Bende >Priority: Critical > Fix For: 1.12.1 > > > One of the changes in https://issues.apache.org/jira/browse/NIFI-7407 was a > refactoring to the SSLContextService interface which added classes from > nifi-security-utils. The result is that nifi-standard-services-api-nar now > has transitive dependencies of nifi-security-utils included... > {code:java} > bcpkix-jdk15on-1.66.jar > bcprov-jdk15on-1.66.jar > bcrypt-0.9.0.jar > bytes-1.3.0.jar > commons-codec-1.14.jar > commons-lang3-3.9.jar {code} > This means any NAR that has a parent of nifi-standard-services-api-nar, which > is most of them, now has these on the classpath which may conflict with > versions of the same libraries used in the child NAR. > We should come up with a way to not depend on nifi-security-utils, or split > it up into more isolated modules so that these dependencies don't get brought > in to the service APIs. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (NIFI-7804) Dependencies ending up in nifi-standard-services-api
[ https://issues.apache.org/jira/browse/NIFI-7804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Joe Witt updated NIFI-7804: --- Priority: Critical (was: Major) > Dependencies ending up in nifi-standard-services-api > > > Key: NIFI-7804 > URL: https://issues.apache.org/jira/browse/NIFI-7804 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.10.0, 1.11.0, 1.12.0, 1.11.1, 1.11.2, 1.11.3, 1.11.4 >Reporter: Bryan Bende >Priority: Critical > Fix For: 1.12.1 > > > One of the changes in https://issues.apache.org/jira/browse/NIFI-7407 was a > refactoring to the SSLContextService interface which added classes from > nifi-security-utils. The result is that nifi-standard-services-api-nar now > has transitive dependencies of nifi-security-utils included... > {code:java} > bcpkix-jdk15on-1.66.jar > bcprov-jdk15on-1.66.jar > bcrypt-0.9.0.jar > bytes-1.3.0.jar > commons-codec-1.14.jar > commons-lang3-3.9.jar {code} > This means any NAR that has a parent of nifi-standard-services-api-nar, which > is most of them, now has these on the classpath which may conflict with > versions of the same libraries used in the child NAR. > We should come up with a way to not depend on nifi-security-utils, or split > it up into more isolated modules so that these dependencies don't get brought > in to the service APIs. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-minifi-cpp] hunyadi-dev commented on a change in pull request #837: MINIFICPP-1121 - Upgrade spdlog to version 1.8.0
hunyadi-dev commented on a change in pull request #837: URL: https://github.com/apache/nifi-minifi-cpp/pull/837#discussion_r487698416 ## File path: cmake/BundledSpdlog.cmake ## @@ -0,0 +1,70 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +function(use_bundled_spdlog SOURCE_DIR BINARY_DIR) +# Define byproducts +if (WIN32) +set(BYPRODUCT "lib/spdlog.lib") +else() +include(GNUInstallDirs) +if ("${CMAKE_BUILD_TYPE}" STREQUAL "Debug") +set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlogd.a") +else() +set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlog.a") +endif() +endif() + +# Set build options +set(SPDLOG_SOURCE_DIR "${BINARY_DIR}/thirdparty/spdlog-src") +set(SPDLOG_INSTALL_DIR "${BINARY_DIR}/thirdparty/spdlog-install") +set(SPDLOG_LIBRARY "${SPDLOG_INSTALL_DIR}/${BYPRODUCT}") +set(SPDLOG_CMAKE_ARGS ${PASSTHROUGH_CMAKE_ARGS} +"-DCMAKE_INSTALL_PREFIX=${SPDLOG_INSTALL_DIR}" +"-DSPDLOG_BUILD_EXAMPLE=OFF" +"-DSPDLOG_BUILD_TESTS=OFF" +"-DSPDLOG_BUILD_TESTING=OFF" +"-DSPDLOG_BUILD_BENCH=OFF" +"-DSPDLOG_BUILD_SHARED=OFF") + +# Build project +ExternalProject_Add( +spdlog-external +URL "https://github.com/gabime/spdlog/archive/v1.7.0.zip; Review comment: Based on the changelog, I don't see anything that should be broken by the version update: done. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] mattyb149 opened a new pull request #4527: NIFI-7800: Fix line endings for changed files
mattyb149 opened a new pull request #4527: URL: https://github.com/apache/nifi/pull/4527 Thank you for submitting a contribution to Apache NiFi. Please provide a short description of the PR here: Description of PR The previous PR for NIFI-7800 was committed with Windows line-endings, this PR returns them to Unix line endings In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [ ] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [ ] Does your PR title start with **NIFI-** where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [ ] Has your PR been rebased against the latest commit within the target branch (typically `main`)? - [ ] Is your initial contribution a single, squashed commit? _Additional commits in response to PR reviewer feedback should be made on this branch and pushed to allow change tracking. Do not `squash` or use `--force` when pushing to allow for clean monitoring of changes._ ### For code changes: - [ ] Have you ensured that the full suite of tests is executed via `mvn -Pcontrib-check clean install` at the root `nifi` folder? - [ ] Have you written or updated unit tests to verify your changes? - [ ] Have you verified that the full build is successful on JDK 8? - [ ] Have you verified that the full build is successful on JDK 11? - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the `LICENSE` file, including the main `LICENSE` file under `nifi-assembly`? - [ ] If applicable, have you updated the `NOTICE` file, including the main `NOTICE` file found under `nifi-assembly`? - [ ] If adding new Properties, have you added `.displayName` in addition to .name (programmatic access) for each of the new properties? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check GitHub Actions CI for build issues and submit an update to your PR as soon as possible. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-7804) Dependencies ending up in nifi-standard-services-api
[ https://issues.apache.org/jira/browse/NIFI-7804?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Joe Witt updated NIFI-7804: --- Fix Version/s: 1.12.1 > Dependencies ending up in nifi-standard-services-api > > > Key: NIFI-7804 > URL: https://issues.apache.org/jira/browse/NIFI-7804 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 1.10.0, 1.11.0, 1.12.0, 1.11.1, 1.11.2, 1.11.3, 1.11.4 >Reporter: Bryan Bende >Priority: Major > Fix For: 1.12.1 > > > One of the changes in https://issues.apache.org/jira/browse/NIFI-7407 was a > refactoring to the SSLContextService interface which added classes from > nifi-security-utils. The result is that nifi-standard-services-api-nar now > has transitive dependencies of nifi-security-utils included... > {code:java} > bcpkix-jdk15on-1.66.jar > bcprov-jdk15on-1.66.jar > bcrypt-0.9.0.jar > bytes-1.3.0.jar > commons-codec-1.14.jar > commons-lang3-3.9.jar {code} > This means any NAR that has a parent of nifi-standard-services-api-nar, which > is most of them, now has these on the classpath which may conflict with > versions of the same libraries used in the child NAR. > We should come up with a way to not depend on nifi-security-utils, or split > it up into more isolated modules so that these dependencies don't get brought > in to the service APIs. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (NIFI-7808) PutAzureBlobStorage fails large files with The request body is too large and exceeds the maximum permissible limit
Jeff Dix created NIFI-7808: -- Summary: PutAzureBlobStorage fails large files with The request body is too large and exceeds the maximum permissible limit Key: NIFI-7808 URL: https://issues.apache.org/jira/browse/NIFI-7808 Project: Apache NiFi Issue Type: Bug Affects Versions: 1.12.0 Environment: NiFi for Mac OS using HomeBrew Reporter: Jeff Dix The PutAzureBlogStorage fails consistently with FlowFiles larger than 50MB with the message _The request body is too large and exceeds the maximum permissible limit_. With the error in some cases the file has been completely uploaded to ADLS Gen 2, but in others the blob is 0 bytes. I have had a couple instances where the error did not occur, and the large FlowFile is successfully processed, but this is rare. It seems to be related to an Azure SDK, so this ticket might be helpful [https://github.com/Azure/azure-storage-blob-go/issues/141.] {code:java} // 2020-09-14 10:25:33,221 ERROR [Timer-Driven Process Thread-7] o.a.n.p.a.s.PutAzureDataLakeStorage PutAzureDataLakeStorage[id=7ddcff2f-0174-1000-874e-8e77a63b2d08] Failed to create file on Azure Data Lake Storage: com.azure.storage.file.datalake.models.DataLakeStorageException: Status code 413, "{"error":{"code":"RequestBodyTooLarge","message":"The request body is too large and exceeds the maximum permissible limit.\nRequestId:0aec5867-f01f-0027-3fab-8a48b300\nTime:2020-09-14T15:25:06.3113287Z"}}" com.azure.storage.file.datalake.models.DataLakeStorageException: Status code 413, "{"error":{"code":"RequestBodyTooLarge","message":"The request body is too large and exceeds the maximum permissible limit.\nRequestId:0aec5867-f01f-0027-3fab-8a48b300\nTime:2020-09-14T15:25:06.3113287Z"}}" at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.azure.core.http.rest.RestProxy.instantiateUnexpectedException(RestProxy.java:320) at com.azure.core.http.rest.RestProxy.lambda$ensureExpectedStatus$3(RestProxy.java:361) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:118) at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1755) at reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.signalCached(MonoCacheTime.java:320) at reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.onNext(MonoCacheTime.java:337) at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2317) at reactor.core.publisher.MonoCacheTime$CoordinatorSubscriber.onSubscribe(MonoCacheTime.java:276) at reactor.core.publisher.FluxFlatMap.trySubscribeScalarMap(FluxFlatMap.java:191) at reactor.core.publisher.MonoFlatMap.subscribeOrReturn(MonoFlatMap.java:53) at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:57) at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) at reactor.core.publisher.MonoCacheTime.subscribeOrReturn(MonoCacheTime.java:132) at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:57) at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:150) at reactor.core.publisher.FluxMap$MapSubscriber.onNext(FluxMap.java:114) at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:123) at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:112) at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:213) at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:123) at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:178) at reactor.core.publisher.FluxContextStart$ContextStartSubscriber.onNext(FluxContextStart.java:96) at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1755) at reactor.core.publisher.MonoCollectList$MonoCollectListSubscriber.onComplete(MonoCollectList.java:121) at reactor.core.publisher.FluxPeek$PeekSubscriber.onComplete(FluxPeek.java:252) at reactor.core.publisher.FluxMap$MapSubscriber.onComplete(FluxMap.java:136) at reactor.netty.channel.FluxReceive.onInboundComplete(FluxReceive.java:366) at reactor.netty.channel.ChannelOperations.onInboundComplete(ChannelOperations.java:367) at reactor.netty.channel.ChannelOperations.terminate(ChannelOperations.java:423) at
[GitHub] [nifi] turcsanyip edited a comment on pull request #4464: NIFI-4303 Add routingKey to ConsumeAMQP processor
turcsanyip edited a comment on pull request #4464: URL: https://github.com/apache/nifi/pull/4464#issuecomment-692136255 `ConsumeKafka` and `ConsumeJMS` add the destination as an attribute (`kafka.topic` / `jms_destination`). It is true though, that the destination property can come from variable registry in case of those processors so it is more dynamic. In general, it is easier to use this info in the downstream flow if it is available on the FlowFile. Anyway, I'm not insisted on it. We can add it later if it is needed by a real use case. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] turcsanyip commented on pull request #4464: NIFI-4303 Add routingKey to ConsumeAMQP processor
turcsanyip commented on pull request #4464: URL: https://github.com/apache/nifi/pull/4464#issuecomment-692136255 `ConsumeKafka` and `ConsumeJMS` add the destination as an attribute (`kafka.topic` / `jms_destination`). It is true though, that the destination property can come from variable registry in case of those processors so it is more dynamic. In general, it is easier to use this info in the downstream flow if it is available on the FlowFile. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-7742) Controller Services searchable, but selecting from search results causes an error
[ https://issues.apache.org/jira/browse/NIFI-7742?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Scott Aslan updated NIFI-7742: -- Fix Version/s: 1.12.1 Resolution: Fixed Status: Resolved (was: Patch Available) > Controller Services searchable, but selecting from search results causes an > error > - > > Key: NIFI-7742 > URL: https://issues.apache.org/jira/browse/NIFI-7742 > Project: Apache NiFi > Issue Type: Bug > Components: Core UI >Affects Versions: 1.12.0 >Reporter: Andrew M. Lim >Assignee: Shane Ardell >Priority: Major > Fix For: 1.12.1 > > Time Spent: 1h 20m > Remaining Estimate: 0h > > Controller service search capability added in NIFI-5925. However, if you > select one in the search results, instead of navigating to the controller > service, an error dialog is displayed with the message: > Unable to find the specified component. > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi] scottyaslan commented on pull request #4524: NIFI-7742: Add case for controller service selections
scottyaslan commented on pull request #4524: URL: https://github.com/apache/nifi/pull/4524#issuecomment-692114329 @sardell thanks for this contribution! This has been merged to master... although I forgot to squash your commits :( This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Comment Edited] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195173#comment-17195173 ] Roberto Garcia edited comment on NIFI-7785 at 9/14/20, 2:53 PM: Table Structure {code:java} CREATE TABLE `sample` ( `id` int NOT NULL AUTO_INCREMENT, `fruit` enum('apple','pear','orange') NOT NULL, `price` int NOT NULL, PRIMARY KEY (`id`) ) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci; {code} I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content = org.apache.commons.io.IOUtils.toString(inputStream, java.nio.charset.StandardCharsets.UTF_8) content=content.trim() def sql = new Sql(conn) if (!sql) { log.error( "Failed to get SQL connection") return } def row = sql.firstRow(sqlCmdString) enums="${row.enums}".trim() def enumArr = enums.split(',') String Index="" Matcher regexMatcher = content =~ /(?s)(?<="id":2,"value":)(\d+?)(?=})/ if (regexMatcher.find()) { index = regexMatcher.group(1) } def enumText= enumArr[index.toInteger() -1] String cdcConverted = content.replaceAll(/(?s)(?<="id":2,"value":)(\d+?)(?=})/, "\"" + enumText + "\""); outputStream.write((cdcConverted).getBytes("UTF-8")) } as StreamCallback) session.transfer(flowFile, REL_SUCCESS) } catch(e) { log.error('Scripting error' + sqlCmd, e) session.transfer(flowFile, REL_FAILURE) } // Release connection, this is important as it will otherwise block new executions conn?.close() {code} *Input:* {code:java} {"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[ {"id":1,"value":57} ,{"id":2,"value":2},{"id":3,"value":300}]} {code} *Output:* {code:java} {"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[ {"id":1,"value":57} ,{"id":2,"value":"pear"},{"id":3,"value":300}]} {code} was (Author: robertogarcia): I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId)
[jira] [Commented] (NIFI-7742) Controller Services searchable, but selecting from search results causes an error
[ https://issues.apache.org/jira/browse/NIFI-7742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195505#comment-17195505 ] ASF subversion and git services commented on NIFI-7742: --- Commit dc4daa29233a52e172b1e644ebda575b5f9e4679 in nifi's branch refs/heads/main from Shane Ardell [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=dc4daa2 ] NIFI-7742: remove defined and null check This closes #4524 Signed-off-by: Scott Aslan > Controller Services searchable, but selecting from search results causes an > error > - > > Key: NIFI-7742 > URL: https://issues.apache.org/jira/browse/NIFI-7742 > Project: Apache NiFi > Issue Type: Bug > Components: Core UI >Affects Versions: 1.12.0 >Reporter: Andrew M. Lim >Assignee: Shane Ardell >Priority: Major > Time Spent: 1h > Remaining Estimate: 0h > > Controller service search capability added in NIFI-5925. However, if you > select one in the search results, instead of navigating to the controller > service, an error dialog is displayed with the message: > Unable to find the specified component. > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7742) Controller Services searchable, but selecting from search results causes an error
[ https://issues.apache.org/jira/browse/NIFI-7742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195504#comment-17195504 ] ASF subversion and git services commented on NIFI-7742: --- Commit 80bc40a9af3994c35d74fdbc5cb6d3c030e81037 in nifi's branch refs/heads/main from Shane Ardell [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=80bc40a ] NIFI-7742: update case clause logic > Controller Services searchable, but selecting from search results causes an > error > - > > Key: NIFI-7742 > URL: https://issues.apache.org/jira/browse/NIFI-7742 > Project: Apache NiFi > Issue Type: Bug > Components: Core UI >Affects Versions: 1.12.0 >Reporter: Andrew M. Lim >Assignee: Shane Ardell >Priority: Major > Time Spent: 1h > Remaining Estimate: 0h > > Controller service search capability added in NIFI-5925. However, if you > select one in the search results, instead of navigating to the controller > service, an error dialog is displayed with the message: > Unable to find the specified component. > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7742) Controller Services searchable, but selecting from search results causes an error
[ https://issues.apache.org/jira/browse/NIFI-7742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195503#comment-17195503 ] ASF subversion and git services commented on NIFI-7742: --- Commit e3d551e87d1de4e350772b3b8ac9fcef80bcdfcd in nifi's branch refs/heads/main from Shane Ardell [ https://gitbox.apache.org/repos/asf?p=nifi.git;h=e3d551e ] NIFI-7742: add case for controller service selections > Controller Services searchable, but selecting from search results causes an > error > - > > Key: NIFI-7742 > URL: https://issues.apache.org/jira/browse/NIFI-7742 > Project: Apache NiFi > Issue Type: Bug > Components: Core UI >Affects Versions: 1.12.0 >Reporter: Andrew M. Lim >Assignee: Shane Ardell >Priority: Major > Time Spent: 1h > Remaining Estimate: 0h > > Controller service search capability added in NIFI-5925. However, if you > select one in the search results, instead of navigating to the controller > service, an error dialog is displayed with the message: > Unable to find the specified component. > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi] asfgit closed pull request #4524: NIFI-7742: Add case for controller service selections
asfgit closed pull request #4524: URL: https://github.com/apache/nifi/pull/4524 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-7802) Unable to create a PropertiesFileLookupService controller service in version 1.12.0
[ https://issues.apache.org/jira/browse/NIFI-7802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Joe Witt updated NIFI-7802: --- Fix Version/s: 1.13.0 > Unable to create a PropertiesFileLookupService controller service in version > 1.12.0 > --- > > Key: NIFI-7802 > URL: https://issues.apache.org/jira/browse/NIFI-7802 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.12.0 > Environment: Windows 10, jdk1.8.0_251 >Reporter: David >Assignee: Bryan Bende >Priority: Blocker > Fix For: 1.13.0, 1.12.1 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > Steps to reproduce: > 1) Fresh nifi install > 2) Click on configuration icon on the root PG. > 3) Create a PropertiesFileLookupService controller service > 4) Configure it to point to a simple properties file. > 5) Enable the processor. > Startup fails with the following stack trace: > {{}} > {code:java} > 2020-09-11 11:11:41,383 INFO [Validate Components Thread-5] > o.apache.nifi.security.xml.XXEValidator Validating > c:\Users\drsnyder\temp\test.properties for XXE attack2020-09-11 11:11:41,420 > ERROR [Timer-Driven Process Thread-4] o.a.n.c.s.StandardControllerServiceNode > StandardControllerServiceNode[service=PropertiesFileLookupService[id=7db74eb3-0174-1000-f309-1635af53ee3a], > versionedComponentId=null, > processGroup=StandardProcessGroup[identifier=7db6d1a3-0174-1000-c5c2-3573fd835310,name=NiFi > Flow], active=true] Failed to invoke @OnEnabled method due to > java.lang.NoClassDefFoundError: org/apache/commons/beanutils/DynaBean: {} > java.lang.NoClassDefFoundError: org/apache/commons/beanutils/DynaBean > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:264) > at com.sun.proxy.$Proxy133.(Unknown Source) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:739) > at > org.apache.commons.configuration2.builder.fluent.Parameters.createParametersProxy(Parameters.java:306) > at > org.apache.commons.configuration2.builder.fluent.Parameters.fileBased(Parameters.java:185) > at > org.apache.nifi.lookup.configuration2.CommonsConfigurationLookupService.onEnabled(CommonsConfigurationLookupService.java:106) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:142) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:130) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:75) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:52) > at > org.apache.nifi.controller.service.StandardControllerServiceNode$2.run(StandardControllerServiceNode.java:432) > at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > at java.lang.Thread.run(Thread.java:748) > Caused by: java.lang.ClassNotFoundException: > org.apache.commons.beanutils.DynaBean > at java.net.URLClassLoader.findClass(URLClassLoader.java:382) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > ... 28 common frames omitted{code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Roberto Garcia updated NIFI-7785: - Attachment: flow.xml.gz > CaptureChangeMySQL processor captures enum values as "INDEX of those values" > from Mysql DB" > --- > > Key: NIFI-7785 > URL: https://issues.apache.org/jira/browse/NIFI-7785 > Project: Apache NiFi > Issue Type: Bug > Components: Tools and Build >Affects Versions: 1.11.4 > Environment: Ubuntu EC2 instance with 8 GB ram >Reporter: zeyk >Priority: Major > Labels: features > Attachments: flow.xml.gz > > > CaptureChangeMySQL processor captures enum values as "INDEX of those values" > rather than the values specified. > for example: > A table has columns (id int, fruit enum ('apple','pears','orange'), price int) > On doing an insert: > insert into (1,'apple',45) > insert into (2,'pears',56) > I have used CaptureChangeMySql processor to capture the CDC changes, the > process does the capture but captures the enum column alone based on its > index like the sample below: > for 1st insert: > > { > "type":"insert", > "timestamp":1599004442000, > "binlog_filename":"mysql-bin-changelog.39", > "binlog_position":1537835, > "database":"sample", > "table_name":"sample", > "table_id":82, > "columns":[ > { > "id":1, > "name":"id", > "column_type":-5, > "value":139 > }, > { > "id":2, > "name":"fruit", > "column_type":12, > "value":0 > }, > { > "id":3, > "name":"price", > "column_type":12, > "value":45 > } > ] > } > > for 2nd insert: > > { > "type":"insert", > "timestamp":1599004442000, > "binlog_filename":"mysql-bin-changelog.39", > "binlog_position":1537835, > "database":"sample", > "table_name":"sample", > "table_id":82, > "columns":[ > { > "id":1, > "name":"id", > "column_type":-5, > "value":139 > }, > { > "id":2, > "name":"fruit", > "column_type":12, > "value":1 > }, > { > "id":3, > "name":"price", > "column_type":12, > "value":56 > } > ] > } > > > So the above has 0 and 1 in place of apple and pears respectively. > > Could you of you help me on this, if there are folks who have faced similar > kinda issue > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi] pvillard31 commented on a change in pull request #2997: NIFI-5583: Add cdc processor for MySQL referring to GTID.
pvillard31 commented on a change in pull request #2997: URL: https://github.com/apache/nifi/pull/2997#discussion_r487972597 ## File path: nifi-nar-bundles/nifi-cdc/nifi-cdc-mysql-bundle/nifi-cdc-mysql-processors/src/main/java/org/apache/nifi/cdc/mysql/processors/CaptureChangeMySQL.java ## @@ -681,6 +731,9 @@ protected void connect(List hosts, String username, String pa binlogClient.setBinlogPosition(currentBinlogPosition); } +binlogClient.setGtidSet(currentGtidSet); +binlogClient.setGtidSetFallbackToPurged(true); Review comment: Is it what we want here? https://github.com/shyiko/mysql-binlog-connector-java/blob/master/src/main/java/com/github/shyiko/mysql/binlog/BinaryLogClient.java#L331 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi] ravitejatvs commented on pull request #2231: NIFI-4521 MS SQL CDC Processor
ravitejatvs commented on pull request #2231: URL: https://github.com/apache/nifi/pull/2231#issuecomment-692076532 > Looks like there are multiple issues. One was introduced by me quite a while ago, with how we split/trim table names. More recently one was introduced by NIFI-7369 which added some special decimal parsing, but which does not work for NULL decimal values in some cases... Still troubleshooting. Awesome.. Please do let me know in case you need some help with the testing This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Updated] (NIFI-7802) Unable to create a PropertiesFileLookupService controller service in version 1.12.0
[ https://issues.apache.org/jira/browse/NIFI-7802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bryan Bende updated NIFI-7802: -- Resolution: Fixed Status: Resolved (was: Patch Available) > Unable to create a PropertiesFileLookupService controller service in version > 1.12.0 > --- > > Key: NIFI-7802 > URL: https://issues.apache.org/jira/browse/NIFI-7802 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.12.0 > Environment: Windows 10, jdk1.8.0_251 >Reporter: David >Assignee: Bryan Bende >Priority: Blocker > Fix For: 1.12.1 > > Time Spent: 1h 40m > Remaining Estimate: 0h > > Steps to reproduce: > 1) Fresh nifi install > 2) Click on configuration icon on the root PG. > 3) Create a PropertiesFileLookupService controller service > 4) Configure it to point to a simple properties file. > 5) Enable the processor. > Startup fails with the following stack trace: > {{}} > {code:java} > 2020-09-11 11:11:41,383 INFO [Validate Components Thread-5] > o.apache.nifi.security.xml.XXEValidator Validating > c:\Users\drsnyder\temp\test.properties for XXE attack2020-09-11 11:11:41,420 > ERROR [Timer-Driven Process Thread-4] o.a.n.c.s.StandardControllerServiceNode > StandardControllerServiceNode[service=PropertiesFileLookupService[id=7db74eb3-0174-1000-f309-1635af53ee3a], > versionedComponentId=null, > processGroup=StandardProcessGroup[identifier=7db6d1a3-0174-1000-c5c2-3573fd835310,name=NiFi > Flow], active=true] Failed to invoke @OnEnabled method due to > java.lang.NoClassDefFoundError: org/apache/commons/beanutils/DynaBean: {} > java.lang.NoClassDefFoundError: org/apache/commons/beanutils/DynaBean > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:264) > at com.sun.proxy.$Proxy133.(Unknown Source) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:739) > at > org.apache.commons.configuration2.builder.fluent.Parameters.createParametersProxy(Parameters.java:306) > at > org.apache.commons.configuration2.builder.fluent.Parameters.fileBased(Parameters.java:185) > at > org.apache.nifi.lookup.configuration2.CommonsConfigurationLookupService.onEnabled(CommonsConfigurationLookupService.java:106) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:142) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:130) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:75) > at > org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:52) > at > org.apache.nifi.controller.service.StandardControllerServiceNode$2.run(StandardControllerServiceNode.java:432) > at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > at java.lang.Thread.run(Thread.java:748) > Caused by: java.lang.ClassNotFoundException: > org.apache.commons.beanutils.DynaBean > at java.net.URLClassLoader.findClass(URLClassLoader.java:382) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > ... 28 common frames omitted{code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi] YolandaMDavis commented on pull request #4522: NIFI-7796: Add Prometheus counters for total bytes sent/received
YolandaMDavis commented on pull request #4522: URL: https://github.com/apache/nifi/pull/4522#issuecomment-692072985 @mattyb149 thanks will review shortly This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Resolved] (NIFIREG-313) Add OpenId Connect support for authenticating users
[ https://issues.apache.org/jira/browse/NIFIREG-313?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bryan Bende resolved NIFIREG-313. - Fix Version/s: 0.8.0 Resolution: Fixed > Add OpenId Connect support for authenticating users > --- > > Key: NIFIREG-313 > URL: https://issues.apache.org/jira/browse/NIFIREG-313 > Project: NiFi Registry > Issue Type: New Feature >Reporter: Pierre Villard >Assignee: Nathan Gough >Priority: Major > Fix For: 0.8.0 > > Time Spent: 3h 10m > Remaining Estimate: 0h > > Add support for authenticating users with the OIDC (OpenId Connection) > specification. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-registry] bbende merged pull request #296: NIFIREG-313 - Add OpenId Connect support for authenticating users
bbende merged pull request #296: URL: https://github.com/apache/nifi-registry/pull/296 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-registry] bbende commented on pull request #296: NIFIREG-313 - Add OpenId Connect support for authenticating users
bbende commented on pull request #296: URL: https://github.com/apache/nifi-registry/pull/296#issuecomment-692066511 +1 latest updates look good, going to merge This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Resolved] (NIFIREG-346) Update JGit to use Apache MINA sshd
[ https://issues.apache.org/jira/browse/NIFIREG-346?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bryan Bende resolved NIFIREG-346. - Resolution: Fixed > Update JGit to use Apache MINA sshd > --- > > Key: NIFIREG-346 > URL: https://issues.apache.org/jira/browse/NIFIREG-346 > Project: NiFi Registry > Issue Type: Wish >Reporter: Ken Swanson >Assignee: Pierre Villard >Priority: Minor > Fix For: 0.8.0 > > Time Spent: 2h > Remaining Estimate: 0h > > I mentioned this earlier in the Slack, but I didn't get around to making this > issue until now. > The current NiFi registry uses JGit to control Git functionality, > specifically (to my concern) to implement the `GitFlowPersistenceProvider` > functionality. When using JGit, it is using the default setup of JGit, and by > default JGit uses the JSch library to implement the SSH Factory. > JSch is an older library, and as a consequence it does not interoperate with > some newer SSH protocols. Specifically, newer SSH keys (like ed25519) do not > appear to work with JSch. > There is an alternate, and newer, SSH library available to JGit: the Apache > MINA sshd library. Using the new library only requires creating a new > SshdSessionFactory. Details can be found here: > [https://wiki.eclipse.org/JGit/New_and_Noteworthy/5.2] > I'd like to suggest moving to use Apache MINA sshd in the NiFi Registry. I > recently had a problem with my registry where I wanted to use SSH keys to > commit back to a repo, and the registry could not access the repo due to the > underlying JSch library not being able to use the ed25519 keys. I was able to > get around this by setting the GIT_SSH environment variable, but I think it > would be a good idea to use the newer library. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-registry] bbende merged pull request #303: NIFIREG-346 - fix build issue after bouncycastle upgrade
bbende merged pull request #303: URL: https://github.com/apache/nifi-registry/pull/303 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Comment Edited] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195173#comment-17195173 ] Roberto Garcia edited comment on NIFI-7785 at 9/14/20, 1:20 PM: I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content = org.apache.commons.io.IOUtils.toString(inputStream, java.nio.charset.StandardCharsets.UTF_8) content=content.trim() def sql = new Sql(conn) if (!sql) { log.error( "Failed to get SQL connection") return } def row = sql.firstRow(sqlCmdString) enums="${row.enums}".trim() def enumArr = enums.split(',') String Index="" Matcher regexMatcher = content =~ /(?s)(?<="id":2,"value":)(\d+?)(?=})/ if (regexMatcher.find()) { index = regexMatcher.group(1) } def enumText= enumArr[index.toInteger() -1] String cdcConverted = content.replaceAll(/(?s)(?<="id":2,"value":)(\d+?)(?=})/, "\"" + enumText + "\""); outputStream.write((cdcConverted).getBytes("UTF-8")) } as StreamCallback) session.transfer(flowFile, REL_SUCCESS) } catch(e) { log.error('Scripting error' + sqlCmd, e) session.transfer(flowFile, REL_FAILURE) } // Release connection, this is important as it will otherwise block new executions conn?.close() {code} *Input:* {code:java} {"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[ {"id":1,"value":57} ,{"id":2,"value":2},{"id":3,"value":300}]} {code} *Output:* {code:java} {"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[ {"id":1,"value":57} ,{"id":2,"value":"pear"},{"id":3,"value":300}]} {code} was (Author: robertogarcia): I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content =
[jira] [Comment Edited] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195173#comment-17195173 ] Roberto Garcia edited comment on NIFI-7785 at 9/14/20, 1:18 PM: I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content = org.apache.commons.io.IOUtils.toString(inputStream, java.nio.charset.StandardCharsets.UTF_8) content=content.trim() def sql = new Sql(conn) if (!sql) { log.error( "Failed to get SQL connection") return } def row = sql.firstRow(sqlCmdString) enums="${row.enums}".trim() def enumArr = enums.split(',') String Index="" Matcher regexMatcher = content =~ /(?s)(?<="id":2,"value":)(\d+?)(?=})/ if (regexMatcher.find()) { index = regexMatcher.group(1) } def enumText= enumArr[index.toInteger() -1] String cdcConverted = content.replaceAll(/(?s)(?<="id":2,"value":)(\d+?)(?=})/, "\"" + enumText + "\""); outputStream.write((cdcConverted).getBytes("UTF-8")) } as StreamCallback) session.transfer(flowFile, REL_SUCCESS) } catch(e) { log.error('Scripting error' + sqlCmd, e) session.transfer(flowFile, REL_FAILURE) } // Release connection, this is important as it will otherwise block new executions conn?.close() {code} *Input:* {code:java} {"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[ {"id":1,"value":57} ,{"id":2,"value":2},{"id":3,"value":300}]} {code} *Output:* {code:java} Output: {"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[ {"id":1,"value":57} ,{"id":2,"value":"pear"},{"id":3,"value":300}]} {code} was (Author: robertogarcia): I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content =
[jira] [Comment Edited] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195173#comment-17195173 ] Roberto Garcia edited comment on NIFI-7785 at 9/14/20, 1:15 PM: I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content = org.apache.commons.io.IOUtils.toString(inputStream, java.nio.charset.StandardCharsets.UTF_8) content=content.trim() def sql = new Sql(conn) if (!sql) { log.error( "Failed to get SQL connection") return } def row = sql.firstRow(sqlCmdString) enums="${row.enums}".trim() def enumArr = enums.split(',') String Index="" Matcher regexMatcher = content =~ /(?s)(?<="id":2,"value":)(\d+?)(?=})/ if (regexMatcher.find()) { index = regexMatcher.group(1) } def enumText= enumArr[index.toInteger() -1] String cdcConverted = content.replaceAll(/(?s)(?<="id":2,"value":)(\d+?)(?=})/, "\"" + enumText + "\""); outputStream.write((cdcConverted).getBytes("UTF-8")) } as StreamCallback) session.transfer(flowFile, REL_SUCCESS) } catch(e) { log.error('Scripting error' + sqlCmd, e) session.transfer(flowFile, REL_FAILURE) } // Release connection, this is important as it will otherwise block new executions conn?.close() {code} Input: \{"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[{"id":1,"value":57},\{"id":2,"value":2},\{"id":3,"value":300}]} Output: \{"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[{"id":1,"value":57},\{"id":2,"value":"pear"},\{"id":3,"value":300}]} was (Author: robertogarcia): I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content = org.apache.commons.io.IOUtils.toString(inputStream,
[jira] [Comment Edited] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195173#comment-17195173 ] Roberto Garcia edited comment on NIFI-7785 at 9/14/20, 1:15 PM: I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content = org.apache.commons.io.IOUtils.toString(inputStream, java.nio.charset.StandardCharsets.UTF_8) content=content.trim() def sql = new Sql(conn) if (!sql) { log.error( "Failed to get SQL connection") return } def row = sql.firstRow(sqlCmdString) enums="${row.enums}".trim() def enumArr = enums.split(',') String Index="" Matcher regexMatcher = content =~ /(?s)(?<="id":2,"value":)(\d+?)(?=})/ if (regexMatcher.find()) { index = regexMatcher.group(1) } def enumText= enumArr[index.toInteger() -1] String cdcConverted = content.replaceAll(/(?s)(?<="id":2,"value":)(\d+?)(?=})/, "\"" + enumText + "\""); outputStream.write((cdcConverted).getBytes("UTF-8")) } as StreamCallback) session.transfer(flowFile, REL_SUCCESS) } catch(e) { log.error('Scripting error' + sqlCmd, e) session.transfer(flowFile, REL_FAILURE) } // Release connection, this is important as it will otherwise block new executions conn?.close() {code} Input: {"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[ {"id":1,"value":57} ,\{"id":2,"value":2},\{"id":3,"value":300}]} Output: {"type":"insert","timestamp":1600053851000,"binlog_filename":"binlog.19","binlog_position":3402,"database":null,"table_name":null,"table_id":null,"columns":[ {"id":1,"value":57} ,\{"id":2,"value":"pear"},\{"id":3,"value":300}]} was (Author: robertogarcia): I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content =
[jira] [Commented] (NIFIREG-147) Add Keycloak authentication method
[ https://issues.apache.org/jira/browse/NIFIREG-147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195449#comment-17195449 ] Christian Englert commented on NIFIREG-147: --- Since the classloader issue is resolved: [https://github.com/ChrisEnglert/nifi-addons/tree/master/nifi-registry-keycloak] If desired I can contribute this as an extension to the NiFi Registry Repo > Add Keycloak authentication method > -- > > Key: NIFIREG-147 > URL: https://issues.apache.org/jira/browse/NIFIREG-147 > Project: NiFi Registry > Issue Type: Improvement >Reporter: Gregory Reshetniak >Priority: Major > > Keycloak does implement a lot of related functionality, including groups, > users and such. It would be great to have first-class integration available. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Comment Edited] (NIFI-7785) CaptureChangeMySQL processor captures enum values as "INDEX of those values" from Mysql DB"
[ https://issues.apache.org/jira/browse/NIFI-7785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195173#comment-17195173 ] Roberto Garcia edited comment on NIFI-7785 at 9/14/20, 1:11 PM: I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0 Database Connection URL : jdbc:mysql://localhost:3306/sample Database Driver Class Name: com.mysql.cj.jdbc.Driver Database Driver Location(s) : /usr/share/java/mysql-connector-java-8.0.21.jar| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content = org.apache.commons.io.IOUtils.toString(inputStream, java.nio.charset.StandardCharsets.UTF_8) content=content.trim() def sql = new Sql(conn) if (!sql) { log.error( "Failed to get SQL connection") return } def row = sql.firstRow(sqlCmdString) enums="${row.enums}".trim() def enumArr = enums.split(',') String Index="" Matcher regexMatcher = content =~ /(?s)(?<="id":2,"value":)(\d+?)(?=})/ if (regexMatcher.find()) { index = regexMatcher.group(1) } def enumText= enumArr[index.toInteger() -1] String cdcConverted = content.replaceAll(/(?s)(?<="id":2,"value":)(\d+?)(?=})/, "\"" + enumText + "\""); outputStream.write((cdcConverted).getBytes("UTF-8")) } as StreamCallback) session.transfer(flowFile, REL_SUCCESS) } catch(e) { log.error('Scripting error' + sqlCmd, e) session.transfer(flowFile, REL_FAILURE) } // Release connection, this is important as it will otherwise block new executions conn?.close() {code} was (Author: robertogarcia): I use an "ExecuteGroovyScript" Processor I added two properties ||Property||Value||Comment|| |DBCPConnectionPoolName|SampleMySQLPool|this is your DBCPConnectionPool 1.12.0| |sqlCmd|SELECT REPLACE(REPLACE(REPLACE(REPLACE(column_type,'enum',''),')',''),'(',''),'\'','') enums FROM INFORMATION_SCHEMA.COLUMNS table_name='sample' AND column_name='fruit'|this query get your enums from your column "fruit"| finally here my Groovy Script: {code:java} import java.util.regex.Matcher import org.apache.nifi.controller.ControllerService import groovy.sql.Sql // Executescript attributes def serviceName = DBCPConnectionPoolName.value def sqlCmdString = sqlCmd.value // get controller service lookup from context def lookup = context.controllerServiceLookup // search for serviceName in controller services def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find { cs -> lookup.getControllerServiceName(cs) == serviceName } //Get the service from serviceid def service = lookup.getControllerService(dbcpServiceId) // Connect to service def conn = service.getConnection() if (!conn) { log.error( "Failed to connect to " + serviceName) return; } try { flowFile = session.get() flowFile = session.write(flowFile, { inputStream, outputStream -> def content = org.apache.commons.io.IOUtils.toString(inputStream, java.nio.charset.StandardCharsets.UTF_8) content=content.trim() def sql = new Sql(conn) if (!sql) { log.error( "Failed to get SQL connection") return } def row = sql.firstRow(sqlCmdString) enums="${row.enums}".trim() def enumArr = enums.split(',') String Index="" Matcher regexMatcher = content =~ /(?s)(?<="id":2,"value":)(\d+?)(?=})/ if (regexMatcher.find()) { index = regexMatcher.group(1) } def enumText= enumArr[index.toInteger() -1] String cdcConverted =
[jira] [Created] (MINIFICPP-1369) Implement ConsumeKafka processor
Arpad Boda created MINIFICPP-1369: - Summary: Implement ConsumeKafka processor Key: MINIFICPP-1369 URL: https://issues.apache.org/jira/browse/MINIFICPP-1369 Project: Apache NiFi MiNiFi C++ Issue Type: New Feature Affects Versions: 0.8.0 Reporter: Arpad Boda Implenent ConsumeKafka processor to support consuming Kafka topic(s). Hint: https://nifi.apache.org/docs/nifi-docs/components/nifi-docs/components/org.apache.nifi/nifi-kafka-2-0-nar/1.9.0/org.apache.nifi.processors.kafka.pubsub.ConsumeKafka_2_0/index.html -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (NIFI-7807) PutKudu documentation clarity & examples
[ https://issues.apache.org/jira/browse/NIFI-7807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17195445#comment-17195445 ] Alasdair Brown commented on NIFI-7807: -- PR Submitted [https://github.com/apache/nifi/pull/4526] > PutKudu documentation clarity & examples > > > Key: NIFI-7807 > URL: https://issues.apache.org/jira/browse/NIFI-7807 > Project: Apache NiFi > Issue Type: Improvement > Components: Documentation Website >Affects Versions: 1.12.1 >Reporter: Alasdair Brown >Assignee: Alasdair Brown >Priority: Trivial > Time Spent: 10m > Remaining Estimate: 0h > > The PutKudu processor could use some changes to documentation. > > Issue created to track some changes, namely: > * Updating the processor description for more clarity about how the Record > -> Column schema works (i.e. inferred from the record reader) > * Adding an additional details page for more clarity on the above and some > examples to demonstrate > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi] sdairs opened a new pull request #4526: NIFI-7807 PutKudu documentation clarity & examples
sdairs opened a new pull request #4526: URL: https://github.com/apache/nifi/pull/4526 Updating documentation for the PutKudu processor. In short: - Changes the description in the class to more closely reflect how the processor works with schemas - Adds an additional details page with more details & examples https://issues.apache.org/jira/browse/NIFI-7807 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Assigned] (NIFI-7807) PutKudu documentation clarity & examples
[ https://issues.apache.org/jira/browse/NIFI-7807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard reassigned NIFI-7807: Assignee: Alasdair Brown > PutKudu documentation clarity & examples > > > Key: NIFI-7807 > URL: https://issues.apache.org/jira/browse/NIFI-7807 > Project: Apache NiFi > Issue Type: Improvement > Components: Documentation Website >Affects Versions: 1.12.1 >Reporter: Alasdair Brown >Assignee: Alasdair Brown >Priority: Trivial > > The PutKudu processor could use some changes to documentation. > > Issue created to track some changes, namely: > * Updating the processor description for more clarity about how the Record > -> Column schema works (i.e. inferred from the record reader) > * Adding an additional details page for more clarity on the above and some > examples to demonstrate > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (MINIFICPP-1367) Add cmake flag for disabling building nanofi
[ https://issues.apache.org/jira/browse/MINIFICPP-1367?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Adam Hunyadi reassigned MINIFICPP-1367: --- Assignee: Adam Hunyadi > Add cmake flag for disabling building nanofi > > > Key: MINIFICPP-1367 > URL: https://issues.apache.org/jira/browse/MINIFICPP-1367 > Project: Apache NiFi MiNiFi C++ > Issue Type: Task >Affects Versions: 0.7.0 >Reporter: Adam Hunyadi >Assignee: Adam Hunyadi >Priority: Minor > Labels: MiNiFi-CPP-Hygiene > Fix For: 1.0.0 > > > *Background:* > Currently nanofi automatically builds alongside minifi contributing to its > build time. > *Proposal:* > We should define a cmake flag (and add an option to toggle it to > bootstrap.sh), so that we can skip building it when we do not need it. We can > have the option default to building nanofi or disable it but make make sure > that at least one CI jobs performs a nanofi build. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (NIFI-7807) PutKudu documentation clarity & examples
Alasdair Brown created NIFI-7807: Summary: PutKudu documentation clarity & examples Key: NIFI-7807 URL: https://issues.apache.org/jira/browse/NIFI-7807 Project: Apache NiFi Issue Type: Improvement Components: Documentation Website Affects Versions: 1.12.1 Reporter: Alasdair Brown The PutKudu processor could use some changes to documentation. Issue created to track some changes, namely: * Updating the processor description for more clarity about how the Record -> Column schema works (i.e. inferred from the record reader) * Adding an additional details page for more clarity on the above and some examples to demonstrate -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (NIFI-7806) ValidateRecord does not handle XML with nested array of records correctly
Vesa Sokka created NIFI-7806: Summary: ValidateRecord does not handle XML with nested array of records correctly Key: NIFI-7806 URL: https://issues.apache.org/jira/browse/NIFI-7806 Project: Apache NiFi Issue Type: Bug Reporter: Vesa Sokka Attachments: nested_xml_failing_validation.xml When attempting to validate XML with nested array of records, we get the following error: "Record #1 is invalid due to: MapRecord[\{ids=MapRecord[{TestRecordID=2103686}]}] is not a valid value for /TestRecord: Value is of type org.apache.nifi.serialization.record.MapRecord but was expected to be of type ARRAY[RECORD]" Template that reproduces the error is attached -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Assigned] (MINIFICPP-1368) Increment version number on main
[ https://issues.apache.org/jira/browse/MINIFICPP-1368?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Arpad Boda reassigned MINIFICPP-1368: - Assignee: Arpad Boda > Increment version number on main > > > Key: MINIFICPP-1368 > URL: https://issues.apache.org/jira/browse/MINIFICPP-1368 > Project: Apache NiFi MiNiFi C++ > Issue Type: Task >Reporter: Arpad Boda >Assignee: Arpad Boda >Priority: Major > Time Spent: 10m > Remaining Estimate: 0h > > Version number still indicates 0.7.0, although main should have 0.9.0 as the > RC branch of 0.8.0 is already created. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-minifi-cpp] arpadboda opened a new pull request #905: MINIFICPP-1368 - Increment version number on main
arpadboda opened a new pull request #905: URL: https://github.com/apache/nifi-minifi-cpp/pull/905 Thank you for submitting a contribution to Apache NiFi - MiNiFi C++. In order to streamline the review of the contribution we ask you to ensure the following steps have been taken: ### For all changes: - [ ] Is there a JIRA ticket associated with this PR? Is it referenced in the commit message? - [ ] Does your PR title start with MINIFICPP- where is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. - [ ] Has your PR been rebased against the latest commit within the target branch (typically main)? - [ ] Is your initial contribution a single, squashed commit? ### For code changes: - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? - [ ] If applicable, have you updated the LICENSE file? - [ ] If applicable, have you updated the NOTICE file? ### For documentation related changes: - [ ] Have you ensured that format looks appropriate for the output in which it is rendered? ### Note: Please ensure that once the PR is submitted, you check GitHub Actions CI results for build issues and submit an update to your PR as soon as possible. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[jira] [Created] (MINIFICPP-1368) Increment version number on main
Arpad Boda created MINIFICPP-1368: - Summary: Increment version number on main Key: MINIFICPP-1368 URL: https://issues.apache.org/jira/browse/MINIFICPP-1368 Project: Apache NiFi MiNiFi C++ Issue Type: Task Reporter: Arpad Boda Version number still indicates 0.7.0, although main should have 0.9.0 as the RC branch of 0.8.0 is already created. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (MINIFICPP-1353) Fix heap-use-after-free errors
[ https://issues.apache.org/jira/browse/MINIFICPP-1353?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ferenc Gerlits resolved MINIFICPP-1353. --- Resolution: Fixed > Fix heap-use-after-free errors > -- > > Key: MINIFICPP-1353 > URL: https://issues.apache.org/jira/browse/MINIFICPP-1353 > Project: Apache NiFi MiNiFi C++ > Issue Type: Bug >Reporter: Ferenc Gerlits >Assignee: Ferenc Gerlits >Priority: Minor > Labels: MiNiFi-CPP-Hygiene > Fix For: 0.9.0 > > Time Spent: 40m > Remaining Estimate: 0h > > Address sanitizer finds one heap-use-after-free error when run on the unit > tests: > {noformat} > ==26761==ERROR: AddressSanitizer: heap-use-after-free on address > 0x6062c4a8 at pc 0x55d957b02e44 bp 0x7f6e736875d0 sp 0x7f6e736875c0 > WRITE of size 1 at 0x6062c4a8 thread T56 > #0 0x55d957b02e43 in std::__atomic_base::store(bool, > std::memory_order) /usr/include/c++/8/bits/atomic_base.h:374 > #1 0x55d957b02e43 in std::__atomic_base::operator=(bool) > /usr/include/c++/8/bits/atomic_base.h:267 > #2 0x55d957acb3c8 in std::atomic::operator=(bool) > /usr/include/c++/8/atomic:79 > #3 0x55d9581a02b9 in > org::apache::nifi::minifi::utils::HTTPClient::forceClose() > /home/fgerlits/src/minifi2/extensions/http-curl/client/HTTPClient.cpp:75 > #4 0x55d9581a00f1 in > org::apache::nifi::minifi::utils::HTTPClient::~HTTPClient() > /home/fgerlits/src/minifi2/extensions/http-curl/client/HTTPClient.cpp:64 > #5 0x55d9581c9f00 in > org::apache::nifi::minifi::processors::InvokeHTTP::onTrigger(std::shared_ptr > const&, std::shared_ptr > const&) > /home/fgerlits/src/minifi2/extensions/http-curl/processors/InvokeHTTP.cpp:286 > [...] > 0x6062c4a8 is located 40 bytes inside of 64-byte region > [0x6062c480,0x6062c4c0) > freed by thread T56 here: > #0 0x7f6e795c8a50 in operator delete(void*) > (/usr/lib/x86_64-linux-gnu/libasan.so.5+0xf0a50) > #1 0x55d9581970e3 in > std::default_delete::operator()(org::apache::nifi::minifi::utils::HTTPUploadCallback*) > const /usr/include/c++/8/bits/unique_ptr.h:81 > #2 0x55d958195e2a in > std::unique_ptr std::default_delete > >::~unique_ptr() /usr/include/c++/8/bits/unique_ptr.h:277 > #3 0x55d9581c9ee2 in > org::apache::nifi::minifi::processors::InvokeHTTP::onTrigger(std::shared_ptr > const&, std::shared_ptr > const&) > /home/fgerlits/src/minifi2/extensions/http-curl/processors/InvokeHTTP.cpp:306 > [...] > previously allocated by thread T56 here: > #0 0x7f6e795c7ba0 in operator new(unsigned long) > (/usr/lib/x86_64-linux-gnu/libasan.so.5+0xefba0) > #1 0x55d9581c86f7 in > org::apache::nifi::minifi::processors::InvokeHTTP::onTrigger(std::shared_ptr > const&, std::shared_ptr > const&) > /home/fgerlits/src/minifi2/extensions/http-curl/processors/InvokeHTTP.cpp:313 > [...] > {noformat} > Fix this bug. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [nifi-minifi-cpp] hunyadi-dev commented on a change in pull request #837: MINIFICPP-1121 - Upgrade spdlog to version 1.8.0
hunyadi-dev commented on a change in pull request #837: URL: https://github.com/apache/nifi-minifi-cpp/pull/837#discussion_r487698416 ## File path: cmake/BundledSpdlog.cmake ## @@ -0,0 +1,70 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +function(use_bundled_spdlog SOURCE_DIR BINARY_DIR) +# Define byproducts +if (WIN32) +set(BYPRODUCT "lib/spdlog.lib") +else() +include(GNUInstallDirs) +if ("${CMAKE_BUILD_TYPE}" STREQUAL "Debug") +set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlogd.a") +else() +set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlog.a") +endif() +endif() + +# Set build options +set(SPDLOG_SOURCE_DIR "${BINARY_DIR}/thirdparty/spdlog-src") +set(SPDLOG_INSTALL_DIR "${BINARY_DIR}/thirdparty/spdlog-install") +set(SPDLOG_LIBRARY "${SPDLOG_INSTALL_DIR}/${BYPRODUCT}") +set(SPDLOG_CMAKE_ARGS ${PASSTHROUGH_CMAKE_ARGS} +"-DCMAKE_INSTALL_PREFIX=${SPDLOG_INSTALL_DIR}" +"-DSPDLOG_BUILD_EXAMPLE=OFF" +"-DSPDLOG_BUILD_TESTS=OFF" +"-DSPDLOG_BUILD_TESTING=OFF" +"-DSPDLOG_BUILD_BENCH=OFF" +"-DSPDLOG_BUILD_SHARED=OFF") + +# Build project +ExternalProject_Add( +spdlog-external +URL "https://github.com/gabime/spdlog/archive/v1.7.0.zip; Review comment: I don't see anything that shouldbe broken by the version update, done. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [nifi-minifi-cpp] hunyadi-dev commented on a change in pull request #837: MINIFICPP-1121 - Upgrade spdlog to version 1.8.0
hunyadi-dev commented on a change in pull request #837: URL: https://github.com/apache/nifi-minifi-cpp/pull/837#discussion_r487698416 ## File path: cmake/BundledSpdlog.cmake ## @@ -0,0 +1,70 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +function(use_bundled_spdlog SOURCE_DIR BINARY_DIR) +# Define byproducts +if (WIN32) +set(BYPRODUCT "lib/spdlog.lib") +else() +include(GNUInstallDirs) +if ("${CMAKE_BUILD_TYPE}" STREQUAL "Debug") +set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlogd.a") +else() +set(BYPRODUCT "${CMAKE_INSTALL_LIBDIR}/libspdlog.a") +endif() +endif() + +# Set build options +set(SPDLOG_SOURCE_DIR "${BINARY_DIR}/thirdparty/spdlog-src") +set(SPDLOG_INSTALL_DIR "${BINARY_DIR}/thirdparty/spdlog-install") +set(SPDLOG_LIBRARY "${SPDLOG_INSTALL_DIR}/${BYPRODUCT}") +set(SPDLOG_CMAKE_ARGS ${PASSTHROUGH_CMAKE_ARGS} +"-DCMAKE_INSTALL_PREFIX=${SPDLOG_INSTALL_DIR}" +"-DSPDLOG_BUILD_EXAMPLE=OFF" +"-DSPDLOG_BUILD_TESTS=OFF" +"-DSPDLOG_BUILD_TESTING=OFF" +"-DSPDLOG_BUILD_BENCH=OFF" +"-DSPDLOG_BUILD_SHARED=OFF") + +# Build project +ExternalProject_Add( +spdlog-external +URL "https://github.com/gabime/spdlog/archive/v1.7.0.zip; Review comment: Updated. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org