[PR] Bump ip from 1.1.8 to 1.1.9 in /nifi-registry/nifi-registry-core/nifi-registry-web-ui/src/main [nifi]

2024-02-21 Thread via GitHub


dependabot[bot] opened a new pull request, #8444:
URL: https://github.com/apache/nifi/pull/8444

   Bumps [ip](https://github.com/indutny/node-ip) from 1.1.8 to 1.1.9.
   
   Commits
   
   https://github.com/indutny/node-ip/commit/1ecbf2fd8c0cc85e44c3b587d2de641f50dc0217;>1ecbf2f
 1.1.9
   https://github.com/indutny/node-ip/commit/6a3ada9b471b09d5f0f5be264911ab564bf67894;>6a3ada9
 lib: fixed CVE-2023-42282 and added unit test
   See full diff in https://github.com/indutny/node-ip/compare/v1.1.8...v1.1.9;>compare 
view
   
   
   
   
   
   [![Dependabot compatibility 
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ip=npm_and_yarn=1.1.8=1.1.9)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
   
   Dependabot will resolve any conflicts with this PR as long as you don't 
alter it yourself. You can also trigger a rebase manually by commenting 
`@dependabot rebase`.
   
   [//]: # (dependabot-automerge-start)
   [//]: # (dependabot-automerge-end)
   
   ---
   
   
   Dependabot commands and options
   
   
   You can trigger Dependabot actions by commenting on this PR:
   - `@dependabot rebase` will rebase this PR
   - `@dependabot recreate` will recreate this PR, overwriting any edits that 
have been made to it
   - `@dependabot merge` will merge this PR after your CI passes on it
   - `@dependabot squash and merge` will squash and merge this PR after your CI 
passes on it
   - `@dependabot cancel merge` will cancel a previously requested merge and 
block automerging
   - `@dependabot reopen` will reopen this PR if it is closed
   - `@dependabot close` will close this PR and stop Dependabot recreating it. 
You can achieve the same result by closing it manually
   - `@dependabot show  ignore conditions` will show all of 
the ignore conditions of the specified dependency
   - `@dependabot ignore this major version` will close this PR and stop 
Dependabot creating any more for this major version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this minor version` will close this PR and stop 
Dependabot creating any more for this minor version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this dependency` will close this PR and stop 
Dependabot creating any more for this dependency (unless you reopen the PR or 
upgrade to it yourself)
   You can disable automated security fix PRs for this repo from the [Security 
Alerts page](https://github.com/apache/nifi/network/alerts).
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Commented] (NIFI-4385) Adjust the QueryDatabaseTable processor for handling big tables.

2024-02-21 Thread Matt Burgess (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-4385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17819430#comment-17819430
 ] 

Matt Burgess commented on NIFI-4385:


I'm ok with changing the Fetch Size defaults but 0 leaves it up to the driver, 
and often their defaults are WAY too low. Would be nice to add more 
documentation around the choice of default as well as what it means to the 
user, as sometimes I've seen assumptions that that's how many rows will be in a 
FlowFile when there's a separate property for that.

Another thing I'm looking into is to see if I can speed everything up by doing 
the fetch in parallel (on multiple cores), fetching the specified number of 
rows and having another thread writing them to the FlowFile. It always depends 
on the use case but sometimes the Avro conversion of the ResultSet is "the long 
pole in the tent", so rather than that logic waiting on a fetch it can be 
working constantly while at least one separate thread ensures there's always 
data ready to be converted and written out. I'll write up a separate Jira for 
that once I fully characterize the issue and proposed solution. In the meantime 
I welcome all thoughts, comments, questions, and concerns right here :)

> Adjust the QueryDatabaseTable processor for handling big tables.
> 
>
> Key: NIFI-4385
> URL: https://issues.apache.org/jira/browse/NIFI-4385
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.3.0
>Reporter: Tim Späth
>Priority: Major
>
> When querying large database tables, the *QueryDatabaseTable* processor does 
> not perform very well.
> The processor will always perform the full query and then transfer all 
> flowfiles as a list instead of 
> transferring them particularly after the *ResultSet* is fetching the next 
> rows(If a fetch size is given). 
> If you want to query a billion rows from a table, 
> the processor will add all flowfiles in an ArrayList in memory 
> before transferring the whole list after the last row is fetched by the 
> ResultSet. 
> I've checked the code in 
> *org.apache.nifi.processors.standard.QueryDatabaseTable.java* 
> and in my opinion, it would be no big deal to move the session.transfer to a 
> proper position in the code (into the while loop where the flowfile is added 
> to the list) to 
> achieve a real _stream support_. There was also a bug report for this problem 
> which resulted in adding the new property *Maximum Number of Fragments*, 
> but this property will just limit the results. 
> Now you have to multiply *Max Rows Per Flow File* with *Maximum Number of 
> Fragments* to get your limit, 
> which is not really a solution for the original problem imho. 
> Also the workaround with GenerateTableFetch and/or ExecuteSQL processors is 
> much slower than using a database cursor or a ResultSet
> and stream the rows in flowfiles directly in the queue.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12832: Eliminated unnecessary dependencies from nifi-mock; moved… [nifi]

2024-02-21 Thread via GitHub


markap14 commented on code in PR #8442:
URL: https://github.com/apache/nifi/pull/8442#discussion_r1498393202


##
nifi-api/src/main/java/org/apache/nifi/processor/util/StandardValidators.java:
##
@@ -540,9 +540,9 @@ public ValidationResult validate(final String subject, 
final String input, final
 }
 
 try {
+// Check that we can parse the value as a URL
 final String evaluatedInput = 
context.newPropertyValue(input).evaluateAttributeExpressions().getValue();
-final URI uri = UriUtils.create(evaluatedInput);
-uri.toURL();
+URI.create(evaluatedInput).toURL();

Review Comment:
   Got it. I saw it was used in InvokeHTTP but didn't think it made sense for 
validation. But I guess it does.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12832: Eliminated unnecessary dependencies from nifi-mock; moved… [nifi]

2024-02-21 Thread via GitHub


markap14 commented on code in PR #8442:
URL: https://github.com/apache/nifi/pull/8442#discussion_r1498391143


##
nifi-api/src/main/java/org/apache/nifi/processor/util/StandardValidators.java:
##
@@ -843,7 +843,7 @@ public ValidationResult validate(final String subject, 
final String input, final
 final boolean validSyntax = pattern.matcher(lowerCase).matches();
 final ValidationResult.Builder builder = new 
ValidationResult.Builder();
 if (validSyntax) {
-final long nanos = FormatUtils.getTimeDuration(lowerCase, 
TimeUnit.NANOSECONDS);
+final long nanos = new TimeFormat().getTimeDuration(lowerCase, 
TimeUnit.NANOSECONDS);

Review Comment:
   Yeah, I intentionally didn't make the method static, because I was thinking 
there were more methods, some of which might benefit from providing a timezone, 
etc. But now that the refactoring is complete, the methods are really only 
about durations. Probably makes sense to use static methods and name it 
`DurationFormat` rather than `TimeFormat`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12832: Eliminated unnecessary dependencies from nifi-mock; moved… [nifi]

2024-02-21 Thread via GitHub


dan-s1 commented on code in PR #8442:
URL: https://github.com/apache/nifi/pull/8442#discussion_r1498371538


##
nifi-api/src/main/java/org/apache/nifi/processor/util/StandardValidators.java:
##
@@ -540,9 +540,9 @@ public ValidationResult validate(final String subject, 
final String input, final
 }
 
 try {
+// Check that we can parse the value as a URL
 final String evaluatedInput = 
context.newPropertyValue(input).evaluateAttributeExpressions().getValue();
-final URI uri = UriUtils.create(evaluatedInput);
-uri.toURL();
+URI.create(evaluatedInput).toURL();

Review Comment:
   Lines 544-545 which were removed were meant to fix the bug discovered in 
NIFI-12513



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12832: Eliminated unnecessary dependencies from nifi-mock; moved… [nifi]

2024-02-21 Thread via GitHub


markap14 commented on code in PR #8442:
URL: https://github.com/apache/nifi/pull/8442#discussion_r1498374579


##
nifi-api/src/main/java/org/apache/nifi/time/TimeFormat.java:
##
@@ -0,0 +1,251 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.time;
+
+import java.util.Arrays;
+import java.util.List;
+import java.util.concurrent.TimeUnit;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
+
+public class TimeFormat {
+private static final String UNION = "|";
+
+
+// for Time Durations
+private static final String NANOS = join(UNION, "ns", "nano", "nanos", 
"nanosecond", "nanoseconds");
+private static final String MILLIS = join(UNION, "ms", "milli", "millis", 
"millisecond", "milliseconds");
+private static final String SECS = join(UNION, "s", "sec", "secs", 
"second", "seconds");
+private static final String MINS = join(UNION, "m", "min", "mins", 
"minute", "minutes");
+private static final String HOURS = join(UNION, "h", "hr", "hrs", "hour", 
"hours");
+private static final String DAYS = join(UNION, "d", "day", "days");
+private static final String WEEKS = join(UNION, "w", "wk", "wks", "week", 
"weeks");
+
+private static final String VALID_TIME_UNITS = join(UNION, NANOS, MILLIS, 
SECS, MINS, HOURS, DAYS, WEEKS);
+public static final String TIME_DURATION_REGEX = "([\\d.]+)\\s*(" + 
VALID_TIME_UNITS + ")";
+public static final Pattern TIME_DURATION_PATTERN = 
Pattern.compile(TIME_DURATION_REGEX);
+private static final List TIME_UNIT_MULTIPLIERS = 
Arrays.asList(1000L, 1000L, 1000L, 60L, 60L, 24L);
+
+
+/**
+ * Returns a time duration in the requested {@link TimeUnit} after parsing 
the {@code String}
+ * input. If the resulting value is a decimal (i.e.
+ * {@code 25 hours -> TimeUnit.DAYS = 1.04}), the value is rounded.
+ * Use {@link #getPreciseTimeDuration(String, TimeUnit)} if fractional 
values are desirable
+ *
+ * @param value the raw String input (i.e. "28 minutes")
+ * @param desiredUnit the requested output {@link TimeUnit}
+ * @return the whole number value of this duration in the requested units
+ * @see #getPreciseTimeDuration(String, TimeUnit)
+ */
+public long getTimeDuration(final String value, final TimeUnit 
desiredUnit) {
+return Math.round(getPreciseTimeDuration(value, desiredUnit));
+}
+
+/**
+ * Returns the parsed and converted input in the requested units.
+ * 
+ * If the value is {@code 0 <= x < 1} in the provided units, the units 
will first be converted to a smaller unit to get a value >= 1 (i.e. 0.5 seconds 
-> 500 milliseconds).
+ * This is because the underlying unit conversion cannot handle decimal 
values.
+ * 
+ * If the value is {@code x >= 1} but x is not a whole number, the units 
will first be converted to a smaller unit to attempt to get a whole number 
value (i.e. 1.5 seconds -> 1500 milliseconds).
+ * 
+ * If the value is {@code x < 1000} and the units are {@code 
TimeUnit.NANOSECONDS}, the result will be a whole number of nanoseconds, 
rounded (i.e. 123.4 ns -> 123 ns).
+ * 
+ * This method handles decimal values over {@code 1 ns}, but {@code < 1 
ns} will return {@code 0} in any other unit.
+ * 
+ * Examples:
+ * 
+ * "10 seconds", {@code TimeUnit.MILLISECONDS} -> 10_000.0
+ * "0.010 s", {@code TimeUnit.MILLISECONDS} -> 10.0
+ * "0.010 s", {@code TimeUnit.SECONDS} -> 0.010
+ * "0.010 ns", {@code TimeUnit.NANOSECONDS} -> 1
+ * "0.010 ns", {@code TimeUnit.MICROSECONDS} -> 0
+ *
+ * @param value   the {@code String} input
+ * @param desiredUnit the desired output {@link TimeUnit}
+ * @return the parsed and converted amount (without a unit)
+ */
+public double getPreciseTimeDuration(final String value, final TimeUnit 
desiredUnit) {
+final Matcher matcher = 
TIME_DURATION_PATTERN.matcher(value.toLowerCase());
+if (!matcher.matches()) {
+throw new IllegalArgumentException("Value '" + value + "' is not a 
valid time duration");
+}
+
+final String duration = matcher.group(1);
+final String units 

Re: [PR] NIFI-12832: Eliminated unnecessary dependencies from nifi-mock; moved… [nifi]

2024-02-21 Thread via GitHub


dan-s1 commented on code in PR #8442:
URL: https://github.com/apache/nifi/pull/8442#discussion_r1498320874


##
nifi-api/src/main/java/org/apache/nifi/processor/util/StandardValidators.java:
##
@@ -843,7 +843,7 @@ public ValidationResult validate(final String subject, 
final String input, final
 final boolean validSyntax = pattern.matcher(lowerCase).matches();
 final ValidationResult.Builder builder = new 
ValidationResult.Builder();
 if (validSyntax) {
-final long nanos = FormatUtils.getTimeDuration(lowerCase, 
TimeUnit.NANOSECONDS);
+final long nanos = new TimeFormat().getTimeDuration(lowerCase, 
TimeUnit.NANOSECONDS);

Review Comment:
   `TimeFormat`, does not have any instance variables, Couldn't all the methods 
be static hence the constructor for it should be private and no instantiation 
should be allowed.



##
nifi-api/src/main/java/org/apache/nifi/time/TimeFormat.java:
##
@@ -0,0 +1,251 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.time;
+
+import java.util.Arrays;
+import java.util.List;
+import java.util.concurrent.TimeUnit;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
+
+public class TimeFormat {
+private static final String UNION = "|";
+
+
+// for Time Durations
+private static final String NANOS = join(UNION, "ns", "nano", "nanos", 
"nanosecond", "nanoseconds");
+private static final String MILLIS = join(UNION, "ms", "milli", "millis", 
"millisecond", "milliseconds");
+private static final String SECS = join(UNION, "s", "sec", "secs", 
"second", "seconds");
+private static final String MINS = join(UNION, "m", "min", "mins", 
"minute", "minutes");
+private static final String HOURS = join(UNION, "h", "hr", "hrs", "hour", 
"hours");
+private static final String DAYS = join(UNION, "d", "day", "days");
+private static final String WEEKS = join(UNION, "w", "wk", "wks", "week", 
"weeks");
+
+private static final String VALID_TIME_UNITS = join(UNION, NANOS, MILLIS, 
SECS, MINS, HOURS, DAYS, WEEKS);
+public static final String TIME_DURATION_REGEX = "([\\d.]+)\\s*(" + 
VALID_TIME_UNITS + ")";
+public static final Pattern TIME_DURATION_PATTERN = 
Pattern.compile(TIME_DURATION_REGEX);
+private static final List TIME_UNIT_MULTIPLIERS = 
Arrays.asList(1000L, 1000L, 1000L, 60L, 60L, 24L);
+
+
+/**
+ * Returns a time duration in the requested {@link TimeUnit} after parsing 
the {@code String}
+ * input. If the resulting value is a decimal (i.e.
+ * {@code 25 hours -> TimeUnit.DAYS = 1.04}), the value is rounded.
+ * Use {@link #getPreciseTimeDuration(String, TimeUnit)} if fractional 
values are desirable
+ *
+ * @param value the raw String input (i.e. "28 minutes")
+ * @param desiredUnit the requested output {@link TimeUnit}
+ * @return the whole number value of this duration in the requested units
+ * @see #getPreciseTimeDuration(String, TimeUnit)
+ */
+public long getTimeDuration(final String value, final TimeUnit 
desiredUnit) {
+return Math.round(getPreciseTimeDuration(value, desiredUnit));
+}
+
+/**
+ * Returns the parsed and converted input in the requested units.
+ * 
+ * If the value is {@code 0 <= x < 1} in the provided units, the units 
will first be converted to a smaller unit to get a value >= 1 (i.e. 0.5 seconds 
-> 500 milliseconds).
+ * This is because the underlying unit conversion cannot handle decimal 
values.
+ * 
+ * If the value is {@code x >= 1} but x is not a whole number, the units 
will first be converted to a smaller unit to attempt to get a whole number 
value (i.e. 1.5 seconds -> 1500 milliseconds).
+ * 
+ * If the value is {@code x < 1000} and the units are {@code 
TimeUnit.NANOSECONDS}, the result will be a whole number of nanoseconds, 
rounded (i.e. 123.4 ns -> 123 ns).
+ * 
+ * This method handles decimal values over {@code 1 ns}, but {@code < 1 
ns} will return {@code 0} in any other unit.
+ * 
+ * Examples:
+ * 
+ * "10 seconds", {@code TimeUnit.MILLISECONDS} -> 10_000.0
+ * "0.010 s", 

[PR] NIFI-12826 adding a similar timing/delay to the test as found in anot… [nifi]

2024-02-21 Thread via GitHub


joewitt opened a new pull request, #8443:
URL: https://github.com/apache/nifi/pull/8443

   …her nearby test
   
   
   
   
   
   
   
   
   
   
   
   
   
   
   # Summary
   
   [NIFI-0](https://issues.apache.org/jira/browse/NIFI-0)
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [ ] Pull Request based on current revision of the `main` branch
   - [ ] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [ ] Build completed using `mvn clean install -P contrib-check`
 - [ ] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] NIFI-12832: Eliminated unnecessary dependencies from nifi-mock; moved… [nifi]

2024-02-21 Thread via GitHub


markap14 opened a new pull request, #8442:
URL: https://github.com/apache/nifi/pull/8442

   … StandardValidators to nifi-api; broke apart FormatUtils into FormatUtils 
and TimeFormat classes, with TimeFormat existing in nifi-api
   
   
   
   
   
   
   
   
   
   
   
   
   
   
   # Summary
   
   [NIFI-0](https://issues.apache.org/jira/browse/NIFI-0)
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [ ] Pull Request based on current revision of the `main` branch
   - [ ] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [ ] Build completed using `mvn clean install -P contrib-check`
 - [ ] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Created] (NIFI-12832) Cleanup nifi-mock dependencies

2024-02-21 Thread Mark Payne (Jira)
Mark Payne created NIFI-12832:
-

 Summary: Cleanup nifi-mock dependencies
 Key: NIFI-12832
 URL: https://issues.apache.org/jira/browse/NIFI-12832
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Core Framework, Extensions
Reporter: Mark Payne
Assignee: Mark Payne
 Fix For: 2.0.0


We have allowed quite a few dependencies to creep into the nifi-mock module. It 
has dependencies now on nifi-utils, nifi-framework-api, nifi-parameter. These 
are not modules that the mock framework should depend on. We should ensure that 
we keep this module lean and clean.

I suspect removing these dependencies from the mock framework will have a 
trickle-down effect, as most modules depend on this module, and removing these 
dependencies will likely require updates to modules who use these things as 
transitive dependencies.

It appears that nifi-parameter is not even used, even though it's a dependency. 
There are two classes in nifi-utils that are in use: CoreAttributes and 
StandardValidators. But I argue these really should move to nifi-api, as they 
are APIs that are widely used and we will guarantee backward compatibility.

Additionally, StandardValidators depends on FormatUtils. While we don't want to 
bring FormatUtils into nifi-api, we should introduce a new TimeFormat class in 
nifi-api that is responsible for parsing things like durations that our 
extensions use ("5 mins", etc.) This makes it simpler to build "framework-level 
extensions" and allows for a cleaner implementation of NiFiProperties in the 
future. FormatUtils should then make use of this class.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-3785 Added feature to move a controller service to it's parent o… [nifi]

2024-02-21 Thread via GitHub


Freedom9339 commented on PR #7734:
URL: https://github.com/apache/nifi/pull/7734#issuecomment-1957639012

   @markap14 I've rebased against main and pushed. Thank You!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-02-21 Thread via GitHub


mark-bathori commented on code in PR #8441:
URL: https://github.com/apache/nifi/pull/8441#discussion_r1498037649


##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/OpenSearchVectorUtils.py:
##
@@ -0,0 +1,149 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from EmbeddingUtils import OPENAI, HUGGING_FACE, EMBEDDING_MODEL
+
+HUGGING_FACE_API_KEY = PropertyDescriptor(
+name="HuggingFace API Key",
+description="The API Key for interacting with HuggingFace",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+HUGGING_FACE_MODEL = PropertyDescriptor(
+name="HuggingFace Model",
+description="The name of the HuggingFace model to use",
+default_value="sentence-transformers/all-MiniLM-L6-v2",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+OPENAI_API_KEY = PropertyDescriptor(
+name="OpenAI API Key",
+description="The API Key for OpenAI in order to create embeddings",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+OPENAI_API_MODEL = PropertyDescriptor(
+name="OpenAI Model",
+description="The API Key for OpenAI in order to create embeddings",
+default_value="text-embedding-ada-002",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+HTTP_HOST = PropertyDescriptor(
+name="HTTP Host",
+description="URL where OpenSearch is hosted.",
+default_value="http://localhost:9200;,
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]

Review Comment:
   Thanks @dan-s1, good catch, I'll check it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2298 Set RocksDB keep_log_file_num configurable and default to 5 [nifi-minifi-cpp]

2024-02-21 Thread via GitHub


szaszm commented on PR #1731:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1731#issuecomment-1957380811

   Do you think we could use rocksdb ini files to make these settings 
user-overridable? https://github.com/facebook/rocksdb/wiki/RocksDB-Options-File


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-02-21 Thread via GitHub


dan-s1 commented on code in PR #8441:
URL: https://github.com/apache/nifi/pull/8441#discussion_r1497954594


##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/OpenSearchVectorUtils.py:
##
@@ -0,0 +1,149 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from EmbeddingUtils import OPENAI, HUGGING_FACE, EMBEDDING_MODEL
+
+HUGGING_FACE_API_KEY = PropertyDescriptor(
+name="HuggingFace API Key",
+description="The API Key for interacting with HuggingFace",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+HUGGING_FACE_MODEL = PropertyDescriptor(
+name="HuggingFace Model",
+description="The name of the HuggingFace model to use",
+default_value="sentence-transformers/all-MiniLM-L6-v2",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+OPENAI_API_KEY = PropertyDescriptor(
+name="OpenAI API Key",
+description="The API Key for OpenAI in order to create embeddings",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+OPENAI_API_MODEL = PropertyDescriptor(
+name="OpenAI Model",
+description="The API Key for OpenAI in order to create embeddings",
+default_value="text-embedding-ada-002",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+HTTP_HOST = PropertyDescriptor(
+name="HTTP Host",
+description="URL where OpenSearch is hosted.",
+default_value="http://localhost:9200;,
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]

Review Comment:
   I am not sure this is possible or not but the Java side of 
`StandardValidators` has `URL_VALIDATOR` and `URI_VALIDATOR` which would make 
more sense here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] MINIFICPP-2303 Fix FindLua in the Windows CI job [nifi-minifi-cpp]

2024-02-21 Thread via GitHub


fgerlits opened a new pull request, #1732:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1732

   After some change in the GitHub Actions (Windows image? CMake version?), our 
Windows CI job can no longer find the Lua library, and the build fails.  
Setting `LUA_DIR` works around this problem.
   
   ---
   
   Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [x] Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
   
   - [x] Does your PR title start with MINIFICPP- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [x] Has your PR been rebased against the latest commit within the target 
branch (typically main)?
   
   - [x] Is your initial contribution a single, squashed commit?
   
   ### For code changes:
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the LICENSE file?
   - [ ] If applicable, have you updated the NOTICE file?
   
   ### For documentation related changes:
   - [ ] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check GitHub Actions CI 
results for build issues and submit an update to your PR as soon as possible.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Created] (MINIFICPP-2303) The Windows CI runner cannot find the Lua library

2024-02-21 Thread Ferenc Gerlits (Jira)
Ferenc Gerlits created MINIFICPP-2303:
-

 Summary: The Windows CI runner cannot find the Lua library
 Key: MINIFICPP-2303
 URL: https://issues.apache.org/jira/browse/MINIFICPP-2303
 Project: Apache NiFi MiNiFi C++
  Issue Type: Bug
Reporter: Ferenc Gerlits
Assignee: Ferenc Gerlits






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12807) Cluster - Provenance, Lineage, and Queue Listing

2024-02-21 Thread Rob Fellows (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rob Fellows updated NIFI-12807:
---
Fix Version/s: 2.0.0
   Resolution: Fixed
   Status: Resolved  (was: Patch Available)

> Cluster - Provenance, Lineage, and Queue Listing
> 
>
> Key: NIFI-12807
> URL: https://issues.apache.org/jira/browse/NIFI-12807
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Matt Gilman
>Assignee: Matt Gilman
>Priority: Major
> Fix For: 2.0.0
>
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> Update the Provenance, Lineage, and Queue Listing page handle when the NiFi 
> instance is clustered.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12807) Cluster - Provenance, Lineage, and Queue Listing

2024-02-21 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17819317#comment-17819317
 ] 

ASF subversion and git services commented on NIFI-12807:


Commit 6c76ecadd417aa8e0fea2605dae2482990cfea13 in nifi's branch 
refs/heads/main from Matt Gilman
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=6c76ecadd4 ]

NIFI-12807: Handle clustering in Provenance, Lineage, and Queue Listing (#8431)

* NIFI-12807:
- Handling cluster node id in provenance listing, lineage graph, and queue 
listing.

* NIFI-12807:
- Addressing review feedback.

This closes #8431 

> Cluster - Provenance, Lineage, and Queue Listing
> 
>
> Key: NIFI-12807
> URL: https://issues.apache.org/jira/browse/NIFI-12807
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Matt Gilman
>Assignee: Matt Gilman
>Priority: Major
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> Update the Provenance, Lineage, and Queue Listing page handle when the NiFi 
> instance is clustered.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12807: Handle clustering in Provenance, Lineage, and Queue Listing [nifi]

2024-02-21 Thread via GitHub


rfellows merged PR #8431:
URL: https://github.com/apache/nifi/pull/8431


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-02-21 Thread via GitHub


exceptionfactory commented on code in PR #8441:
URL: https://github.com/apache/nifi/pull/8441#discussion_r1497820217


##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/OpenSearchVectorUtils.py:
##
@@ -0,0 +1,149 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from EmbeddingUtils import OPENAI, HUGGING_FACE, EMBEDDING_MODEL
+
+HUGGING_FACE_API_KEY = PropertyDescriptor(
+name="HuggingFace API Key",
+description="The API Key for interacting with HuggingFace",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+HUGGING_FACE_MODEL = PropertyDescriptor(
+name="HuggingFace Model",
+description="The name of the HuggingFace model to use",
+default_value="sentence-transformers/all-MiniLM-L6-v2",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+OPENAI_API_KEY = PropertyDescriptor(
+name="OpenAI API Key",
+description="The API Key for OpenAI in order to create embeddings",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+OPENAI_API_MODEL = PropertyDescriptor(
+name="OpenAI Model",
+description="The API Key for OpenAI in order to create embeddings",
+default_value="text-embedding-ada-002",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+HTTP_HOST = PropertyDescriptor(
+name="HTTP Host",
+description="URL where OpenSearch is hosted.",
+default_value="http://localhost:9200;,
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+USERNAME = PropertyDescriptor(
+name="Username",
+description="The username to use for authenticating to OpenSearch server",
+required=False,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+PASSWORD = PropertyDescriptor(
+name="Password",
+description="The password to use for authenticating to OpenSearch server",
+required=False,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+VERIFY_CERTIFICATES = PropertyDescriptor(
+name="Verify Certificates",
+description="The password to use for authenticating to OpenSearch server",
+allowable_values=["true", "false"],
+default_value="false",
+required=False,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)

Review Comment:
   Thanks!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-02-21 Thread via GitHub


mark-bathori commented on code in PR #8441:
URL: https://github.com/apache/nifi/pull/8441#discussion_r1497806696


##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/OpenSearchVectorUtils.py:
##
@@ -0,0 +1,149 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from EmbeddingUtils import OPENAI, HUGGING_FACE, EMBEDDING_MODEL
+
+HUGGING_FACE_API_KEY = PropertyDescriptor(
+name="HuggingFace API Key",
+description="The API Key for interacting with HuggingFace",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+HUGGING_FACE_MODEL = PropertyDescriptor(
+name="HuggingFace Model",
+description="The name of the HuggingFace model to use",
+default_value="sentence-transformers/all-MiniLM-L6-v2",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+OPENAI_API_KEY = PropertyDescriptor(
+name="OpenAI API Key",
+description="The API Key for OpenAI in order to create embeddings",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+OPENAI_API_MODEL = PropertyDescriptor(
+name="OpenAI Model",
+description="The API Key for OpenAI in order to create embeddings",
+default_value="text-embedding-ada-002",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+HTTP_HOST = PropertyDescriptor(
+name="HTTP Host",
+description="URL where OpenSearch is hosted.",
+default_value="http://localhost:9200;,
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+USERNAME = PropertyDescriptor(
+name="Username",
+description="The username to use for authenticating to OpenSearch server",
+required=False,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+PASSWORD = PropertyDescriptor(
+name="Password",
+description="The password to use for authenticating to OpenSearch server",
+required=False,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+VERIFY_CERTIFICATES = PropertyDescriptor(
+name="Verify Certificates",
+description="The password to use for authenticating to OpenSearch server",
+allowable_values=["true", "false"],
+default_value="false",
+required=False,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)

Review Comment:
   Thanks for the comment @exceptionfactory, I'll remove this property.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-02-21 Thread via GitHub


exceptionfactory commented on code in PR #8441:
URL: https://github.com/apache/nifi/pull/8441#discussion_r1497732871


##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/OpenSearchVectorUtils.py:
##
@@ -0,0 +1,149 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from EmbeddingUtils import OPENAI, HUGGING_FACE, EMBEDDING_MODEL
+
+HUGGING_FACE_API_KEY = PropertyDescriptor(
+name="HuggingFace API Key",
+description="The API Key for interacting with HuggingFace",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+HUGGING_FACE_MODEL = PropertyDescriptor(
+name="HuggingFace Model",
+description="The name of the HuggingFace model to use",
+default_value="sentence-transformers/all-MiniLM-L6-v2",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, HUGGING_FACE)]
+)
+OPENAI_API_KEY = PropertyDescriptor(
+name="OpenAI API Key",
+description="The API Key for OpenAI in order to create embeddings",
+required=True,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+OPENAI_API_MODEL = PropertyDescriptor(
+name="OpenAI Model",
+description="The API Key for OpenAI in order to create embeddings",
+default_value="text-embedding-ada-002",
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR],
+dependencies=[PropertyDependency(EMBEDDING_MODEL, OPENAI)]
+)
+HTTP_HOST = PropertyDescriptor(
+name="HTTP Host",
+description="URL where OpenSearch is hosted.",
+default_value="http://localhost:9200;,
+required=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+USERNAME = PropertyDescriptor(
+name="Username",
+description="The username to use for authenticating to OpenSearch server",
+required=False,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+PASSWORD = PropertyDescriptor(
+name="Password",
+description="The password to use for authenticating to OpenSearch server",
+required=False,
+sensitive=True,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)
+VERIFY_CERTIFICATES = PropertyDescriptor(
+name="Verify Certificates",
+description="The password to use for authenticating to OpenSearch server",
+allowable_values=["true", "false"],
+default_value="false",
+required=False,
+validators=[StandardValidators.NON_EMPTY_VALIDATOR]
+)

Review Comment:
   In keeping with practices in other Processors, we should not support 
disabling certificate verification.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-02-21 Thread via GitHub


mark-bathori opened a new pull request, #8441:
URL: https://github.com/apache/nifi/pull/8441

   
   
   
   
   
   
   
   
   
   
   
   
   
   # Summary
   
   [NIFI-12831](https://issues.apache.org/jira/browse/NIFI-12831)
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [x] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [x] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [x] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [x] Pull Request based on current revision of the `main` branch
   - [x] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [ ] Build completed using `mvn clean install -P contrib-check`
 - [ ] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Created] (NIFI-12831) Add PutOpenSearchVector and QueryOpenSearchVector processors

2024-02-21 Thread Mark Bathori (Jira)
Mark Bathori created NIFI-12831:
---

 Summary: Add PutOpenSearchVector and QueryOpenSearchVector 
processors
 Key: NIFI-12831
 URL: https://issues.apache.org/jira/browse/NIFI-12831
 Project: Apache NiFi
  Issue Type: New Feature
Reporter: Mark Bathori
Assignee: Mark Bathori


Create vector store specific put and query processors for OpenSearch.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Comment Edited] (NIFI-11859) Nifi in standalone mode is not able to enable EmbeddedHazelcastCacheManager

2024-02-21 Thread Bob Paulin (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-11859?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17819071#comment-17819071
 ] 

Bob Paulin edited comment on NIFI-11859 at 2/21/24 1:53 PM:


Looks like Hazelcast will enable the MulticastJoiner[1] if autodetect is 
enabled.  And it is by default  [2].  This causes Multicast to be enabled when 
this is run locally
{code:java}
2024-02-20 19:14:41,547 INFO [Timer-Driven Process Thread-6] 
com.hazelcast.instance.impl.Node [192.168.56.1]:5701 [nifi] [5.3.5] Using 
Multicast discovery {code}
By disabling the auto detect this does not happen

[https://github.com/apache/nifi/pull/8440]

 

I get the following in the logs which I believe is expected.
{code:java}
2024-02-20 19:23:15,870 WARN [Timer-Driven Process Thread-2] 
com.hazelcast.instance.impl.Node [192.168.56.1]:5701 [nifi] [5.3.5] No join 
method is enabled! Starting standalone. {code}
 

This should also address the issue above since it appears to be hitting a 
usecase where the DiscoveryJoiner is enabled.  This should also be suppressed 
by autodetect being disabled [3].  This appears to happen in both 1.x and 2.x

[1] 
[https://github.com/hazelcast/hazelcast/blob/a3ae01dcbfa32e3b314047506dadf837d54e8e2a/hazelcast/src/main/java/com/hazelcast/instance/impl/Node.java#L946|https://github.com/hazelcast/hazelcast/blob/e3dd651a78e97c6702ce4260e6263d4818fb29b1/hazelcast/src/main/java/com/hazelcast/instance/impl/Node.java#L974]
[2] 
[https://github.com/hazelcast/hazelcast/blob/e3dd651a78e97c6702ce4260e6263d4818fb29b1/hazelcast/src/main/java/com/hazelcast/config/AutoDetectionConfig.java#L28]

[3] 
https://github.com/hazelcast/hazelcast/blob/a3ae01dcbfa32e3b314047506dadf837d54e8e2a/hazelcast/src/main/java/com/hazelcast/instance/impl/Node.java#L938


was (Author: bob):
Looks like Hazelcast will enable the Multicast Joinger[1] if autodetect is 
enabled.  And it is by default  [2].  This causes Multicast to be enabled when 
this is run locally
{code:java}
2024-02-20 19:14:41,547 INFO [Timer-Driven Process Thread-6] 
com.hazelcast.instance.impl.Node [192.168.56.1]:5701 [nifi] [5.3.5] Using 
Multicast discovery {code}
By disabling the auto detect this does not happen

[https://github.com/apache/nifi/pull/8440]

 

I get the following in the logs which I believe is expected.


{code:java}
2024-02-20 19:23:15,870 WARN [Timer-Driven Process Thread-2] 
com.hazelcast.instance.impl.Node [192.168.56.1]:5701 [nifi] [5.3.5] No join 
method is enabled! Starting standalone. {code}
 

This should also address the issue above since it appears to be hitting a 
usecase where the DiscoveryJoiner is enabled.  This should also be suppressed 
by autodetect being disabled [3].  This appears to happen in both 1.x and 2.x


[1] 
[https://github.com/hazelcast/hazelcast/blob/a3ae01dcbfa32e3b314047506dadf837d54e8e2a/hazelcast/src/main/java/com/hazelcast/instance/impl/Node.java#L946|https://github.com/hazelcast/hazelcast/blob/e3dd651a78e97c6702ce4260e6263d4818fb29b1/hazelcast/src/main/java/com/hazelcast/instance/impl/Node.java#L974]
[2] 
[https://github.com/hazelcast/hazelcast/blob/e3dd651a78e97c6702ce4260e6263d4818fb29b1/hazelcast/src/main/java/com/hazelcast/config/AutoDetectionConfig.java#L28]

[3]https://github.com/hazelcast/hazelcast/blob/a3ae01dcbfa32e3b314047506dadf837d54e8e2a/hazelcast/src/main/java/com/hazelcast/instance/impl/Node.java#L938

> Nifi in standalone mode is not able to enable EmbeddedHazelcastCacheManager 
> 
>
> Key: NIFI-11859
> URL: https://issues.apache.org/jira/browse/NIFI-11859
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Configuration Management
>Affects Versions: 1.22.0
>Reporter: Jeetendra G Vasisht
>Priority: Blocker
> Attachments: embeddedHazelcastNifiControllerservice.PNG, nifi--app.log
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> EmbeddedHazelcastCacheManager Controller gets enabled in cluster mode with 
> "All Nodes" clustering strategy, but having issue when tried to run in 
> standalone mode with "None" clustering strategy. This is observed in 
> Kubernetes Environment and this is coming as part of internal Nifi packaging 
> and any external dependency or code related to Hazelcast is not being used.
> Controller gets stuck in Enabling state:
> !embeddedHazelcastNifiControllerservice.PNG|width=662,height=131!
> Respective Logs have been attached



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12202) SAML Infinitely Redirects

2024-02-21 Thread Alex Jackson (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12202?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alex Jackson updated NIFI-12202:

Affects Version/s: 1.25.0

> SAML Infinitely Redirects
> -
>
> Key: NIFI-12202
> URL: https://issues.apache.org/jira/browse/NIFI-12202
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.24.0, 1.23.1, 1.23.2, 1.25.0
>Reporter: Alex Jackson
>Priority: Major
> Attachments: image-2024-02-21-14-41-53-054.png
>
>
> We have SAML configured and when I updated from 1.20.0 to 1.23.1 (at the 
> time) and just tried now 1.23.2 I see that SAML authentication takes place 
> but I am infinitely redirected and eventually land on a nifi-api address. I 
> havent got it deployed in this bad state anymore but I feel like there is an 
> issue with SAML and it would be great if someone could look into it



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12202) SAML Infinitely Redirects

2024-02-21 Thread Alex Jackson (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17819259#comment-17819259
 ] 

Alex Jackson commented on NIFI-12202:
-

[~exceptionfactory]  sorry again for being late on this - it was somehow 
related to the cookie (this helped us direct our attention to what was going 
on) but we had to remove the line single-user-provider from here:
{{nifi.security.user.login.identity.provider}}
strangely this always worked even though we have managed-authorizer set: 
{{nifi.security.user.authorizer=managed-authorizer}}

now we have another problem though - before we had to put the user and the 
group in nifi users in order for them to login. The user name would let them 
login but the groups were where we gave them their policy access etc.

It seems now though that unless we physically add the user to the member of the 
group it will not give them their policies - do I need to create a separate 
ticket for this or is this somehow expected behavior??
!image-2024-02-21-14-41-53-054.png!

We tested the fact that the username does now no longer need to be in NiFi 
users but the policies with the groups no longer work and only work when we add 
the user to be a member of said group. But the group is definitely coming 
through from the saml token/cookie

> SAML Infinitely Redirects
> -
>
> Key: NIFI-12202
> URL: https://issues.apache.org/jira/browse/NIFI-12202
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.24.0, 1.23.1, 1.23.2
>Reporter: Alex Jackson
>Priority: Major
> Attachments: image-2024-02-21-14-41-53-054.png
>
>
> We have SAML configured and when I updated from 1.20.0 to 1.23.1 (at the 
> time) and just tried now 1.23.2 I see that SAML authentication takes place 
> but I am infinitely redirected and eventually land on a nifi-api address. I 
> havent got it deployed in this bad state anymore but I feel like there is an 
> issue with SAML and it would be great if someone could look into it



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12202) SAML Infinitely Redirects

2024-02-21 Thread Alex Jackson (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12202?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alex Jackson updated NIFI-12202:

Attachment: image-2024-02-21-14-41-53-054.png

> SAML Infinitely Redirects
> -
>
> Key: NIFI-12202
> URL: https://issues.apache.org/jira/browse/NIFI-12202
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.24.0, 1.23.1, 1.23.2
>Reporter: Alex Jackson
>Priority: Major
> Attachments: image-2024-02-21-14-41-53-054.png
>
>
> We have SAML configured and when I updated from 1.20.0 to 1.23.1 (at the 
> time) and just tried now 1.23.2 I see that SAML authentication takes place 
> but I am infinitely redirected and eventually land on a nifi-api address. I 
> havent got it deployed in this bad state anymore but I feel like there is an 
> issue with SAML and it would be great if someone could look into it



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (NIFI-12830) Memory leak outside of heap

2024-02-21 Thread saarbs (Jira)
saarbs created NIFI-12830:
-

 Summary: Memory leak outside of heap
 Key: NIFI-12830
 URL: https://issues.apache.org/jira/browse/NIFI-12830
 Project: Apache NiFi
  Issue Type: Bug
Affects Versions: 1.22.0
 Environment: Openshift
Reporter: saarbs


We run NiFi on Openshift and we are experiencing problems regarding memory 
leaks in some of our clusters. we set the max heap size to 50% (6g) of the pod 
request and limit (12Gi), and we experience frequent OOM kills almost 4-5 times 
a day per pod in our 5 pod cluster.



Using {{top}} we see the process memory usage increases over time until it 
reaches twice the heap size and is killed by the Openshift OOM Killer.

Using 
{code:java}
jcmd VM.native_memory{code}
 we determined that the leak is not in the heap and is not in memory tracked 
off-heap.



Then we used {{pmap}} and partial memory dumps of the parts we suspect are part 
of the leak.

Inspecting the memory using {{strings}} the notable content is JDI type 
signatures, flowfile attributes, and flowfile contents.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12829) NiFi 2.0 Python API Interpreter choice per processor

2024-02-21 Thread Denis Jakupovic (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12829?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Denis Jakupovic updated NIFI-12829:
---
Description: 
Hi,

the Python API is amazing, thank you for the work. 

Could you implement a python interpreter path e.g. python 3.12 or 3.11 or PyPy 
e.g. 
Of course the user needs to guarantee that the interpreter works/and is 
available but having the choice would be great per processor.

e.g. use pypy interpreter just for high performance tasks and if nothing set 
use interpreter set in nifi.properties

This is also important for tests without having several nifi clusters with 
different python interpreter versions

  was:
Hi,

the Python API is amazing, thank you for the work. 

Could you implement a python interpreter path e.g. python 3.12 or 3.11 or PyPy 
e.g. 
Of course the user needs to guarantee that the interpreter works/and is 
available but having the choice would be great per processor.

e.g. use pypy interpreter just for high performance tasks and if nothing set 
use interpreter set in nifi.properties

 


> NiFi 2.0 Python API Interpreter choice per processor
> 
>
> Key: NIFI-12829
> URL: https://issues.apache.org/jira/browse/NIFI-12829
> Project: Apache NiFi
>  Issue Type: Wish
>  Components: Core Framework
>Affects Versions: 2.0.0-M2
>Reporter: Denis Jakupovic
>Priority: Blocker
>
> Hi,
> the Python API is amazing, thank you for the work. 
> Could you implement a python interpreter path e.g. python 3.12 or 3.11 or 
> PyPy e.g. 
> Of course the user needs to guarantee that the interpreter works/and is 
> available but having the choice would be great per processor.
> e.g. use pypy interpreter just for high performance tasks and if nothing set 
> use interpreter set in nifi.properties
> This is also important for tests without having several nifi clusters with 
> different python interpreter versions



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (NIFI-12829) NiFi 2.0 Python API Interpreter choice per processor

2024-02-21 Thread Denis Jakupovic (Jira)
Denis Jakupovic created NIFI-12829:
--

 Summary: NiFi 2.0 Python API Interpreter choice per processor
 Key: NIFI-12829
 URL: https://issues.apache.org/jira/browse/NIFI-12829
 Project: Apache NiFi
  Issue Type: Wish
  Components: Core Framework
Affects Versions: 2.0.0-M2
Reporter: Denis Jakupovic


Hi,

the Python API is amazing, thank you for the work. 

Could you implement a python interpreter path e.g. python 3.12 or 3.11 or PyPy 
e.g. 
Of course the user needs to guarantee that the interpreter works/and is 
available but having the choice would be great per processor.

e.g. use pypy interpreter just for high performance tasks and if nothing set 
use interpreter set in nifi.properties

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (MINIFICPP-2293) Support installing python dependencies defined inline

2024-02-21 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2293?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi updated MINIFICPP-2293:
-
Status: Patch Available  (was: Open)

https://github.com/apache/nifi-minifi-cpp/pull/1727

> Support installing python dependencies defined inline
> -
>
> Key: MINIFICPP-2293
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2293
> Project: Apache NiFi MiNiFi C++
>  Issue Type: New Feature
>Reporter: Gábor Gyimesi
>Assignee: Gábor Gyimesi
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> In NiFi python processors python dependencies can be defined inside the class 
> definition in the ProcessorDetails nested class using the dependencies 
> attribute. The dependencies attribute is a list with the required python 
> packages the processor depends on. MiNiFi should also support installing the 
> required packages defined here.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (MINIFICPP-2297) Remove iOS support from build system

2024-02-21 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2297?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi updated MINIFICPP-2297:
-
Status: Patch Available  (was: Open)

> Remove iOS support from build system
> 
>
> Key: MINIFICPP-2297
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2297
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Gábor Gyimesi
>Assignee: Gábor Gyimesi
>Priority: Trivial
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> The project was not built for iOS for years and is not maintained at all, 
> should be removed from the build options.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[PR] MINIFICPP-2298 Set RocksDB keep_log_file_num configurable and default to 5 [nifi-minifi-cpp]

2024-02-21 Thread via GitHub


lordgamez opened a new pull request, #1731:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1731

   Note: all of the RocksDB database options are defined in the 
`rocksdb::DBOptions` struct (not counting the ColumnOptions) with various 
member types ranging from booleans to shared pointers. There are 90-100 options 
defined, so it is unfeasable to implement the handling of all of them. I would 
suggest implementing the configurability of these options only when needed. I 
made the `keep_log_file_num` configurable, but I'm open for suggestions if 
anything else should be configurable at the moment.
   
   https://issues.apache.org/jira/browse/MINIFICPP-2298
   
   
   
   Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [ ] Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
   
   - [ ] Does your PR title start with MINIFICPP- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [ ] Has your PR been rebased against the latest commit within the target 
branch (typically main)?
   
   - [ ] Is your initial contribution a single, squashed commit?
   
   ### For code changes:
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the LICENSE file?
   - [ ] If applicable, have you updated the NOTICE file?
   
   ### For documentation related changes:
   - [ ] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check GitHub Actions CI 
results for build issues and submit an update to your PR as soon as possible.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] MINIFICPP-2302 Upgrade github actions to Node.js 20 (or latest available) versions [nifi-minifi-cpp]

2024-02-21 Thread via GitHub


fgerlits opened a new pull request, #1730:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1730

   Upgrade github action versions:
   
   * actions/cache v3 -> v4
   * actions/checkout v3 -> v4
   * actions/download-artifact v3 -> v4
   * actions/setup-python v4 -> v5
   * actions/upload-artifact v3.1.2 -> v4
   * seanmiddleditch/gha-setup-ninja v3 -> v4
   * xpol/setup-lua@v0.3 -> leafo/gh-actions-lua@v10
   * add ilammy/msvc-dev-cmd@v1 because gh-actions-lua depends on it
   * mozilla-actions/sccache-action v0.0.3 -> v0.0.4
   
   https://issues.apache.org/jira/browse/MINIFICPP-2302
   
   ---
   
   Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [x] Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
   
   - [x] Does your PR title start with MINIFICPP- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [x] Has your PR been rebased against the latest commit within the target 
branch (typically main)?
   
   - [x] Is your initial contribution a single, squashed commit?
   
   ### For code changes:
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the LICENSE file?
   - [ ] If applicable, have you updated the NOTICE file?
   
   ### For documentation related changes:
   - [ ] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check GitHub Actions CI 
results for build issues and submit an update to your PR as soon as possible.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Created] (MINIFICPP-2302) Upgrade to Node.js 20 github actions

2024-02-21 Thread Ferenc Gerlits (Jira)
Ferenc Gerlits created MINIFICPP-2302:
-

 Summary: Upgrade to Node.js 20 github actions
 Key: MINIFICPP-2302
 URL: https://issues.apache.org/jira/browse/MINIFICPP-2302
 Project: Apache NiFi MiNiFi C++
  Issue Type: Improvement
Reporter: Ferenc Gerlits
Assignee: Ferenc Gerlits


Most of the github user actions we use in our CI jobs use Node.js 16, which is 
now deprecated, so generates a warning.

Upgrade to the latest major versions, which use Node.js 20.

Unfortunately, some actions don't have Node.js 20 versions:
* seanmiddleditch/gha-setup-ninja@v4 is the latest version, uses Node.js 16
* xpol/setup-lua@v0.3 is the latest version, uses Node.js 12 (!) -- maybe use 
leafo/gh-actions-lua instead?



--
This message was sent by Atlassian Jira
(v8.20.10#820010)