Re: [PR] NIFI-12017 add ability to choose to output to single line for base32 base64 contents [nifi]

2024-03-11 Thread via GitHub


exceptionfactory commented on code in PR #8417:
URL: https://github.com/apache/nifi/pull/8417#discussion_r1520637551


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/encoding/EncodingMode.java:
##
@@ -0,0 +1,47 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard.encoding;
+
+import org.apache.nifi.components.DescribedValue;
+
+public enum EncodingMode implements DescribedValue {
+ ENCODE("Encode", "Transform original input to encoded representation"),
+ DECODE("Decode", "Transform encoded input to original representation");
+
+ EncodingMode(String value, String description) {

Review Comment:
   The spacing in this file appears to be off in several places. Each block 
should be indented with a multiple of four spaces.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Resolved] (NIFI-12861) Provide documentation on JASN1Reader that it only supports JDK and not just JRE

2024-03-11 Thread David Handermann (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12861?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Handermann resolved NIFI-12861.
-
Fix Version/s: 2.0.0
   Resolution: Fixed

> Provide documentation on JASN1Reader that it only supports JDK and not just 
> JRE
> ---
>
> Key: NIFI-12861
> URL: https://issues.apache.org/jira/browse/NIFI-12861
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mathew Kapkiai
>Assignee: Mathew Kapkiai
>Priority: Minor
>  Labels: documentation
> Fix For: 2.0.0
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12861) Provide documentation on JASN1Reader that it only supports JDK and not just JRE

2024-03-11 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12861?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825479#comment-17825479
 ] 

ASF subversion and git services commented on NIFI-12861:


Commit e96201ddd1911b5bbfa45ae3fd56336e170a9a80 in nifi's branch 
refs/heads/main from kapkiai
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=e96201ddd1 ]

NIFI-12861 Documented that JASN1Reader requires the JDK

This closes #8469

Signed-off-by: David Handermann 


> Provide documentation on JASN1Reader that it only supports JDK and not just 
> JRE
> ---
>
> Key: NIFI-12861
> URL: https://issues.apache.org/jira/browse/NIFI-12861
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mathew Kapkiai
>Assignee: Mathew Kapkiai
>Priority: Minor
>  Labels: documentation
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12861 Provide documentation on JASN1Reader that it only supports… [nifi]

2024-03-11 Thread via GitHub


exceptionfactory closed pull request #8469: NIFI-12861 Provide documentation on 
JASN1Reader that it only supports…
URL: https://github.com/apache/nifi/pull/8469


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Updated] (NIFI-12884) Correct documentation for python debugging

2024-03-11 Thread David Handermann (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12884?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Handermann updated NIFI-12884:

Summary: Correct documentation for python debugging  (was: Corrected 
documentation for python debugging)

> Correct documentation for python debugging
> --
>
> Key: NIFI-12884
> URL: https://issues.apache.org/jira/browse/NIFI-12884
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Mark Bathori
>Assignee: Mark Bathori
>Priority: Minor
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Correct config property name and minor typo in python debug documentation.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (NIFI-12884) Correct documentation for python debugging

2024-03-11 Thread David Handermann (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12884?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Handermann resolved NIFI-12884.
-
Fix Version/s: 2.0.0
   Resolution: Fixed

> Correct documentation for python debugging
> --
>
> Key: NIFI-12884
> URL: https://issues.apache.org/jira/browse/NIFI-12884
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Mark Bathori
>Assignee: Mark Bathori
>Priority: Minor
> Fix For: 2.0.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Correct config property name and minor typo in python debug documentation.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12884) Corrected documentation for python debugging

2024-03-11 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12884?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825478#comment-17825478
 ] 

ASF subversion and git services commented on NIFI-12884:


Commit 7e594d58dc9cea9317b1e31b9cc161e2230900c9 in nifi's branch 
refs/heads/main from Mark Bathori
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=7e594d58dc ]

NIFI-12884 Corrected documentation for python debugging

This closes #8490

Signed-off-by: David Handermann 


> Corrected documentation for python debugging
> 
>
> Key: NIFI-12884
> URL: https://issues.apache.org/jira/browse/NIFI-12884
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Mark Bathori
>Assignee: Mark Bathori
>Priority: Minor
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Correct config property name and minor typo in python debug documentation.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12884: Corrected documentation for python debugging [nifi]

2024-03-11 Thread via GitHub


exceptionfactory closed pull request #8490: NIFI-12884: Corrected documentation 
for python debugging
URL: https://github.com/apache/nifi/pull/8490


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Updated] (NIFI-12886) Upgrade Jackson JSON to 2.16.2

2024-03-11 Thread David Handermann (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12886?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Handermann updated NIFI-12886:

Status: Patch Available  (was: Open)

> Upgrade Jackson JSON to 2.16.2
> --
>
> Key: NIFI-12886
> URL: https://issues.apache.org/jira/browse/NIFI-12886
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework, Extensions
>Reporter: David Handermann
>Assignee: David Handermann
>Priority: Minor
>  Labels: backport-needed
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Jackson JSON libraries should be upgraded to 
> [2.16.2|https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.16.2] to 
> incorporate several minor bug fixes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[PR] NIFI-12886 Upgrade Jackson JSON from 2.16.1 to 2.16.2 [nifi]

2024-03-11 Thread via GitHub


exceptionfactory opened a new pull request, #8492:
URL: https://github.com/apache/nifi/pull/8492

   # Summary
   
   [NIFI-12886](https://issues.apache.org/jira/browse/NIFI-12886) Upgrades 
Jackson JSON libraries from 2.16.1 to 
[2.16.2](https://issues.apache.org/jira/browse/NIFI-12886) incorporating 
several minor bug fixes.
   
   This upgrade is compatible with both main and support branches.
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [X] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [X] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [X] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [X] Pull Request based on current revision of the `main` branch
   - [X] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [X] Build completed using `mvn clean install -P contrib-check`
 - [X] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Created] (NIFI-12886) Upgrade Jackson JSON to 2.16.2

2024-03-11 Thread David Handermann (Jira)
David Handermann created NIFI-12886:
---

 Summary: Upgrade Jackson JSON to 2.16.2
 Key: NIFI-12886
 URL: https://issues.apache.org/jira/browse/NIFI-12886
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Core Framework, Extensions
Reporter: David Handermann
Assignee: David Handermann


Jackson JSON libraries should be upgraded to 
[2.16.2|https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.16.2] to 
incorporate several minor bug fixes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-11 Thread via GitHub


scottyaslan commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1520457156


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/app/pages/flow-designer/ui/canvas/graph-controls/navigation-control/_navigation-control.component-theme.scss:
##
@@ -17,46 +17,42 @@
 
 @use 'sass:map';
 @use '@angular/material' as mat;
+@use '../../../../../../../assets/utils.scss' as utils;
 
 @mixin nifi-theme($material-theme, $canvas-theme) {
 // Get the color config from the theme.
 $color-config: mat.get-color-config($material-theme);
 $canvas-color-config: mat.get-color-config($canvas-theme);
 
 // Get the color palette from the color-config.
-$primary-palette: map.get($color-config, 'primary');
-$accent-palette: map.get($color-config, 'accent');
 $canvas-primary-palette: map.get($canvas-color-config, 'primary');
 
 // Get hues from palette
-$primary-palette-100: mat.get-color-from-palette($primary-palette, 100);
-$primary-palette-300: mat.get-color-from-palette($primary-palette, 300);
-$accent-palette-A400: mat.get-color-from-palette($accent-palette, 'A400');
-$canvas-primary-palette-50: 
mat.get-color-from-palette($canvas-primary-palette, 50);
-$canvas-primary-palette-600: 
mat.get-color-from-palette($canvas-primary-palette, 600);
-$canvas-primary-palette-A100: 
mat.get-color-from-palette($canvas-primary-palette, 'A100');
+$on-surface-medium: utils.get-on-surface($canvas-color-config, medium);

Review Comment:
   What is medium? Is that different from default?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Comment Edited] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-11 Thread crissaegrim (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825455#comment-17825455
 ] 

crissaegrim edited comment on NIFI-12885 at 3/11/24 9:28 PM:
-

Hi.  Thanks for the reply.

> Outside of this test class, did you observe runtime behavior changes in any 
> Processors?

Only in a custom processor I'm writing.  `getAsDate` used to return the time 
component too, so my tests there were breaking when we switch versions.

And agreed, on adding a method.  I looked through nifi code today.  I don't 
think any tests are testing for the time component in `MapRecord`.  I think 
that's the shortcoming here–`MapRecord` stopped providing a time component.


was (Author: JIRAUSER298664):
Hi.  Thanks for the reply.

> Outside of this test class, did you observe runtime behavior changes in any 
> Processors?

Only in a custom processor I'm writing.  `getAsDate` used to return the time 
component too, so my tests there were breaking when we switch versions.

And agreed, on adding a method.  I looked through nifi code today.  I don't 
think any tests are testing for the time component.  I think that's the 
shortcoming here–`MapRecord` stopped providing a time component, which I think 
could break a lot of folks' data when materialized as `MapRecord` and 
`getAsDate` is called and downstream expects a timestamp.

> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Priority: Major
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-11 Thread crissaegrim (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825455#comment-17825455
 ] 

crissaegrim commented on NIFI-12885:


Hi.  Thanks for the reply.

> Outside of this test class, did you observe runtime behavior changes in any 
> Processors?

Only in a custom processor I'm writing.  `getAsDate` used to return the time 
component too, so my tests there were breaking when we switch versions.

And agreed, on adding a method.  I looked through nifi code today.  I don't 
think any tests are testing for the time component.  I think that's the 
shortcoming here–`MapRecord` stopped providing a time component, which I think 
could break a lot of folks' data when materialized as `MapRecord` and 
`getAsDate` is called and downstream expects a timestamp.

> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Priority: Major
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-11 Thread David Handermann (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825451#comment-17825451
 ] 

David Handermann commented on NIFI-12885:
-

Thanks for highlighting this change [~crissaegrim].

Part of the challenge is that {{getAsDate()}} is somewhat ambigous based on 
previous usage. Test classes including TestQueryRecord and 
ResultSetRecordSetTest expect the results of getAsDate() to return what is 
effectively a LocalDate, containing only year, month, and day.

The changes referenced in NIFI-9458 make that behavior explicit by return a 
java.sql.Date object containing only the year, month, and day information. This 
different than java.sql.Timestamp, which contains resolution down to the 
millisecond.

These existing unit tests pass based on this expected behavior, and the 
references are otherwise limited. Outside of this test class, did you observe 
runtime behavior changes in any Processors?

There are several possible ways forward, one of which may be to change the 
method signature to return a java.time.LocalDate, making the behavior clear. 
Another option could include adding a method such as getAsOffsetDateTime(), 
which could return an instance of java.time.OffsetDateTime containing 
additional precision.

> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: New Feature
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Priority: Critical
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-11 Thread David Handermann (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Handermann updated NIFI-12885:

Priority: Major  (was: Critical)

> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Priority: Major
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-11 Thread David Handermann (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Handermann updated NIFI-12885:

Issue Type: Bug  (was: New Feature)

> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Priority: Critical
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


EndzeitBegins commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520432760


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)

Review Comment:
   That's an interesting remark. 
   
   The other existing `DeleteX` processors I looked into all required an input 
FlowFile, so I think that's a reasonable default to start with.
   Especially, since using it with a `GenerateFlowFile` is an easy workaround. 
   
   For now, I would argue that `INPUT_REQUIRED` is a sensible start. 
   Allowing to use the processor without input should be relatively easy to 
support in a later point in time, in case that should be deemed useful.
   However, the opposite is not true I assume



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


joewitt commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520429450


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"file", "remove", "delete", "local", "files", "filesystem"})
+@CapabilityDescription("Deletes a file from the filesystem.")
+@UseCase(
+description = "Delete source file only after its processing completed",
+configuration = """
+Retrieve a file from the filesystem, e.g. using 'ListFile' and 
'FetchFile'.
+Process the file using any combination of processors.
+Store the resulting file to a destination, e.g. using 
'PutSFTP'.
+Using 'DeleteFile', delete the file from the filesystem only 
after the result has been stored.
+"""
+)
+public class DeleteFile extends AbstractProcessor {
+
+public static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("All FlowFiles, for which an existing file has been 
deleted, are routed to this relationship")
+.build();
+public static final Relationship REL_NOT_FOUND = new Relationship.Builder()
+.name("not found")
+.description("All FlowFiles, for which the file to delete did not 
exist, are routed to this relationship")
+.build();
+public static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("All FlowFiles, for which an existing file could not 
be deleted, are routed to this relationship")
+.build();
+
+private final static Set relationships = Set.of(REL_SUCCESS, 
REL_NOT_FOUND, REL_FAILURE);
+
+public static final PropertyDescriptor DIRECTORY_PATH = new 
PropertyDescriptor.Builder()
+.name("Directory Path")
+.displayName("Directory Path")
+.description("The path to the directory the file to delete is 
located in.")
+.required(true)
+.defaultValue("${" + CoreAttributes.ABSOLUTE_PATH.key() + "}")
+
.addValidator(StandardValidators.createDirectoryExistsValidator(true, false))
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+public static final PropertyDescriptor FILENAME = new 
PropertyDescriptor.Builder()

Review Comment:
   This should be designed to delete a single specified file and not be 
recursing/etc..



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this 

Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


EndzeitBegins commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520428228


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"file", "remove", "delete", "local", "files", "filesystem"})
+@CapabilityDescription("Deletes a file from the filesystem.")
+@UseCase(
+description = "Delete source file only after its processing completed",
+configuration = """
+Retrieve a file from the filesystem, e.g. using 'ListFile' and 
'FetchFile'.
+Process the file using any combination of processors.
+Store the resulting file to a destination, e.g. using 
'PutSFTP'.
+Using 'DeleteFile', delete the file from the filesystem only 
after the result has been stored.
+"""
+)
+public class DeleteFile extends AbstractProcessor {
+
+public static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("All FlowFiles, for which an existing file has been 
deleted, are routed to this relationship")
+.build();
+public static final Relationship REL_NOT_FOUND = new Relationship.Builder()
+.name("not found")
+.description("All FlowFiles, for which the file to delete did not 
exist, are routed to this relationship")
+.build();
+public static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("All FlowFiles, for which an existing file could not 
be deleted, are routed to this relationship")
+.build();
+
+private final static Set relationships = Set.of(REL_SUCCESS, 
REL_NOT_FOUND, REL_FAILURE);
+
+public static final PropertyDescriptor DIRECTORY_PATH = new 
PropertyDescriptor.Builder()
+.name("Directory Path")
+.displayName("Directory Path")
+.description("The path to the directory the file to delete is 
located in.")
+.required(true)
+.defaultValue("${" + CoreAttributes.ABSOLUTE_PATH.key() + "}")
+
.addValidator(StandardValidators.createDirectoryExistsValidator(true, false))
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+public static final PropertyDescriptor FILENAME = new 
PropertyDescriptor.Builder()
+.name("Filename")
+.displayName("Filename")
+.description("The name of the file to delete.")
+.required(true)
+.defaultValue("${" + CoreAttributes.FILENAME.key() + "}")
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+

Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


EndzeitBegins commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520428228


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"file", "remove", "delete", "local", "files", "filesystem"})
+@CapabilityDescription("Deletes a file from the filesystem.")
+@UseCase(
+description = "Delete source file only after its processing completed",
+configuration = """
+Retrieve a file from the filesystem, e.g. using 'ListFile' and 
'FetchFile'.
+Process the file using any combination of processors.
+Store the resulting file to a destination, e.g. using 
'PutSFTP'.
+Using 'DeleteFile', delete the file from the filesystem only 
after the result has been stored.
+"""
+)
+public class DeleteFile extends AbstractProcessor {
+
+public static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("All FlowFiles, for which an existing file has been 
deleted, are routed to this relationship")
+.build();
+public static final Relationship REL_NOT_FOUND = new Relationship.Builder()
+.name("not found")
+.description("All FlowFiles, for which the file to delete did not 
exist, are routed to this relationship")
+.build();
+public static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("All FlowFiles, for which an existing file could not 
be deleted, are routed to this relationship")
+.build();
+
+private final static Set relationships = Set.of(REL_SUCCESS, 
REL_NOT_FOUND, REL_FAILURE);
+
+public static final PropertyDescriptor DIRECTORY_PATH = new 
PropertyDescriptor.Builder()
+.name("Directory Path")
+.displayName("Directory Path")
+.description("The path to the directory the file to delete is 
located in.")
+.required(true)
+.defaultValue("${" + CoreAttributes.ABSOLUTE_PATH.key() + "}")
+
.addValidator(StandardValidators.createDirectoryExistsValidator(true, false))
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+public static final PropertyDescriptor FILENAME = new 
PropertyDescriptor.Builder()
+.name("Filename")
+.displayName("Filename")
+.description("The name of the file to delete.")
+.required(true)
+.defaultValue("${" + CoreAttributes.FILENAME.key() + "}")
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+

Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


joewitt commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520427709


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"file", "remove", "delete", "local", "files", "filesystem"})
+@CapabilityDescription("Deletes a file from the filesystem.")
+@UseCase(
+description = "Delete source file only after its processing completed",
+configuration = """
+Retrieve a file from the filesystem, e.g. using 'ListFile' and 
'FetchFile'.
+Process the file using any combination of processors.
+Store the resulting file to a destination, e.g. using 
'PutSFTP'.
+Using 'DeleteFile', delete the file from the filesystem only 
after the result has been stored.
+"""
+)
+public class DeleteFile extends AbstractProcessor {
+
+public static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("All FlowFiles, for which an existing file has been 
deleted, are routed to this relationship")
+.build();
+public static final Relationship REL_NOT_FOUND = new Relationship.Builder()
+.name("not found")
+.description("All FlowFiles, for which the file to delete did not 
exist, are routed to this relationship")
+.build();
+public static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("All FlowFiles, for which an existing file could not 
be deleted, are routed to this relationship")
+.build();
+
+private final static Set relationships = Set.of(REL_SUCCESS, 
REL_NOT_FOUND, REL_FAILURE);
+
+public static final PropertyDescriptor DIRECTORY_PATH = new 
PropertyDescriptor.Builder()
+.name("Directory Path")
+.displayName("Directory Path")
+.description("The path to the directory the file to delete is 
located in.")
+.required(true)
+.defaultValue("${" + CoreAttributes.ABSOLUTE_PATH.key() + "}")
+
.addValidator(StandardValidators.createDirectoryExistsValidator(true, false))
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+public static final PropertyDescriptor FILENAME = new 
PropertyDescriptor.Builder()

Review Comment:
   DeleteFile is the correct name either way.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:

Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


joewitt commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520426977


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)

Review Comment:
   Unless a clear intent/use case already exists please keep this input 
required as that satisfies the specific use case intended.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


joewitt commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520425629


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"file", "remove", "delete", "local", "files", "filesystem"})
+@CapabilityDescription("Deletes a file from the filesystem.")
+@UseCase(
+description = "Delete source file only after its processing completed",
+configuration = """
+Retrieve a file from the filesystem, e.g. using 'ListFile' and 
'FetchFile'.
+Process the file using any combination of processors.
+Store the resulting file to a destination, e.g. using 
'PutSFTP'.
+Using 'DeleteFile', delete the file from the filesystem only 
after the result has been stored.
+"""
+)
+public class DeleteFile extends AbstractProcessor {
+
+public static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("All FlowFiles, for which an existing file has been 
deleted, are routed to this relationship")
+.build();
+public static final Relationship REL_NOT_FOUND = new Relationship.Builder()
+.name("not found")
+.description("All FlowFiles, for which the file to delete did not 
exist, are routed to this relationship")
+.build();
+public static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("All FlowFiles, for which an existing file could not 
be deleted, are routed to this relationship")
+.build();
+
+private final static Set relationships = Set.of(REL_SUCCESS, 
REL_NOT_FOUND, REL_FAILURE);
+
+public static final PropertyDescriptor DIRECTORY_PATH = new 
PropertyDescriptor.Builder()
+.name("Directory Path")
+.displayName("Directory Path")
+.description("The path to the directory the file to delete is 
located in.")
+.required(true)
+.defaultValue("${" + CoreAttributes.ABSOLUTE_PATH.key() + "}")
+
.addValidator(StandardValidators.createDirectoryExistsValidator(true, false))
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+public static final PropertyDescriptor FILENAME = new 
PropertyDescriptor.Builder()
+.name("Filename")
+.displayName("Filename")
+.description("The name of the file to delete.")
+.required(true)
+.defaultValue("${" + CoreAttributes.FILENAME.key() + "}")
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+

Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-11 Thread via GitHub


james-elliott commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1520426195


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/assets/themes/nifi.scss:
##
@@ -19,11 +19,15 @@
 // For more information: 
https://m2.material.io/design/color/the-color-system.html
 @use '@angular/material' as mat;
 
+// Define some variables that are re-used throughout the theme.
+$on-surface-dark: rgba(black, 0.87);
+$on-surface-light: #ff;
+
 // The $material-primary-light-palette define the PRIMARY AND ACCENT palettes 
for all Angular Material components used throughout Apache NiFi
 $material-primary-light-palette: (
 // 50 -> 900 are the PRIMARY colors 
(mat.define-palette($material-primary-light-palette, 300);) defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-50: rgba(249, 250, 251, 0.97), // .context-menu
-100: rgba(233, 239, 243, 1), // "lighter" hue for this palette. Also 
.global-menu:hover, .navigation-control-header:hover, 
.operation-control-header:hover, .new-canvas-item.icon.hovering, table 
tr:hover, .CodeMirror.blank, .remote-banner, .process-group-details-banner, 
.process-group-details-banner, remote-process-group-details-banner, 
.remote-process-group-last-refresh-rect,

Review Comment:
   Color should support rgb, hex, and keywords. The 3 value versions of rgb 
`rgb(0,0,0)` and hex `#00` are considered shorthand where the 4th value, 
alpha, is intuited at full. We can always declare a color and alpha explicitly. 
For instance, we could use rgba(0,0,0,0.87) for $on-surface-dark or we could 
define it as #00de.
   
   I believe all modern browsers support hsla() as well. However, not too many 
folks use that as a way to define color. I'm pretty confident that hex 
declarations are the most used. However, having alpha channel defined by a hex 
is a little awkward and unfamiliar. I'd suggest, if we are going to 
standardize, to go with a format like `rgba(#00, 0.87)`. That also works in 
all browsers, and combines the familiarity of hexes for the color, with 
percentage of transparency.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


mattyb149 commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520422701


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"file", "remove", "delete", "local", "files", "filesystem"})
+@CapabilityDescription("Deletes a file from the filesystem.")
+@UseCase(
+description = "Delete source file only after its processing completed",
+configuration = """
+Retrieve a file from the filesystem, e.g. using 'ListFile' and 
'FetchFile'.
+Process the file using any combination of processors.
+Store the resulting file to a destination, e.g. using 
'PutSFTP'.
+Using 'DeleteFile', delete the file from the filesystem only 
after the result has been stored.
+"""
+)
+public class DeleteFile extends AbstractProcessor {
+
+public static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("All FlowFiles, for which an existing file has been 
deleted, are routed to this relationship")
+.build();
+public static final Relationship REL_NOT_FOUND = new Relationship.Builder()
+.name("not found")
+.description("All FlowFiles, for which the file to delete did not 
exist, are routed to this relationship")
+.build();
+public static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("All FlowFiles, for which an existing file could not 
be deleted, are routed to this relationship")
+.build();
+
+private final static Set relationships = Set.of(REL_SUCCESS, 
REL_NOT_FOUND, REL_FAILURE);
+
+public static final PropertyDescriptor DIRECTORY_PATH = new 
PropertyDescriptor.Builder()
+.name("Directory Path")
+.displayName("Directory Path")
+.description("The path to the directory the file to delete is 
located in.")
+.required(true)
+.defaultValue("${" + CoreAttributes.ABSOLUTE_PATH.key() + "}")
+
.addValidator(StandardValidators.createDirectoryExistsValidator(true, false))
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+public static final PropertyDescriptor FILENAME = new 
PropertyDescriptor.Builder()
+.name("Filename")
+.displayName("Filename")
+.description("The name of the file to delete.")
+.required(true)
+.defaultValue("${" + CoreAttributes.FILENAME.key() + "}")
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+

Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


EndzeitBegins commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520419875


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"file", "remove", "delete", "local", "files", "filesystem"})
+@CapabilityDescription("Deletes a file from the filesystem.")
+@UseCase(
+description = "Delete source file only after its processing completed",
+configuration = """
+Retrieve a file from the filesystem, e.g. using 'ListFile' and 
'FetchFile'.
+Process the file using any combination of processors.
+Store the resulting file to a destination, e.g. using 
'PutSFTP'.
+Using 'DeleteFile', delete the file from the filesystem only 
after the result has been stored.
+"""
+)
+public class DeleteFile extends AbstractProcessor {
+
+public static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("All FlowFiles, for which an existing file has been 
deleted, are routed to this relationship")
+.build();
+public static final Relationship REL_NOT_FOUND = new Relationship.Builder()
+.name("not found")
+.description("All FlowFiles, for which the file to delete did not 
exist, are routed to this relationship")
+.build();
+public static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("All FlowFiles, for which an existing file could not 
be deleted, are routed to this relationship")
+.build();
+
+private final static Set relationships = Set.of(REL_SUCCESS, 
REL_NOT_FOUND, REL_FAILURE);
+
+public static final PropertyDescriptor DIRECTORY_PATH = new 
PropertyDescriptor.Builder()
+.name("Directory Path")
+.displayName("Directory Path")
+.description("The path to the directory the file to delete is 
located in.")
+.required(true)
+.defaultValue("${" + CoreAttributes.ABSOLUTE_PATH.key() + "}")
+
.addValidator(StandardValidators.createDirectoryExistsValidator(true, false))
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+public static final PropertyDescriptor FILENAME = new 
PropertyDescriptor.Builder()
+.name("Filename")
+.displayName("Filename")
+.description("The name of the file to delete.")
+.required(true)
+.defaultValue("${" + CoreAttributes.FILENAME.key() + "}")
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+

Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


mattyb149 commented on code in PR #8489:
URL: https://github.com/apache/nifi/pull/8489#discussion_r1520411617


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DeleteFile.java:
##
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.DefaultRunDuration;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.documentation.UseCase;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.NoSuchFileException;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.List;
+import java.util.Set;
+
+@SupportsBatching(defaultDuration = DefaultRunDuration.TWENTY_FIVE_MILLIS)
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"file", "remove", "delete", "local", "files", "filesystem"})
+@CapabilityDescription("Deletes a file from the filesystem.")
+@UseCase(
+description = "Delete source file only after its processing completed",
+configuration = """
+Retrieve a file from the filesystem, e.g. using 'ListFile' and 
'FetchFile'.
+Process the file using any combination of processors.
+Store the resulting file to a destination, e.g. using 
'PutSFTP'.
+Using 'DeleteFile', delete the file from the filesystem only 
after the result has been stored.
+"""
+)
+public class DeleteFile extends AbstractProcessor {
+
+public static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("All FlowFiles, for which an existing file has been 
deleted, are routed to this relationship")
+.build();
+public static final Relationship REL_NOT_FOUND = new Relationship.Builder()
+.name("not found")
+.description("All FlowFiles, for which the file to delete did not 
exist, are routed to this relationship")
+.build();
+public static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("All FlowFiles, for which an existing file could not 
be deleted, are routed to this relationship")
+.build();
+
+private final static Set relationships = Set.of(REL_SUCCESS, 
REL_NOT_FOUND, REL_FAILURE);
+
+public static final PropertyDescriptor DIRECTORY_PATH = new 
PropertyDescriptor.Builder()
+.name("Directory Path")
+.displayName("Directory Path")
+.description("The path to the directory the file to delete is 
located in.")
+.required(true)
+.defaultValue("${" + CoreAttributes.ABSOLUTE_PATH.key() + "}")
+
.addValidator(StandardValidators.createDirectoryExistsValidator(true, false))
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.build();
+public static final PropertyDescriptor FILENAME = new 
PropertyDescriptor.Builder()
+.name("Filename")
+.displayName("Filename")
+.description("The name of the file to delete.")
+.required(true)
+.defaultValue("${" + CoreAttributes.FILENAME.key() + "}")
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+

[jira] [Updated] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-11 Thread crissaegrim (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

crissaegrim updated NIFI-12885:
---
Description: 
I think I found a breaking bug from this commit 
[https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]

The below passes in 1.20.0 but fails in 2.0
{code:java}
@Test
void testBasic() throws Exception {
// setup
final String schemaText = "{" +
"\"type\" : \"record\"," +
"\"name\" : \"TestRecord\"," +
"\"namespace\" : \"org.apache.nifi\"," +
"\"fields\" : [ {" +
"\"name\" : \"my_datestamp_field\"," +
"\"type\" : {" +
"\"type\" : \"long\"," +
"\"logicalType\" : \"timestamp-millis\"" +
"}" +
"} ]" +
  "}";
final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
Schema.Parser().parse(schemaText));

final HashMap item = new HashMap<>();
item.put("my_datestamp_field", "2022-01-01 10:00:00.000");

// act
final MapRecord record = new MapRecord(schemaParsed, item);
final Date myDateStampField = record.getAsDate("my_datestamp_field", 
"-MM-dd HH:mm:ss.SSS");

// assert
// fails in 2.0; actual in 2.0.0-M2 is `164099520`
assertEquals(164103120L, myDateStampField.getTime());
}
{code}

  was:
I think I found a breaking bug from this commit 
[https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]

The below passes in 1.20.0 but fails in 2.0
{code:java}
@Test
void testBasic() throws Exception {
// setup
final String schemaText = "{" +
"\"type\" : \"record\"," +
"\"name\" : \"TestRecord\"," +
"\"namespace\" : \"org.apache.nifi\"," +
"\"fields\" : [ {" +
"\"name\" : \"my_datestamp_field\"," +
"\"type\" : {" +
"\"type\" : \"long\"," +
"\"logicalType\" : \"timestamp-millis\"" +
"}" +
"} ]" +
  "}";
final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
Schema.Parser().parse(schemaText));

final HashMap item = new HashMap<>();
item.put("my_datestamp_field", "2022-01-01 10:00:00.000");

// act
final MapRecord record = new MapRecord(schemaParsed, item);
final Date myDateStampField = record.getAsDate("my_datestamp_field", 
"-MM-dd HH:mm:ss.SSS");

// assert
// fails in 2.0; actual is `164099520`
assertEquals(164103120L, myDateStampField.getTime());
}
{code}


> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: New Feature
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Priority: Critical
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-11 Thread crissaegrim (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

crissaegrim updated NIFI-12885:
---
Description: 
I think I found a breaking bug from this commit 
[https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]

The below passes in 1.20.0 but fails in 2.0
{code:java}
@Test
void testBasic() throws Exception {
// setup
final String schemaText = "{" +
"\"type\" : \"record\"," +
"\"name\" : \"TestRecord\"," +
"\"namespace\" : \"org.apache.nifi\"," +
"\"fields\" : [ {" +
"\"name\" : \"my_datestamp_field\"," +
"\"type\" : {" +
"\"type\" : \"long\"," +
"\"logicalType\" : \"timestamp-millis\"" +
"}" +
"} ]" +
  "}";
final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
Schema.Parser().parse(schemaText));

final HashMap item = new HashMap<>();
item.put("my_datestamp_field", "2022-01-01 10:00:00.000");

// act
final MapRecord record = new MapRecord(schemaParsed, item);
final Date myDateStampField = record.getAsDate("my_datestamp_field", 
"-MM-dd HH:mm:ss.SSS");

// assert
// fails in 2.0; actual is `164099520`
assertEquals(164103120L, myDateStampField.getTime());
}
{code}

  was:
I think I found a breaking bug from this commit 
[https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]

The below passes in 1.20.0 but fails in 2.0
{code:java}
@Test
void testBasic() throws Exception {
// setup
final String schemaText = "{" +
"\"type\" : \"record\"," +
"\"name\" : \"TestRecord\"," +
"\"namespace\" : \"org.apache.nifi\"," +
"\"fields\" : [ {" +
"\"name\" : \"my_datestamp_field\"," +
"\"type\" : {" +
"\"type\" : \"long\"," +
"\"logicalType\" : \"timestamp-millis\"" +
"}" +
"} ]" +
  "}";
final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
Schema.Parser().parse(schemaText));

final HashMap item = new HashMap<>();
item.put("my_datestamp_field", "2022-01-01 10:00:00.000");

// act
final MapRecord record = new MapRecord(schemaParsed, item);
final Date myDateStampField = record.getAsDate("my_datestamp_field", 
"-MM-dd HH:mm:ss.SSS");

// assert
// fails in 2.0; actual is `164099520`
assertEquals(164103120L, myDateStampField.getTime());
}
{code}


> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: New Feature
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Priority: Critical
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-11 Thread via GitHub


james-elliott commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1520379212


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/assets/themes/nifi.scss:
##
@@ -49,355 +53,100 @@ $material-primary-light-palette: (
 // NOTE: When creating the Material palette definition 
mat.define-palette($material-primary-light-palette, 300);
 // Since 300, was set as the default the contrast-300 will be used as the 
default text color.
 contrast: (
-50: rgba(black, 0.87),
-100: rgba(black, 0.87),
-200: rgba(black, 0.87),
-300: #ff,
-400: #ff,
-500: #ff,
-600: #ff,
-700: #ff,
-800: #ff,
-900: #ff,
-A100: rgba(black, 0.87),
-A200: rgba(black, 0.87),
-A400: #ff,
-A700: #ff,
-)
-);
-
-// The $material-primary-dark-palette define the PRIMARY AND ACCENT palettes 
for all Angular Material components used throughout Apache NiFi
-$material-primary-dark-palette: (
-// 50 -> 900 are the PRIMARY colors 
(mat.define-palette($material-primary-dark-palette, 300);) defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-50: rgb(30, 45, 54), // .context-menu
-100: rgba(32, 47, 54, 1), // "lighter" hue for this palette. Also 
.global-menu:hover, .navigation-control-header:hover, 
.operation-control-header:hover, .new-canvas-item.icon.hovering, table 
tr:hover, .CodeMirror.blank, .remote-banner, .process-group-details-banner, 
.process-group-details-banner, remote-process-group-details-banner, 
.remote-process-group-last-refresh-rect,
-200: #30444d, // .processor-stats-border, .process-group-stats-border, 
.context-menu-item:hover, .process-group-banner, .remote-process-group-banner, 
.a, button.nifi-button, button.nifi-button:disabled
-300: #3e5762, // .breadcrumb-container, .navigation-control, 
.operation-control, .flow-status, .controller-bulletins, 
.component-button-grip, .search-container, .nifi-navigation, .CodeMirror.blank
-400: #4d6b78, // Default hue for this palette (color="primary").
-500: #587a89, // .disabled, .not-transmitting, .splash, 
.access-policies-header, .operation-context-type, .bulletin-board-header, 
.counter-header, .stats-info, .active-thread-count-icon, .processor-type, 
.port-transmission-icon, .operation-context-type, .flow-status.fa, 
.flow-status.icon, .controller-bulletins, .prioritizers-list, 
.controller-services-header, .login-title, .parameter-context-header, 
.parameter-context-inheritance-list, .provenance-header, .flowfile-header, 
.queue-listing-header, .settings-header, .summary-header, .user-header, table 
th, button.global-menu-item.fa, button.global-menu-item.icon, .event-header, 
.section-header,
-600: #718d9a, // .breadcrumb-container, .birdseye-brush
-700: #8aa2ad, // "darker" hue for this palette. Also 
#status-history-chart-container text, #status-history-chart-control-container 
text
-800: #abbcc5,
-900: #abbcc5,
-
-// A100 -> A700 are the ACCENT color 
(mat.define-palette($material-primary-dark-palette, A400, A100, A700);). These 
color are the ANALOGOUS (or possibly the TRIADIC??) colors as defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-// These colors are also used by some custom canvas components that 
display the ANALOGOUS color for things like buttons, links, borders, info, etc.
-A100: #aabec7, // .zero
-A200: #44a3cf, // .enabled, .transmitting, .load-balance-icon-active
-A400: #009b9d, // a, a:hover, button.nifi-button, 
button.nifi-button:hover, .add-tenant-to-policy-form.fa, .component.selected 
rect.border, .add-connect, .remote-process-group-uri, 
.remote-process-group-transmission-secure, .navigation-control.fa, 
.operation-control.fa, .new-canvas-item.icon, .upload-flow-definition, 
.lineage-controls.fa, .event circle.context, .nifi-navigation.icon, 
.listing-table.fa, .listing-table.icon, .context-menu
-A700: #2cd5d5,//rgba(139, 208, 229, 1),//#aabec7 // .hint-pattern
-
-// These are the $material-primary-dark-palette PRIMARY AND ACCENT 
contrast colors. These color do not really get defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors.
-// Instead if we look to the Angular Material provided palettes we see 
that these fields are typically rgba(black, 0.87) or white. These values are 
particularly important
-// for light mode and dark mode as these values set the colors for the 
text when displayed against the primary background on a button, badge, chip, 
etc.
-//
-// NOTE: Care should be taken here to ensure the values meet accessibility 
standards.
-//
-// NOTE: When creating the Material palette definition 

Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-11 Thread via GitHub


james-elliott commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1520377250


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/assets/themes/nifi.scss:
##
@@ -49,355 +53,100 @@ $material-primary-light-palette: (
 // NOTE: When creating the Material palette definition 
mat.define-palette($material-primary-light-palette, 300);
 // Since 300, was set as the default the contrast-300 will be used as the 
default text color.
 contrast: (
-50: rgba(black, 0.87),
-100: rgba(black, 0.87),
-200: rgba(black, 0.87),
-300: #ff,
-400: #ff,
-500: #ff,
-600: #ff,
-700: #ff,
-800: #ff,
-900: #ff,
-A100: rgba(black, 0.87),
-A200: rgba(black, 0.87),
-A400: #ff,
-A700: #ff,
-)
-);
-
-// The $material-primary-dark-palette define the PRIMARY AND ACCENT palettes 
for all Angular Material components used throughout Apache NiFi
-$material-primary-dark-palette: (
-// 50 -> 900 are the PRIMARY colors 
(mat.define-palette($material-primary-dark-palette, 300);) defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-50: rgb(30, 45, 54), // .context-menu
-100: rgba(32, 47, 54, 1), // "lighter" hue for this palette. Also 
.global-menu:hover, .navigation-control-header:hover, 
.operation-control-header:hover, .new-canvas-item.icon.hovering, table 
tr:hover, .CodeMirror.blank, .remote-banner, .process-group-details-banner, 
.process-group-details-banner, remote-process-group-details-banner, 
.remote-process-group-last-refresh-rect,
-200: #30444d, // .processor-stats-border, .process-group-stats-border, 
.context-menu-item:hover, .process-group-banner, .remote-process-group-banner, 
.a, button.nifi-button, button.nifi-button:disabled
-300: #3e5762, // .breadcrumb-container, .navigation-control, 
.operation-control, .flow-status, .controller-bulletins, 
.component-button-grip, .search-container, .nifi-navigation, .CodeMirror.blank
-400: #4d6b78, // Default hue for this palette (color="primary").
-500: #587a89, // .disabled, .not-transmitting, .splash, 
.access-policies-header, .operation-context-type, .bulletin-board-header, 
.counter-header, .stats-info, .active-thread-count-icon, .processor-type, 
.port-transmission-icon, .operation-context-type, .flow-status.fa, 
.flow-status.icon, .controller-bulletins, .prioritizers-list, 
.controller-services-header, .login-title, .parameter-context-header, 
.parameter-context-inheritance-list, .provenance-header, .flowfile-header, 
.queue-listing-header, .settings-header, .summary-header, .user-header, table 
th, button.global-menu-item.fa, button.global-menu-item.icon, .event-header, 
.section-header,
-600: #718d9a, // .breadcrumb-container, .birdseye-brush
-700: #8aa2ad, // "darker" hue for this palette. Also 
#status-history-chart-container text, #status-history-chart-control-container 
text
-800: #abbcc5,
-900: #abbcc5,
-
-// A100 -> A700 are the ACCENT color 
(mat.define-palette($material-primary-dark-palette, A400, A100, A700);). These 
color are the ANALOGOUS (or possibly the TRIADIC??) colors as defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-// These colors are also used by some custom canvas components that 
display the ANALOGOUS color for things like buttons, links, borders, info, etc.
-A100: #aabec7, // .zero
-A200: #44a3cf, // .enabled, .transmitting, .load-balance-icon-active
-A400: #009b9d, // a, a:hover, button.nifi-button, 
button.nifi-button:hover, .add-tenant-to-policy-form.fa, .component.selected 
rect.border, .add-connect, .remote-process-group-uri, 
.remote-process-group-transmission-secure, .navigation-control.fa, 
.operation-control.fa, .new-canvas-item.icon, .upload-flow-definition, 
.lineage-controls.fa, .event circle.context, .nifi-navigation.icon, 
.listing-table.fa, .listing-table.icon, .context-menu
-A700: #2cd5d5,//rgba(139, 208, 229, 1),//#aabec7 // .hint-pattern
-
-// These are the $material-primary-dark-palette PRIMARY AND ACCENT 
contrast colors. These color do not really get defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors.
-// Instead if we look to the Angular Material provided palettes we see 
that these fields are typically rgba(black, 0.87) or white. These values are 
particularly important
-// for light mode and dark mode as these values set the colors for the 
text when displayed against the primary background on a button, badge, chip, 
etc.
-//
-// NOTE: Care should be taken here to ensure the values meet accessibility 
standards.
-//
-// NOTE: When creating the Material palette definition 

Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-11 Thread via GitHub


james-elliott commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1520374092


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/assets/themes/nifi.scss:
##
@@ -49,355 +53,100 @@ $material-primary-light-palette: (
 // NOTE: When creating the Material palette definition 
mat.define-palette($material-primary-light-palette, 300);
 // Since 300, was set as the default the contrast-300 will be used as the 
default text color.
 contrast: (
-50: rgba(black, 0.87),
-100: rgba(black, 0.87),
-200: rgba(black, 0.87),
-300: #ff,
-400: #ff,
-500: #ff,
-600: #ff,
-700: #ff,
-800: #ff,
-900: #ff,
-A100: rgba(black, 0.87),
-A200: rgba(black, 0.87),
-A400: #ff,
-A700: #ff,
-)
-);
-
-// The $material-primary-dark-palette define the PRIMARY AND ACCENT palettes 
for all Angular Material components used throughout Apache NiFi
-$material-primary-dark-palette: (
-// 50 -> 900 are the PRIMARY colors 
(mat.define-palette($material-primary-dark-palette, 300);) defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-50: rgb(30, 45, 54), // .context-menu
-100: rgba(32, 47, 54, 1), // "lighter" hue for this palette. Also 
.global-menu:hover, .navigation-control-header:hover, 
.operation-control-header:hover, .new-canvas-item.icon.hovering, table 
tr:hover, .CodeMirror.blank, .remote-banner, .process-group-details-banner, 
.process-group-details-banner, remote-process-group-details-banner, 
.remote-process-group-last-refresh-rect,
-200: #30444d, // .processor-stats-border, .process-group-stats-border, 
.context-menu-item:hover, .process-group-banner, .remote-process-group-banner, 
.a, button.nifi-button, button.nifi-button:disabled
-300: #3e5762, // .breadcrumb-container, .navigation-control, 
.operation-control, .flow-status, .controller-bulletins, 
.component-button-grip, .search-container, .nifi-navigation, .CodeMirror.blank
-400: #4d6b78, // Default hue for this palette (color="primary").
-500: #587a89, // .disabled, .not-transmitting, .splash, 
.access-policies-header, .operation-context-type, .bulletin-board-header, 
.counter-header, .stats-info, .active-thread-count-icon, .processor-type, 
.port-transmission-icon, .operation-context-type, .flow-status.fa, 
.flow-status.icon, .controller-bulletins, .prioritizers-list, 
.controller-services-header, .login-title, .parameter-context-header, 
.parameter-context-inheritance-list, .provenance-header, .flowfile-header, 
.queue-listing-header, .settings-header, .summary-header, .user-header, table 
th, button.global-menu-item.fa, button.global-menu-item.icon, .event-header, 
.section-header,
-600: #718d9a, // .breadcrumb-container, .birdseye-brush
-700: #8aa2ad, // "darker" hue for this palette. Also 
#status-history-chart-container text, #status-history-chart-control-container 
text
-800: #abbcc5,
-900: #abbcc5,
-
-// A100 -> A700 are the ACCENT color 
(mat.define-palette($material-primary-dark-palette, A400, A100, A700);). These 
color are the ANALOGOUS (or possibly the TRIADIC??) colors as defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-// These colors are also used by some custom canvas components that 
display the ANALOGOUS color for things like buttons, links, borders, info, etc.
-A100: #aabec7, // .zero
-A200: #44a3cf, // .enabled, .transmitting, .load-balance-icon-active
-A400: #009b9d, // a, a:hover, button.nifi-button, 
button.nifi-button:hover, .add-tenant-to-policy-form.fa, .component.selected 
rect.border, .add-connect, .remote-process-group-uri, 
.remote-process-group-transmission-secure, .navigation-control.fa, 
.operation-control.fa, .new-canvas-item.icon, .upload-flow-definition, 
.lineage-controls.fa, .event circle.context, .nifi-navigation.icon, 
.listing-table.fa, .listing-table.icon, .context-menu
-A700: #2cd5d5,//rgba(139, 208, 229, 1),//#aabec7 // .hint-pattern
-
-// These are the $material-primary-dark-palette PRIMARY AND ACCENT 
contrast colors. These color do not really get defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors.
-// Instead if we look to the Angular Material provided palettes we see 
that these fields are typically rgba(black, 0.87) or white. These values are 
particularly important
-// for light mode and dark mode as these values set the colors for the 
text when displayed against the primary background on a button, badge, chip, 
etc.
-//
-// NOTE: Care should be taken here to ensure the values meet accessibility 
standards.
-//
-// NOTE: When creating the Material palette definition 

[jira] [Created] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-11 Thread crissaegrim (Jira)
crissaegrim created NIFI-12885:
--

 Summary: MapRecord.getAsDate timestamp breaking bug
 Key: NIFI-12885
 URL: https://issues.apache.org/jira/browse/NIFI-12885
 Project: Apache NiFi
  Issue Type: New Feature
Affects Versions: 2.0.0-M2
Reporter: crissaegrim


I think I found a breaking bug from this commit 
[https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]

The below passes in 1.20.0 but fails in 2.0
{code:java}
@Test
void testBasic() throws Exception {
// setup
final String schemaText = "{" +
"\"type\" : \"record\"," +
"\"name\" : \"TestRecord\"," +
"\"namespace\" : \"org.apache.nifi\"," +
"\"fields\" : [ {" +
"\"name\" : \"my_datestamp_field\"," +
"\"type\" : {" +
"\"type\" : \"long\"," +
"\"logicalType\" : \"timestamp-millis\"" +
"}" +
"} ]" +
  "}";
final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
Schema.Parser().parse(schemaText));

final HashMap item = new HashMap<>();
item.put("my_datestamp_field", "2022-01-01 10:00:00.000");

// act
final MapRecord record = new MapRecord(schemaParsed, item);
final Date myDateStampField = record.getAsDate("my_datestamp_field", 
"-MM-dd HH:mm:ss.SSS");

// assert
// fails in 2.0; actual is `164099520`
assertEquals(164103120L, myDateStampField.getTime());
}
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-11449) add autocommit property to PutDatabaseRecord processor

2024-03-11 Thread Matt Burgess (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-11449?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825426#comment-17825426
 ] 

Matt Burgess commented on NIFI-11449:
-

For AWS, what does Iceberg use for a catalog? DynamoDB?

> add autocommit property to PutDatabaseRecord processor
> --
>
> Key: NIFI-11449
> URL: https://issues.apache.org/jira/browse/NIFI-11449
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.21.0
> Environment: Any Nifi Deployment
>Reporter: Abdelrahim Ahmad
>Priority: Blocker
>  Labels: Trino, autocommit, database, iceberg, putdatabaserecord
>
> The issue is with the {{PutDatabaseRecord}} processor in Apache NiFi. When 
> using the processor with the Trino-JDBC-Driver or Dremio-JDBC-Driver to write 
> to an Iceberg catalog, it disables the autocommit feature. This leads to 
> errors such as "{*}Catalog only supports writes using autocommit: iceberg{*}".
> the autocommit feature needs to be added in the processor to be 
> enabled/disabled.
> enabling auto-commit in the Nifi PutDatabaseRecord processor is important for 
> Deltalake, Iceberg, and Hudi as it ensures data consistency and integrity by 
> allowing atomic writes to be performed in the underlying database. This will 
> allow the process to be widely used with bigger range of databases.
> _Improving this processor will allow Nifi to be the main tool to ingest data 
> into these new Technologies. So we don't have to deal with another tool to do 
> so._
> +*_{color:#de350b}BUT:{color}_*+
> I have reviewed The {{PutDatabaseRecord}} processor in NiFi. It inserts 
> records one by one into the database using a prepared statement, and commits 
> the transaction at the end of the loop that processes each record. This 
> approach can be inefficient and slow when inserting large volumes of data 
> into tables that are optimized for bulk ingestion, such as Delta Lake, 
> Iceberg, and Hudi tables.
> These tables use various techniques to optimize the performance of bulk 
> ingestion, such as partitioning, clustering, and indexing. Inserting records 
> one by one using a prepared statement can bypass these optimizations, leading 
> to poor performance and potentially causing issues such as excessive disk 
> usage, increased memory consumption, and decreased query performance.
> To avoid these issues, it is recommended to have a new processor, or add 
> feature to the current one, to bulk insert method with AutoCommit feature 
> when inserting large volumes of data into Delta Lake, Iceberg, and Hudi 
> tables. 
>  
> P.S.: using PutSQL is not a have autoCommit but have the same performance 
> problem described above..
> Thanks and best regards :)
> Abdelrahim Ahmad



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12882) Allow control over unexpected failure behavior in processors

2024-03-11 Thread endzeit (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825424#comment-17825424
 ] 

endzeit commented on NIFI-12882:


There was a discussion on a related topic on the [dev mail 
list|https://lists.apache.org/thread/bps716px66lo47x5xb7tm8znp04dgdk0] just a 
few weeks ago.

> Allow control over unexpected failure behavior in processors
> 
>
> Key: NIFI-12882
> URL: https://issues.apache.org/jira/browse/NIFI-12882
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Reporter: saarbs
>Priority: Minor
>
> Recently we had a problem with a flow using UpdateAttribute that does 
> base64decode, and one of our flow sources began sending invalid base64. This 
> resulted in exceptions in the processor causing rollbacks to the flowfiles 
> and backpressure in the flow.
> Since there is no failure relationship to the processor there was no way to 
> resolve this issue, And we realized there are a lot of processors facing this 
> issue where an unexpected failure could cause an infinite loop where some 
> flows may desire to send them to a failure relationship.
> I suggest a setting or an unexpected failure relationship, which would allow 
> keeping the existing behavior of a rollback but would also allow terminating 
> or processing the failure in different ways.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


EndzeitBegins commented on PR #8489:
URL: https://github.com/apache/nifi/pull/8489#issuecomment-1989101375

   Thank you @ChrisSamo632. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2313 Fix Grafana Loki issues on Windows [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


lordgamez commented on code in PR #1742:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1742#discussion_r1520172072


##
extensions/grafana-loki/CMakeLists.txt:
##
@@ -58,8 +58,8 @@ add_dependencies(minifi-grafana-loki minifi-http-curl)
 
 if (ENABLE_GRPC_FOR_LOKI)
 target_include_directories(minifi-grafana-loki SYSTEM PRIVATE BEFORE 
"${LOKI_PROTOBUF_GENERATED_DIR}" "${GRPC_INCLUDE_DIR}" 
"${PROTOBUF_INCLUDE_DIR}")
-target_link_libraries(minifi-grafana-loki grpc++ protobuf::libprotobuf)
-add_dependencies(minifi-grafana-loki grpc grafana-loki-protos)
+target_link_libraries(minifi-grafana-loki grafana-loki-protos grpc++ 
protobuf::libprotobuf)

Review Comment:
   Good point, not needed anymore, removed in 
9ca6dc49ee42bf24d36d3a229830bacea5970709



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-1797 Python bootstrap (part 1) [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


szaszm commented on code in PR #1681:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1681#discussion_r1520163594


##
bootstrap/py_bootstrap.bat:
##
@@ -0,0 +1,29 @@
+@echo off
+
+REM Check if Python is installed
+where python > nul 2>&1
+if %errorlevel% neq 0 (
+echo Python is not installed
+exit /b 1
+)
+
+REM Check if venv module is available
+python -m venv --help > nul 2>&1

Review Comment:
   You could just remove the separate check step and handle the error when 
calling it normally below.



##
Windows.md:
##
@@ -120,9 +120,22 @@ A basic working CMake configuration can be inferred from 
the `win_build_vs.bat`.
 ```
 mkdir build
 cd build
-cmake -G "Visual Studio 17 2022" -A x64 -DINSTALLER_MERGE_MODULES=OFF 
-DTEST_CUSTOM_WEL_PROVIDER=OFF -DENABLE_SQL=OFF -DUSE_REAL_ODBC_TEST_DRIVER=OFF 
-DCMAKE_BUILD_TYPE_INIT=Release -DCMAKE_BUILD_TYPE=Release -DWIN32=WIN32 
-DENABLE_LIBRDKAFKA=OFF -DENABLE_JNI=OFF -DOPENSSL_OFF=OFF -DENABLE_COAP=OFF 
-DENABLE_AWS=OFF -DENABLE_PDH= -DENABLE_AZURE=OFF -DENABLE_SFTP=OFF 
-DENABLE_SPLUNK= -DENABLE_GCP= -DENABLE_NANOFI=OFF -DENABLE_OPENCV=OFF 
-DENABLE_PROMETHEUS=OFF -DENABLE_ELASTICSEARCH= -DUSE_SHARED_LIBS=OFF 
-DENABLE_CONTROLLER=ON -DENABLE_BUSTACHE=OFF -DENABLE_COAP=OFF 
-DENABLE_ENCRYPT_CONFIG=OFF -DENABLE_GPS=OFF -DENABLE_LUA_SCRIPTING=OFF 
-DENABLE_MQTT=OFF -DENABLE_OPC=OFF -DENABLE_OPENWSMAN=OFF -DENABLE_OPS=OFF 
-DENABLE_PCAP=OFF -DENABLE_PYTHON_SCRIPTING= -DENABLE_SENSORS=OFF 
-DENABLE_USB_CAMERA=OFF -DBUILD_ROCKSDB=ON -DFORCE_WINDOWS=ON 
-DUSE_SYSTEM_UUID=OFF -DDISABLE_LIBARCHIVE=OFF -DENABLE_WEL=ON 
-DFAIL_ON_WARNINGS=OFF -DSKIP_TESTS=OFF ..
+cmake -G "Visual Studio 17 2022" -A x64 -DINSTALLER_MERGE_MODULES=OFF 
-DTEST_CUSTOM_WEL_PROVIDER=OFF -DENABLE_SQL=OFF 
-DMINIFI_USE_REAL_ODBC_TEST_DRIVER=OFF -DCMAKE_BUILD_TYPE_INIT=Release 
-DCMAKE_BUILD_TYPE=Release -DWIN32=WIN32 -DENABLE_LIBRDKAFKA=OFF 
-DENABLE_JNI=OFF -DMINIFI_OPENSSL=ON -DENABLE_COAP=OFF -DENABLE_AWS=OFF 
-DENABLE_PDH= -DENABLE_AZURE=OFF -DENABLE_SFTP=OFF -DENABLE_SPLUNK= 
-DENABLE_GCP= -DENABLE_NANOFI=OFF -DENABLE_OPENCV=OFF -DENABLE_PROMETHEUS=OFF 
-DENABLE_ELASTICSEARCH= -DUSE_SHARED_LIBS=OFF -DENABLE_CONTROLLER=ON 
-DENABLE_BUSTACHE=OFF -DENABLE_COAP=OFF -DENABLE_ENCRYPT_CONFIG=OFF 
-DENABLE_GPS=OFF -DENABLE_LUA_SCRIPTING=OFF -DENABLE_MQTT=OFF -DENABLE_OPC=OFF 
-DENABLE_OPENWSMAN=OFF -DENABLE_OPS=OFF -DENABLE_PCAP=OFF 
-DENABLE_PYTHON_SCRIPTING= -DENABLE_SENSORS=OFF -DENABLE_USB_CAMERA=OFF 
-DBUILD_ROCKSDB=ON -DUSE_SYSTEM_UUID=OFF -DENABLE_LIBARCHIVE=ON -DENABLE_WEL=ON 
-DMINIFI_FAIL_ON_WARNINGS=OFF -DSKIP_TESTS=OFF ..
 msbuild /m nifi-minifi-cpp.sln /property:Configuration=Release 
/property:Platform=x64
 copy minifi_main\Release\minifi.exe minifi_main\
 cpack
 ctest -C Release
 ```
+
+## Python based bootstrapping (recommended)

Review Comment:
   This could go on top, and the old instructions could go under another 
heading, like `## Alternative: Manual bootstrapping (advanced)`



##
bootstrap/py_bootstrap.sh:
##
@@ -0,0 +1,27 @@
+#!/bin/bash
+
+# Check if Python is installed
+if ! command -v python3 &>/dev/null; then
+echo "Python is not installed"
+exit 1
+fi
+
+# Check if virtualenv is installed
+if ! command -v python3 -m venv --help &>/dev/null; then
+echo "virtualenv is not installed"
+exit 1
+fi
+
+SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+VENV_DIR="$SCRIPT_DIR/venv"
+
+if [ -d "$VENV_DIR" ]; then
+source "$VENV_DIR/bin/activate"
+else
+echo "Creating virtualenv"
+python3 -m venv "$VENV_DIR"
+source "$VENV_DIR/bin/activate"
+pip install -r "$SCRIPT_DIR/requirements.txt"
+fi

Review Comment:
   ```suggestion
   SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
   VENV_DIR="$SCRIPT_DIR/venv"
   
   set -e
   
   if [ -d "$VENV_DIR" ]; then
   source "$VENV_DIR/bin/activate"
   else
   echo "Creating virtualenv"
   if ! python3 -m venv "$VENV_DIR"; then
   echo "Creating virtualenv failed. Is venv installed?"
   exit 1
   fi
   source "$VENV_DIR/bin/activate"
   pip install -r "$SCRIPT_DIR/requirements.txt"
   fi
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-1797 Python bootstrap (part 1) [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


martinzink commented on code in PR #1681:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1681#discussion_r1520154175


##
.github/workflows/ci.yml:
##
@@ -75,12 +131,72 @@ jobs:
 timeout-minutes: 240
 env:
   LUA_DIR: D:\a\nifi-minifi-cpp\nifi-minifi-cpp\.lua
+  WINDOWS_MINIFI_OPTIONS: >-
+-DCMAKE_BUILD_TYPE=Release
+-DBUILD_ROCKSDB=ON
+-DBUILD_SHARED_LIBS=OFF
+-DCI_BUILD=ON
+-DCUSTOM_MALLOC=OFF
+-DDOCKER_BUILD_ONLY=OFF
+-DDOCKER_PUSH=OFF
+-DDOCKER_SKIP_TESTS=ON
+-DENABLE_ALL=OFF
+-DENABLE_AWS=ON
+-DENABLE_AZURE=ON
+-DENABLE_BUSTACHE=OFF
+-DENABLE_BZIP2=ON
+-DENABLE_CIVET=ON
+-DENABLE_COAP=OFF
+-DENABLE_CONTROLLER=OFF

Review Comment:
   good idea, 
https://github.com/apache/nifi-minifi-cpp/pull/1681/commits/48ebdcec251fc04221d0cbcf750bc261eed9409f



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-1797 Python bootstrap (part 1) [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


martinzink commented on code in PR #1681:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1681#discussion_r1520152166


##
bootstrap/package_manager.py:
##
@@ -0,0 +1,259 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+import os
+import platform
+import subprocess
+import sys
+from typing import Dict, Set
+
+from distro import distro
+
+
+def _query_yes_no(question: str, no_confirm: bool) -> bool:
+valid = {"yes": True, "y": True, "ye": True, "no": False, "n": False}
+
+if no_confirm:
+print("Running {} with noconfirm".format(question))
+return True
+while True:
+print("{} [y/n]".format(question), end=' ', flush=True)
+choice = input().lower()
+if choice in valid:
+return valid[choice]
+else:
+print("Please respond with 'yes' or 'no' " "(or 'y' or 'n').")
+
+
+def _run_command_with_confirm(command: str, no_confirm: bool) -> bool:
+if _query_yes_no("Running {}".format(command), no_confirm):
+return os.system(command) == 0
+
+
+class PackageManager(object):
+def __init__(self, no_confirm):
+self.no_confirm = no_confirm
+pass
+
+def install(self, dependencies: Dict[str, Set[str]]) -> bool:
+raise Exception("NotImplementedException")
+
+def install_compiler(self) -> str:
+raise Exception("NotImplementedException")
+
+def _install(self, dependencies: Dict[str, Set[str]], replace_dict: 
Dict[str, Set[str]], install_cmd: str) -> bool:
+dependencies.update({k: v for k, v in replace_dict.items() if k in 
dependencies})
+dependencies = self._filter_out_installed_packages(dependencies)
+dependencies_str = " ".join(str(value) for value_set in 
dependencies.values() for value in value_set)
+if not dependencies_str or dependencies_str.isspace():
+return True
+return _run_command_with_confirm(f"{install_cmd} {dependencies_str}", 
self.no_confirm)
+
+def _get_installed_packages(self) -> Set[str]:
+raise Exception("NotImplementedException")
+
+def _filter_out_installed_packages(self, dependencies: Dict[str, 
Set[str]]):
+installed_packages = self._get_installed_packages()
+filtered_packages = {k: (v - installed_packages) for k, v in 
dependencies.items()}
+for installed_package in installed_packages:
+filtered_packages.pop(installed_package, None)
+return filtered_packages
+
+def run_cmd(self, cmd: str) -> bool:
+result = subprocess.run(f"{cmd}", shell=True, text=True)
+return result.returncode == 0
+
+
+class BrewPackageManager(PackageManager):
+def __init__(self, no_confirm):
+PackageManager.__init__(self, no_confirm)
+
+def install(self, dependencies: Dict[str, Set[str]]) -> bool:
+return self._install(dependencies=dependencies,
+ install_cmd="brew install",
+ replace_dict={"patch": set(),
+   "jni": {"maven"}})
+
+def install_compiler(self) -> str:
+self.install({"compiler": set()})
+return ""
+
+def _get_installed_packages(self) -> Set[str]:
+result = subprocess.run(['brew', 'list'], text=True, 
capture_output=True, check=True)
+lines = result.stdout.splitlines()
+lines = [line.split('@', 1)[0] for line in lines]
+return set(lines)
+
+
+class AptPackageManager(PackageManager):
+def __init__(self, no_confirm):
+PackageManager.__init__(self, no_confirm)
+
+def install(self, dependencies: Dict[str, Set[str]]) -> bool:
+return self._install(dependencies=dependencies,
+ install_cmd="sudo apt install -y",
+ replace_dict={"libarchive": {"liblzma-dev"},
+   "lua": {"liblua5.1-0-dev"},
+   "python": {"libpython3-dev"},
+   "libusb": {"libusb-1.0-0-dev", 
"libusb-dev"},
+   "libpng": {"libpng-dev"},
+   "libpcap": {"libpcap-dev"},
+

[jira] [Commented] (NIFI-11449) add autocommit property to PutDatabaseRecord processor

2024-03-11 Thread Matt Burgess (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-11449?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825392#comment-17825392
 ] 

Matt Burgess commented on NIFI-11449:
-

I believe there are separate processors for AWS and GCP as well. They may not 
be included with the Apache NiFi release binaries but the NARs can be added 
manually from the Apache repository. Do those solve your use case or do we need 
PutIceberg to support Object-backed storage as well?

> add autocommit property to PutDatabaseRecord processor
> --
>
> Key: NIFI-11449
> URL: https://issues.apache.org/jira/browse/NIFI-11449
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.21.0
> Environment: Any Nifi Deployment
>Reporter: Abdelrahim Ahmad
>Priority: Blocker
>  Labels: Trino, autocommit, database, iceberg, putdatabaserecord
>
> The issue is with the {{PutDatabaseRecord}} processor in Apache NiFi. When 
> using the processor with the Trino-JDBC-Driver or Dremio-JDBC-Driver to write 
> to an Iceberg catalog, it disables the autocommit feature. This leads to 
> errors such as "{*}Catalog only supports writes using autocommit: iceberg{*}".
> the autocommit feature needs to be added in the processor to be 
> enabled/disabled.
> enabling auto-commit in the Nifi PutDatabaseRecord processor is important for 
> Deltalake, Iceberg, and Hudi as it ensures data consistency and integrity by 
> allowing atomic writes to be performed in the underlying database. This will 
> allow the process to be widely used with bigger range of databases.
> _Improving this processor will allow Nifi to be the main tool to ingest data 
> into these new Technologies. So we don't have to deal with another tool to do 
> so._
> +*_{color:#de350b}BUT:{color}_*+
> I have reviewed The {{PutDatabaseRecord}} processor in NiFi. It inserts 
> records one by one into the database using a prepared statement, and commits 
> the transaction at the end of the loop that processes each record. This 
> approach can be inefficient and slow when inserting large volumes of data 
> into tables that are optimized for bulk ingestion, such as Delta Lake, 
> Iceberg, and Hudi tables.
> These tables use various techniques to optimize the performance of bulk 
> ingestion, such as partitioning, clustering, and indexing. Inserting records 
> one by one using a prepared statement can bypass these optimizations, leading 
> to poor performance and potentially causing issues such as excessive disk 
> usage, increased memory consumption, and decreased query performance.
> To avoid these issues, it is recommended to have a new processor, or add 
> feature to the current one, to bulk insert method with AutoCommit feature 
> when inserting large volumes of data into Delta Lake, Iceberg, and Hudi 
> tables. 
>  
> P.S.: using PutSQL is not a have autoCommit but have the same performance 
> problem described above..
> Thanks and best regards :)
> Abdelrahim Ahmad



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12880 Add processor DeleteFile [nifi]

2024-03-11 Thread via GitHub


EndzeitBegins commented on PR #8489:
URL: https://github.com/apache/nifi/pull/8489#issuecomment-1988946115

   Sadly `nifi-web-frontend` failed. I'm not quite sure why that is but I'm 
certain that's not due to the changes in the PR. 路‍♂️
   
   Would you be so kind to start the failed jobs once more @ChrisSamo632?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Commented] (NIFI-7085) Consume JMS is very slow

2024-03-11 Thread Michael W Moser (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-7085?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825374#comment-17825374
 ] 

Michael W Moser commented on NIFI-7085:
---

When consuming a JMS queue, you can add more than 1 concurrent task to 
ConsumeJMS to get the performance you need.

When consuming a JMS topic, if you add more than 1 concurrent task on a 
non-durable, non-shared topic, then you receive duplicate messages and 
performance does not improve.  In this case, you cannot use more than 1 task.

I think I can improve ConsumeJMS by adding a Message Batch Size property, which 
will allow the ability to consume more than 1 message per onTrigger().

As a note to anyone who reads this, recommend always setting ConsumeJMS Run 
Schedule to 0 sec.  If no message is available, then the Timeout property 
controls how long to keep the timer-driven thread while waiting for a message 
from the broker.

> Consume JMS is very slow
> 
>
> Key: NIFI-7085
> URL: https://issues.apache.org/jira/browse/NIFI-7085
> Project: Apache NiFi
>  Issue Type: Improvement
> Environment: 1.8.0 nifi
>Reporter: naveen kumar saharan
>Priority: Minor
>
> ConsumeJMS as primary node consuming only 20 messages per second. 
> If i do all nodes will it read duplicate? 
> If i run multiple thread will it read as duplicate? 
>  
> We want to read 2000 messages per sec from Tibco jms.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (NIFI-7085) Consume JMS is very slow

2024-03-11 Thread Michael W Moser (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-7085?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael W Moser reassigned NIFI-7085:
-

Assignee: Michael W Moser

> Consume JMS is very slow
> 
>
> Key: NIFI-7085
> URL: https://issues.apache.org/jira/browse/NIFI-7085
> Project: Apache NiFi
>  Issue Type: Improvement
> Environment: 1.8.0 nifi
>Reporter: naveen kumar saharan
>Assignee: Michael W Moser
>Priority: Minor
>
> ConsumeJMS as primary node consuming only 20 messages per second. 
> If i do all nodes will it read duplicate? 
> If i run multiple thread will it read as duplicate? 
>  
> We want to read 2000 messages per sec from Tibco jms.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-10707) Add Proxy Configuration Service to new PutBigQuery processor

2024-03-11 Thread Peter Turcsanyi (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-10707?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter Turcsanyi updated NIFI-10707:
---
Status: Patch Available  (was: Open)

> Add Proxy Configuration Service to new PutBigQuery processor
> 
>
> Key: NIFI-10707
> URL: https://issues.apache.org/jira/browse/NIFI-10707
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Csaba Bejan
>Assignee: Peter Turcsanyi
>Priority: Major
>  Labels: BigQuery, GCP
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The new PutBigQuery processor targeting the new Write API introduced as part 
> of this ticket: https://issues.apache.org/jira/browse/NIFI-10403 should be 
> extended with Proxy Configuration Service capability. As discussed on an 
> earlier PR Proxy configuration needs to be added on gRPC level as well which 
> could be a more involved effort: https://github.com/apache/nifi/pull/6580



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[PR] NIFI-10707 Added proxy support in PutBigQuery [nifi]

2024-03-11 Thread via GitHub


turcsanyip opened a new pull request, #8491:
URL: https://github.com/apache/nifi/pull/8491

   Bumped GCP client library version
   Added grpc-* jars in service api nar in order to avoid CNFE warning in 
io.grpc.LoadBalancerRegistry
   
   # Summary
   
   [NIFI-10707](https://issues.apache.org/jira/browse/NIFI-10707)
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [ ] Pull Request based on current revision of the `main` branch
   - [ ] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [ ] Build completed using `mvn clean install -P contrib-check`
 - [ ] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Assigned] (NIFI-10707) Add Proxy Configuration Service to new PutBigQuery processor

2024-03-11 Thread Peter Turcsanyi (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-10707?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter Turcsanyi reassigned NIFI-10707:
--

Assignee: Peter Turcsanyi

> Add Proxy Configuration Service to new PutBigQuery processor
> 
>
> Key: NIFI-10707
> URL: https://issues.apache.org/jira/browse/NIFI-10707
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Csaba Bejan
>Assignee: Peter Turcsanyi
>Priority: Major
>  Labels: BigQuery, GCP
>
> The new PutBigQuery processor targeting the new Write API introduced as part 
> of this ticket: https://issues.apache.org/jira/browse/NIFI-10403 should be 
> extended with Proxy Configuration Service capability. As discussed on an 
> earlier PR Proxy configuration needs to be added on gRPC level as well which 
> could be a more involved effort: https://github.com/apache/nifi/pull/6580



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread David Handermann (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825360#comment-17825360
 ] 

David Handermann commented on NIFI-12883:
-

[~dstiegli1]  Generating Avro Schemas from POJOs does not seem to be something 
ideal for general usage. It may be helpful in some particular cases, but that 
seems better as an external process, and then NiFi components simply load the 
Avro Schema definitions.

> JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity
> --
>
> Key: NIFI-12883
> URL: https://issues.apache.org/jira/browse/NIFI-12883
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Daniel Stieglitz
>Priority: Major
> Attachments: ControllerServiceEntity.json
>
>
>  When trying to use the ConvertRecord processor configured with a 
> JsonTreeReader whose Schema
> Access Strategy is to Infer schema and configured with a JsonRecordSetWriter 
> to write the
> schema out all in order to read and write a ControllerServiceEntity JSON 
> object returned by the NIFI Rest API and produce the Avro schema, I got the 
> following stacktrace
> below. I have attached the ControllerServiceEntity JSON which I tried to
> convert. 
> {code:java}
> 2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
> o.a.n.processors.standard.ConvertRecord
> ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
> StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
> [resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
> section=1], offset=34287,
> length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
> will route to failure
> org.apache.avro.SchemaParseException: Illegal character in:
> component_descriptors_JSON Schema Version_allowableValues_allowableValueType
> at org.apache.avro.Schema.validateName(Schema.java:1625)
> at org.apache.avro.Schema.access$400(Schema.java:94)
> at org.apache.avro.Schema$Name.(Schema.java:713)
> at org.apache.avro.Schema.createRecord(Schema.java:226)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
> at
> org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
> at
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
> at
> com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
> at
> java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
> at
> com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
> at
> com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
> at
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
> at
> org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
> at
> org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:568)
> at
> org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
> at
> 

[jira] [Comment Edited] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825353#comment-17825353
 ] 

Daniel Stieglitz edited comment on NIFI-12883 at 3/11/24 3:23 PM:
--

[~exceptionfactory] Thanks for the reference. I wasn't aware of that. It is 
interesting though as I was able to overcome this issue by using 
[jackson-dataformats-binary|https://github.com/FasterXML/jackson-dataformats-binary/tree/master/avro#generating-avro-schema-from-pojo-definition]
 to generate an Avro schema from NIFI's POJO 
{code:java}
org.apache.nifi.web.api.entity.ControllerServiceEntity 
{code}

which I then used in JsonTreeReader to read the attached file successfully.
This got me thinking if it is possible to generate Avro schemas from POJOs then 
perhaps Apache NIFI should include in the distribution the generated Avro 
schemas of all the objects returned by the Rest API in order to facilitate 
reading the Rest API responses by JsonTreeReader and acting on them. Is that 
something which could be included in the distribution or perhaps a different 
artifact?


was (Author: JIRAUSER294662):
[~exceptionfactory] Thanks for the reference. I wasn't aware of that. It is 
interesting though as I was able to overcome this issue by using 
[jackson-dataformats-binary|https://github.com/FasterXML/jackson-dataformats-binary/tree/master/avro#generating-avro-schema-from-pojo-definition]
 to generate an Avro schema from NIFI's POJO 
{code:java}
org.apache.nifi.web.api.entity.ControllerServiceEntity 
{code}

which I then used in JsonTreeReader to read the attached file successfully.
This got me thinking if it is possible to generate Avro schemas from POJOs then 
perhaps Apache NIFI should include in the distribution the generated Avro 
schemas of all the objects returned by the Rest API in order to facilitate 
reading the Rest API responses by JsonTreeReader and acting on them. Is that 
something which should be included in the distribution or perhaps a different 
artifact?

> JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity
> --
>
> Key: NIFI-12883
> URL: https://issues.apache.org/jira/browse/NIFI-12883
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Daniel Stieglitz
>Priority: Major
> Attachments: ControllerServiceEntity.json
>
>
>  When trying to use the ConvertRecord processor configured with a 
> JsonTreeReader whose Schema
> Access Strategy is to Infer schema and configured with a JsonRecordSetWriter 
> to write the
> schema out all in order to read and write a ControllerServiceEntity JSON 
> object returned by the NIFI Rest API and produce the Avro schema, I got the 
> following stacktrace
> below. I have attached the ControllerServiceEntity JSON which I tried to
> convert. 
> {code:java}
> 2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
> o.a.n.processors.standard.ConvertRecord
> ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
> StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
> [resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
> section=1], offset=34287,
> length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
> will route to failure
> org.apache.avro.SchemaParseException: Illegal character in:
> component_descriptors_JSON Schema Version_allowableValues_allowableValueType
> at org.apache.avro.Schema.validateName(Schema.java:1625)
> at org.apache.avro.Schema.access$400(Schema.java:94)
> at org.apache.avro.Schema$Name.(Schema.java:713)
> at org.apache.avro.Schema.createRecord(Schema.java:226)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
> at
> org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
> at
> 

Re: [PR] MINIFICPP-2313 Fix Grafana Loki issues on Windows [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


szaszm commented on code in PR #1742:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1742#discussion_r1519905031


##
extensions/grafana-loki/CMakeLists.txt:
##
@@ -58,8 +58,8 @@ add_dependencies(minifi-grafana-loki minifi-http-curl)
 
 if (ENABLE_GRPC_FOR_LOKI)
 target_include_directories(minifi-grafana-loki SYSTEM PRIVATE BEFORE 
"${LOKI_PROTOBUF_GENERATED_DIR}" "${GRPC_INCLUDE_DIR}" 
"${PROTOBUF_INCLUDE_DIR}")
-target_link_libraries(minifi-grafana-loki grpc++ protobuf::libprotobuf)
-add_dependencies(minifi-grafana-loki grpc grafana-loki-protos)
+target_link_libraries(minifi-grafana-loki grafana-loki-protos grpc++ 
protobuf::libprotobuf)

Review Comment:
   Now that `grafana-loki-protos` is linked here, and it has `grpc++` and 
`protobuf::libprotobuf` linked, do we need to link them here as well?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Comment Edited] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825353#comment-17825353
 ] 

Daniel Stieglitz edited comment on NIFI-12883 at 3/11/24 3:18 PM:
--

[~exceptionfactory] Thanks for the reference. I wasn't aware of that. It is 
interesting though as I was able to overcome this issue by using 
[jackson-dataformats-binary|https://github.com/FasterXML/jackson-dataformats-binary/tree/master/avro#generating-avro-schema-from-pojo-definition]
 to generate an Avro schema from NIFI's POJO 
{code:java}
org.apache.nifi.web.api.entity.ControllerServiceEntity 
{code}

which I then used in JsonTreeReader to read the attached file successfully.
This got me thinking if it is possible to generate Avro schemas from POJOs then 
perhaps Apache NIFI should include in the distribution the generated Avro 
schemas of all the objects returned by the Rest API in order to facilitate 
reading the Rest API responses by JsonTreeReader and acting on them. Is that 
something which should be included in the distribution or perhaps a different 
artifact?


was (Author: JIRAUSER294662):
[~exceptionfactory] Thanks for the reference. I wasn't aware of that. It is 
interesting though as I was able to overcome this issue by using 
[jackson-dataformats-binary|https://github.com/FasterXML/jackson-dataformats-binary/tree/master/avro#generating-avro-schema-from-pojo-definition]
 to generate an Avro schema from NIFI's POJO 
{code:java}
org.apache.nifi.web.api.entity.ControllerServiceEntity 
{code}

which I then used in JsonTreeReader to read the attached file successfully.
This got me thinking if it is possible to generate Avro schemas from POJOs then 
perhaps Apache NIFI should include in the distribution the generated Avro 
schemas of all the objects returned by the Rest API in order to facilitate 
reading the Rest API responses by JsonTreeReader and acting on them. 

> JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity
> --
>
> Key: NIFI-12883
> URL: https://issues.apache.org/jira/browse/NIFI-12883
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Daniel Stieglitz
>Priority: Major
> Attachments: ControllerServiceEntity.json
>
>
>  When trying to use the ConvertRecord processor configured with a 
> JsonTreeReader whose Schema
> Access Strategy is to Infer schema and configured with a JsonRecordSetWriter 
> to write the
> schema out all in order to read and write a ControllerServiceEntity JSON 
> object returned by the NIFI Rest API and produce the Avro schema, I got the 
> following stacktrace
> below. I have attached the ControllerServiceEntity JSON which I tried to
> convert. 
> {code:java}
> 2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
> o.a.n.processors.standard.ConvertRecord
> ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
> StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
> [resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
> section=1], offset=34287,
> length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
> will route to failure
> org.apache.avro.SchemaParseException: Illegal character in:
> component_descriptors_JSON Schema Version_allowableValues_allowableValueType
> at org.apache.avro.Schema.validateName(Schema.java:1625)
> at org.apache.avro.Schema.access$400(Schema.java:94)
> at org.apache.avro.Schema$Name.(Schema.java:713)
> at org.apache.avro.Schema.createRecord(Schema.java:226)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
> at
> org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
> at
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
> at
> 

[jira] [Comment Edited] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825353#comment-17825353
 ] 

Daniel Stieglitz edited comment on NIFI-12883 at 3/11/24 3:17 PM:
--

[~exceptionfactory] Thanks for the reference. I wasn't aware of that. It is 
interesting though as I was able to overcome this issue by using 
[jackson-dataformats-binary|https://github.com/FasterXML/jackson-dataformats-binary/tree/master/avro#generating-avro-schema-from-pojo-definition]
 to generate an Avro schema from NIFI's POJO 
{code:java}
org.apache.nifi.web.api.entity.ControllerServiceEntity 
{code}

which I then used in JsonTreeReader to read the attached file successfully.
This got me thinking if it is possible to generate Avro schemas from POJOs then 
perhaps Apache NIFI should include in the distribution the generated Avro 
schemas of all the objects returned by the Rest API in order to facilitate 
reading the Rest API responses by JsonTreeReader and acting on them. 


was (Author: JIRAUSER294662):
[~exceptionfactory] Thanks for the reference. I wasn't aware of that. It is 
interesting though as I was able to overcome this issue by using 
[jackson-dataformats-binary|https://github.com/FasterXML/jackson-dataformats-binary/tree/master/avro#generating-avro-schema-from-pojo-definition]
 to generate an Avro schema from NIFI's POJO 
{code:java}
org.apache.nifi.web.api.entity.ControllerServiceEntity 
{code}

which I then used in JsonTreeReader to read the attached file successfully.

> JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity
> --
>
> Key: NIFI-12883
> URL: https://issues.apache.org/jira/browse/NIFI-12883
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Daniel Stieglitz
>Priority: Major
> Attachments: ControllerServiceEntity.json
>
>
>  When trying to use the ConvertRecord processor configured with a 
> JsonTreeReader whose Schema
> Access Strategy is to Infer schema and configured with a JsonRecordSetWriter 
> to write the
> schema out all in order to read and write a ControllerServiceEntity JSON 
> object returned by the NIFI Rest API and produce the Avro schema, I got the 
> following stacktrace
> below. I have attached the ControllerServiceEntity JSON which I tried to
> convert. 
> {code:java}
> 2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
> o.a.n.processors.standard.ConvertRecord
> ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
> StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
> [resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
> section=1], offset=34287,
> length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
> will route to failure
> org.apache.avro.SchemaParseException: Illegal character in:
> component_descriptors_JSON Schema Version_allowableValues_allowableValueType
> at org.apache.avro.Schema.validateName(Schema.java:1625)
> at org.apache.avro.Schema.access$400(Schema.java:94)
> at org.apache.avro.Schema$Name.(Schema.java:713)
> at org.apache.avro.Schema.createRecord(Schema.java:226)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
> at
> org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
> at
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
> at
> com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
> at
> java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
> at
> 

[jira] [Commented] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825353#comment-17825353
 ] 

Daniel Stieglitz commented on NIFI-12883:
-

[~exceptionfactory] Thanks for the reference. I wasn't aware of that. It is 
interesting though as I was able to overcome this issue by using 
[jackson-dataformats-binary|https://github.com/FasterXML/jackson-dataformats-binary/tree/master/avro#generating-avro-schema-from-pojo-definition]
 to generate an Avro schema from NIFI's POJO 
{code:java}
org.apache.nifi.web.api.entity.ControllerServiceEntity 
{code}

which I then used in JsonTreeReader to read the attached file successfully.

> JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity
> --
>
> Key: NIFI-12883
> URL: https://issues.apache.org/jira/browse/NIFI-12883
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Daniel Stieglitz
>Priority: Major
> Attachments: ControllerServiceEntity.json
>
>
>  When trying to use the ConvertRecord processor configured with a 
> JsonTreeReader whose Schema
> Access Strategy is to Infer schema and configured with a JsonRecordSetWriter 
> to write the
> schema out all in order to read and write a ControllerServiceEntity JSON 
> object returned by the NIFI Rest API and produce the Avro schema, I got the 
> following stacktrace
> below. I have attached the ControllerServiceEntity JSON which I tried to
> convert. 
> {code:java}
> 2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
> o.a.n.processors.standard.ConvertRecord
> ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
> StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
> [resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
> section=1], offset=34287,
> length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
> will route to failure
> org.apache.avro.SchemaParseException: Illegal character in:
> component_descriptors_JSON Schema Version_allowableValues_allowableValueType
> at org.apache.avro.Schema.validateName(Schema.java:1625)
> at org.apache.avro.Schema.access$400(Schema.java:94)
> at org.apache.avro.Schema$Name.(Schema.java:713)
> at org.apache.avro.Schema.createRecord(Schema.java:226)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
> at
> org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
> at
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
> at
> com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
> at
> java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
> at
> com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
> at
> com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
> at
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
> at
> org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
> at
> org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at 

Re: [PR] MINIFICPP-2282 Support re-encryption of sensitive properties [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


lordgamez commented on code in PR #1739:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1739#discussion_r1519863114


##
encrypt-config/EncryptConfigMain.cpp:
##


Review Comment:
   I encrypted a flow config in interactive mode using a `PutS3Object` 
processor which has 2 sensitive properties `Secret Key` and `Proxy Password`. I 
set the `Secret Key` but did not set the `Proxy Password` to any value, so the 
flow config had 1 encrypted property set in the processor. After I set the 
`nifi.bootstrap.sensitive.properties.key` to 
`nifi.bootstrap.sensitive.properties.key.old` in the `bootstrap.conf` and tried 
using re-encrypt on it, it successfully generated a new key and re-encrypted 
the `Secret Key` property, but also the `Proxy Password` appeared in the flow 
config with an encrypted value:
   
   `Proxy Password: 
enc{ilN0h9zhIwVCMGn0GuuJGkupRmmGUR8z||2mqjr67fi2r2sr2Bojg6WQ==}`
   
   If it was not set, I think it should not be set after re-encrypt call 
either, also what value would this be set to in this case?



##
encrypt-config/EncryptConfigMain.cpp:
##
@@ -62,18 +65,19 @@ int main(int argc, char* argv[]) try {
   if (operation == OPERATION_MINIFI_PROPERTIES) {
 encrypt_config.encryptSensitiveValuesInMinifiProperties();
   } else if (operation == OPERATION_FLOW_CONFIG) {
+auto re_encrypt = argument_parser.get("--re-encrypt");
 auto component_id = argument_parser.present("--component-id");
 auto property_name = argument_parser.present("--property-name");
 auto property_value = argument_parser.present("--property-value");

Review Comment:
   Could we return an error when these options are specified, but not the 
flow-config operation is set? It could be a bit misleading if the user wants to 
encrypt flow config values, the operation runs successfully, but it actually 
changed the minifi.properties.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Resolved] (NIFI-9269) ExtractEmailHeaders & ExtractEmailAttachments -- NoClassDefFoundError

2024-03-11 Thread David Handermann (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-9269?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Handermann resolved NIFI-9269.

Fix Version/s: 2.0.0
   Resolution: Fixed

Upgrades in NIFI-12820 resolve the class resolution issues with these email 
processors. The fixed version will be incorporated in the next milestone 
release of NiFi 2.0.0.

> ExtractEmailHeaders & ExtractEmailAttachments -- NoClassDefFoundError
> -
>
> Key: NIFI-9269
> URL: https://issues.apache.org/jira/browse/NIFI-9269
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.14.0
> Environment: Nifi:
> 1.14.0
> 07/10/2021 12:25:36 SAST
> Tagged nifi-1.14.0-RC2
> From fcbf1d5 on branch UNKNOWN
> Java:
> java --version
> openjdk 11.0.12 2021-07-20
> OpenJDK Runtime Environment 18.9 (build 11.0.12+7)
> OpenJDK 64-Bit Server VM 18.9 (build 11.0.12+7, mixed mode, sharing)
> OS:
> uname -a
> Linux blue.centilliard.io 5.13.16-200.fc34.x86_64 #1 SMP Mon Sep 13
> 12:39:36 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
>Reporter: Stefan
>Assignee: David Handermann
>Priority: Major
> Fix For: 2.0.0
>
>
> The above two processors fail with the below errors:
> /*** *ExtractEmailAttachments* ***/
> ExtractEmailAttachments[id=3738d7b2-017c-1000-c120-29f7760d3084] Failed
> to process session due to com/sun/activation/registries/LogSupport;
> Processor Administratively Yielded for 1 sec:
> java.lang.NoClassDefFoundError:
> com/sun/activation/registries/LogSupport
> /*** *ExtractEmailHeaders* ***/
> ExtractEmailHeaders[id=36a7997a-017c-1000-0405-0197e1cb8792] Failed to
> process session due to com/sun/activation/registries/LogSupport;
> Processor Administratively Yielded for 1 sec:
> java.lang.NoClassDefFoundError:
> com/sun/activation/registries/LogSupport
> *P.S*
> Adding javax.activation-1.2.0.jar and leaving javax.activation-1.2.0.jar in 
> */lib/java11* also seems to resolve the problem.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (NIFI-9269) ExtractEmailHeaders & ExtractEmailAttachments -- NoClassDefFoundError

2024-03-11 Thread David Handermann (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-9269?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

David Handermann reassigned NIFI-9269:
--

Assignee: David Handermann  (was: Pierre Villard)

> ExtractEmailHeaders & ExtractEmailAttachments -- NoClassDefFoundError
> -
>
> Key: NIFI-9269
> URL: https://issues.apache.org/jira/browse/NIFI-9269
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.14.0
> Environment: Nifi:
> 1.14.0
> 07/10/2021 12:25:36 SAST
> Tagged nifi-1.14.0-RC2
> From fcbf1d5 on branch UNKNOWN
> Java:
> java --version
> openjdk 11.0.12 2021-07-20
> OpenJDK Runtime Environment 18.9 (build 11.0.12+7)
> OpenJDK 64-Bit Server VM 18.9 (build 11.0.12+7, mixed mode, sharing)
> OS:
> uname -a
> Linux blue.centilliard.io 5.13.16-200.fc34.x86_64 #1 SMP Mon Sep 13
> 12:39:36 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
>Reporter: Stefan
>Assignee: David Handermann
>Priority: Major
>
> The above two processors fail with the below errors:
> /*** *ExtractEmailAttachments* ***/
> ExtractEmailAttachments[id=3738d7b2-017c-1000-c120-29f7760d3084] Failed
> to process session due to com/sun/activation/registries/LogSupport;
> Processor Administratively Yielded for 1 sec:
> java.lang.NoClassDefFoundError:
> com/sun/activation/registries/LogSupport
> /*** *ExtractEmailHeaders* ***/
> ExtractEmailHeaders[id=36a7997a-017c-1000-0405-0197e1cb8792] Failed to
> process session due to com/sun/activation/registries/LogSupport;
> Processor Administratively Yielded for 1 sec:
> java.lang.NoClassDefFoundError:
> com/sun/activation/registries/LogSupport
> *P.S*
> Adding javax.activation-1.2.0.jar and leaving javax.activation-1.2.0.jar in 
> */lib/java11* also seems to resolve the problem.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] MINIFICPP-2277 Add virtualenv support for python processors [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


szaszm commented on code in PR #1721:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1721#discussion_r1519830502


##
extensions/python/PythonConfigState.h:
##
@@ -0,0 +1,50 @@
+/**
+ *
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+#pragma once
+
+#include 
+#include 
+
+namespace org::apache::nifi::minifi::extensions::python {
+
+struct PythonConfigState {
+ public:
+  PythonConfigState(PythonConfigState&&) = delete;
+  PythonConfigState(const PythonConfigState&) = delete;
+  PythonConfigState& operator=(PythonConfigState&&) = delete;
+  PythonConfigState& operator=(const PythonConfigState&) = delete;
+
+  bool isPackageInstallationNeeded() const {
+return install_python_packages_automatically && !virtualenv_path.empty();
+  }
+
+  static PythonConfigState& getInstance() {
+static PythonConfigState config;
+return config;
+  }

Review Comment:
   I think it could be solved by returning or moving the list of dependencies 
from the python processor to the execution environment class, which can install 
them as needed. A callback could probably work.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Commented] (NIFI-9269) ExtractEmailHeaders & ExtractEmailAttachments -- NoClassDefFoundError

2024-03-11 Thread Piermarco (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-9269?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825336#comment-17825336
 ] 

Piermarco commented on NIFI-9269:
-

Hello, I have the same issue for both components, after updating to version 
2.0.0-M1. 
Is there any kind of workaround to solve this problem?
At the moment, we are stuck without the possibility of using them.

Thank you in advance.

> ExtractEmailHeaders & ExtractEmailAttachments -- NoClassDefFoundError
> -
>
> Key: NIFI-9269
> URL: https://issues.apache.org/jira/browse/NIFI-9269
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.14.0
> Environment: Nifi:
> 1.14.0
> 07/10/2021 12:25:36 SAST
> Tagged nifi-1.14.0-RC2
> From fcbf1d5 on branch UNKNOWN
> Java:
> java --version
> openjdk 11.0.12 2021-07-20
> OpenJDK Runtime Environment 18.9 (build 11.0.12+7)
> OpenJDK 64-Bit Server VM 18.9 (build 11.0.12+7, mixed mode, sharing)
> OS:
> uname -a
> Linux blue.centilliard.io 5.13.16-200.fc34.x86_64 #1 SMP Mon Sep 13
> 12:39:36 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
>Reporter: Stefan
>Assignee: Pierre Villard
>Priority: Major
>
> The above two processors fail with the below errors:
> /*** *ExtractEmailAttachments* ***/
> ExtractEmailAttachments[id=3738d7b2-017c-1000-c120-29f7760d3084] Failed
> to process session due to com/sun/activation/registries/LogSupport;
> Processor Administratively Yielded for 1 sec:
> java.lang.NoClassDefFoundError:
> com/sun/activation/registries/LogSupport
> /*** *ExtractEmailHeaders* ***/
> ExtractEmailHeaders[id=36a7997a-017c-1000-0405-0197e1cb8792] Failed to
> process session due to com/sun/activation/registries/LogSupport;
> Processor Administratively Yielded for 1 sec:
> java.lang.NoClassDefFoundError:
> com/sun/activation/registries/LogSupport
> *P.S*
> Adding javax.activation-1.2.0.jar and leaving javax.activation-1.2.0.jar in 
> */lib/java11* also seems to resolve the problem.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-11107 In ConsumeIMAP and ConsumePOP3 added support for OAUTH based authorization. [nifi]

2024-03-11 Thread via GitHub


tpalfy commented on PR #6900:
URL: https://github.com/apache/nifi/pull/6900#issuecomment-1988565123

   @AnTu2702 I think I can add a not too complicated change that only recreates 
the `messageReciever` when an authentication exception is encountered in a 
couple of days, hopefully today.
   
   Since there are connections involved and given the fact that onTriggers can 
be called very frequently, it's better to have it like this right away.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Resolved] (MINIFICPP-2199) The windows build script should default to 64-bit

2024-03-11 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi resolved MINIFICPP-2199.
--
Resolution: Fixed

Currently it defaults to 64 bit and there is an override option for 32 bit 
version, ninja generator also detects if it is running in a x86 or x64 
developer prompt and uses that

> The windows build script should default to 64-bit
> -
>
> Key: MINIFICPP-2199
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2199
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Ferenc Gerlits
>Priority: Minor
>
> The {{win_build_vs.bat}} script defaults to building a 32-bit binary, and one 
> has to add the {{/64}} option to build a 64-bit binary.
> If possible, it would be good if the script could detect if it's running in 
> an "x86 Native Tools Command Prompt" or an "x64 Native Tools Command Prompt" 
> and set the correct bitness automatically.
> If automatic detection fails, I think we should still default to a 64-bit 
> build, and have a {{/32}} option to override this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread David Handermann (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825320#comment-17825320
 ] 

David Handermann commented on NIFI-12883:
-

[~dstiegli1] On a cursory review, this appears to be expected behavior for Avro 
Schema Name validation.

The Avro specification has stringent requirements for Schema names:

https://avro.apache.org/docs/1.11.1/specification/#names

> JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity
> --
>
> Key: NIFI-12883
> URL: https://issues.apache.org/jira/browse/NIFI-12883
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Daniel Stieglitz
>Priority: Major
> Attachments: ControllerServiceEntity.json
>
>
>  When trying to use the ConvertRecord processor configured with a 
> JsonTreeReader whose Schema
> Access Strategy is to Infer schema and configured with a JsonRecordSetWriter 
> to write the
> schema out all in order to read and write a ControllerServiceEntity JSON 
> object returned by the NIFI Rest API and produce the Avro schema, I got the 
> following stacktrace
> below. I have attached the ControllerServiceEntity JSON which I tried to
> convert. 
> {code:java}
> 2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
> o.a.n.processors.standard.ConvertRecord
> ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
> StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
> [resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
> section=1], offset=34287,
> length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
> will route to failure
> org.apache.avro.SchemaParseException: Illegal character in:
> component_descriptors_JSON Schema Version_allowableValues_allowableValueType
> at org.apache.avro.Schema.validateName(Schema.java:1625)
> at org.apache.avro.Schema.access$400(Schema.java:94)
> at org.apache.avro.Schema$Name.(Schema.java:713)
> at org.apache.avro.Schema.createRecord(Schema.java:226)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
> at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
> at
> org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
> at
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
> at
> com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
> at
> java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
> at
> com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
> at
> com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
> at
> com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
> at
> org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
> at
> org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
> at
> org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:568)
> at
> org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
> at
> 

Re: [PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-03-11 Thread via GitHub


lordgamez commented on code in PR #8441:
URL: https://github.com/apache/nifi/pull/8441#discussion_r1519730551


##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/PutOpenSearchVector.py:
##
@@ -0,0 +1,245 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from langchain.vectorstores import OpenSearchVectorSearch
+from nifiapi.flowfiletransform import FlowFileTransform, 
FlowFileTransformResult
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from OpenSearchVectorUtils import (L2, L1, LINF, COSINESIMIL, OPENAI_API_KEY, 
OPENAI_API_MODEL, HUGGING_FACE_API_KEY,
+   HUGGING_FACE_MODEL,HTTP_HOST, USERNAME, 
PASSWORD, INDEX_NAME, VECTOR_FIELD,

Review Comment:
   There is a missing whitespace after HUGGING_FACE_MODEL



##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/PutOpenSearchVector.py:
##
@@ -0,0 +1,245 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from langchain.vectorstores import OpenSearchVectorSearch
+from nifiapi.flowfiletransform import FlowFileTransform, 
FlowFileTransformResult
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from OpenSearchVectorUtils import (L2, L1, LINF, COSINESIMIL, OPENAI_API_KEY, 
OPENAI_API_MODEL, HUGGING_FACE_API_KEY,
+   HUGGING_FACE_MODEL,HTTP_HOST, USERNAME, 
PASSWORD, INDEX_NAME, VECTOR_FIELD,
+   TEXT_FIELD, create_authentication_params, 
parse_documents)
+from EmbeddingUtils import EMBEDDING_MODEL, create_embedding_service
+from nifiapi.documentation import use_case, multi_processor_use_case, 
ProcessorConfiguration

Review Comment:
   multi_processor_use_case and ProcessorConfiguration are unused imports that 
can be removed
   



##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/PutOpenSearchVector.py:
##
@@ -0,0 +1,245 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from langchain.vectorstores import OpenSearchVectorSearch
+from nifiapi.flowfiletransform import FlowFileTransform, 
FlowFileTransformResult
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from OpenSearchVectorUtils import (L2, L1, LINF, COSINESIMIL, OPENAI_API_KEY, 
OPENAI_API_MODEL, HUGGING_FACE_API_KEY,
+   HUGGING_FACE_MODEL,HTTP_HOST, USERNAME, 
PASSWORD, INDEX_NAME, VECTOR_FIELD,
+   TEXT_FIELD, create_authentication_params, 
parse_documents)
+from EmbeddingUtils import 

[jira] [Updated] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Stieglitz updated NIFI-12883:

Description: 
 When trying to use the ConvertRecord processor configured with a 
JsonTreeReader whose Schema
Access Strategy is to Infer schema and configured with a JsonRecordSetWriter to 
write the
schema out all in order to read and write a ControllerServiceEntity JSON object 
returned by the NIFI Rest API and produce the Avro schema, I got the following 
stacktrace
below. I have attached the ControllerServiceEntity JSON which I tried to
convert. 

{code:java}
2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
o.a.n.processors.standard.ConvertRecord
ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
section=1], offset=34287,
length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
will route to failure
org.apache.avro.SchemaParseException: Illegal character in:
component_descriptors_JSON Schema Version_allowableValues_allowableValueType
at org.apache.avro.Schema.validateName(Schema.java:1625)
at org.apache.avro.Schema.access$400(Schema.java:94)
at org.apache.avro.Schema$Name.(Schema.java:713)
at org.apache.avro.Schema.createRecord(Schema.java:226)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
at
org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
at
java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
at
com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
at
org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
at
org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.access$100(StandardControllerServiceInvocationHandler.java:38)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler$ProxiedReturnObjectInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:240)
at jdk.proxy18/jdk.proxy18.$Proxy180.write(Unknown Source)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:153)
at
org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3432)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1361)
at

[jira] [Updated] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Stieglitz updated NIFI-12883:

Description: 
 When trying to use the ConvertRecord processor configured with a 
JsonTreeReader whose Schema
Access Strategy is to Infer schema and configured with a JsonRecordSetWriter to 
write the
schema out in order to read and write a ControllerServiceEntity object returned 
by the NIFI Rest API and produce the Avro schema, I got the following stacktrace
below. I have attached the ControllerServiceEntity JSON which I tried to
convert. 

{code:java}
2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
o.a.n.processors.standard.ConvertRecord
ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
section=1], offset=34287,
length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
will route to failure
org.apache.avro.SchemaParseException: Illegal character in:
component_descriptors_JSON Schema Version_allowableValues_allowableValueType
at org.apache.avro.Schema.validateName(Schema.java:1625)
at org.apache.avro.Schema.access$400(Schema.java:94)
at org.apache.avro.Schema$Name.(Schema.java:713)
at org.apache.avro.Schema.createRecord(Schema.java:226)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
at
org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
at
java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
at
com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
at
org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
at
org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.access$100(StandardControllerServiceInvocationHandler.java:38)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler$ProxiedReturnObjectInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:240)
at jdk.proxy18/jdk.proxy18.$Proxy180.write(Unknown Source)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:153)
at
org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3432)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1361)
at

[jira] [Updated] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Stieglitz updated NIFI-12883:

Description: 
 When trying to use a JsonTreeReader with Schema
Access Strategy set to Infer schema and a JsonRecordSetWriter to write the
schema out in an instance of ConvertRecord processor to read a 
ControllerServiceEntity object returned by the NIFI Rest API and produce the 
Avro schema, I got the following stacktrace
below. I have attached the ControllerServiceEntity JSON which I tried to
convert. 

{code:java}
2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
o.a.n.processors.standard.ConvertRecord
ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
section=1], offset=34287,
length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
will route to failure
org.apache.avro.SchemaParseException: Illegal character in:
component_descriptors_JSON Schema Version_allowableValues_allowableValueType
at org.apache.avro.Schema.validateName(Schema.java:1625)
at org.apache.avro.Schema.access$400(Schema.java:94)
at org.apache.avro.Schema$Name.(Schema.java:713)
at org.apache.avro.Schema.createRecord(Schema.java:226)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
at
org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
at
java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
at
com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
at
org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
at
org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.access$100(StandardControllerServiceInvocationHandler.java:38)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler$ProxiedReturnObjectInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:240)
at jdk.proxy18/jdk.proxy18.$Proxy180.write(Unknown Source)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:153)
at
org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3432)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1361)
at

[PR] NIFI-12884: Corrected documentation for python debugging [nifi]

2024-03-11 Thread via GitHub


mark-bathori opened a new pull request, #8490:
URL: https://github.com/apache/nifi/pull/8490

   
   
   
   
   
   
   
   
   
   
   
   
   
   # Summary
   
   [NIFI-12884](https://issues.apache.org/jira/browse/NIFI-12884)
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [x] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [x] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [x] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [x] Pull Request based on current revision of the `main` branch
   - [x] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [x] Build completed using `mvn clean install -P contrib-check`
 - [x] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Created] (NIFI-12884) Corrected documentation for python debugging

2024-03-11 Thread Mark Bathori (Jira)
Mark Bathori created NIFI-12884:
---

 Summary: Corrected documentation for python debugging
 Key: NIFI-12884
 URL: https://issues.apache.org/jira/browse/NIFI-12884
 Project: Apache NiFi
  Issue Type: Bug
Reporter: Mark Bathori
Assignee: Mark Bathori


Correct config property name and minor typo in python debug documentation.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Stieglitz updated NIFI-12883:

Description: 
I wasn't sure how to write an Avro schema for a ControllerServiceEntity object 
returned by the NIFI Rest API. I tried to use a JsonTreeReader with Schema
Access Strategy set to Infer schema and a JsonRecordSetWriter to write the
schema out in an instance of ConvertRecord processor. but I got the following 
stacktrace
below. I have attached the ControllerServiceEntity JSON which I tried to
convert. 

{code:java}
2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
o.a.n.processors.standard.ConvertRecord
ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
section=1], offset=34287,
length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
will route to failure
org.apache.avro.SchemaParseException: Illegal character in:
component_descriptors_JSON Schema Version_allowableValues_allowableValueType
at org.apache.avro.Schema.validateName(Schema.java:1625)
at org.apache.avro.Schema.access$400(Schema.java:94)
at org.apache.avro.Schema$Name.(Schema.java:713)
at org.apache.avro.Schema.createRecord(Schema.java:226)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
at
org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
at
java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
at
com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
at
org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
at
org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.access$100(StandardControllerServiceInvocationHandler.java:38)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler$ProxiedReturnObjectInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:240)
at jdk.proxy18/jdk.proxy18.$Proxy180.write(Unknown Source)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:153)
at
org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3432)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1361)
at

[jira] [Created] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)
Daniel Stieglitz created NIFI-12883:
---

 Summary: JsonTreeReader not able to infer schema from JSON of a 
ControllerServiceEntity
 Key: NIFI-12883
 URL: https://issues.apache.org/jira/browse/NIFI-12883
 Project: Apache NiFi
  Issue Type: Bug
Reporter: Daniel Stieglitz
 Attachments: ControllerServiceEntity.json

I wasn't sure how to write an Avro schema a ControllerServiceEntity object 
returned by the NIFI Rest API. I tried to use a JsonTreeReader with Schema
Access Strategy set to Infer schema and a JsonRecordSetWriter to write the
schema out in an instance of ConvertRecord processor. When trying to
convert the ControllerServiceEntity JSON I got the following stacktrace
below. I have attached the ControllerServiceEntity JSON which I tried to
convert. 

{code:java}
2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
o.a.n.processors.standard.ConvertRecord
ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
section=1], offset=34287,
length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
will route to failure
org.apache.avro.SchemaParseException: Illegal character in:
component_descriptors_JSON Schema Version_allowableValues_allowableValueType
at org.apache.avro.Schema.validateName(Schema.java:1625)
at org.apache.avro.Schema.access$400(Schema.java:94)
at org.apache.avro.Schema$Name.(Schema.java:713)
at org.apache.avro.Schema.createRecord(Schema.java:226)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
at
org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
at
java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
at
com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
at
org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
at
org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.access$100(StandardControllerServiceInvocationHandler.java:38)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler$ProxiedReturnObjectInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:240)
at jdk.proxy18/jdk.proxy18.$Proxy180.write(Unknown Source)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:153)
at
org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3432)
at

[jira] [Updated] (NIFI-12883) JsonTreeReader not able to infer schema from JSON of a ControllerServiceEntity

2024-03-11 Thread Daniel Stieglitz (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12883?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daniel Stieglitz updated NIFI-12883:

Description: 
I wasn't sure how to write an Avro schema for a ControllerServiceEntity object 
returned by the NIFI Rest API. I tried to use a JsonTreeReader with Schema
Access Strategy set to Infer schema and a JsonRecordSetWriter to write the
schema out in an instance of ConvertRecord processor. When trying to
convert the ControllerServiceEntity JSON I got the following stacktrace
below. I have attached the ControllerServiceEntity JSON which I tried to
convert. 

{code:java}
2024-03-08 17:41:26,198 ERROR [Timer-Driven Process Thread-2]
o.a.n.processors.standard.ConvertRecord
ConvertRecord[id=1f213eb6-018e-1000-e76a-b9ac6041ed48] Failed to process
StandardFlowFileRecord[uuid=45aa31af-0850-42be-9f9e-05001acbf8f2,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1709914518257-1, container=default,
section=1], offset=34287,
length=14358],offset=0,name=sampleAfterDisablingStandardJsonSchemaRegistry.json,size=14358];
will route to failure
org.apache.avro.SchemaParseException: Illegal character in:
component_descriptors_JSON Schema Version_allowableValues_allowableValueType
at org.apache.avro.Schema.validateName(Schema.java:1625)
at org.apache.avro.Schema.access$400(Schema.java:94)
at org.apache.avro.Schema$Name.(Schema.java:713)
at org.apache.avro.Schema.createRecord(Schema.java:226)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:287)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:211)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:284)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroField(AvroTypeUtil.java:130)
at org.apache.nifi.avro.AvroTypeUtil.buildAvroSchema(AvroTypeUtil.java:122)
at
org.apache.nifi.avro.AvroTypeUtil.extractAvroSchema(AvroTypeUtil.java:102)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.lambda$new$0(WriteAvroSchemaAttributeStrategy.java:36)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:145)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
at
java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2404)
at
com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2387)
at
com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
at
com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:56)
at
org.apache.nifi.schema.access.WriteAvroSchemaAttributeStrategy.getAttributes(WriteAvroSchemaAttributeStrategy.java:53)
at
org.apache.nifi.json.WriteJsonResult.writeRecord(WriteJsonResult.java:151)
at
org.apache.nifi.serialization.AbstractRecordSetWriter.write(AbstractRecordSetWriter.java:59)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:254)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.access$100(StandardControllerServiceInvocationHandler.java:38)
at
org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler$ProxiedReturnObjectInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:240)
at jdk.proxy18/jdk.proxy18.$Proxy180.write(Unknown Source)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:153)
at
org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:3432)
at
org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1361)
at

[jira] [Created] (NIFI-12882) Allow control over unexpected failure behavior in processors

2024-03-11 Thread saarbs (Jira)
saarbs created NIFI-12882:
-

 Summary: Allow control over unexpected failure behavior in 
processors
 Key: NIFI-12882
 URL: https://issues.apache.org/jira/browse/NIFI-12882
 Project: Apache NiFi
  Issue Type: Bug
  Components: Core Framework
Reporter: saarbs


Recently we had a problem with a flow using UpdateAttribute that does 
base64decode, and one of our flow sources began sending invalid base64. This 
resulted in exceptions in the processor causing rollbacks to the flowfiles and 
backpressure in the flow.



Since there is no failure relationship to the processor there was no way to 
resolve this issue, And we realized there are a lot of processors facing this 
issue where an unexpected failure could cause an infinite loop where some flows 
may desire to send them to a failure relationship.

I suggest a setting or an unexpected failure relationship, which would allow 
keeping the existing behavior of a rollback but would also allow terminating or 
processing the failure in different ways.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] MINIFICPP-2277 Add virtualenv support for python processors [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


lordgamez commented on code in PR #1721:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1721#discussion_r1519685441


##
extensions/python/PythonConfigState.h:
##
@@ -0,0 +1,50 @@
+/**
+ *
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+#pragma once
+
+#include 
+#include 
+
+namespace org::apache::nifi::minifi::extensions::python {
+
+struct PythonConfigState {
+ public:
+  PythonConfigState(PythonConfigState&&) = delete;
+  PythonConfigState(const PythonConfigState&) = delete;
+  PythonConfigState& operator=(PythonConfigState&&) = delete;
+  PythonConfigState& operator=(const PythonConfigState&) = delete;
+
+  bool isPackageInstallationNeeded() const {
+return install_python_packages_automatically && !virtualenv_path.empty();
+  }
+
+  static PythonConfigState& getInstance() {
+static PythonConfigState config;
+return config;
+  }

Review Comment:
   It makes sense in the context of this PR, I'm not sure what was the original 
intention of the `PythonScriptEngine::initialize` static function, as it was 
previously called with and empty bode from the `PythonCreator`, so my thought 
was that it was supposed to implement the global python environment specific 
initialization, but it can be moved to the `PythonCreator` or a separate class.
   
   The problem is that in the PR depending on this 
https://github.com/apache/nifi-minifi-cpp/pull/1727 we introduce the support 
for inline python dependency installation, which is separately done for every 
NiFi python processor initialization that is called from the 
`ExecutePythonProcessor`. We also only want to initialize the processors that 
are part of the minifi flow config in the future. The inline python dependency 
installation needs the information stored in the `PythonConfigState` that is 
why it needs to be reached from outside the `PythonCreator`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-11259 - Kafka processor refactor [nifi]

2024-03-11 Thread via GitHub


taz1988 commented on code in PR #8463:
URL: https://github.com/apache/nifi/pull/8463#discussion_r1517541500


##
nifi-nar-bundles/nifi-kafka-bundle/nifi-kafka-processors/src/main/java/org/apache/nifi/kafka/processors/PublishKafka.java:
##
@@ -0,0 +1,527 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.kafka.processors;
+
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.ConfigVerificationResult;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.kafka.processors.producer.common.PublishKafkaUtil;
+import org.apache.nifi.kafka.processors.producer.config.DeliveryGuarantee;
+import 
org.apache.nifi.kafka.processors.producer.convert.DelimitedStreamKafkaRecordConverter;
+import 
org.apache.nifi.kafka.processors.producer.convert.FlowFileStreamKafkaRecordConverter;
+import org.apache.nifi.kafka.processors.producer.convert.KafkaRecordConverter;
+import 
org.apache.nifi.kafka.processors.producer.convert.RecordStreamKafkaRecordConverter;
+import 
org.apache.nifi.kafka.processors.producer.convert.RecordWrapperStreamKafkaRecordConverter;
+import 
org.apache.nifi.kafka.processors.producer.header.AttributesHeadersFactory;
+import org.apache.nifi.kafka.processors.producer.header.HeadersFactory;
+import org.apache.nifi.kafka.processors.producer.key.AttributeKeyFactory;
+import org.apache.nifi.kafka.processors.producer.key.KeyFactory;
+import org.apache.nifi.kafka.processors.producer.key.MessageKeyFactory;
+import 
org.apache.nifi.kafka.processors.producer.wrapper.RecordMetadataStrategy;
+import org.apache.nifi.kafka.service.api.KafkaConnectionService;
+import org.apache.nifi.kafka.service.api.common.PartitionState;
+import org.apache.nifi.kafka.service.api.producer.FlowFileResult;
+import org.apache.nifi.kafka.service.api.producer.KafkaProducerService;
+import org.apache.nifi.kafka.service.api.producer.ProducerConfiguration;
+import org.apache.nifi.kafka.service.api.producer.PublishContext;
+import org.apache.nifi.kafka.service.api.producer.RecordSummary;
+import org.apache.nifi.kafka.service.api.record.KafkaRecord;
+import org.apache.nifi.kafka.shared.attribute.KafkaFlowFileAttribute;
+import org.apache.nifi.kafka.shared.component.KafkaPublishComponent;
+import org.apache.nifi.kafka.shared.property.FailureStrategy;
+import org.apache.nifi.kafka.shared.property.KeyEncoding;
+import org.apache.nifi.kafka.shared.property.PublishStrategy;
+import org.apache.nifi.kafka.shared.transaction.TransactionIdSupplier;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.VerifiableProcessor;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+
+import java.io.BufferedInputStream;
+import java.io.InputStream;
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.List;
+import java.util.Map;
+import java.util.Objects;
+import java.util.Set;
+import java.util.function.Supplier;
+import java.util.regex.Pattern;
+
+@Tags({"kafka", "producer", "record"})
+public class PublishKafka extends AbstractProcessor implements 
KafkaPublishComponent, VerifiableProcessor {

Review Comment:
   InputRequirement, CapabilityDescription, WritesAttribute annotations are 
missing



##

[jira] [Resolved] (NIFI-12840) Expose REMOTE_POLL_BATCH_SIZE property for ListSFTP

2024-03-11 Thread Pierre Villard (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12840?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pierre Villard resolved NIFI-12840.
---
Fix Version/s: 1.26.0
   Resolution: Fixed

> Expose REMOTE_POLL_BATCH_SIZE property for ListSFTP
> ---
>
> Key: NIFI-12840
> URL: https://issues.apache.org/jira/browse/NIFI-12840
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.25.0
>Reporter: endzeit
>Assignee: endzeit
>Priority: Major
> Fix For: 1.26.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> Backport changes from NIFI-12772 into the _support/nifi-1.x_ branch by 
> exposing the property {{REMOTE_POLL_BATCH_SIZE}} for {{ListSFTP}}.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12840 Expose REMOTE_POLL_BATCH_SIZE property for ListSFTP (backport for 1.x) [nifi]

2024-03-11 Thread via GitHub


pvillard31 closed pull request #8448: NIFI-12840 Expose REMOTE_POLL_BATCH_SIZE 
property for ListSFTP (backport for 1.x)
URL: https://github.com/apache/nifi/pull/8448


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Commented] (NIFI-12840) Expose REMOTE_POLL_BATCH_SIZE property for ListSFTP

2024-03-11 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12840?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17825288#comment-17825288
 ] 

ASF subversion and git services commented on NIFI-12840:


Commit a51641f37c03eb6cf1a50b9b6987d2d2b7edd220 in nifi's branch 
refs/heads/support/nifi-1.x from EndzeitBegins
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=a51641f37c ]

NIFI-12840 Expose REMOTE_POLL_BATCH_SIZE property for ListSFTP

Signed-off-by: Pierre Villard 

This closes #8448.


> Expose REMOTE_POLL_BATCH_SIZE property for ListSFTP
> ---
>
> Key: NIFI-12840
> URL: https://issues.apache.org/jira/browse/NIFI-12840
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.25.0
>Reporter: endzeit
>Assignee: endzeit
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Backport changes from NIFI-12772 into the _support/nifi-1.x_ branch by 
> exposing the property {{REMOTE_POLL_BATCH_SIZE}} for {{ListSFTP}}.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] MINIFICPP-2277 Add virtualenv support for python processors [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


lordgamez commented on code in PR #1721:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1721#discussion_r1519636902


##
extensions/python/PythonScriptEngine.cpp:
##
@@ -68,6 +67,73 @@ void initThreads() {
 #pragma warning(pop)
 #endif
 }
+
+std::string encapsulateCommandInQuotesIfNeeded(const std::string& command) {
+#if WIN32
+return "\"" + command + "\"";
+#else
+return command;
+#endif
+}
+
+std::vector getRequirementsFilePaths(const 
std::shared_ptr ) {
+  std::vector paths;
+  if (auto python_processor_path = 
configuration->get(minifi::Configuration::nifi_python_processor_dir)) {
+for (const auto& entry : 
std::filesystem::recursive_directory_iterator(std::filesystem::path{*python_processor_path}))
 {
+  if (std::filesystem::is_regular_file(entry.path()) && 
entry.path().filename() == "requirements.txt") {
+paths.push_back(entry.path());
+  }
+}
+  }
+  return paths;
+}
+
+std::string getPythonBinary(const std::shared_ptr ) {
+#if WIN32
+  std::string python_binary = "python";
+#else
+  std::string python_binary = "python3";
+#endif
+  if (auto binary = 
configuration->get(minifi::Configuration::nifi_python_env_setup_binary)) {
+python_binary = *binary;
+  }
+  return python_binary;
+}
+
+void createVirtualEnvIfSpecified(const std::shared_ptr 
) {
+  if (auto path = 
configuration->get(minifi::Configuration::nifi_python_virtualenv_directory)) {
+PythonConfigState::getInstance().virtualenv_path = *path;
+if 
(!std::filesystem::exists(PythonConfigState::getInstance().virtualenv_path) || 
!std::filesystem::is_empty(PythonConfigState::getInstance().virtualenv_path)) {
+  auto venv_command = "\"" + 
PythonConfigState::getInstance().python_binary + "\" -m venv \"" + 
PythonConfigState::getInstance().virtualenv_path.string() + "\"";
+  auto return_value = 
std::system(encapsulateCommandInQuotesIfNeeded(venv_command).c_str());
+  if (return_value != 0) {
+throw PythonScriptException(fmt::format("The following command 
creating python virtual env failed: '{}'", venv_command));
+  }
+}
+  }
+}
+
+void installPythonPackagesIfRequested(const std::shared_ptr 
, const std::shared_ptr& logger) {
+  std::string automatic_install_str;
+  if (!PythonConfigState::getInstance().isPackageInstallationNeeded()) {
+return;
+  }
+  auto requirement_file_paths = getRequirementsFilePaths(configuration);
+  for (const auto& requirements_file_path : requirement_file_paths) {
+logger->log_info("Installing python packages from the following 
requirements.txt file: {}", requirements_file_path.string());
+std::string pip_command;
+#if WIN32
+
pip_command.append("\"").append((PythonConfigState::getInstance().virtualenv_path
 / "Scripts" / "activate.bat").string()).append("\" && ");
+#else
+pip_command.append(". 
\"").append((PythonConfigState::getInstance().virtualenv_path / "bin" / 
"activate").string()).append("\" && ");
+#endif
+
pip_command.append("\"").append(PythonConfigState::getInstance().python_binary).append("\"
 -m pip install --no-cache-dir -r 
\"").append(requirements_file_path.string()).append("\"");
+auto return_value = 
std::system(encapsulateCommandInQuotesIfNeeded(pip_command).c_str());

Review Comment:
   Added comment in c8e483279209a6d6a843e0f343876ff9adb97c93



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2203 Add support for building Windows MSI without any redistributables included [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


lordgamez commented on code in PR #1734:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1734#discussion_r1519529421


##
cmake/MiNiFiOptions.cmake:
##
@@ -82,6 +80,8 @@ list(APPEND STRICT_GSL_CHECKS_Values AUDIT ON DEBUG_ONLY OFF)
 set_property(CACHE STRICT_GSL_CHECKS PROPERTY STRINGS 
${STRICT_GSL_CHECKS_Values})
 
 if (WIN32)
+add_minifi_option(INSTALLER_MERGE_MODULES "Creates installer with merge 
modules" OFF)
+add_minifi_option(INSTALLER_WITH_VC_REDISTRIBUTABLES "Creates installer 
with Visual C++ redistributables included" OFF)

Review Comment:
   Updated the flag names and descriptions in 
b2da5ed65397525433093fa654edfe0d7836ca13 also making them mutually exclusive



##
cmake/MiNiFiOptions.cmake:
##
@@ -82,6 +80,8 @@ list(APPEND STRICT_GSL_CHECKS_Values AUDIT ON DEBUG_ONLY OFF)
 set_property(CACHE STRICT_GSL_CHECKS PROPERTY STRINGS 
${STRICT_GSL_CHECKS_Values})
 
 if (WIN32)
+add_minifi_option(INSTALLER_MERGE_MODULES "Creates installer with merge 
modules" OFF)
+add_minifi_option(INSTALLER_WITH_VC_REDISTRIBUTABLES "Creates installer 
with Visual C++ redistributables included" OFF)

Review Comment:
   Updated in b2da5ed65397525433093fa654edfe0d7836ca13



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2277 Add virtualenv support for python processors [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


szaszm commented on code in PR #1721:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1721#discussion_r1519487010


##
extensions/python/PythonConfigState.h:
##
@@ -0,0 +1,50 @@
+/**
+ *
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+#pragma once
+
+#include 
+#include 
+
+namespace org::apache::nifi::minifi::extensions::python {
+
+struct PythonConfigState {
+ public:
+  PythonConfigState(PythonConfigState&&) = delete;
+  PythonConfigState(const PythonConfigState&) = delete;
+  PythonConfigState& operator=(PythonConfigState&&) = delete;
+  PythonConfigState& operator=(const PythonConfigState&) = delete;
+
+  bool isPackageInstallationNeeded() const {
+return install_python_packages_automatically && !virtualenv_path.empty();
+  }
+
+  static PythonConfigState& getInstance() {
+static PythonConfigState config;
+return config;
+  }

Review Comment:
   Since PythonScriptEngine is created on a per-processor basis, we can't let 
it handle the creation of the shared execution environment. That should be 
extracted to a separate class, which could be owned by e.g. PythonCreator, 
setting up the virtualenv and packages, so that individual PythonScriptEngines 
can be created in that environment. PythonConfigState could be part of this new 
class, and PythonCreator could inject the Configuration object into it. What do 
you think about this?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Created] (MINIFICPP-2316) Expression language does not handle ${literal('\n')}

2024-03-11 Thread Ferenc Gerlits (Jira)
Ferenc Gerlits created MINIFICPP-2316:
-

 Summary: Expression language does not handle ${literal('\n')}
 Key: MINIFICPP-2316
 URL: https://issues.apache.org/jira/browse/MINIFICPP-2316
 Project: Apache NiFi MiNiFi C++
  Issue Type: Bug
Reporter: Ferenc Gerlits


If we put {{{}$\{literal('\n'){ in a property value (which supports 
expression value) in NiFi, it results in a newline (0x0A) character. In MiNiFi, 
we get this error message:
{noformat}
[2024-03-11 10:48:37.164] [org::apache::nifi::minifi::core::Processor] 
[warning] Caught "1.13: syntax error, unexpected identifier, expecting \ or "'" 
or "\""" (St13runtime_error) during Processor::onTrigger of processor: 
932a0430-3655-4db2-9b4a-bb271ba67d53 (Append newline) 
(932a0430-3655-4db2-9b4a-bb271ba67d53)
[2024-03-11 10:48:37.164] [org::apache::nifi::minifi::core::ProcessSession] 
[info] Penalizing 7670c84e-df8c-11ee-9667-0242204dc695 for 3ms at Append 
newline
{noformat}
Note that the actual property value in {{config.yml}} (or in the exported NiFi 
config) is
{noformat}
${literal('\\n')}
{noformat}
because the backslash gets escaped.

If you manually edit the {{config.yml}} file and enter 
{{{}$\{literal('\n'){, that works, because we'll have a 0x0A character in 
the config, so MiNiFi sees
{noformat}
${literal('
')}
{noformat}
But if you use a tool to generate {{{}config.yml{}}}, it most likely will 
escape the backslash, so there is no way to have {{$\{literal('\n'){ as the 
property value.

Please check the NiFi code to see the full list of special characters supported 
by NiFi, and make sure MiNiFi supports the same list.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (MINIFICPP-2315) Send asset id hashes in c2 heartbeat

2024-03-11 Thread Adam Debreceni (Jira)
Adam Debreceni created MINIFICPP-2315:
-

 Summary: Send asset id hashes in c2 heartbeat
 Key: MINIFICPP-2315
 URL: https://issues.apache.org/jira/browse/MINIFICPP-2315
 Project: Apache NiFi MiNiFi C++
  Issue Type: New Feature
Reporter: Adam Debreceni


The agent should calculate the hash of all asset ids and send it in each 
heartbeat for the c2 agent to assess if a sync operation is needed.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (MINIFICPP-2314) Introduce SYNC_ASSETS c2 command

2024-03-11 Thread Adam Debreceni (Jira)
Adam Debreceni created MINIFICPP-2314:
-

 Summary: Introduce SYNC_ASSETS c2 command
 Key: MINIFICPP-2314
 URL: https://issues.apache.org/jira/browse/MINIFICPP-2314
 Project: Apache NiFi MiNiFi C++
  Issue Type: New Feature
Reporter: Adam Debreceni
Assignee: Adam Debreceni


The c2 server sends a list of file paths, ids and download urls. The agent 
should calculate the diff and fetch files as needed.

 
{code:java}
{
  operation: heartbeat,
  requested_operations: [
{
  operationid: 13,
  operation: sync,
  operand: asset,
  args: [
{
  file: ,
  url: ,
  id: 
},
        {
  file: ,
  url: ,
          id:          }
  ]
}
  ]
}
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] MINIFICPP-2313 Fix Grafana Loki issues on Windows [nifi-minifi-cpp]

2024-03-11 Thread via GitHub


lordgamez commented on code in PR #1742:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1742#discussion_r1519324535


##
extensions/grafana-loki/tests/CMakeLists.txt:
##
@@ -16,6 +16,8 @@
 # specific language governing permissions and limitations
 # under the License.
 #
+include(WholeArchive)
+

Review Comment:
   Good point, updated in fc172520e6c3dbab68d5f9d37a76761927b02ab2



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org