Re: [PR] NIFI-12875: Add unsuccessful HTTP status code handling logic [nifi]

2024-03-13 Thread via GitHub


Lehel44 commented on PR #8484:
URL: https://github.com/apache/nifi/pull/8484#issuecomment-1996559949

   Reviewing...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12889: Retry Kerberos login on auth failure in HDFS processors [nifi]

2024-03-13 Thread via GitHub


Lehel44 commented on code in PR #8495:
URL: https://github.com/apache/nifi/pull/8495#discussion_r1524238959


##
nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java:
##
@@ -372,54 +370,56 @@ public Object run() {
 
 // Write FlowFile to temp file on HDFS
 final StopWatch stopWatch = new StopWatch(true);
-session.read(putFlowFile, new InputStreamCallback() {
-
-@Override
-public void process(InputStream in) throws IOException 
{
-OutputStream fos = null;
-Path createdFile = null;
-try {
-if (conflictResponse.equals(APPEND_RESOLUTION) 
&& destinationExists) {
-fos = hdfs.append(copyFile, bufferSize);
-} else {
-final EnumSet cflags = 
EnumSet.of(CreateFlag.CREATE, CreateFlag.OVERWRITE);
-
-if (shouldIgnoreLocality(context, 
session)) {
-
cflags.add(CreateFlag.IGNORE_CLIENT_LOCALITY);
-}
+session.read(putFlowFile, in -> {
+OutputStream fos = null;
+Path createdFile = null;
+try {
+if (conflictResponse.equals(APPEND_RESOLUTION) && 
destinationExists) {
+fos = hdfs.append(copyFile, bufferSize);
+} else {
+final EnumSet cflags = 
EnumSet.of(CreateFlag.CREATE, CreateFlag.OVERWRITE);
 
-fos = hdfs.create(actualCopyFile, 
FsCreateModes.applyUMask(FsPermission.getFileDefault(),
-
FsPermission.getUMask(hdfs.getConf())), cflags, bufferSize, replication, 
blockSize,
-null, null);
+if (shouldIgnoreLocality(context, session)) {
+
cflags.add(CreateFlag.IGNORE_CLIENT_LOCALITY);
 }
 
-if (codec != null) {
-fos = codec.createOutputStream(fos);
+fos = hdfs.create(actualCopyFile, 
FsCreateModes.applyUMask(FsPermission.getFileDefault(),
+
FsPermission.getUMask(hdfs.getConf())), cflags, bufferSize, replication, 
blockSize,
+null, null);
+}
+
+if (codec != null) {
+fos = codec.createOutputStream(fos);
+}
+createdFile = actualCopyFile;
+BufferedInputStream bis = new 
BufferedInputStream(in);
+StreamUtils.copy(bis, fos);
+bis = null;
+fos.flush();
+} catch (IOException e) {
+// Catch GSSExceptions and reset the resources
+Optional causeOptional = 
findCause(e, GSSException.class, gsse -> GSSException.NO_CRED == 
gsse.getMajor());
+if (causeOptional.isPresent()) {

Review Comment:
   Here we catch the IOException, but in case the underlying cause is not a 
GSSException with an error code NO_CRED, the exception wont be handled and will 
be absorbed by the catch. The IOException should be thrown in the else branch.



##
nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java:
##
@@ -372,54 +370,56 @@ public Object run() {
 
 // Write FlowFile to temp file on HDFS
 final StopWatch stopWatch = new StopWatch(true);
-session.read(putFlowFile, new InputStreamCallback() {
-
-@Override
-public void process(InputStream in) throws IOException 
{
-OutputStream fos = null;
-Path createdFile = null;
-try {
-if (conflictResponse.equals(APPEND_RESOLUTION) 
&& destinationExists) {
-fos = hdfs.append(copyFile, bufferSize);
-} else {
-final EnumSet cflags = 
EnumSet.of(CreateFlag.CREATE, CreateFlag.OVERWRITE);
-
-if (shouldIgnoreLocality(context, 
session)) {
-

[PR] NIFI-12700: refactored PutKudu to optimize memory handling for AUTO_F… [nifi]

2024-03-13 Thread via GitHub


emiliosetiadarma opened a new pull request, #8501:
URL: https://github.com/apache/nifi/pull/8501

   …LUSH_SYNC flush mode (unbatched flush)
   
   
   
   
   
   
   
   
   
   
   
   
   
   
   # Summary
   
   [NIFI-12700](https://issues.apache.org/jira/browse/NIFI-12700)
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [x] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI-12700) 
issue created
   
   ### Pull Request Tracking
   
   - [x] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [x] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [ ] Pull Request based on current revision of the `main` branch
   - [ ] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [x] Build completed using `mvn clean install -P contrib-check`
 - [x] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Updated] (NIFI-12894) Replace .nifi-button with mat-icon-button

2024-03-13 Thread Scott Aslan (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12894?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Scott Aslan updated NIFI-12894:
---
Status: Patch Available  (was: In Progress)

> Replace .nifi-button with mat-icon-button
> -
>
> Key: NIFI-12894
> URL: https://issues.apache.org/jira/browse/NIFI-12894
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (NIFI-12894) Replace .nifi-button with mat-icon-button

2024-03-13 Thread Scott Aslan (Jira)
Scott Aslan created NIFI-12894:
--

 Summary: Replace .nifi-button with mat-icon-button
 Key: NIFI-12894
 URL: https://issues.apache.org/jira/browse/NIFI-12894
 Project: Apache NiFi
  Issue Type: Sub-task
Reporter: Scott Aslan
Assignee: Scott Aslan






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[PR] NIFI-12514: Add config python virtual env binary directory for windows. [nifi]

2024-03-13 Thread via GitHub


bobpaulin opened a new pull request, #8500:
URL: https://github.com/apache/nifi/pull/8500

   
   
   
   
   
   
   
   
   
   
   
   
   
   # Summary
   
   [NIFI-12514](https://issues.apache.org/jira/browse/NIFI-12514)
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [ ] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [ ] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [ ] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [ ] Pull Request based on current revision of the `main` branch
   - [ ] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   Started nifi on windows with:
   
   nifi.python.command=python
   nifi.python.virtual.env.binary.directory=Scripts
   
   Started fine.  Ran a simple Python extension.
   
   
   ### Build
   
   - [ ] Build completed using `mvn clean install -P contrib-check`
 - [ ] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Updated] (NIFI-12822) BUG - light mode sorting arrows are too dark

2024-03-13 Thread Scott Aslan (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12822?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Scott Aslan updated NIFI-12822:
---
Status: Patch Available  (was: In Progress)

> BUG - light mode sorting arrows are too dark
> 
>
> Key: NIFI-12822
> URL: https://issues.apache.org/jira/browse/NIFI-12822
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Major
> Attachments: Screenshot 2024-02-20 at 9.50.36 AM.png
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (NIFI-12870) Refactor the usage of Material color theming to be semantic

2024-03-13 Thread Scott Aslan (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12870?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Scott Aslan resolved NIFI-12870.

Resolution: Fixed

> Refactor the usage of Material color theming to be semantic
> ---
>
> Key: NIFI-12870
> URL: https://issues.apache.org/jira/browse/NIFI-12870
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: James Elliott
>Assignee: James Elliott
>Priority: Major
>  Time Spent: 4h 50m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (NIFI-12870) Refactor the usage of Material color theming to be semantic

2024-03-13 Thread Scott Aslan (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12870?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Scott Aslan reassigned NIFI-12870:
--

Assignee: James Elliott

> Refactor the usage of Material color theming to be semantic
> ---
>
> Key: NIFI-12870
> URL: https://issues.apache.org/jira/browse/NIFI-12870
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: James Elliott
>Assignee: James Elliott
>Priority: Major
>  Time Spent: 4h 50m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[PR] [NIFI-12822] table, border, theming updates [nifi]

2024-03-13 Thread via GitHub


scottyaslan opened a new pull request, #8499:
URL: https://github.com/apache/nifi/pull/8499

   NIFI-12822 
   
   - listing table styles updated to include border and be more extensible
   - table header sort arrow color
   - border colors style override for tailwind
   - cdk drag styles now available application wide to ensure consistency
   - run prettier


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Commented] (NIFI-12870) Refactor the usage of Material color theming to be semantic

2024-03-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12870?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17826877#comment-17826877
 ] 

ASF subversion and git services commented on NIFI-12870:


Commit 06c011306356992ef6fa9fcb3b31c9ad658d7589 in nifi's branch 
refs/heads/main from James Mingardi-Elliott
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=06c0113063 ]

NIFI-12870 Semantic colors (#8480)

Next step to color theming

Update theming to reference colors semantically

Material and Canvas palettes are reordered so that in all cases they go from 50 
= lightest / least amount of color to 900 = darkest / most amount of color 
applied.

Usage of color has been changed so that Material's primary, accent, and warn 
values are used by semantic reference of 'default', 'lighter' and 'darker' 
rather than explicit number values.

The Canvas palettes still have values referenced directly because they are a 
special case.

Added SASS utilities:
- To help ensure color contrast for text and backgrounds by checking for a 
4.5:1 contrast ratio.
- To provide helper functions that somewhat replicate Material designs approach 
to Surface and On Surface concepts. This is how the same Canvas palettes can be 
used for light and dark modes.

Some minor tweaks to the styling of the flow canvas to bring custom NiFi 
components and the Angular Material components closer together visually.

Moved the Canvas theme declaration to a separate file so the Material themes 
can be more easily swapped out without needing to redeclare the Canvas themes.

This closes #8480

> Refactor the usage of Material color theming to be semantic
> ---
>
> Key: NIFI-12870
> URL: https://issues.apache.org/jira/browse/NIFI-12870
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: James Elliott
>Priority: Major
>  Time Spent: 4h 50m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-13 Thread via GitHub


scottyaslan merged PR #8480:
URL: https://github.com/apache/nifi/pull/8480


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-13 Thread via GitHub


scottyaslan commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1523939529


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/assets/themes/nifi.scss:
##
@@ -19,11 +19,15 @@
 // For more information: 
https://m2.material.io/design/color/the-color-system.html
 @use '@angular/material' as mat;
 
+// Define some variables that are re-used throughout the theme.
+$on-surface-dark: rgba(black, 0.87);
+$on-surface-light: #ff;
+
 // The $material-primary-light-palette define the PRIMARY AND ACCENT palettes 
for all Angular Material components used throughout Apache NiFi
 $material-primary-light-palette: (
 // 50 -> 900 are the PRIMARY colors 
(mat.define-palette($material-primary-light-palette, 300);) defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-50: rgba(249, 250, 251, 0.97), // .context-menu
-100: rgba(233, 239, 243, 1), // "lighter" hue for this palette. Also 
.global-menu:hover, .navigation-control-header:hover, 
.operation-control-header:hover, .new-canvas-item.icon.hovering, table 
tr:hover, .CodeMirror.blank, .remote-banner, .process-group-details-banner, 
.process-group-details-banner, remote-process-group-details-banner, 
.remote-process-group-last-refresh-rect,

Review Comment:
   Let's leave the hex values and if we find a case where we need rgba we will 
revisit this conversation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-13 Thread via GitHub


scottyaslan commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1523938033


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/assets/themes/nifi.scss:
##
@@ -49,355 +53,100 @@ $material-primary-light-palette: (
 // NOTE: When creating the Material palette definition 
mat.define-palette($material-primary-light-palette, 300);
 // Since 300, was set as the default the contrast-300 will be used as the 
default text color.
 contrast: (
-50: rgba(black, 0.87),
-100: rgba(black, 0.87),
-200: rgba(black, 0.87),
-300: #ff,
-400: #ff,
-500: #ff,
-600: #ff,
-700: #ff,
-800: #ff,
-900: #ff,
-A100: rgba(black, 0.87),
-A200: rgba(black, 0.87),
-A400: #ff,
-A700: #ff,
-)
-);
-
-// The $material-primary-dark-palette define the PRIMARY AND ACCENT palettes 
for all Angular Material components used throughout Apache NiFi
-$material-primary-dark-palette: (
-// 50 -> 900 are the PRIMARY colors 
(mat.define-palette($material-primary-dark-palette, 300);) defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-50: rgb(30, 45, 54), // .context-menu
-100: rgba(32, 47, 54, 1), // "lighter" hue for this palette. Also 
.global-menu:hover, .navigation-control-header:hover, 
.operation-control-header:hover, .new-canvas-item.icon.hovering, table 
tr:hover, .CodeMirror.blank, .remote-banner, .process-group-details-banner, 
.process-group-details-banner, remote-process-group-details-banner, 
.remote-process-group-last-refresh-rect,
-200: #30444d, // .processor-stats-border, .process-group-stats-border, 
.context-menu-item:hover, .process-group-banner, .remote-process-group-banner, 
.a, button.nifi-button, button.nifi-button:disabled
-300: #3e5762, // .breadcrumb-container, .navigation-control, 
.operation-control, .flow-status, .controller-bulletins, 
.component-button-grip, .search-container, .nifi-navigation, .CodeMirror.blank
-400: #4d6b78, // Default hue for this palette (color="primary").
-500: #587a89, // .disabled, .not-transmitting, .splash, 
.access-policies-header, .operation-context-type, .bulletin-board-header, 
.counter-header, .stats-info, .active-thread-count-icon, .processor-type, 
.port-transmission-icon, .operation-context-type, .flow-status.fa, 
.flow-status.icon, .controller-bulletins, .prioritizers-list, 
.controller-services-header, .login-title, .parameter-context-header, 
.parameter-context-inheritance-list, .provenance-header, .flowfile-header, 
.queue-listing-header, .settings-header, .summary-header, .user-header, table 
th, button.global-menu-item.fa, button.global-menu-item.icon, .event-header, 
.section-header,
-600: #718d9a, // .breadcrumb-container, .birdseye-brush
-700: #8aa2ad, // "darker" hue for this palette. Also 
#status-history-chart-container text, #status-history-chart-control-container 
text
-800: #abbcc5,
-900: #abbcc5,
-
-// A100 -> A700 are the ACCENT color 
(mat.define-palette($material-primary-dark-palette, A400, A100, A700);). These 
color are the ANALOGOUS (or possibly the TRIADIC??) colors as defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-// These colors are also used by some custom canvas components that 
display the ANALOGOUS color for things like buttons, links, borders, info, etc.
-A100: #aabec7, // .zero
-A200: #44a3cf, // .enabled, .transmitting, .load-balance-icon-active
-A400: #009b9d, // a, a:hover, button.nifi-button, 
button.nifi-button:hover, .add-tenant-to-policy-form.fa, .component.selected 
rect.border, .add-connect, .remote-process-group-uri, 
.remote-process-group-transmission-secure, .navigation-control.fa, 
.operation-control.fa, .new-canvas-item.icon, .upload-flow-definition, 
.lineage-controls.fa, .event circle.context, .nifi-navigation.icon, 
.listing-table.fa, .listing-table.icon, .context-menu
-A700: #2cd5d5,//rgba(139, 208, 229, 1),//#aabec7 // .hint-pattern
-
-// These are the $material-primary-dark-palette PRIMARY AND ACCENT 
contrast colors. These color do not really get defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors.
-// Instead if we look to the Angular Material provided palettes we see 
that these fields are typically rgba(black, 0.87) or white. These values are 
particularly important
-// for light mode and dark mode as these values set the colors for the 
text when displayed against the primary background on a button, badge, chip, 
etc.
-//
-// NOTE: Care should be taken here to ensure the values meet accessibility 
standards.
-//
-// NOTE: When creating the Material palette definition 

Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-13 Thread via GitHub


scottyaslan commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1523936096


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/styles.scss:
##
@@ -187,149 +192,146 @@ $appFontPath: '~roboto-fontface/fonts';
 $canvas-accent-palette: map.get($canvas-color-config, 'accent');
 
 // Get hues from palette
-$primary-palette-50: mat.get-color-from-palette($primary-palette, 50);
-$primary-palette-200: mat.get-color-from-palette($primary-palette, 200);
-$primary-palette-500: mat.get-color-from-palette($primary-palette, 500);
-$accent-palette-A100: mat.get-color-from-palette($accent-palette, 'A100');
-$accent-palette-A200: mat.get-color-from-palette($accent-palette, 'A200');
-$accent-palette-A400: mat.get-color-from-palette($accent-palette, 'A400');
-$canvas-primary-palette-50: 
mat.get-color-from-palette($canvas-primary-palette, 50);
-$canvas-primary-palette-200: 
mat.get-color-from-palette($canvas-primary-palette, 200);
+
+// Start with the canvas theme.
+$canvas-primary-palette-A200: 
mat.get-color-from-palette($canvas-primary-palette, A200);
 $canvas-primary-palette-400: 
mat.get-color-from-palette($canvas-primary-palette, 400);
-$canvas-primary-palette-900: 
mat.get-color-from-palette($canvas-primary-palette, 900);
-$canvas-accent-palette-200: 
mat.get-color-from-palette($canvas-accent-palette, 200);
-$canvas-accent-palette-400: 
mat.get-color-from-palette($canvas-accent-palette, 400);
-$canvas-accent-palette-A200: 
mat.get-color-from-palette($canvas-accent-palette, 'A200');
-$warn-palette-200: mat.get-color-from-palette($warn-palette, 200);
-$warn-palette-300: mat.get-color-from-palette($warn-palette, 300);
-$warn-palette-A100: mat.get-color-from-palette($warn-palette, 'A100');
-$warn-palette-A400: mat.get-color-from-palette($warn-palette, 'A400');
+$canvas-primary-palette-500: 
mat.get-color-from-palette($canvas-primary-palette, 500);
+$canvas-accent-palette-lighter: 
mat.get-color-from-palette($canvas-accent-palette, lighter);
+$canvas-accent-palette-default: 
mat.get-color-from-palette($canvas-accent-palette, default);
+
+$primary-palette-lighter: mat.get-color-from-palette($primary-palette, 
lighter);
+$primary-palette-default: mat.get-color-from-palette($primary-palette, 
'default');
+$primary-palette-A400: mat.get-color-from-palette($primary-palette, 
'A400');
+
+$accent-palette-default: mat.get-color-from-palette($accent-palette, 
'default');
+$accent-palette-lighter: mat.get-color-from-palette($accent-palette, 
'lighter');
+
+$warn-palette-lighter: mat.get-color-from-palette($warn-palette, lighter);
+$warn-palette-default: mat.get-color-from-palette($warn-palette, 
'default');
+
+// Alternative hue for warn colors.
+$warn-palette-A200: mat.get-color-from-palette($warn-palette, 'A200');
+
+$surface: utils.get-surface($canvas-color-config);
+$surface-darker: utils.get-surface($canvas-color-config, darker);
+$surface-highlight: utils.get-on-surface($canvas-color-config, highlight);
+$on-surface: utils.get-on-surface($canvas-color-config);
+$on-surface-lighter: utils.get-on-surface($canvas-color-config, lighter);
+
+* { // Tailwind sets a default that doesn't shift with light and dark 
themes
+border-color: $on-surface-lighter;
+}
 
 a {
-color: $accent-palette-A400;
-text-decoration-color: $primary-palette-200;
+color: utils.get-color-on-surface($color-config, $surface);
+text-decoration-color: $primary-palette-lighter;
 }
 
 a:hover {
-text-decoration-color: $accent-palette-A400;
+text-decoration-color: utils.get-color-on-surface($color-config, 
$surface);
 }
 
 .tooltip {
-background-color: $canvas-primary-palette-900;
-border-color: $canvas-primary-palette-200;
-box-shadow: 0 2px 5px $canvas-primary-palette-50;
-color: $canvas-primary-palette-200;
+background-color: $surface;
+border-color: $on-surface;
+box-shadow: 0 2px 5px $canvas-primary-palette-A200;
+color: $on-surface;
 }
 
 .property-editor {
-background-color: $canvas-primary-palette-900;
-box-shadow: 0 2px 5px $canvas-primary-palette-50;
+background-color: $surface;
+box-shadow: 0 2px 5px $canvas-primary-palette-A200;
 }
 
 .disabled {
-color: $primary-palette-500 !important;
-fill: $primary-palette-500 !important;
-text-shadow: 0 0 4px $canvas-primary-palette-900;
+color: $primary-palette-default !important;
+fill: $primary-palette-default !important;
 }
 
 .enabled {
-color: $accent-palette-A200 !important;
-fill: $accent-palette-A200 !important;
-text-shadow: 0 0 4px $canvas-primary-palette-900;
+color: 

Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-13 Thread via GitHub


scottyaslan commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1523936670


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/assets/themes/nifi.scss:
##
@@ -49,355 +53,100 @@ $material-primary-light-palette: (
 // NOTE: When creating the Material palette definition 
mat.define-palette($material-primary-light-palette, 300);
 // Since 300, was set as the default the contrast-300 will be used as the 
default text color.
 contrast: (
-50: rgba(black, 0.87),
-100: rgba(black, 0.87),
-200: rgba(black, 0.87),
-300: #ff,
-400: #ff,
-500: #ff,
-600: #ff,
-700: #ff,
-800: #ff,
-900: #ff,
-A100: rgba(black, 0.87),
-A200: rgba(black, 0.87),
-A400: #ff,
-A700: #ff,
-)
-);
-
-// The $material-primary-dark-palette define the PRIMARY AND ACCENT palettes 
for all Angular Material components used throughout Apache NiFi
-$material-primary-dark-palette: (
-// 50 -> 900 are the PRIMARY colors 
(mat.define-palette($material-primary-dark-palette, 300);) defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-50: rgb(30, 45, 54), // .context-menu
-100: rgba(32, 47, 54, 1), // "lighter" hue for this palette. Also 
.global-menu:hover, .navigation-control-header:hover, 
.operation-control-header:hover, .new-canvas-item.icon.hovering, table 
tr:hover, .CodeMirror.blank, .remote-banner, .process-group-details-banner, 
.process-group-details-banner, remote-process-group-details-banner, 
.remote-process-group-last-refresh-rect,
-200: #30444d, // .processor-stats-border, .process-group-stats-border, 
.context-menu-item:hover, .process-group-banner, .remote-process-group-banner, 
.a, button.nifi-button, button.nifi-button:disabled
-300: #3e5762, // .breadcrumb-container, .navigation-control, 
.operation-control, .flow-status, .controller-bulletins, 
.component-button-grip, .search-container, .nifi-navigation, .CodeMirror.blank
-400: #4d6b78, // Default hue for this palette (color="primary").
-500: #587a89, // .disabled, .not-transmitting, .splash, 
.access-policies-header, .operation-context-type, .bulletin-board-header, 
.counter-header, .stats-info, .active-thread-count-icon, .processor-type, 
.port-transmission-icon, .operation-context-type, .flow-status.fa, 
.flow-status.icon, .controller-bulletins, .prioritizers-list, 
.controller-services-header, .login-title, .parameter-context-header, 
.parameter-context-inheritance-list, .provenance-header, .flowfile-header, 
.queue-listing-header, .settings-header, .summary-header, .user-header, table 
th, button.global-menu-item.fa, button.global-menu-item.icon, .event-header, 
.section-header,
-600: #718d9a, // .breadcrumb-container, .birdseye-brush
-700: #8aa2ad, // "darker" hue for this palette. Also 
#status-history-chart-container text, #status-history-chart-control-container 
text
-800: #abbcc5,
-900: #abbcc5,
-
-// A100 -> A700 are the ACCENT color 
(mat.define-palette($material-primary-dark-palette, A400, A100, A700);). These 
color are the ANALOGOUS (or possibly the TRIADIC??) colors as defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-// These colors are also used by some custom canvas components that 
display the ANALOGOUS color for things like buttons, links, borders, info, etc.
-A100: #aabec7, // .zero
-A200: #44a3cf, // .enabled, .transmitting, .load-balance-icon-active
-A400: #009b9d, // a, a:hover, button.nifi-button, 
button.nifi-button:hover, .add-tenant-to-policy-form.fa, .component.selected 
rect.border, .add-connect, .remote-process-group-uri, 
.remote-process-group-transmission-secure, .navigation-control.fa, 
.operation-control.fa, .new-canvas-item.icon, .upload-flow-definition, 
.lineage-controls.fa, .event circle.context, .nifi-navigation.icon, 
.listing-table.fa, .listing-table.icon, .context-menu
-A700: #2cd5d5,//rgba(139, 208, 229, 1),//#aabec7 // .hint-pattern
-
-// These are the $material-primary-dark-palette PRIMARY AND ACCENT 
contrast colors. These color do not really get defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors.
-// Instead if we look to the Angular Material provided palettes we see 
that these fields are typically rgba(black, 0.87) or white. These values are 
particularly important
-// for light mode and dark mode as these values set the colors for the 
text when displayed against the primary background on a button, badge, chip, 
etc.
-//
-// NOTE: Care should be taken here to ensure the values meet accessibility 
standards.
-//
-// NOTE: When creating the Material palette definition 

Re: [PR] NIFI-12700: refactored PutKudu to optimize memory handling for AUTO_F… [nifi]

2024-03-13 Thread via GitHub


mattyb149 commented on PR #8322:
URL: https://github.com/apache/nifi/pull/8322#issuecomment-1995644417

   Reviewing...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Assigned] (NIFI-11449) add autocommit property to PutDatabaseRecord processor

2024-03-13 Thread Jim Steinebrey (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-11449?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jim Steinebrey reassigned NIFI-11449:
-

Assignee: Jim Steinebrey

> add autocommit property to PutDatabaseRecord processor
> --
>
> Key: NIFI-11449
> URL: https://issues.apache.org/jira/browse/NIFI-11449
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.21.0
> Environment: Any Nifi Deployment
>Reporter: Abdelrahim Ahmad
>Assignee: Jim Steinebrey
>Priority: Blocker
>  Labels: Trino, autocommit, database, iceberg, putdatabaserecord
>
> The issue is with the {{PutDatabaseRecord}} processor in Apache NiFi. When 
> using the processor with the Trino-JDBC-Driver or Dremio-JDBC-Driver to write 
> to an Iceberg catalog, it disables the autocommit feature. This leads to 
> errors such as "{*}Catalog only supports writes using autocommit: iceberg{*}".
> the autocommit feature needs to be added in the processor to be 
> enabled/disabled.
> enabling auto-commit in the Nifi PutDatabaseRecord processor is important for 
> Deltalake, Iceberg, and Hudi as it ensures data consistency and integrity by 
> allowing atomic writes to be performed in the underlying database. This will 
> allow the process to be widely used with bigger range of databases.
> _Improving this processor will allow Nifi to be the main tool to ingest data 
> into these new Technologies. So we don't have to deal with another tool to do 
> so._
> +*_{color:#de350b}BUT:{color}_*+
> I have reviewed The {{PutDatabaseRecord}} processor in NiFi. It inserts 
> records one by one into the database using a prepared statement, and commits 
> the transaction at the end of the loop that processes each record. This 
> approach can be inefficient and slow when inserting large volumes of data 
> into tables that are optimized for bulk ingestion, such as Delta Lake, 
> Iceberg, and Hudi tables.
> These tables use various techniques to optimize the performance of bulk 
> ingestion, such as partitioning, clustering, and indexing. Inserting records 
> one by one using a prepared statement can bypass these optimizations, leading 
> to poor performance and potentially causing issues such as excessive disk 
> usage, increased memory consumption, and decreased query performance.
> To avoid these issues, it is recommended to have a new processor, or add 
> feature to the current one, to bulk insert method with AutoCommit feature 
> when inserting large volumes of data into Delta Lake, Iceberg, and Hudi 
> tables. 
>  
> P.S.: using PutSQL is not a have autoCommit but have the same performance 
> problem described above..
> Thanks and best regards :)
> Abdelrahim Ahmad



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (NIFI-12700) PutKudu memory optimization for unbatched flush mode (AUTO_FLUSH_SYNC)

2024-03-13 Thread Matt Burgess (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-12700:

Status: Patch Available  (was: Open)

> PutKudu memory optimization for unbatched flush mode (AUTO_FLUSH_SYNC)
> --
>
> Key: NIFI-12700
> URL: https://issues.apache.org/jira/browse/NIFI-12700
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Emilio Setiadarma
>Assignee: Emilio Setiadarma
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> The PutKudu processor's existing implementation uses a Map of KuduOperation 
> -> FlowFile  to keep track of which FlowFile was processing when the 
> KuduOperation was created. This is mapping is eventually used to associate 
> FlowFiles with the RowError (if any occurs), a mapping that is necessary for 
> transferring FlowFiles to success/failure relationships or logging failures 
> among other things. 
> For very large inputs, Kudu Operation objects can grow very large. There is 
> no memory leak, but still could cause OutOfMemory issues in very large input 
> data. There is a possibility to not require the use of a KuduOperation -> 
> FlowFile map for unbatched flush modes (e.g. when using the AUTO_FLUSH_SYNC 
> flush mode, where the KuduSession.apply() would have already flushed the 
> buffer before returning, 
> [https://kudu.apache.org/apidocs/org/apache/kudu/client/SessionConfiguration.FlushMode.html)|https://kudu.apache.org/apidocs/org/apache/kudu/client/SessionConfiguration.FlushMode.html]
> This Jira attempts to capture the efforts for refactoring PutKudu processor 
> to make it more memory optimized.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[PR] NIFI-12855: Add more information to provenance events to facilitate full graph traversal [nifi]

2024-03-13 Thread via GitHub


mattyb149 opened a new pull request, #8498:
URL: https://github.com/apache/nifi/pull/8498

   # Summary
   
   [NIFI-12855](https://issues.apache.org/jira/browse/NIFI-12855) This PR 
augments the provenance capabilities to include the following features:
   
   - A reference in a provenance event to any parent events ("previousEventIds")
   - Add methods to GraphClientService to generate queries/statements in 
popular graph languages such as Tinkerpop/Gremlin, Cypher, and SQL
   - Add ArcadeDB service as reference implementation for SQL generation
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [x] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [x] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [x] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [x] Pull Request based on current revision of the `main` branch
   - [x] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [x] Build completed using `mvn clean install -P contrib-check`
 - [x] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Updated] (NIFI-12855) Add more information to provenance events to facilitate full graph traversal

2024-03-13 Thread Matt Burgess (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12855?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-12855:

Description: 
Although NiFi has a capability in the UI to issue and display lineage queries 
for provenance events, it is not a complete graph that can be traversed if for 
example provenance events were stored in a graph database. The following 
features should be added:

- A reference in a provenance event to any parent events ("previousEventIds")
- Add methods to GraphClientService to generate queries/statements in popular 
graph languages such as Tinkerpop/Gremlin, Cypher, and SQL
- Add ArcadeDB service as reference implementation for SQL generation

  was:
Although NiFi has a capability in the UI to issue and display lineage queries 
for provenance events, it is not a complete graph that can be traversed if for 
example provenance events were stored in a graph database. The following 
features should be added:

- A reference in a provenance event to any parent events ("previousEventIds")
- Add methods to GraphClientService to generate queries/statements in popular 
graph languages such as Tinkerpop/Gremlin, Cypher, and SQL
- Add explicit references to the relationship to which the FlowFile was 
transferred
- Add ArcadeDB service as reference implementation for SQL generation


> Add more information to provenance events to facilitate full graph traversal
> 
>
> Key: NIFI-12855
> URL: https://issues.apache.org/jira/browse/NIFI-12855
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> Although NiFi has a capability in the UI to issue and display lineage queries 
> for provenance events, it is not a complete graph that can be traversed if 
> for example provenance events were stored in a graph database. The following 
> features should be added:
> - A reference in a provenance event to any parent events ("previousEventIds")
> - Add methods to GraphClientService to generate queries/statements in popular 
> graph languages such as Tinkerpop/Gremlin, Cypher, and SQL
> - Add ArcadeDB service as reference implementation for SQL generation



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12889: Retry Kerberos login on auth failure in HDFS processors [nifi]

2024-03-13 Thread via GitHub


Lehel44 commented on PR #8495:
URL: https://github.com/apache/nifi/pull/8495#issuecomment-1995416957

   reviewing...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-13 Thread via GitHub


scottyaslan commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1523722687


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/assets/themes/purple.scss:
##
@@ -50,354 +54,69 @@ $material-primary-light-palette: (
 // NOTE: When creating the Material palette definition 
mat.define-palette($material-primary-light-palette, 300);
 // Since 300, was set as the default the contrast-300 will be used as the 
default text color.
 contrast: (
-50: rgba(black, 0.87),
-100: rgba(black, 0.87),
-200: rgba(black, 0.87),
-300: #ff,
-400: #ff,
-500: #ff,
-600: #ff,
-700: #ff,
-800: #ff,
-900: #ff,
-A100: rgba(black, 0.87),
-A200: rgba(black, 0.87),
-A400: #ff,
-A700: #ff,
+50: $on-surface-dark,
+100: $on-surface-dark,
+200: $on-surface-dark,
+300: $on-surface-light,
+400: $on-surface-light,
+500: $on-surface-light,
+600: $on-surface-light,
+700: $on-surface-light,
+800: $on-surface-light,
+900: $on-surface-light,
+A100: $on-surface-dark,
+A200: $on-surface-dark,
+A400: $on-surface-light,
+A700: $on-surface-light,
 )
 );
 
 // The $material-primary-dark-palette define the PRIMARY AND ACCENT palettes 
for all Angular Material components used throughout Apache NiFi
-$material-primary-dark-palette: (
-// 50 -> 900 are the PRIMARY colors 
(mat.define-palette($material-primary-dark-palette, 300);) defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-50: rgba(69, 47, 101, 1), // .context-menu
-100: rgba(93, 57, 123, 1), // "lighter" hue for this palette. Also 
.global-menu:hover, .navigation-control-header:hover, 
.operation-control-header:hover, .new-canvas-item.icon.hovering, table 
tr:hover, .CodeMirror.blank, .remote-banner, .process-group-details-banner, 
.process-group-details-banner, remote-process-group-details-banner, 
.remote-process-group-last-refresh-rect,
-200: #6b3f86, // .processor-stats-border, .process-group-stats-border, 
.context-menu-item:hover, .process-group-banner, .remote-process-group-banner, 
.a, button.nifi-button, button.nifi-button:disabled
-300: #7b4690, // .breadcrumb-container, .navigation-control, 
.operation-control, .flow-status, .controller-bulletins, 
.component-button-grip, .search-container, .nifi-navigation, .CodeMirror.blank
-400: #874b98, // Default hue for this palette (color="primary").
-500: #985fa7, // .disabled, .not-transmitting, .splash, 
.access-policies-header, .operation-context-type, .bulletin-board-header, 
.counter-header, .stats-info, .active-thread-count-icon, .processor-type, 
.port-transmission-icon, .operation-context-type, .flow-status.fa, 
.flow-status.icon, .controller-bulletins, .prioritizers-list, 
.controller-services-header, .login-title, .parameter-context-header, 
.parameter-context-inheritance-list, .provenance-header, .flowfile-header, 
.queue-listing-header, .settings-header, .summary-header, .user-header, table 
th, button.global-menu-item.fa, button.global-menu-item.icon, .event-header, 
.section-header,
-600: #aa79b7, // .breadcrumb-container, .birdseye-brush
-700: #d3bada, // "darker" hue for this palette. Also 
#status-history-chart-container text, #status-history-chart-control-container 
text
-800: #dac3e0,
-900: #f0e7f2,
-
-// A100 -> A700 are the ACCENT color 
(mat.define-palette($material-primary-dark-palette, A400, A100, A700);). These 
color are the ANALOGOUS (or possibly the TRIADIC??) colors as defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors
 for primary color #728e9b
-// These colors are also used by some custom canvas components that 
display the ANALOGOUS color for things like buttons, links, borders, info, etc.
-A100: #2cd5d5, // .zero
-A200: #009b9d, // .enabled, .transmitting, .load-balance-icon-active
-A400: #44a3cf, // a, a:hover, button.nifi-button, 
button.nifi-button:hover, .add-tenant-to-policy-form.fa, .component.selected 
rect.border, .add-connect, .remote-process-group-uri, 
.remote-process-group-transmission-secure, .navigation-control.fa, 
.operation-control.fa, .new-canvas-item.icon, .upload-flow-definition, 
.lineage-controls.fa, .event circle.context, .nifi-navigation.icon, 
.listing-table.fa, .listing-table.icon, .context-menu
-A700: rgba(139, 208, 229, 1),//#aabec7 // .hint-pattern
-
-// These are the $material-primary-dark-palette PRIMARY AND ACCENT 
contrast colors. These color do not really get defined by 
https://m2.material.io/design/color/the-color-system.html#tools-for-picking-colors.
-// Instead if we look to the Angular Material provided palettes we 

Re: [PR] NIFI-12870 Refactor the usage of Material color theming to be semantic [nifi]

2024-03-13 Thread via GitHub


scottyaslan commented on code in PR #8480:
URL: https://github.com/apache/nifi/pull/8480#discussion_r1521739029


##
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-frontend/src/main/nifi/src/styles.scss:
##
@@ -187,149 +192,146 @@ $appFontPath: '~roboto-fontface/fonts';
 $canvas-accent-palette: map.get($canvas-color-config, 'accent');
 
 // Get hues from palette
-$primary-palette-50: mat.get-color-from-palette($primary-palette, 50);
-$primary-palette-200: mat.get-color-from-palette($primary-palette, 200);
-$primary-palette-500: mat.get-color-from-palette($primary-palette, 500);
-$accent-palette-A100: mat.get-color-from-palette($accent-palette, 'A100');
-$accent-palette-A200: mat.get-color-from-palette($accent-palette, 'A200');
-$accent-palette-A400: mat.get-color-from-palette($accent-palette, 'A400');
-$canvas-primary-palette-50: 
mat.get-color-from-palette($canvas-primary-palette, 50);
-$canvas-primary-palette-200: 
mat.get-color-from-palette($canvas-primary-palette, 200);
+
+// Start with the canvas theme.
+$canvas-primary-palette-A200: 
mat.get-color-from-palette($canvas-primary-palette, A200);
 $canvas-primary-palette-400: 
mat.get-color-from-palette($canvas-primary-palette, 400);
-$canvas-primary-palette-900: 
mat.get-color-from-palette($canvas-primary-palette, 900);
-$canvas-accent-palette-200: 
mat.get-color-from-palette($canvas-accent-palette, 200);
-$canvas-accent-palette-400: 
mat.get-color-from-palette($canvas-accent-palette, 400);
-$canvas-accent-palette-A200: 
mat.get-color-from-palette($canvas-accent-palette, 'A200');
-$warn-palette-200: mat.get-color-from-palette($warn-palette, 200);
-$warn-palette-300: mat.get-color-from-palette($warn-palette, 300);
-$warn-palette-A100: mat.get-color-from-palette($warn-palette, 'A100');
-$warn-palette-A400: mat.get-color-from-palette($warn-palette, 'A400');
+$canvas-primary-palette-500: 
mat.get-color-from-palette($canvas-primary-palette, 500);
+$canvas-accent-palette-lighter: 
mat.get-color-from-palette($canvas-accent-palette, lighter);
+$canvas-accent-palette-default: 
mat.get-color-from-palette($canvas-accent-palette, default);
+
+$primary-palette-lighter: mat.get-color-from-palette($primary-palette, 
lighter);
+$primary-palette-default: mat.get-color-from-palette($primary-palette, 
'default');
+$primary-palette-A400: mat.get-color-from-palette($primary-palette, 
'A400');
+
+$accent-palette-default: mat.get-color-from-palette($accent-palette, 
'default');
+$accent-palette-lighter: mat.get-color-from-palette($accent-palette, 
'lighter');
+
+$warn-palette-lighter: mat.get-color-from-palette($warn-palette, lighter);
+$warn-palette-default: mat.get-color-from-palette($warn-palette, 
'default');
+
+// Alternative hue for warn colors.
+$warn-palette-A200: mat.get-color-from-palette($warn-palette, 'A200');
+
+$surface: utils.get-surface($canvas-color-config);
+$surface-darker: utils.get-surface($canvas-color-config, darker);
+$surface-highlight: utils.get-on-surface($canvas-color-config, highlight);
+$on-surface: utils.get-on-surface($canvas-color-config);
+$on-surface-lighter: utils.get-on-surface($canvas-color-config, lighter);
+
+* { // Tailwind sets a default that doesn't shift with light and dark 
themes
+border-color: $on-surface-lighter;
+}
 
 a {
-color: $accent-palette-A400;
-text-decoration-color: $primary-palette-200;
+color: utils.get-color-on-surface($color-config, $surface);
+text-decoration-color: $primary-palette-lighter;
 }
 
 a:hover {
-text-decoration-color: $accent-palette-A400;
+text-decoration-color: utils.get-color-on-surface($color-config, 
$surface);
 }
 
 .tooltip {
-background-color: $canvas-primary-palette-900;
-border-color: $canvas-primary-palette-200;
-box-shadow: 0 2px 5px $canvas-primary-palette-50;
-color: $canvas-primary-palette-200;
+background-color: $surface;
+border-color: $on-surface;
+box-shadow: 0 2px 5px $canvas-primary-palette-A200;
+color: $on-surface;
 }
 
 .property-editor {
-background-color: $canvas-primary-palette-900;
-box-shadow: 0 2px 5px $canvas-primary-palette-50;
+background-color: $surface;
+box-shadow: 0 2px 5px $canvas-primary-palette-A200;
 }
 
 .disabled {
-color: $primary-palette-500 !important;
-fill: $primary-palette-500 !important;
-text-shadow: 0 0 4px $canvas-primary-palette-900;
+color: $primary-palette-default !important;
+fill: $primary-palette-default !important;
 }
 
 .enabled {
-color: $accent-palette-A200 !important;
-fill: $accent-palette-A200 !important;
-text-shadow: 0 0 4px $canvas-primary-palette-900;
+color: 

Re: [PR] MINIFICPP-2306 Filter out corrupt flowfiles during startup [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


fgerlits commented on code in PR #1738:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1738#discussion_r1523604085


##
extensions/rocksdb-repos/tests/RepoTests.cpp:
##
@@ -781,4 +781,69 @@ TEST_CASE("Content repositories are always running", 
"[TestRepoIsRunning]") {
   REQUIRE(content_repo->isRunning());
 }
 
+std::shared_ptr 
createFlowFileWithContent(core::ContentRepository& content_repo, 
std::string_view content) {
+  auto flow_file = std::make_shared();
+  const auto content_session = content_repo.createSession();
+  const auto claim = content_session->create();
+  const auto stream = content_session->write(claim);
+
+  stream->write(utils::as_span(std::span(content));

Review Comment:
   oops, sorry, my fault



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2306 Filter out corrupt flowfiles during startup [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


szaszm commented on code in PR #1738:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1738#discussion_r1523602600


##
extensions/rocksdb-repos/tests/RepoTests.cpp:
##
@@ -781,4 +781,69 @@ TEST_CASE("Content repositories are always running", 
"[TestRepoIsRunning]") {
   REQUIRE(content_repo->isRunning());
 }
 
+std::shared_ptr 
createFlowFileWithContent(core::ContentRepository& content_repo, 
std::string_view content) {
+  auto flow_file = std::make_shared();
+  const auto content_session = content_repo.createSession();
+  const auto claim = content_session->create();
+  const auto stream = content_session->write(claim);
+
+  stream->write(utils::as_span(std::span(content));

Review Comment:
   The CI fails because of a missing parenthesis here.
   ```suggestion
 stream->write(utils::as_span(std::span(content)));
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2283 Create tool to encrypt sensitive properties in config.yml [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


szaszm commented on code in PR #1725:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1725#discussion_r1523587236


##
encrypt-config/FlowConfigEncryptor.cpp:
##
@@ -0,0 +1,174 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+#include "FlowConfigEncryptor.h"
+
+#include "core/extension/ExtensionManager.h"
+#include "core/FlowConfiguration.h"
+#include "core/flow/AdaptiveConfiguration.h"
+#include "core/ProcessGroup.h"
+#include "core/RepositoryFactory.h"
+#include "core/repository/VolatileContentRepository.h"
+#include "Defaults.h"
+#include "Utils.h"
+#include "utils/file/FileSystem.h"
+#include "utils/Id.h"
+
+namespace minifi = org::apache::nifi::minifi;
+
+namespace {
+enum class Type {
+  Processor,
+  ControllerService
+};
+
+struct SensitiveProperty {
+  Type type;
+  minifi::utils::Identifier component_id;
+  std::string component_name;
+  std::string property_name;
+  std::string property_display_name;
+};
+}  // namespace
+
+namespace magic_enum::customize {
+template<>
+constexpr customize_t enum_name(Type type) noexcept {
+  switch (type) {
+case Type::Processor: return "Processor";
+case Type::ControllerService: return "Controller service";
+  }
+  return invalid_tag;
+}
+}  // namespace magic_enum::customize
+
+namespace {
+std::vector listSensitiveProperties(const 
minifi::core::ProcessGroup _group) {
+  std::vector sensitive_properties;
+
+  std::vector processors;
+  process_group.getAllProcessors(processors);
+  for (const auto *processor : processors) {
+gsl_Expects(processor);
+for (const auto& [_, property] : processor->getProperties()) {
+  if (property.isSensitive()) {
+sensitive_properties.push_back(SensitiveProperty{
+.type = Type::Processor,
+.component_id = processor->getUUID(),
+.component_name = processor->getName(),
+.property_name = property.getName(),
+.property_display_name = property.getDisplayName()});
+  }
+}
+  }
+
+  for (const auto* controller_service_node : 
process_group.getAllControllerServices()) {
+gsl_Expects(controller_service_node);
+const auto* controller_service = 
controller_service_node->getControllerServiceImplementation();
+gsl_Expects(controller_service);
+for (const auto& [_, property] : controller_service->getProperties()) {
+  if (property.isSensitive()) {
+sensitive_properties.push_back(SensitiveProperty{
+.type = Type::ControllerService,
+.component_id = controller_service->getUUID(),
+.component_name = controller_service->getName(),
+.property_name = property.getName(),
+.property_display_name = property.getDisplayName()});
+  }
+}
+  }
+
+  return sensitive_properties;
+}
+
+template
+void encryptSensitiveValuesInFlowConfigImpl(
+const minifi::encrypt_config::EncryptionKeys& keys, const 
std::filesystem::path& minifi_home, const std::filesystem::path& 
flow_config_path, Func create_overrides) {
+  const auto configure = std::make_shared();
+  configure->setHome(minifi_home);
+  configure->loadConfigureFile(DEFAULT_NIFI_PROPERTIES_FILE);
+
+  bool encrypt_whole_flow_config_file = 
(configure->get(minifi::Configure::nifi_flow_configuration_encrypt) | 
minifi::utils::andThen(minifi::utils::string::toBool)).value_or(false);
+  auto encryptor = encrypt_whole_flow_config_file ? 
minifi::utils::crypto::EncryptionProvider::create(minifi_home) : std::nullopt;
+  auto filesystem = 
std::make_shared(encrypt_whole_flow_config_file,
 encryptor);
+
+  minifi::core::extension::ExtensionManager::get().initialize(configure);
+
+  minifi::core::flow::AdaptiveConfiguration 
adaptive_configuration{minifi::core::ConfigurationContext{
+  .flow_file_repo = minifi::core::createRepository("flowfilerepository"),
+  .content_repo = 
std::make_shared(),

Review Comment:
   Would this work with passing null or NoopRepository? I don't think we need 
repositories for modifying the config.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: 

[jira] [Created] (NIFI-12893) Adjust documentation on time based properties in DBCPProperties

2024-03-13 Thread endzeit (Jira)
endzeit created NIFI-12893:
--

 Summary: Adjust documentation on time based properties in 
DBCPProperties
 Key: NIFI-12893
 URL: https://issues.apache.org/jira/browse/NIFI-12893
 Project: Apache NiFi
  Issue Type: Task
Reporter: endzeit
Assignee: endzeit


Some of the properties in {{DBCPProperties}}, like {{MAX_CONN_LIFETIME}}, note 
that the time has to be defined in milliseconds.
However, this is not true as an time period like "5 mins" is expected.

The documentation should be adjusted accordingly.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12887 Additional ways for handling Strings as binary in PutDatabaseRecord [nifi]

2024-03-13 Thread via GitHub


pvillard31 commented on PR #8493:
URL: https://github.com/apache/nifi/pull/8493#issuecomment-1994840950

   Yeah the option of opening another PR for 1.x is also OK :) As a FYI, from 
the Github UI, on the PR, you can change the branch against which the PR is 
submitted, that is why I suggested to use this one against 1.x and open a new 
one with the changes. But all good, the end result is the same ;)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Assigned] (NIFI-12837) Add DFS setting to smb processors

2024-03-13 Thread Peter Turcsanyi (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Peter Turcsanyi reassigned NIFI-12837:
--

Assignee: Peter Turcsanyi

> Add DFS setting to smb processors
> -
>
> Key: NIFI-12837
> URL: https://issues.apache.org/jira/browse/NIFI-12837
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.25.0
>Reporter: Anders
>Assignee: Peter Turcsanyi
>Priority: Major
>
> The hierynomus/smbj library has a setting for enabling DFS which is disabled 
> by default:
> https://github.com/hierynomus/smbj/blob/f25d5c5478a5b73e9ba4202dcfb365974e15367e/src/main/java/com/hierynomus/smbj/SmbConfig.java#L106C17-L106C39
> This appears to cause problems in some SMB configurations.
> Patched 
> https://github.com/apache/nifi/blob/main/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
>  to test in my environment with:
> {code}
> $ git diff 
> nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
> diff --git 
> a/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
>  
> b/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
> index 0895abfae0..eac765 100644
> --- 
> a/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
> +++ 
> b/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
> @@ -46,6 +46,8 @@ public final class SmbUtils {
>  }
>  }
> +configBuilder.withDfsEnabled(true);
> +
>  if (context.getProperty(USE_ENCRYPTION).isSet()) {
>  
> configBuilder.withEncryptData(context.getProperty(USE_ENCRYPTION).asBoolean());
>  }
> {code}
> This appeared to resolve the issue.
> It would be very useful if this setting was available to toggle in the UI for 
> all SMB processors.
> Without this setting, I get a *STATUS_PATH_NOT_COVERED* error. 
> Somewhat related hierynomus/smbj github issues:
> https://github.com/hierynomus/smbj/issues/152
> https://github.com/hierynomus/smbj/issues/419
> This setting should be made available in the following processors and 
> services:
> * GetSmbFile
> * PutSmbFile
> * SmbjClientProviderService
> Edit: It might require some more changes to handle the connections and 
> sessions correctly.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12837) Add DFS setting to smb processors

2024-03-13 Thread Peter Turcsanyi (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17826782#comment-17826782
 ] 

Peter Turcsanyi commented on NIFI-12837:


[~andersns] Thanks for filing this feature request!

I did some investigation around DFS support in smbj and how it could be used in 
NiFi a few months ago. You are right that turning it on is quite 
straightforward. However, I ran into two bugs in the library (there may be 
[others|https://github.com/hierynomus/smbj/issues?q=is%3Aissue+is%3Aopen+DFS] 
too):
 * [https://github.com/hierynomus/smbj/issues/796] - Stale connections in 
SMBClient.connectionTable
 * [https://github.com/hierynomus/smbj/issues/717] - Null Pointer when opening 
DFS link

They should be fixed on the smbj side. Suggestions were provided how to handle 
them, but still no answers.

Fortunately, there is a workaround to the first one that can be implemented in 
NiFi.

The second one is a bit specific:
 - affects the List processor only
 - occurs only when the DFS root or a parent directory (not a link) is listed; 
when Directory property is set to a link directly, it works
 - managed to replicate on Samba DFS share and standalone Windows instances; it 
seems to work in Windows AD Domain where the DFS namespace is hosted by Domain 
Controller(s)

Having said that, I don't feel smbj's DFS support pretty robust but we can go 
ahead and add it in the SMB processors as an "experimental" feature.

 

> Add DFS setting to smb processors
> -
>
> Key: NIFI-12837
> URL: https://issues.apache.org/jira/browse/NIFI-12837
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.25.0
>Reporter: Anders
>Priority: Major
>
> The hierynomus/smbj library has a setting for enabling DFS which is disabled 
> by default:
> https://github.com/hierynomus/smbj/blob/f25d5c5478a5b73e9ba4202dcfb365974e15367e/src/main/java/com/hierynomus/smbj/SmbConfig.java#L106C17-L106C39
> This appears to cause problems in some SMB configurations.
> Patched 
> https://github.com/apache/nifi/blob/main/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
>  to test in my environment with:
> {code}
> $ git diff 
> nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
> diff --git 
> a/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
>  
> b/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
> index 0895abfae0..eac765 100644
> --- 
> a/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
> +++ 
> b/nifi-nar-bundles/nifi-smb-bundle/nifi-smb-smbj-common/src/main/java/org/apache/nifi/smb/common/SmbUtils.java
> @@ -46,6 +46,8 @@ public final class SmbUtils {
>  }
>  }
> +configBuilder.withDfsEnabled(true);
> +
>  if (context.getProperty(USE_ENCRYPTION).isSet()) {
>  
> configBuilder.withEncryptData(context.getProperty(USE_ENCRYPTION).asBoolean());
>  }
> {code}
> This appeared to resolve the issue.
> It would be very useful if this setting was available to toggle in the UI for 
> all SMB processors.
> Without this setting, I get a *STATUS_PATH_NOT_COVERED* error. 
> Somewhat related hierynomus/smbj github issues:
> https://github.com/hierynomus/smbj/issues/152
> https://github.com/hierynomus/smbj/issues/419
> This setting should be made available in the following processors and 
> services:
> * GetSmbFile
> * PutSmbFile
> * SmbjClientProviderService
> Edit: It might require some more changes to handle the connections and 
> sessions correctly.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[PR] MINIFICPP-2308 Use 0 terminated strings in python binding interface [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


lordgamez opened a new pull request, #1743:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1743

   https://issues.apache.org/jira/browse/MINIFICPP-2308
   
   
   Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [ ] Is there a JIRA ticket associated with this PR? Is it referenced
in the commit message?
   
   - [ ] Does your PR title start with MINIFICPP- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [ ] Has your PR been rebased against the latest commit within the target 
branch (typically main)?
   
   - [ ] Is your initial contribution a single, squashed commit?
   
   ### For code changes:
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the LICENSE file?
   - [ ] If applicable, have you updated the NOTICE file?
   
   ### For documentation related changes:
   - [ ] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check GitHub Actions CI 
results for build issues and submit an update to your PR as soon as possible.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12704 Support record root reference in function arguments [nifi]

2024-03-13 Thread via GitHub


dan-s1 commented on code in PR #8450:
URL: https://github.com/apache/nifi/pull/8450#discussion_r1523534416


##
nifi-commons/nifi-record-path/src/test/java/org/apache/nifi/record/path/TestRecordPath.java:
##
@@ -2115,4 +2068,11 @@ private Record createSimpleRecord() {
 return new MapRecord(schema, values);
 }
 
+private static FieldValue evaluateSingleFieldValue(RecordPath recordPath, 
Record record) {
+return 
recordPath.evaluate(record).getSelectedFields().findFirst().get();
+}
+
+private static FieldValue evaluateSingleFieldValue(String path, Record 
record) {
+return evaluateSingleFieldValue(RecordPath.compile(path), record);
+}

Review Comment:
   @EndzeitBegins This looks good to me but I would like @markap14 final word 
on this. @markap14 can you please take a look at this PR to ensure the logic 
here is correct? Thanks!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12887 Additional ways for handling Strings as binary in PutDatabaseRecord [nifi]

2024-03-13 Thread via GitHub


exceptionfactory commented on PR #8493:
URL: https://github.com/apache/nifi/pull/8493#issuecomment-1994733725

   > > @tpalfy - I think the easiest is to keep this PR for support/1.x, and 
open another PR targeted at main that does not use Google library. I think it's 
safer compared to letting the merger do the changes. Happy to merge once we're 
on the same page.
   > 
   > This PR is based on and is open against `main` already. So I'll need to 
change this one and open a new one against 1.x.
   
   Following up here from the thread, yes, I recommend making the changes on 
this pull request to use the native Java components, and then create a separate 
PR for the support branch. Java 8 includes Base64 support, so that should be 
used there, and Apache Commons Codec provides a Hex utility class that can 
handle hexadecimal decoding.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12887 Additional ways for handling Strings as binary in PutDatabaseRecord [nifi]

2024-03-13 Thread via GitHub


tpalfy commented on PR #8493:
URL: https://github.com/apache/nifi/pull/8493#issuecomment-1994675042

   > @tpalfy - I think the easiest is to keep this PR for support/1.x, and open 
another PR targeted at main that does not use Google library. I think it's 
safer compared to letting the merger do the changes. Happy to merge once we're 
on the same page.
   
   This PR is based on and is open against `main` already. So I'll need to 
change this one and open a new one against 1.x.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12887 Additional ways for handling Strings as binary in PutDatabaseRecord [nifi]

2024-03-13 Thread via GitHub


pvillard31 commented on PR #8493:
URL: https://github.com/apache/nifi/pull/8493#issuecomment-1994603126

   @tpalfy - I think the easiest is to keep this PR for support/1.x, and open 
another PR targeted at main that does not use Google library. I think it's 
safer compared to letting the merger do the changes. Happy to merge once we're 
on the same page.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-11107 In ConsumeIMAP and ConsumePOP3 added support for OAUTH based authorization. [nifi]

2024-03-13 Thread via GitHub


tpalfy commented on PR #6900:
URL: https://github.com/apache/nifi/pull/6900#issuecomment-1994553665

   > Hm... Sorry, but I have doubts about this...
   > 
   > @tpalfy Our access tokens e.g. are valid for 60 minutes only. Due to my 
understanding, the StandardOauth2AccessTokenProvider.java is intentionally 
written such that an authentication exception shall never occur at all! It will 
always automatically refresh any token, BEFORE it will expire (using the 
refresh window) by just asking it for "the access token" from outside calling 
`getAccessDetail()`. It will always be valid this way...
   > 
   > So an authentication exception should IMHO never be intended to occur at 
all...
   > 
   > ...
   > 
   > But simply to have a quickfix für NIFI 1.26 still better than nothing. ;-)
   
   Okay, that's fair. I have a PR up with my proposed changes here: 
https://github.com/apache/nifi/pull/8494
   
   In it's first state it only recreates the mail receiver if an error is 
encountered. I'm going to update it by keeping the first commit, adding a 
revert commit and adding a new one that checks if the token has been refreshed. 
So both approach can be checked.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12887 Additional ways for handling Strings as binary in PutDatabaseRecord [nifi]

2024-03-13 Thread via GitHub


tpalfy commented on code in PR #8493:
URL: https://github.com/apache/nifi/pull/8493#discussion_r1523369229


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/PutDatabaseRecord.java:
##
@@ -764,7 +796,15 @@ private void executeDML(final ProcessContext context, 
final ProcessSession sessi
 }
 currentValue = dest;
 } else if (currentValue instanceof 
String) {
-currentValue = ((String) 
currentValue).getBytes(StandardCharsets.UTF_8);
+final String stringValue = 
(String) currentValue;
+
+if 
(BINARY_STRING_FORMAT_BASE64.getValue().equals(binaryStringFormat)) {
+currentValue = 
BaseEncoding.base64().decode(stringValue);
+} else if 
(BINARY_STRING_FORMAT_HEX_STRING.getValue().equals(binaryStringFormat)) {
+currentValue = 
BaseEncoding.base16().decode(stringValue.toUpperCase());

Review Comment:
   Yes, I'm thinking about having this change on the 1.x branch.
   
   So not sure what is your suggestion. If we want it on 1.x branch is it okay 
to use the Google library on main and 1.x?
   
   Or you suggest it's worth the additional work to use vanilla Java on main 
and instead of simply doing a cherry-pick, the committer should do the code 
rewrite during backport to 1.x?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12887 Additional ways for handling Strings as binary in PutDatabaseRecord [nifi]

2024-03-13 Thread via GitHub


tpalfy commented on code in PR #8493:
URL: https://github.com/apache/nifi/pull/8493#discussion_r1523369229


##
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/PutDatabaseRecord.java:
##
@@ -764,7 +796,15 @@ private void executeDML(final ProcessContext context, 
final ProcessSession sessi
 }
 currentValue = dest;
 } else if (currentValue instanceof 
String) {
-currentValue = ((String) 
currentValue).getBytes(StandardCharsets.UTF_8);
+final String stringValue = 
(String) currentValue;
+
+if 
(BINARY_STRING_FORMAT_BASE64.getValue().equals(binaryStringFormat)) {
+currentValue = 
BaseEncoding.base64().decode(stringValue);
+} else if 
(BINARY_STRING_FORMAT_HEX_STRING.getValue().equals(binaryStringFormat)) {
+currentValue = 
BaseEncoding.base16().decode(stringValue.toUpperCase());

Review Comment:
   Yes, I'm thinking about having this change on the 1.x branch.
   
   So not sure what is your suggestion. If we want it on 1.x branch is it okay 
to use the Google library?
   
   Or you suggest it's worth the additional work to use vanilla Java on main 
and instead of simply doing a cherry-pick, the committer should do the code 
rewrite during backport to 1.x?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-12892: Use Kerberos user principal as isolation key instead of s… [nifi]

2024-03-13 Thread via GitHub


dan-s1 commented on code in PR #8497:
URL: https://github.com/apache/nifi/pull/8497#discussion_r1523328447


##
nifi-nar-bundles/nifi-iceberg-bundle/nifi-iceberg-common/src/main/java/org/apache/nifi/processors/iceberg/AbstractIcebergProcessor.java:
##
@@ -124,10 +124,16 @@ public void onTrigger(ProcessContext context, 
ProcessSession session) throws Pro
 
 @Override
 public String getClassloaderIsolationKey(PropertyContext context) {
-final KerberosUserService kerberosUserService = 
context.getProperty(KERBEROS_USER_SERVICE).asControllerService(KerberosUserService.class);
-if (kerberosUserService != null) {
-return kerberosUserService.getIdentifier();
+try {
+final KerberosUserService kerberosUserService = 
context.getProperty(KERBEROS_USER_SERVICE).asControllerService(KerberosUserService.class);
+if (kerberosUserService != null) {
+final KerberosUser kerberosUser = 
kerberosUserService.createKerberosUser();
+return kerberosUser.getPrincipal();
+}
+} catch (IllegalStateException e) {
+return null;
 }
+
 return null;

Review Comment:
   No need to add another `return null;` statement.
   ```suggestion
   } catch (IllegalStateException e) {}
   
   return null;
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Assigned] (MINIFICPP-2304) Clean up Python processor initialization

2024-03-13 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2304?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi reassigned MINIFICPP-2304:


Assignee: Gábor Gyimesi

> Clean up Python processor initialization
> 
>
> Key: MINIFICPP-2304
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2304
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Gábor Gyimesi
>Assignee: Gábor Gyimesi
>Priority: Major
>
> Python processor initialization should be refactored to be cleaner. We 
> instantiate the Python processors twice:
> We instantiate the Python processors that are used in the configured MiNiFi 
> flow. This is straightforward and not problematic.
> The problem is that before that we also instantiate all python processors in 
> the PythonCreator::registerScriptDescription method for getting the class 
> description for all available python processors for the agent manifest
>  * In this scenario we call the Python processors' initialize method twice:
>  ** Once the PythonObjectFactory::create method calls it to initialize the 
> supported properties to set the ScriptFile property to the path of the Python 
> processor
>  ** After this the PythonCreator::registerScriptDescription also calls it 
> explicitly to load the python processor from the set path
>  ** This should be circumvented to not need double initialization and have a 
> telling warning message in ExecutePythonProcessor::initialize() if the 
> loadScript method fails
>  * We should also find a way to avoid initializing all the Python processors 
> and retrieve the processor data without it. With NiFi Python processors a way 
> for this could be to use the "ast" python module to retrieve the processor 
> details which does not require loading the python module



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-13 Thread David Handermann (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17826049#comment-17826049
 ] 

David Handermann commented on NIFI-12885:
-

I agree that defining the schema type and then parsing it again is not ideal.

Thanks for offering to put together a pull request!

> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Assignee: crissaegrim
>Priority: Major
> Attachments: .png
>
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-13 Thread crissaegrim (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

crissaegrim reassigned NIFI-12885:
--

Assignee: crissaegrim

> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Assignee: crissaegrim
>Priority: Major
> Attachments: .png
>
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (NIFI-12885) MapRecord.getAsDate timestamp breaking bug

2024-03-13 Thread crissaegrim (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-12885?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17826045#comment-17826045
 ] 

crissaegrim commented on NIFI-12885:


Think it's anti-pattern to define a schema that expects a `DateTime`, only to 
parse it yourself downstream.  To unblock my project, yes, I will parse with 
`java.time` as suggested.

I will assign this ticket to myself and add a new method and rename the 
convenience method, as you have suggested above.

> MapRecord.getAsDate timestamp breaking bug
> --
>
> Key: NIFI-12885
> URL: https://issues.apache.org/jira/browse/NIFI-12885
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 2.0.0-M2
>Reporter: crissaegrim
>Priority: Major
> Attachments: .png
>
>
> I think I found a breaking bug from this commit 
> [https://github.com/apache/nifi/commit/250fe90b348fac515ea597c1985ca432ac7c3ac3#diff-ce496d3f0fc5a7e8a3c0431972f7069b4cf1af2e94f3a199f595ef195eb5ebfa]
> The below passes in 1.20.0 but fails in 2.0
> {code:java}
> @Test
> void testBasic() throws Exception {
> // setup
> final String schemaText = "{" +
> "\"type\" : \"record\"," +
> "\"name\" : \"TestRecord\"," +
> "\"namespace\" : \"org.apache.nifi\"," +
> "\"fields\" : [ {" +
> "\"name\" : \"my_datestamp_field\"," +
> "\"type\" : {" +
> "\"type\" : \"long\"," +
> "\"logicalType\" : \"timestamp-millis\"" +
> "}" +
> "} ]" +
>   "}";
> final RecordSchema schemaParsed = AvroTypeUtil.createSchema(new 
> Schema.Parser().parse(schemaText));
> final HashMap item = new HashMap<>();
> item.put("my_datestamp_field", "2022-01-01 10:00:00.000");
> // act
> final MapRecord record = new MapRecord(schemaParsed, item);
> final Date myDateStampField = record.getAsDate("my_datestamp_field", 
> "-MM-dd HH:mm:ss.SSS");
> // assert
>   // fails in 2.0; actual in 2.0.0-M2 is `164099520`
> assertEquals(164103120L, myDateStampField.getTime());
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-03-13 Thread via GitHub


lordgamez commented on code in PR #8441:
URL: https://github.com/apache/nifi/pull/8441#discussion_r1523112270


##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/PutOpenSearchVector.py:
##
@@ -0,0 +1,245 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from langchain.vectorstores import OpenSearchVectorSearch
+from nifiapi.flowfiletransform import FlowFileTransform, 
FlowFileTransformResult
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from OpenSearchVectorUtils import (L2, L1, LINF, COSINESIMIL, OPENAI_API_KEY, 
OPENAI_API_MODEL, HUGGING_FACE_API_KEY,
+   HUGGING_FACE_MODEL,HTTP_HOST, USERNAME, 
PASSWORD, INDEX_NAME, VECTOR_FIELD,
+   TEXT_FIELD, create_authentication_params, 
parse_documents)
+from EmbeddingUtils import EMBEDDING_MODEL, create_embedding_service
+from nifiapi.documentation import use_case, multi_processor_use_case, 
ProcessorConfiguration
+
+
+@use_case(description="Create vectors/embeddings that represent text content 
and send the vectors to OpenSearch",
+  notes="This use case assumes that the data has already been 
formatted in JSONL format with the text to store in OpenSearch provided in the 
'text' field.",
+  keywords=["opensearch", "embedding", "vector", "text", 
"vectorstore", "insert"],
+  configuration="""
+Configure the 'HTTP Host' to an appropriate URL where 
OpenSearch is accessible.
+Configure 'Embedding Model' to indicate whether OpenAI 
embeddings should be used or a HuggingFace embedding model should be used: 
'Hugging Face Model' or 'OpenAI Model'
+Configure the 'OpenAI API Key' or 'HuggingFace API Key', 
depending on the chosen Embedding Model.
+Set 'Index Name' to the name of your OpenSearch Index.
+Set 'Vector Field Name' to the name of the field in the 
document which will store the vector data.
+Set 'Text Field Name' to the name of the field in the document 
which will store the text data.
+
+If the documents to send to OpenSearch contain a unique 
identifier, set the 'Document ID Field Name' property to the name of the field 
that contains the document ID.
+This property can be left blank, in which case a unique ID 
will be generated based on the FlowFile's filename.
+
+If the provided index does not exists in OpenSearch then the 
processor is capable to create it. The 'New Index Strategy' property defines 
+that the index needs to be created from the default template 
or it should be configured with custom values.
+""")
+@use_case(description="Update vectors/embeddings in OpenSearch",
+  notes="This use case assumes that the data has already been 
formatted in JSONL format with the text to store in OpenSearch provided in the 
'text' field.",
+  keywords=["opensearch", "embedding", "vector", "text", 
"vectorstore", "update", "upsert"],
+  configuration="""
+Configure the 'HTTP Host' to an appropriate URL where 
OpenSearch is accessible.
+Configure 'Embedding Model' to indicate whether OpenAI 
embeddings should be used or a HuggingFace embedding model should be used: 
'Hugging Face Model' or 'OpenAI Model'
+Configure the 'OpenAI API Key' or 'HuggingFace API Key', 
depending on the chosen Embedding Model.
+Set 'Index Name' to the name of your OpenSearch Index.
+Set 'Vector Field Name' to the name of the field in the 
document which will store the vector data.
+Set 'Text Field Name' to the name of the field in the document 
which will store the text data.
+Set the 'Document ID Field Name' property to the name of the 
field that contains the identifier of the document in OpenSearch to update.
+""")
+class PutOpenSearchVector(FlowFileTransform):
+class Java:
+implements = ['org.apache.nifi.python.processor.FlowFileTransform']
+
+class ProcessorDetails:
+version = 

[jira] [Commented] (NIFI-8572) Documentation concerning NiFi properties override is inaccurate

2024-03-13 Thread Jim Halfpenny (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-8572?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17826025#comment-17826025
 ] 

Jim Halfpenny commented on NIFI-8572:
-

The docs appear to contain correct YAML syntax for setting overrides in 
nifi.properties. This example below comes from one of the MiNiFi tests that 
explicitly tests overriding config properties.

 
{code:java}
NiFi Properties Overrides:
  nifi.flowfile.repository.directory: ./flowfile_repository_override
  nifi.content.repository.directory.default: ./content_repository_override
  nifi.database.directory: ./database_repository_override{code}
I would suggest that this issue can be closed.

 

> Documentation concerning NiFi properties override is inaccurate
> ---
>
> Key: NIFI-8572
> URL: https://issues.apache.org/jira/browse/NIFI-8572
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: MiNiFi
>Affects Versions: 0.6.0
>Reporter: Wynner
>Priority: Critical
>
> Current documentation:
> h3. Example NiFi Properties Overrides
>  
> {{NiFi Properties Overrides:
>   nifi.flowfile.repository.directory: ./flowfile_repository_override
>   nifi.content.repository.directory.default: ./content_repository_override
>   nifi.database.directory: ./database_repository_override}}
>  
> But the format required for it to work is:
> NiFi Properties Overrides: {
>  nifi.content.repository.archive.max.retention.period: 1 hour,
>  nifi.content.repository.archive.max.usage.percentage: 10%,
>  nifi.content.repository.archive.enabled: 'true'}
>  
>  
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] MINIFICPP-2306 Filter out corrupt flowfiles during startup [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


martinzink commented on code in PR #1738:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1738#discussion_r1523086615


##
extensions/rocksdb-repos/tests/RepoTests.cpp:
##
@@ -781,4 +780,83 @@ TEST_CASE("Content repositories are always running", 
"[TestRepoIsRunning]") {
   REQUIRE(content_repo->isRunning());
 }
 
+std::shared_ptr 
createFlowFileWithContent(core::ContentRepository& content_repo, 
std::string_view content) {
+  auto flow_file = std::make_shared();
+  const auto content_session = content_repo.createSession();
+  const auto claim = content_session->create();
+  const auto stream = content_session->write(claim);
+
+  stream->write(gsl::make_span(content).as_span());
+  flow_file->setResourceClaim(claim);
+  flow_file->setSize(stream->size());
+  flow_file->setOffset(0);
+  stream->close();
+  content_session->commit();
+  return flow_file;
+}
+
+void corruptFlowFile(core::FlowFile& ff) {
+  ff.setSize(ff.getSize()*2);
+}
+
+TEST_CASE("FlowFileRepository can filter out too small contents") {
+  LogTestController::getInstance().setDebug();
+  
LogTestController::getInstance().setDebug();
+  
LogTestController::getInstance().setDebug();
+  TestController testController;
+  const auto ff_dir = testController.createTempDirectory();
+  const auto content_dir = testController.createTempDirectory();
+
+  auto config = std::make_shared();
+  config->set(minifi::Configure::nifi_flowfile_repository_directory_default, 
ff_dir.string());
+  config->set(minifi::Configure::nifi_dbcontent_repository_directory_default, 
content_dir.string());
+  size_t expected_flowfiles = std::numeric_limits::max();
+  std::shared_ptr content_repo;
+
+  SECTION("nifi.flowfile.repository.check.health set to false") {
+config->set(minifi::Configure::nifi_flow_file_repository_check_health, 
"false");
+expected_flowfiles = 2;
+SECTION("FileSystemRepository") {
+  content_repo = 
std::make_shared();
+}
+SECTION("RocksDB") {
+  content_repo = 
std::make_shared();
+}
+  }
+  SECTION("nifi.flowfile.repository.check.health set to true") {
+config->set(minifi::Configure::nifi_flow_file_repository_check_health, 
"true");
+expected_flowfiles = 1;
+SECTION("FileSystemRepository") {
+  content_repo = 
std::make_shared();
+}
+SECTION("RocksDB") {
+  content_repo = 
std::make_shared();
+}
+  }

Review Comment:
   Oh didnt know about this feature, looks way more clean thanks
   
https://github.com/apache/nifi-minifi-cpp/pull/1738/commits/a804730683686e95180515c36a62d8ba29a81aca



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2282 Support re-encryption of sensitive properties [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


fgerlits commented on code in PR #1739:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1739#discussion_r1522975112


##
encrypt-config/EncryptConfigMain.cpp:
##


Review Comment:
   it is set to an encrypted empty string, but I agree that we should not add 
new properties during re-encryption



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Assigned] (MINIFICPP-2308) python extension Dict: key type is std::string_view, but null-terminated string is required. The interface should be changed to reflect that.

2024-03-13 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi reassigned MINIFICPP-2308:


Assignee: Gábor Gyimesi

> python extension Dict: key type is std::string_view, but null-terminated 
> string is required. The interface should be changed to reflect that.
> -
>
> Key: MINIFICPP-2308
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2308
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Marton Szasz
>Assignee: Gábor Gyimesi
>Priority: Major
>
> We could change it to have two overloads: const char* and maybe std::string. 
> Alternatively, change it to use {{PyDict_GetItem}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] MINIFICPP-2306 Filter out corrupt flowfiles during startup [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


fgerlits commented on code in PR #1738:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1738#discussion_r1522907902


##
extensions/rocksdb-repos/tests/RepoTests.cpp:
##
@@ -781,4 +780,83 @@ TEST_CASE("Content repositories are always running", 
"[TestRepoIsRunning]") {
   REQUIRE(content_repo->isRunning());
 }
 
+std::shared_ptr 
createFlowFileWithContent(core::ContentRepository& content_repo, 
std::string_view content) {
+  auto flow_file = std::make_shared();
+  const auto content_session = content_repo.createSession();
+  const auto claim = content_session->create();
+  const auto stream = content_session->write(claim);
+
+  stream->write(gsl::make_span(content).as_span());

Review Comment:
   we still use `gsl::span` in a few places, but I think `std::span` is 
preferred in new code:
   ```suggestion
 stream->write(utils::as_span(std::span(content));
   ```



##
extensions/rocksdb-repos/tests/RepoTests.cpp:
##
@@ -781,4 +780,83 @@ TEST_CASE("Content repositories are always running", 
"[TestRepoIsRunning]") {
   REQUIRE(content_repo->isRunning());
 }
 
+std::shared_ptr 
createFlowFileWithContent(core::ContentRepository& content_repo, 
std::string_view content) {
+  auto flow_file = std::make_shared();
+  const auto content_session = content_repo.createSession();
+  const auto claim = content_session->create();
+  const auto stream = content_session->write(claim);
+
+  stream->write(gsl::make_span(content).as_span());
+  flow_file->setResourceClaim(claim);
+  flow_file->setSize(stream->size());
+  flow_file->setOffset(0);
+  stream->close();
+  content_session->commit();
+  return flow_file;
+}
+
+void corruptFlowFile(core::FlowFile& ff) {
+  ff.setSize(ff.getSize()*2);
+}
+
+TEST_CASE("FlowFileRepository can filter out too small contents") {
+  LogTestController::getInstance().setDebug();
+  
LogTestController::getInstance().setDebug();
+  
LogTestController::getInstance().setDebug();
+  TestController testController;
+  const auto ff_dir = testController.createTempDirectory();
+  const auto content_dir = testController.createTempDirectory();
+
+  auto config = std::make_shared();
+  config->set(minifi::Configure::nifi_flowfile_repository_directory_default, 
ff_dir.string());
+  config->set(minifi::Configure::nifi_dbcontent_repository_directory_default, 
content_dir.string());
+  size_t expected_flowfiles = std::numeric_limits::max();
+  std::shared_ptr content_repo;
+
+  SECTION("nifi.flowfile.repository.check.health set to false") {
+config->set(minifi::Configure::nifi_flow_file_repository_check_health, 
"false");
+expected_flowfiles = 2;
+SECTION("FileSystemRepository") {
+  content_repo = 
std::make_shared();
+}
+SECTION("RocksDB") {
+  content_repo = 
std::make_shared();
+}
+  }
+  SECTION("nifi.flowfile.repository.check.health set to true") {
+config->set(minifi::Configure::nifi_flow_file_repository_check_health, 
"true");
+expected_flowfiles = 1;
+SECTION("FileSystemRepository") {
+  content_repo = 
std::make_shared();
+}
+SECTION("RocksDB") {
+  content_repo = 
std::make_shared();
+}
+  }

Review Comment:
   just a suggestion, but I think `GENERATE` is more readable for Cartesian 
products than `SECTION`:
   ```suggestion
   
 const auto [check_health, expected_flowfiles] = GENERATE(
 std::make_tuple("false", 2),
 std::make_tuple("true", 1));
 config->set(minifi::Configure::nifi_flow_file_repository_check_health, 
check_health);
   
 const auto content_repo = GENERATE(
 
static_cast>(std::make_shared()),
 
static_cast>(std::make_shared()));
   ```
   
   (this requires `#include "catch2/generators/catch_generators.hpp"`)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Resolved] (MINIFICPP-2118) fix HTTPClientTests failure on CentOS 7

2024-03-13 Thread Marton Szasz (Jira)


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2118?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marton Szasz resolved MINIFICPP-2118.
-
Resolution: Not A Problem

doesn't fail anymore, probably fixed in a different jira

> fix HTTPClientTests failure on CentOS 7
> ---
>
> Key: MINIFICPP-2118
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2118
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Marton Szasz
>Priority: Major
>
> {noformat}
> [2023-05-10 16:02:44.706] 
> [org::apache::nifi::minifi::extensions::curl::HTTPClient] [error] 
> curl_easy_perform() failed SSL peer certificate or SSH remote key was not OK 
> on https://apache.org, error code 60
> ~~~
> HTTPClientTests is a Catch v2.13.10 host application.
> Run with -? for options
> ---
> SSL without SSLContextService
> ---
> /home/szaszm/nifi-minifi-cpp/extensions/http-curl/tests/unit/HTTPClientTests.cpp:147
> ...
> /home/szaszm/nifi-minifi-cpp/extensions/http-curl/tests/unit/HTTPClientTests.cpp:150:
>  FAILED:
>   REQUIRE( client.submit() )
> with expansion:
>   false
> ===
> test cases:  6 |  5 passed | 1 failed
> assertions: 52 | 51 passed | 1 failed
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] NIFI-12831: Add PutOpenSearchVector and QueryOpenSearchVector processors [nifi]

2024-03-13 Thread via GitHub


mark-bathori commented on code in PR #8441:
URL: https://github.com/apache/nifi/pull/8441#discussion_r1522929589


##
nifi-python-extensions/nifi-text-embeddings-module/src/main/python/vectorstores/PutOpenSearchVector.py:
##
@@ -0,0 +1,245 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from langchain.vectorstores import OpenSearchVectorSearch
+from nifiapi.flowfiletransform import FlowFileTransform, 
FlowFileTransformResult
+from nifiapi.properties import PropertyDescriptor, StandardValidators, 
ExpressionLanguageScope, PropertyDependency
+from OpenSearchVectorUtils import (L2, L1, LINF, COSINESIMIL, OPENAI_API_KEY, 
OPENAI_API_MODEL, HUGGING_FACE_API_KEY,
+   HUGGING_FACE_MODEL,HTTP_HOST, USERNAME, 
PASSWORD, INDEX_NAME, VECTOR_FIELD,
+   TEXT_FIELD, create_authentication_params, 
parse_documents)
+from EmbeddingUtils import EMBEDDING_MODEL, create_embedding_service
+from nifiapi.documentation import use_case, multi_processor_use_case, 
ProcessorConfiguration
+
+
+@use_case(description="Create vectors/embeddings that represent text content 
and send the vectors to OpenSearch",
+  notes="This use case assumes that the data has already been 
formatted in JSONL format with the text to store in OpenSearch provided in the 
'text' field.",
+  keywords=["opensearch", "embedding", "vector", "text", 
"vectorstore", "insert"],
+  configuration="""
+Configure the 'HTTP Host' to an appropriate URL where 
OpenSearch is accessible.
+Configure 'Embedding Model' to indicate whether OpenAI 
embeddings should be used or a HuggingFace embedding model should be used: 
'Hugging Face Model' or 'OpenAI Model'
+Configure the 'OpenAI API Key' or 'HuggingFace API Key', 
depending on the chosen Embedding Model.
+Set 'Index Name' to the name of your OpenSearch Index.
+Set 'Vector Field Name' to the name of the field in the 
document which will store the vector data.
+Set 'Text Field Name' to the name of the field in the document 
which will store the text data.
+
+If the documents to send to OpenSearch contain a unique 
identifier, set the 'Document ID Field Name' property to the name of the field 
that contains the document ID.
+This property can be left blank, in which case a unique ID 
will be generated based on the FlowFile's filename.
+
+If the provided index does not exists in OpenSearch then the 
processor is capable to create it. The 'New Index Strategy' property defines 
+that the index needs to be created from the default template 
or it should be configured with custom values.
+""")
+@use_case(description="Update vectors/embeddings in OpenSearch",
+  notes="This use case assumes that the data has already been 
formatted in JSONL format with the text to store in OpenSearch provided in the 
'text' field.",
+  keywords=["opensearch", "embedding", "vector", "text", 
"vectorstore", "update", "upsert"],
+  configuration="""
+Configure the 'HTTP Host' to an appropriate URL where 
OpenSearch is accessible.
+Configure 'Embedding Model' to indicate whether OpenAI 
embeddings should be used or a HuggingFace embedding model should be used: 
'Hugging Face Model' or 'OpenAI Model'
+Configure the 'OpenAI API Key' or 'HuggingFace API Key', 
depending on the chosen Embedding Model.
+Set 'Index Name' to the name of your OpenSearch Index.
+Set 'Vector Field Name' to the name of the field in the 
document which will store the vector data.
+Set 'Text Field Name' to the name of the field in the document 
which will store the text data.
+Set the 'Document ID Field Name' property to the name of the 
field that contains the identifier of the document in OpenSearch to update.
+""")
+class PutOpenSearchVector(FlowFileTransform):
+class Java:
+implements = ['org.apache.nifi.python.processor.FlowFileTransform']
+
+class ProcessorDetails:
+version = 

[jira] [Updated] (MINIFICPP-2203) Add support for building Windows MSI without any redistributables included

2024-03-13 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2203?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi updated MINIFICPP-2203:
-
Status: Patch Available  (was: Open)

> Add support for building Windows MSI without any redistributables included
> --
>
> Key: MINIFICPP-2203
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2203
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Marton Szasz
>Assignee: Gábor Gyimesi
>Priority: Major
>  Time Spent: 2h 10m
>  Remaining Estimate: 0h
>
> Currenly MSIs can't be distributed under Apache-2.0, because the required 
> Microsoft redistributables are included, and their license is not compatible 
> with Apache-2.0. We should add the option to compile with the system UCRT 
> only, without including either merge modules or DLLs. The resulting 
> installers should still work on reasonably up to date systems.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (MINIFICPP-2202) The MSI installer should start the ApacheNiFiMiNiFi service

2024-03-13 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2202?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi updated MINIFICPP-2202:
-
Status: Patch Available  (was: Open)

> The MSI installer should start the ApacheNiFiMiNiFi service
> ---
>
> Key: MINIFICPP-2202
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2202
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Ferenc Gerlits
>Assignee: Gábor Gyimesi
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> At the end of the installation process, is it possible for the Windows MSI 
> installer to start the ApacheNiFiMiNiFi service?  That would save an 
> unnecessary step for the user.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (MINIFICPP-2309) Add JSON flow config examples

2024-03-13 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi updated MINIFICPP-2309:
-
Status: Patch Available  (was: Open)

> Add JSON flow config examples
> -
>
> Key: MINIFICPP-2309
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2309
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Task
>Reporter: Gábor Gyimesi
>Assignee: Gábor Gyimesi
>Priority: Minor
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> Currently only yaml flow config examples are available in the examples 
> directory of the project. Since we are also supporting JSON format for flow 
> configurations, we should add JSON examples too.
> As there are two accepted formats of JSON configs that are accepted, both 
> examples should be added:
>  # The simpler format, which mimics the current yaml configuration
>  # The NiFi format, which mimics NiFi's JSON configuration format



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (MINIFICPP-2313) Grafana Loki build fails and grpc test crashes on Windows

2024-03-13 Thread Jira


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-2313?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi updated MINIFICPP-2313:
-
Status: Patch Available  (was: Open)

> Grafana Loki build fails and grpc test crashes on Windows
> -
>
> Key: MINIFICPP-2313
> URL: https://issues.apache.org/jira/browse/MINIFICPP-2313
> Project: Apache NiFi MiNiFi C++
>  Issue Type: Bug
>Reporter: Gábor Gyimesi
>Assignee: Gábor Gyimesi
>Priority: Major
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> Grafana Loki build fails on Windows, which we did not realize earlier as the 
> windows build batch file does not turn the Loki build flag on, so it was 
> missing in the CI build as well.
> PushGrafanaLokiGrpcTest crashes with segmentation fault in the grpc library 
> when run on Windows.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] MINIFICPP-2203 Add support for building Windows MSI without any redistributables included [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


lordgamez commented on code in PR #1734:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1734#discussion_r1522884699


##
CMakeLists.txt:
##
@@ -493,20 +504,21 @@ if(WIN32)
 list(GET VCRUNTIME_X64_MERGEMODULES 0 
VCRUNTIME_X64_MERGEMODULE_PATH)
 endif()
 
-if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+if (BUILD_PLATFORM STREQUAL "x64")
 message("Using ${VCRUNTIME_X64_MERGEMODULE_PATH} VC 
Redistributable Merge Module")
 configure_file("msi/x64.wsi" "msi/x64.wsi" @ONLY)
 list(APPEND CPACK_WIX_EXTRA_SOURCES 
"${CMAKE_CURRENT_BINARY_DIR}/msi/x64.wsi")
-elseif(CMAKE_SIZEOF_VOID_P EQUAL 4)
+else()
 message("Using ${VCRUNTIME_X86_MERGEMODULE_PATH} VC 
Redistributable Merge Module")
 configure_file("msi/x86.wsi" "msi/x86.wsi" @ONLY)
 list(APPEND CPACK_WIX_EXTRA_SOURCES 
"${CMAKE_CURRENT_BINARY_DIR}/msi/x86.wsi")
-else()
-message(FATAL_ERROR "Could not determine architecture, 
CMAKE_SIZEOF_VOID_P is unexpected: ${CMAKE_SIZEOF_VOID_P}")
 endif()
-set(CPACK_WIX_TEMPLATE 
"${CMAKE_CURRENT_SOURCE_DIR}/msi/WixWinMergeModules.wsi")
-else()
-message("Creating installer with redistributables")
+
+file(READ "${CMAKE_CURRENT_SOURCE_DIR}/msi/MergeModulesFeature.xml" 
WIX_EXTRA_FEATURES)
+configure_file("${CMAKE_CURRENT_SOURCE_DIR}/msi/WixWin.wsi.in" 
"${CMAKE_CURRENT_SOURCE_DIR}/msi/WixWin.wsi")
+set(CPACK_WIX_TEMPLATE "${CMAKE_CURRENT_SOURCE_DIR}/msi/WixWin.wsi")

Review Comment:
   Good point, updated in b58c443735e7d065d37619945f6fe88e6fc534ef



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2277 Add virtualenv support for python processors [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


lordgamez commented on code in PR #1721:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1721#discussion_r1522863070


##
extensions/python/PythonConfigState.h:
##
@@ -0,0 +1,50 @@
+/**
+ *
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+#pragma once
+
+#include 
+#include 
+
+namespace org::apache::nifi::minifi::extensions::python {
+
+struct PythonConfigState {
+ public:
+  PythonConfigState(PythonConfigState&&) = delete;
+  PythonConfigState(const PythonConfigState&) = delete;
+  PythonConfigState& operator=(PythonConfigState&&) = delete;
+  PythonConfigState& operator=(const PythonConfigState&) = delete;
+
+  bool isPackageInstallationNeeded() const {
+return install_python_packages_automatically && !virtualenv_path.empty();
+  }
+
+  static PythonConfigState& getInstance() {
+static PythonConfigState config;
+return config;
+  }

Review Comment:
   Extracted dependency installation to the `PythonDependencyInstaller` class 
and moved the dependency install to `PythonCreator`, to install dependencies 
while processing all the python processor script files in 
5b20de2ffdc4d8f2bfd8907d1a47294440a11fc2



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2277 Add virtualenv support for python processors [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


lordgamez commented on code in PR #1721:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1721#discussion_r1522853483


##
extensions/python/ExecutePythonProcessor.h:
##
@@ -35,10 +35,6 @@
 #include "PythonScriptEngine.h"
 #include "PythonScriptEngine.h"
 
-#if defined(__GNUC__) || defined(__GNUG__)
-#pragma GCC visibility push(hidden)

Review Comment:
   This was added due to pybind11 requiring -fvisibility=hidden, but as we do 
not use pybind11 anymore this can be removed



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] MINIFICPP-2203 Add support for building Windows MSI without any redistributables included [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


fgerlits commented on code in PR #1734:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1734#discussion_r1522822212


##
CMakeLists.txt:
##
@@ -493,20 +504,21 @@ if(WIN32)
 list(GET VCRUNTIME_X64_MERGEMODULES 0 
VCRUNTIME_X64_MERGEMODULE_PATH)
 endif()
 
-if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+if (BUILD_PLATFORM STREQUAL "x64")
 message("Using ${VCRUNTIME_X64_MERGEMODULE_PATH} VC 
Redistributable Merge Module")
 configure_file("msi/x64.wsi" "msi/x64.wsi" @ONLY)
 list(APPEND CPACK_WIX_EXTRA_SOURCES 
"${CMAKE_CURRENT_BINARY_DIR}/msi/x64.wsi")
-elseif(CMAKE_SIZEOF_VOID_P EQUAL 4)
+else()
 message("Using ${VCRUNTIME_X86_MERGEMODULE_PATH} VC 
Redistributable Merge Module")
 configure_file("msi/x86.wsi" "msi/x86.wsi" @ONLY)
 list(APPEND CPACK_WIX_EXTRA_SOURCES 
"${CMAKE_CURRENT_BINARY_DIR}/msi/x86.wsi")
-else()
-message(FATAL_ERROR "Could not determine architecture, 
CMAKE_SIZEOF_VOID_P is unexpected: ${CMAKE_SIZEOF_VOID_P}")
 endif()
-set(CPACK_WIX_TEMPLATE 
"${CMAKE_CURRENT_SOURCE_DIR}/msi/WixWinMergeModules.wsi")
-else()
-message("Creating installer with redistributables")
+
+file(READ "${CMAKE_CURRENT_SOURCE_DIR}/msi/MergeModulesFeature.xml" 
WIX_EXTRA_FEATURES)
+configure_file("${CMAKE_CURRENT_SOURCE_DIR}/msi/WixWin.wsi.in" 
"${CMAKE_CURRENT_SOURCE_DIR}/msi/WixWin.wsi")
+set(CPACK_WIX_TEMPLATE "${CMAKE_CURRENT_SOURCE_DIR}/msi/WixWin.wsi")

Review Comment:
   these two lines are the same in every branch, so they could be moved to 
after the `if` block



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-11107 In ConsumeIMAP and ConsumePOP3 added support for OAUTH based authorization. [nifi]

2024-03-13 Thread via GitHub


AnTu2702 commented on PR #6900:
URL: https://github.com/apache/nifi/pull/6900#issuecomment-1993901612

   Hm... Not sure about this...
   
   Our access tokens e.g. are valid for 60 minutes only.
   Due to my understanding, the StandardOauth2AccessTokenProvider.java is 
intentionally written such that an Authentication Exception shall never occur 
at all! It will automatically refresh any token, BEFORE it will expire (using 
the refresh window) by just asking it for "the access token". It will always be 
valid this way...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] NIFI-11107 In ConsumeIMAP and ConsumePOP3 added support for OAUTH based authorization. [nifi]

2024-03-13 Thread via GitHub


AnTu2702 commented on PR #6900:
URL: https://github.com/apache/nifi/pull/6900#issuecomment-1993902221

   > @AnTu2702 I think I can add a not too complicated change that only 
recreates the `messageReciever` when an authentication exception is encountered 
in a couple of days, hopefully today.
   > 
   > Since there are connections involved and given the fact that onTriggers 
can be called very frequently, it's better to have it like this right away.
   
   Hm... Not sure about this...
   
   @tpalfy Our access tokens e.g. are valid for 60 minutes only.
   Due to my understanding, the StandardOauth2AccessTokenProvider.java is 
intentionally written such that an Authentication Exception shall never occur 
at all! It will automatically refresh any token, BEFORE it will expire (using 
the refresh window) by just asking it for "the access token". It will always be 
valid this way...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] NIFI-12892: Use Kerberos user principal as isolation key instead of s… [nifi]

2024-03-13 Thread via GitHub


mark-bathori opened a new pull request, #8497:
URL: https://github.com/apache/nifi/pull/8497

   …ervice identifier in Iceberg processors
   
   
   
   
   
   
   
   
   
   
   
   
   
   
   # Summary
   
   [NIFI-12892](https://issues.apache.org/jira/browse/NIFI-12892)
   
   # Tracking
   
   Please complete the following tracking steps prior to pull request creation.
   
   ### Issue Tracking
   
   - [x] [Apache NiFi Jira](https://issues.apache.org/jira/browse/NIFI) issue 
created
   
   ### Pull Request Tracking
   
   - [x] Pull Request title starts with Apache NiFi Jira issue number, such as 
`NIFI-0`
   - [x] Pull Request commit message starts with Apache NiFi Jira issue number, 
as such `NIFI-0`
   
   ### Pull Request Formatting
   
   - [x] Pull Request based on current revision of the `main` branch
   - [x] Pull Request refers to a feature branch with one commit containing 
changes
   
   # Verification
   
   Please indicate the verification steps performed prior to pull request 
creation.
   
   ### Build
   
   - [x] Build completed using `mvn clean install -P contrib-check`
 - [x] JDK 21
   
   ### Licensing
   
   - [ ] New dependencies are compatible with the [Apache License 
2.0](https://apache.org/licenses/LICENSE-2.0) according to the [License 
Policy](https://www.apache.org/legal/resolved.html)
   - [ ] New dependencies are documented in applicable `LICENSE` and `NOTICE` 
files
   
   ### Documentation
   
   - [ ] Documentation formatting appears as expected in rendered files
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[jira] [Created] (NIFI-12892) Use Kerberos user principal as isolation key instead of service identifier in Iceberg processors

2024-03-13 Thread Mark Bathori (Jira)
Mark Bathori created NIFI-12892:
---

 Summary: Use Kerberos user principal as isolation key instead of 
service identifier in Iceberg processors
 Key: NIFI-12892
 URL: https://issues.apache.org/jira/browse/NIFI-12892
 Project: Apache NiFi
  Issue Type: Bug
Reporter: Mark Bathori
Assignee: Mark Bathori


Change the class loader isolation to use the kerberos user principal as key 
instead of the service identifier to align with other HDFS processors.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


Re: [PR] MINIFICPP-1797 Python bootstrap (part 1) [nifi-minifi-cpp]

2024-03-13 Thread via GitHub


szaszm commented on code in PR #1681:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1681#discussion_r1522706091


##
bootstrap/package_manager.py:
##
@@ -0,0 +1,325 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+import glob
+import os
+import platform
+import subprocess
+import sys
+import re
+from typing import Dict, Set
+
+from distro import distro
+
+
+def _query_yes_no(question: str, no_confirm: bool) -> bool:
+valid = {"yes": True, "y": True, "ye": True, "no": False, "n": False}
+
+if no_confirm:
+print("Running {} with noconfirm".format(question))
+return True
+while True:
+print("{} [y/n]".format(question))
+choice = input().lower()
+if choice in valid:
+return valid[choice]
+else:
+print("Please respond with 'yes' or 'no' " "(or 'y' or 'n').")
+
+
+def _run_command_with_confirm(command: str, no_confirm: bool) -> bool:
+if _query_yes_no("Running {}".format(command), no_confirm):
+return os.system(command) == 0
+
+
+class PackageManager(object):
+def __init__(self, no_confirm):
+self.no_confirm = no_confirm
+pass
+
+def install(self, dependencies: Dict[str, Set[str]]) -> bool:
+raise Exception("NotImplementedException")
+
+def install_compiler(self) -> str:
+raise Exception("NotImplementedException")
+
+def _install(self, dependencies: Dict[str, Set[str]], replace_dict: 
Dict[str, Set[str]], install_cmd: str) -> bool:
+dependencies.update({k: v for k, v in replace_dict.items() if k in 
dependencies})
+dependencies = self._filter_out_installed_packages(dependencies)
+dependencies_str = " ".join(str(value) for value_set in 
dependencies.values() for value in value_set)
+if not dependencies_str or dependencies_str.isspace():
+return True
+return _run_command_with_confirm(f"{install_cmd} {dependencies_str}", 
self.no_confirm)
+
+def _get_installed_packages(self) -> Set[str]:
+raise Exception("NotImplementedException")
+
+def _filter_out_installed_packages(self, dependencies: Dict[str, 
Set[str]]):
+installed_packages = self._get_installed_packages()
+filtered_packages = {k: (v - installed_packages) for k, v in 
dependencies.items()}
+for installed_package in installed_packages:
+filtered_packages.pop(installed_package, None)
+return filtered_packages
+
+def run_cmd(self, cmd: str) -> bool:
+result = subprocess.run(f"{cmd}", shell=True, text=True)
+return result.returncode == 0
+
+
+class BrewPackageManager(PackageManager):
+def __init__(self, no_confirm):
+PackageManager.__init__(self, no_confirm)
+
+def install(self, dependencies: Dict[str, Set[str]]) -> bool:
+return self._install(dependencies=dependencies,
+ install_cmd="brew install",
+ replace_dict={"patch": set(),
+   "jni": {"maven"}})
+
+def install_compiler(self) -> str:
+self.install({"compiler": {"llvm"}})
+return ""

Review Comment:
   I checked last week, and it was working as well as without the new python 
bootstrap. (Which still doesn't build on my mac, but that's not your fault.)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@nifi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org