[jira] [Comment Edited] (NIFI-5018) basic snap-to-grid feature for UI

2018-05-11 Thread Ryan Bower (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5018?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472792#comment-16472792
 ] 

Ryan Bower edited comment on NIFI-5018 at 5/11/18 11:21 PM:


[~joewitt] is this an ok place to inquire about some of the canvas component 
details?

My grid snap works, but the edges aren't consistent with canvas components of 
differing dimensions.  I may be able to normalize the rounding and account for 
the width of the component, but I'm not sure.  Is there documentation of the 
default dimensions for each canvas component? 


was (Author: rbower):
[~joewitt] is this an ok place to inquire about some of the canvas component 
details?

My grid snap works, but the edges aren't consistent with canvas components of 
differing dimensions.  I may be able to normalize the rounding and account for 
the width of the component, but I'm not sure.  Is there a reference for the 
default dimensions for each canvas component? 

> basic snap-to-grid feature for UI
> -
>
> Key: NIFI-5018
> URL: https://issues.apache.org/jira/browse/NIFI-5018
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Affects Versions: 1.5.0, 1.6.0
> Environment: Tested on Windows
>Reporter: Ryan Bower
>Priority: Minor
>  Labels: web-ui
>   Original Estimate: 0.25h
>  Remaining Estimate: 0.25h
>
> NiFi 1.2.0 contained the flow alignment feature, detailed:
> *NiFi 1.2.0 has a nice, new little feature that will surely please those who 
> may spend a bit of time – for some, perhaps A LOT of time – getting their 
> flow to line up perfectly. The background grid lines can help with this, but 
> the process can still be quite tedious with many components. Now there is a 
> quick, easy way.*
> **I've made a slight modification to the UI (roughly 5 lines) that results in 
> a "snap-to-grid" for selected components.  See [this 
> video|https://www.youtube.com/watch?v=S7lnBMMO6KE=youtu.be] for an 
> example of it in action.
> Target file: 
> nifi-1.6.0-src\nifi-nar-bundles\nifi-framework-bundle\nifi-framework\nifi-web\nifi-web-ui\src\main\webapp\js\nf\canvas\nf-draggable.js
> The processor alignment is based on rounding the component's X and Y 
> coordinates during the drag event.  The result is a consistent "snap" 
> alignment.  I modified the following code to achieve this:
>  
>  
>  
> {{// previous code...}}
>  
> {{(this, function ($, d3, nfConnection, nfBirdseye, nfCanvasUtils, nfCommon, 
> nfDialog, nfClient, nfErrorHandler) {}}
> {{  'use strict';}}
> {{  var nfCanvas;}}
> {{  var drag;}}{{ }}{{// added for snap-to-grid feature.}}
> {{  var snapTo = 16;}}
>  
> {{// code...}}
>  
> {{  var nfDraggable = {}}
>  
> {{    // more code...}}
>  
> {{    if (dragSelection.empty()) }}{ 
>  
> {{      // more code...}}
>  
> {{    } else {}}
> {{      // update the position of the drag selection}}{{    }}
>  {{      dragSelection.attr('x', function (d) {}}
>  {{        d.x += d3.event.dx;}}
> {{        // rounding the result achieves the "snap" alignment}}
>  {{        return (Math.round(d.x/snapTo) * snapTo);}}
>  {{    })}}
>  {{      .attr('y', function (d) {}}
>  {{        d.y += d3.event.dy;}}
>  {{        return (Math.round(d.y/snapTo) * snapTo);}}
>  {{      });}}
>         }
>  
> {{ // more code}}
>  
> {{    updateComponentPosition: function (d, delta) {}}
>  {{      // perform rounding again for the update}}
>  {{      var newPosition = {}}
>  {{      'x': (Math.round((d.position.x + delta.x)/snapTo) * snapTo),}}
>  {{      'y': (Math.round((d.position.y + delta.y)/snapTo) * snapTo)}}
>  {{    };}}
> {{ // more code}}
> {{}}}
>  
> The downside of this is that components must start aligned in order to snap 
> to the same alignment on the canvas.  To remedy this, just use the 1.2.0 flow 
> alignment feature.  Note: this is only an issue for old, unaligned flows.  
> New flows and aligned flows don't have this problem.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5018) basic snap-to-grid feature for UI

2018-05-11 Thread Ryan Bower (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5018?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472792#comment-16472792
 ] 

Ryan Bower commented on NIFI-5018:
--

[~joewitt] is this an ok place to inquire about some of the canvas component 
details?

My grid snap works, but the edges aren't consistent with canvas components of 
differing dimensions.  I may be able to normalize the rounding and account for 
the width of the component, but I'm not sure.  Is there a reference for the 
default dimensions for each canvas component? 

> basic snap-to-grid feature for UI
> -
>
> Key: NIFI-5018
> URL: https://issues.apache.org/jira/browse/NIFI-5018
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Affects Versions: 1.5.0, 1.6.0
> Environment: Tested on Windows
>Reporter: Ryan Bower
>Priority: Minor
>  Labels: web-ui
>   Original Estimate: 0.25h
>  Remaining Estimate: 0.25h
>
> NiFi 1.2.0 contained the flow alignment feature, detailed:
> *NiFi 1.2.0 has a nice, new little feature that will surely please those who 
> may spend a bit of time – for some, perhaps A LOT of time – getting their 
> flow to line up perfectly. The background grid lines can help with this, but 
> the process can still be quite tedious with many components. Now there is a 
> quick, easy way.*
> **I've made a slight modification to the UI (roughly 5 lines) that results in 
> a "snap-to-grid" for selected components.  See [this 
> video|https://www.youtube.com/watch?v=S7lnBMMO6KE=youtu.be] for an 
> example of it in action.
> Target file: 
> nifi-1.6.0-src\nifi-nar-bundles\nifi-framework-bundle\nifi-framework\nifi-web\nifi-web-ui\src\main\webapp\js\nf\canvas\nf-draggable.js
> The processor alignment is based on rounding the component's X and Y 
> coordinates during the drag event.  The result is a consistent "snap" 
> alignment.  I modified the following code to achieve this:
>  
>  
>  
> {{// previous code...}}
>  
> {{(this, function ($, d3, nfConnection, nfBirdseye, nfCanvasUtils, nfCommon, 
> nfDialog, nfClient, nfErrorHandler) {}}
> {{  'use strict';}}
> {{  var nfCanvas;}}
> {{  var drag;}}{{ }}{{// added for snap-to-grid feature.}}
> {{  var snapTo = 16;}}
>  
> {{// code...}}
>  
> {{  var nfDraggable = {}}
>  
> {{    // more code...}}
>  
> {{    if (dragSelection.empty()) }}{ 
>  
> {{      // more code...}}
>  
> {{    } else {}}
> {{      // update the position of the drag selection}}{{    }}
>  {{      dragSelection.attr('x', function (d) {}}
>  {{        d.x += d3.event.dx;}}
> {{        // rounding the result achieves the "snap" alignment}}
>  {{        return (Math.round(d.x/snapTo) * snapTo);}}
>  {{    })}}
>  {{      .attr('y', function (d) {}}
>  {{        d.y += d3.event.dy;}}
>  {{        return (Math.round(d.y/snapTo) * snapTo);}}
>  {{      });}}
>         }
>  
> {{ // more code}}
>  
> {{    updateComponentPosition: function (d, delta) {}}
>  {{      // perform rounding again for the update}}
>  {{      var newPosition = {}}
>  {{      'x': (Math.round((d.position.x + delta.x)/snapTo) * snapTo),}}
>  {{      'y': (Math.round((d.position.y + delta.y)/snapTo) * snapTo)}}
>  {{    };}}
> {{ // more code}}
> {{}}}
>  
> The downside of this is that components must start aligned in order to snap 
> to the same alignment on the canvas.  To remedy this, just use the 1.2.0 flow 
> alignment feature.  Note: this is only an issue for old, unaligned flows.  
> New flows and aligned flows don't have this problem.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5024) Deadlock in ExecuteStreamCommand processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5024?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472731#comment-16472731
 ] 

ASF GitHub Bot commented on NIFI-5024:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2594#discussion_r187743134
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExecuteStreamCommand.java
 ---
@@ -382,10 +389,10 @@ public void onTrigger(ProcessContext context, final 
ProcessSession session) thro
 Map attributes = new HashMap<>();
 
 final StringBuilder strBldr = new StringBuilder();
-try {
-String line;
-while ((line = bufferedReader.readLine()) != null) {
-strBldr.append(line).append("\n");
+try (final InputStream is = new FileInputStream(errorOut)) {
--- End diff --

The java-monitor.com link wouldn't load for me, so I couldn't see your 
reference. So is this problem here with the BufferedReader? I'm not entirely 
sure I follow the change here because either way you're pulling data from 
stderr. This case just redirects it and you're reading it from a file.


> Deadlock in ExecuteStreamCommand processor
> --
>
> Key: NIFI-5024
> URL: https://issues.apache.org/jira/browse/NIFI-5024
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Nicolas Sanglard
>Priority: Minor
> Attachments: Screen Shot 2018-03-28 at 15.34.36.png, Screen Shot 
> 2018-03-28 at 15.36.02.png
>
>
> Whenever a process is producing too much output on stderr, the current 
> implementation will run into a deadlock between the JVM and the unix process 
> started by the ExecuteStreamCommand.
> This is a known issue that is fully described here: 
> [http://java-monitor.com/forum/showthread.php?t=4067]
> In short:
>  * If the process produces too much stderr that is not consumed by 
> ExecuteStreamCommand, it will block until data is read.
>  * The current processor implementation is reading from stderr only after 
> having called process.waitFor()
>  * Thus, the two processes are waiting for each other and fall into a deadlock
>  
>  
> The following setup will lead to a deadlock:
>  
> A jar containing the following Main application:
> {code:java}
> object Main extends App {
>   import scala.collection.JavaConverters._
>   val str = 
> Source.fromInputStream(this.getClass.getResourceAsStream("/1mb.txt")).mkString
>   System.err.println(str)
> }
> {code}
> The following NiFi Flow:
>  
> !Screen Shot 2018-03-28 at 15.34.36.png!
>  
> Configuration for ExecuteStreamCommand:
>  
> !Screen Shot 2018-03-28 at 15.36.02.png!
>  
> The script is simply containing a call to the jar: 
> {code:java}
> java -jar stderr.jar
> {code}
>  
> Once the processor calls the script, it appears as "processing" indefinitely 
> and can only be stopped by restarting NiFi.
>  
> I already have a running solution that I will publish as soon as possible.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2594: NIFI-5024 Resolves deadlock in ExecuteStreamCommand...

2018-05-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2594#discussion_r187743134
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExecuteStreamCommand.java
 ---
@@ -382,10 +389,10 @@ public void onTrigger(ProcessContext context, final 
ProcessSession session) thro
 Map attributes = new HashMap<>();
 
 final StringBuilder strBldr = new StringBuilder();
-try {
-String line;
-while ((line = bufferedReader.readLine()) != null) {
-strBldr.append(line).append("\n");
+try (final InputStream is = new FileInputStream(errorOut)) {
--- End diff --

The java-monitor.com link wouldn't load for me, so I couldn't see your 
reference. So is this problem here with the BufferedReader? I'm not entirely 
sure I follow the change here because either way you're pulling data from 
stderr. This case just redirects it and you're reading it from a file.


---


[jira] [Commented] (NIFI-5024) Deadlock in ExecuteStreamCommand processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5024?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472724#comment-16472724
 ] 

ASF GitHub Bot commented on NIFI-5024:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2594#discussion_r187742376
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExecuteStreamCommand.java
 ---
@@ -351,6 +350,16 @@ public void onTrigger(ProcessContext context, final 
ProcessSession session) thro
 builder.directory(dir);
 builder.redirectInput(Redirect.PIPE);
 builder.redirectOutput(Redirect.PIPE);
+final File errorOut;
+try {
+errorOut = File.createTempFile("out", null);
+errorOut.deleteOnExit();
--- End diff --

I would like to see this deleted after every iteration of the processor 
because it could a very long time brefore deleteOnExit gets executed.


> Deadlock in ExecuteStreamCommand processor
> --
>
> Key: NIFI-5024
> URL: https://issues.apache.org/jira/browse/NIFI-5024
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0
>Reporter: Nicolas Sanglard
>Priority: Minor
> Attachments: Screen Shot 2018-03-28 at 15.34.36.png, Screen Shot 
> 2018-03-28 at 15.36.02.png
>
>
> Whenever a process is producing too much output on stderr, the current 
> implementation will run into a deadlock between the JVM and the unix process 
> started by the ExecuteStreamCommand.
> This is a known issue that is fully described here: 
> [http://java-monitor.com/forum/showthread.php?t=4067]
> In short:
>  * If the process produces too much stderr that is not consumed by 
> ExecuteStreamCommand, it will block until data is read.
>  * The current processor implementation is reading from stderr only after 
> having called process.waitFor()
>  * Thus, the two processes are waiting for each other and fall into a deadlock
>  
>  
> The following setup will lead to a deadlock:
>  
> A jar containing the following Main application:
> {code:java}
> object Main extends App {
>   import scala.collection.JavaConverters._
>   val str = 
> Source.fromInputStream(this.getClass.getResourceAsStream("/1mb.txt")).mkString
>   System.err.println(str)
> }
> {code}
> The following NiFi Flow:
>  
> !Screen Shot 2018-03-28 at 15.34.36.png!
>  
> Configuration for ExecuteStreamCommand:
>  
> !Screen Shot 2018-03-28 at 15.36.02.png!
>  
> The script is simply containing a call to the jar: 
> {code:java}
> java -jar stderr.jar
> {code}
>  
> Once the processor calls the script, it appears as "processing" indefinitely 
> and can only be stopped by restarting NiFi.
>  
> I already have a running solution that I will publish as soon as possible.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2594: NIFI-5024 Resolves deadlock in ExecuteStreamCommand...

2018-05-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2594#discussion_r187742376
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExecuteStreamCommand.java
 ---
@@ -351,6 +350,16 @@ public void onTrigger(ProcessContext context, final 
ProcessSession session) thro
 builder.directory(dir);
 builder.redirectInput(Redirect.PIPE);
 builder.redirectOutput(Redirect.PIPE);
+final File errorOut;
+try {
+errorOut = File.createTempFile("out", null);
+errorOut.deleteOnExit();
--- End diff --

I would like to see this deleted after every iteration of the processor 
because it could a very long time brefore deleteOnExit gets executed.


---


[jira] [Commented] (NIFI-950) Perform component validation asynchronously

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-950?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472549#comment-16472549
 ] 

ASF GitHub Bot commented on NIFI-950:
-

Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2693#discussion_r187673609
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/authorization/StandardAuthorizableLookup.java
 ---
@@ -493,10 +493,6 @@ public Authorizable getAuthorizableFromResource(String 
resource) {
 }
 }
 
-if (resourceType == null) {
--- End diff --

Why is this being removed?


> Perform component validation asynchronously
> ---
>
> Key: NIFI-950
> URL: https://issues.apache.org/jira/browse/NIFI-950
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Joseph Percivall
>Priority: Major
> Attachments: self_reference_flow_fix.xml
>
>
> I created a flow that is a self referencing http loop. The flow was working 
> fine but I wanted to save the template for later testing. I downloaded the 
> the flow as a template. Then I tried testing a thread.sleep in the beginning 
> of onConfigured, createSSLContext, and validate methods of 
> StandardSSLContextService. I did a mvn clean install in the 
> nifi-nar-bundles/nifi-standard-services/nifi-ssl-context-bundle/nifi-ssl-context-service
>  directory. Then a mvn clean install in the nifi-assembly directory. After I 
> imported the template the UI became very slow when clicking to different 
> windows of the UI such as configuring a processor and the controller services 
> window.
> I then stashed my changes and rebuilt the files. Once again I imported my 
> template, and attempting to configure a processor or accessing the controller 
> services window became very slow.
> The flow xml is attached. 
> ---
> The description and attachment showed an issue where long running validation 
> caused the UI to become unresponsive. This validation should be done 
> asynchronously so that the UI always remains responsive. Initial thoughts...
> - new state to indicate that validation is in progress
> - a mechanism for refreshing validation results
> - time out for waiting for validation to complete? or need to always be 
> validating all components in case their validity is based on something 
> environmental (like a configuration file that is modified outside of the 
> application)?
> - provide better support for components that are running and become invalid
> -- related to this we need to provide guidance regarding the difference 
> between become invalid and when we should use features like bulletins and 
> yielding to rely runtime issues



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-950) Perform component validation asynchronously

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-950?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472550#comment-16472550
 ] 

ASF GitHub Bot commented on NIFI-950:
-

Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2693#discussion_r187686929
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestListenHTTP.java
 ---
@@ -168,6 +169,8 @@ public void 
testSecurePOSTRequestsReturnCodeReceivedWithEL() throws Exception {
 }
 
 @Test
+// TODO / NOCOMMIT: Don't check in with this ignored... it's now 
failing because the service is valid. DOn't know why it was invalid before
--- End diff --

Was this meant to be committed :)


> Perform component validation asynchronously
> ---
>
> Key: NIFI-950
> URL: https://issues.apache.org/jira/browse/NIFI-950
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Joseph Percivall
>Priority: Major
> Attachments: self_reference_flow_fix.xml
>
>
> I created a flow that is a self referencing http loop. The flow was working 
> fine but I wanted to save the template for later testing. I downloaded the 
> the flow as a template. Then I tried testing a thread.sleep in the beginning 
> of onConfigured, createSSLContext, and validate methods of 
> StandardSSLContextService. I did a mvn clean install in the 
> nifi-nar-bundles/nifi-standard-services/nifi-ssl-context-bundle/nifi-ssl-context-service
>  directory. Then a mvn clean install in the nifi-assembly directory. After I 
> imported the template the UI became very slow when clicking to different 
> windows of the UI such as configuring a processor and the controller services 
> window.
> I then stashed my changes and rebuilt the files. Once again I imported my 
> template, and attempting to configure a processor or accessing the controller 
> services window became very slow.
> The flow xml is attached. 
> ---
> The description and attachment showed an issue where long running validation 
> caused the UI to become unresponsive. This validation should be done 
> asynchronously so that the UI always remains responsive. Initial thoughts...
> - new state to indicate that validation is in progress
> - a mechanism for refreshing validation results
> - time out for waiting for validation to complete? or need to always be 
> validating all components in case their validity is based on something 
> environmental (like a configuration file that is modified outside of the 
> application)?
> - provide better support for components that are running and become invalid
> -- related to this we need to provide guidance regarding the difference 
> between become invalid and when we should use features like bulletins and 
> yielding to rely runtime issues



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2693: NIFI-950: Make component validation asynchronous

2018-05-11 Thread mcgilman
Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2693#discussion_r187673609
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/authorization/StandardAuthorizableLookup.java
 ---
@@ -493,10 +493,6 @@ public Authorizable getAuthorizableFromResource(String 
resource) {
 }
 }
 
-if (resourceType == null) {
--- End diff --

Why is this being removed?


---


[GitHub] nifi pull request #2693: NIFI-950: Make component validation asynchronous

2018-05-11 Thread mcgilman
Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2693#discussion_r187686929
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestListenHTTP.java
 ---
@@ -168,6 +169,8 @@ public void 
testSecurePOSTRequestsReturnCodeReceivedWithEL() throws Exception {
 }
 
 @Test
+// TODO / NOCOMMIT: Don't check in with this ignored... it's now 
failing because the service is valid. DOn't know why it was invalid before
--- End diff --

Was this meant to be committed :)


---


[jira] [Resolved] (NIFI-3113) Update and clean up Upgrading NiFi document

2018-05-11 Thread Andrew Lim (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Lim resolved NIFI-3113.
--
Resolution: Fixed

Reorganized the [Upgrading 
NiFi|https://cwiki.apache.org/confluence/display/NIFI/Upgrading+NiFi] section 
of the wiki.  The outdated document referred to in the Jira description has 
been renamed to [0.x.0 to 0.x.0 
Upgrade|https://cwiki.apache.org/confluence/display/NIFI/0.x.0+to+0.x.0+Upgrade].
  Two new documents have been added:

[0.x.0 to 1.0.0 
Upgrade|https://cwiki.apache.org/confluence/display/NIFI/0.x.0+to+1.0.0+Upgrade]

[1.x.0 to 1.x.0 
Upgrade|https://cwiki.apache.org/confluence/display/NIFI/1.x.0+to+1.x.0+Upgrade]

Feedback welcome and appreciated.

> Update and clean up Upgrading NiFi document
> ---
>
> Key: NIFI-3113
> URL: https://issues.apache.org/jira/browse/NIFI-3113
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation  Website
>Affects Versions: 1.1.0
>Reporter: Andy LoPresto
>Assignee: Andrew Lim
>Priority: Major
>  Labels: documentation, upgrade
>
> The [Upgrading 
> NiFi|https://cwiki.apache.org/confluence/display/NIFI/Upgrading+NiFi] 
> document on the wiki refers to {{0.4.0}} as the "new version" of NiFi. It 
> should be updated with the following considerations:
> * 1.x branch is now current, which means significant changes have occurred, 
> including ZMC, ZK, and multi-tenant authorization
> * The document has numerous spelling and grammatical errors, some of which 
> seriously affect comprehension (e.g. "_static files from the conf directory 
> will *not* be located_" should be "_static files from the conf directory will 
> *now* be located_")
> * The document does not explicitly explain *how* the default location of 
> various resources is modified until the end of the file where an example 
> {{nifi.properties}} is provided
> I think reviewing and updating this document with a new set of eyes will 
> greatly improve it and make it a relevant (and hopefully more long-lived) 
> resource for users. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5145) MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null inputs

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472413#comment-16472413
 ] 

ASF GitHub Bot commented on NIFI-5145:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2672
  
@markap14 Any feedback?


> MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null 
> inputs
> 
>
> Key: NIFI-5145
> URL: https://issues.apache.org/jira/browse/NIFI-5145
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> The method mentioned in the title line cannot handle null inputs, even though 
> the main NiFi execution classes can handle that scenario. This forces hack to 
> pass testing with nulls that looks like this:
> String val = flowFile != null ? 
> context.getProperty(PROP).evaluateExpressionLanguage(flowfile).getValue() : 
> context.getProperty(PROP).evaluateExpressionLanguage(new 
> HashMap()).getValue();



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2672: NIFI-5145 Made MockPropertyValue.evaluateExpressionLanguag...

2018-05-11 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2672
  
@markap14 Any feedback?


---


[jira] [Commented] (MINIFICPP-472) Implement date manipulation EL functions

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/MINIFICPP-472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472385#comment-16472385
 ] 

ASF GitHub Bot commented on MINIFICPP-472:
--

Github user achristianson commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/315
  
Taking a look.


> Implement date manipulation EL functions
> 
>
> Key: MINIFICPP-472
> URL: https://issues.apache.org/jira/browse/MINIFICPP-472
> Project: NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Andrew Christianson
>Assignee: Andrew Christianson
>Priority: Major
>
> [Date 
> Manipulation|https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#dates]
>  * 
> [format|https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#format]
>  * 
> [toDate|https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#todate]
>  * 
> [now|https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#now]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp issue #315: MINIFICPP-472 Added date formatting EL functions

2018-05-11 Thread achristianson
Github user achristianson commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/315
  
Taking a look.


---


[jira] [Commented] (MINIFICPP-474) Implement getDelimitedField EL function

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/MINIFICPP-474?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472376#comment-16472376
 ] 

ASF GitHub Bot commented on MINIFICPP-474:
--

GitHub user achristianson opened a pull request:

https://github.com/apache/nifi-minifi-cpp/pull/328

MINIFICPP-474 Added getDelimitedField EL function

Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced
 in the commit message?

- [x] Does your PR title start with MINIFI- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
- [x] If applicable, have you updated the LICENSE file?
- [x] If applicable, have you updated the NOTICE file?

### For documentation related changes:
- [x] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/achristianson/nifi-minifi-cpp MINIFICPP-474

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-minifi-cpp/pull/328.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #328


commit b17252fb09f13fefde79d61baf1555b21c0817e9
Author: Andrew I. Christianson 
Date:   2018-05-11T14:46:55Z

MINIFICPP-474 Added getDelimitedField EL function




> Implement getDelimitedField EL function
> ---
>
> Key: MINIFICPP-474
> URL: https://issues.apache.org/jira/browse/MINIFICPP-474
> Project: NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Andrew Christianson
>Assignee: Andrew Christianson
>Priority: Major
>
> *Description*: Parses the Subject as a delimited line of text and returns 
> just a single field from that delimited text.
> *Subject Type*: String
> *Arguments*:
>  * _index_ : The index of the field to return. A value of 1 will return the 
> first field, a value of 2 will return the second field, and so on.
>  * _delimiter_ : Optional argument that provides the character to use as a 
> field separator. If not specified, a comma will be used. This value must be 
> exactly 1 character.
>  * _quoteChar_ : Optional argument that provides the character that can be 
> used to quote values so that the delimiter can be used within a single field. 
> If not specified, a double-quote (") will be used. This value must be exactly 
> 1 character.
>  * _escapeChar_ : Optional argument that provides the character that can be 
> used to escape the Quote Character or the Delimiter within a field. If not 
> specified, a backslash (\) is used. This value must be exactly 1 character.
>  * _stripChars_ : Optional argument that specifies whether or not quote 
> characters and escape characters should be stripped. For example, if we have 
> a field value "1, 2, 3" and this value is true, we will get the value {{1, 2, 
> 3}}, but if this value is false, we will get the value {{"1, 2, 3"}} with the 
> quotes. The default value is false. This value must be either {{true}}or 
> {{false}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp pull request #328: MINIFICPP-474 Added getDelimitedField EL ...

2018-05-11 Thread achristianson
GitHub user achristianson opened a pull request:

https://github.com/apache/nifi-minifi-cpp/pull/328

MINIFICPP-474 Added getDelimitedField EL function

Thank you for submitting a contribution to Apache NiFi - MiNiFi C++.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced
 in the commit message?

- [x] Does your PR title start with MINIFI- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
- [x] If applicable, have you updated the LICENSE file?
- [x] If applicable, have you updated the NOTICE file?

### For documentation related changes:
- [x] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/achristianson/nifi-minifi-cpp MINIFICPP-474

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-minifi-cpp/pull/328.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #328


commit b17252fb09f13fefde79d61baf1555b21c0817e9
Author: Andrew I. Christianson 
Date:   2018-05-11T14:46:55Z

MINIFICPP-474 Added getDelimitedField EL function




---


[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Bryan Bende (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472367#comment-16472367
 ] 

Bryan Bende commented on NIFI-5184:
---

Given the limitations of the JDK native library loading and the native 
libraries themsevles, there is always going to be some trade-off... if we did 
what you are suggesting then we would possible make the native libraries work 
better, but would be sacrificing some of the isolation by making additional 
native libs/jars visible at a higher level which makes them visible to other 
components that don't need/want them.

Also, something to keep in mind is that not only would the native libs need to 
be loaded at a higher shared level, but also the Java code that uses them...

One time i did a test where I tried to load the Hadoop native libs into the 
system class loader right at the beginning of NiFi start up, but the Java code 
was still loaded down in the NAR class loader, and that Java got unsatisfied 
link errors because it also needed to be loaded into the system class loader 
where the native libs were loaded, which would then mean Hadoop libs would be 
on the classpath of every single NAR/component which then defeats the purpose 
of NARs.

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> 

[jira] [Commented] (NIFI-5122) Add record writer to S2S Reporting Tasks

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5122?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472344#comment-16472344
 ] 

ASF GitHub Bot commented on NIFI-5122:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2663#discussion_r187685161
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/resources/docs/org.apache.nifi.reporting.SiteToSiteStatusReportingTask/additionalDetails.html
 ---
@@ -0,0 +1,122 @@
+
+
+
+
+
+SiteToSiteStatusReportingTask
+
+
+
+
+
+   
+   The Site-to-Site Bulletin Reporting Task allows the 
user to publish Status events using the Site To Site protocol. 
+   The component type and name filter regexes form a 
union: only components matching both regexes will be reported. 
+   However, all process groups are recursively searched 
for matching components, regardless of whether the process 
+   group matches the component filters.
+   
+   
+   Record writer
+   
+   
+   The user can define a Record Writer and directly 
specify the output format and data with the assumption that the 
+   input schema is the following:
+   
+
+   
+   
+{
+  "type" : "record",
+  "name" : "status",
+  "namespace" : "status",
+  "fields" : [
+// common fields for all components
+   { "name" : "statusId", "type" : "string"},
+   { "name" : "timestampMillis", "type": { "type": "long", "logicalType": 
"timestamp-millis" } },
+   { "name" : "timestamp", "type" : "string"},
+   { "name" : "actorHostname", "type" : "string"},
+   { "name" : "componentType", "type" : "string"},
+   { "name" : "componentName", "type" : "string"},
+   { "name" : "parentId", "type" : "string"},
--- End diff --

Currently in 
https://github.com/apache/nifi/pull/2663/files#diff-f1f37886c50fa0946558aed14f835e27R147
 it is sending in null for the parentId all the time. I think we should 
determine the parentId if possible, and (at least) make the field in the schema 
nullable.


> Add record writer to S2S Reporting Tasks
> 
>
> Key: NIFI-5122
> URL: https://issues.apache.org/jira/browse/NIFI-5122
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> Just like we have the option to specify a record writer for the new Site To 
> Site Metrics Reporting Task, there should be the possibility to specify an 
> optional record writer for the other S2S reporting tasks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2663: NIFI-5122 - Add Record Writer for S2S RTs

2018-05-11 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2663#discussion_r187685161
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/resources/docs/org.apache.nifi.reporting.SiteToSiteStatusReportingTask/additionalDetails.html
 ---
@@ -0,0 +1,122 @@
+
+
+
+
+
+SiteToSiteStatusReportingTask
+
+
+
+
+
+   
+   The Site-to-Site Bulletin Reporting Task allows the 
user to publish Status events using the Site To Site protocol. 
+   The component type and name filter regexes form a 
union: only components matching both regexes will be reported. 
+   However, all process groups are recursively searched 
for matching components, regardless of whether the process 
+   group matches the component filters.
+   
+   
+   Record writer
+   
+   
+   The user can define a Record Writer and directly 
specify the output format and data with the assumption that the 
+   input schema is the following:
+   
+
+   
+   
+{
+  "type" : "record",
+  "name" : "status",
+  "namespace" : "status",
+  "fields" : [
+// common fields for all components
+   { "name" : "statusId", "type" : "string"},
+   { "name" : "timestampMillis", "type": { "type": "long", "logicalType": 
"timestamp-millis" } },
+   { "name" : "timestamp", "type" : "string"},
+   { "name" : "actorHostname", "type" : "string"},
+   { "name" : "componentType", "type" : "string"},
+   { "name" : "componentName", "type" : "string"},
+   { "name" : "parentId", "type" : "string"},
--- End diff --

Currently in 
https://github.com/apache/nifi/pull/2663/files#diff-f1f37886c50fa0946558aed14f835e27R147
 it is sending in null for the parentId all the time. I think we should 
determine the parentId if possible, and (at least) make the field in the schema 
nullable.


---


[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Greg Senia (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472328#comment-16472328
 ] 

Greg Senia commented on NIFI-5184:
--

[~joewitt] and [~bende] should there really be a shared classloader just like 
in j2ee and with what they are doing with Spark. I'd even take a custom 
property flag to load these jars at a higher shared classloader vs risking the 
double loading..

 

[https://github.com/apache/spark/pull/5851]

 

https://issues.apache.org/jira/browse/SPARK-7819

 

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An 
> exception occurred in the Java(tm) MQI.
>  at sun.reflect.GeneratedConstructorAccessor192.newInstance(Unknown Source)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>  at 
> 

[jira] [Created] (NIFI-5187) Allow EvaluateXPath to return multiple results if destination is FlowFile content

2018-05-11 Thread Matt Burgess (JIRA)
Matt Burgess created NIFI-5187:
--

 Summary: Allow EvaluateXPath to return multiple results if 
destination is FlowFile content
 Key: NIFI-5187
 URL: https://issues.apache.org/jira/browse/NIFI-5187
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Extensions
Reporter: Matt Burgess


EvaluateXPath currently expects a Nodeset to be returned from the XPath 
expression if the destination is FlowFile content, but it also requires that 
the Nodeset only have one element. Since the element is output as a string, it 
could support Nodesets with multiple elements, using a newline as a delimiter 
(and perhaps escaping any newlines in the elements' values if necessary), only 
if the destination was FlowFile content.

This improvement would make it easier to prepare XML for use by a record 
processor, for example.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5186) Update UI to account for asynchronous validation

2018-05-11 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-5186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman updated NIFI-5186:
--
Description: This Jira is a follow up to NIFI-950. The new asynchronous 
validation introduces a new state VALIDATING. This VALIDATING state will be 
entered following any modifications (create, update) to the component 
(Processor, Controller Service, Reporting Task). All component validation is 
done in the background on a recurring interval. When obtaining the current 
state, we will return the last known state unless it is VALIDATING as a result 
of a modification.   (was: This Jira is a follow up to NIFI-950. The new 
asynchronous validation introduces a new state VALIDATING. This VALIDATING 
state will be shown following any modifications (create, update) to the 
component (Processor, Controller Service, Reporting Task). All component 
validation is done in the background on a recurring interval or following a 
modification. When obtaining the current state, we will return the last known 
state unless we are returning it as part of a modification (in which case we 
return VALIDATING). )

> Update UI to account for asynchronous validation
> 
>
> Key: NIFI-5186
> URL: https://issues.apache.org/jira/browse/NIFI-5186
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core UI
>Reporter: Matt Gilman
>Assignee: Matt Gilman
>Priority: Blocker
> Fix For: 1.7.0
>
>
> This Jira is a follow up to NIFI-950. The new asynchronous validation 
> introduces a new state VALIDATING. This VALIDATING state will be entered 
> following any modifications (create, update) to the component (Processor, 
> Controller Service, Reporting Task). All component validation is done in the 
> background on a recurring interval. When obtaining the current state, we will 
> return the last known state unless it is VALIDATING as a result of a 
> modification. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-5186) Update UI to account for asynchronous validation

2018-05-11 Thread Matt Gilman (JIRA)
Matt Gilman created NIFI-5186:
-

 Summary: Update UI to account for asynchronous validation
 Key: NIFI-5186
 URL: https://issues.apache.org/jira/browse/NIFI-5186
 Project: Apache NiFi
  Issue Type: Bug
  Components: Core UI
Reporter: Matt Gilman
Assignee: Matt Gilman
 Fix For: 1.7.0


This Jira is a follow up to NIFI-950. The new asynchronous validation 
introduces a new state VALIDATING. This VALIDATING state will be shown 
following any modifications (create, update) to the component (Processor, 
Controller Service, Reporting Task). All component validation is done in the 
background on a recurring interval or following a modification. When obtaining 
the current state, we will return the last known state unless we are returning 
it as part of a modification (in which case we return VALIDATING). 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5041) Add convenient SPNEGO/Kerberos authentication support to LivySessionController

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5041?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472198#comment-16472198
 ] 

ASF GitHub Bot commented on NIFI-5041:
--

Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/2630
  
The changes LGTM, and I tested on a secure cluster, verifying that I could 
connect, get a session, and execute some simple Scala/Spark code. However, when 
I tested with various unhappy paths including no Kerberos Credentials Service 
and a bad keytab, it seems we could be handling these situations better. 

In the first case (no credentials), the /sessions endpoint will return HTML 
not JSON. This causes a bulletin to be issued, but the flow file is not 
penalized and/or the processor is not yielded, and the LivyControllerService 
thread to manage the sessions exits, meaning the flow will never proceed until 
the CS is restarted with the correct credentials. This could be considered a 
Livy bug (I didn't see an existing Jira), but we need to handle it for now.

I believe something similar happens for a bad keytab, but I didn't trace it 
back to the manageSessions thread or anything. I think we need to ensure that 
the manageSessions() thread is always running while the CS is enabled, we can 
pass any exceptions back to the CS so when the processor makes a call to the 
CS, we can throw the appropriate exception, etc.


> Add convenient SPNEGO/Kerberos authentication support to LivySessionController
> --
>
> Key: NIFI-5041
> URL: https://issues.apache.org/jira/browse/NIFI-5041
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Peter Toth
>Priority: Minor
>
> Livy requires SPNEGO/Kerberos authentication on a secured cluster. Initiating 
> such an authentication from NiFi is a viable by providing a 
> java.security.auth.login.config system property 
> (https://docs.oracle.com/javase/8/docs/technotes/guides/security/jgss/lab/part6.html),
>  but this is a bit cumbersome and needs kinit running outside of NiFi.
> An alternative and more sophisticated solution would be to do the SPNEGO 
> negotiation programmatically.
>  * This solution would add some new properties to the LivySessionController 
> to fetch kerberos principal and password/keytab
>  * Add the required HTTP Negotiate header (with an SPNEGO token) to the 
> HttpURLConnection to do the authentication programmatically 
> (https://tools.ietf.org/html/rfc4559)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472197#comment-16472197
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187663098
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+

[GitHub] nifi issue #2630: NIFI-5041 Adds SPNEGO authentication to LivySessionControl...

2018-05-11 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/2630
  
The changes LGTM, and I tested on a secure cluster, verifying that I could 
connect, get a session, and execute some simple Scala/Spark code. However, when 
I tested with various unhappy paths including no Kerberos Credentials Service 
and a bad keytab, it seems we could be handling these situations better. 

In the first case (no credentials), the /sessions endpoint will return HTML 
not JSON. This causes a bulletin to be issued, but the flow file is not 
penalized and/or the processor is not yielded, and the LivyControllerService 
thread to manage the sessions exits, meaning the flow will never proceed until 
the CS is restarted with the correct credentials. This could be considered a 
Livy bug (I didn't see an existing Jira), but we need to handle it for now.

I believe something similar happens for a bad keytab, but I didn't trace it 
back to the manageSessions thread or anything. I think we need to ensure that 
the manageSessions() thread is always running while the CS is enabled, we can 
pass any exceptions back to the CS so when the processor makes a call to the 
CS, we can throw the appropriate exception, etc.


---


[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187663098
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+tempRelationships.add(REL_FAILURE);
+relationships = Collections.unmodifiableSet(tempRelationships);
+final List tempDescriptors = new ArrayList<>();
+tempDescriptors.add(MODEL_FILE);
+

[GitHub] nifi-minifi-cpp issue #315: MINIFICPP-472 Added date formatting EL functions

2018-05-11 Thread apiri
Github user apiri commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/315
  
@achristianson Started reviewing.  Build failed on my local envs and it 
looks like Travis is reproducing the same.  Also seems to be present for #325.  
Could you fix those up and rebase please?


---


[jira] [Commented] (MINIFICPP-472) Implement date manipulation EL functions

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/MINIFICPP-472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472178#comment-16472178
 ] 

ASF GitHub Bot commented on MINIFICPP-472:
--

Github user apiri commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/315
  
@achristianson Started reviewing.  Build failed on my local envs and it 
looks like Travis is reproducing the same.  Also seems to be present for #325.  
Could you fix those up and rebase please?


> Implement date manipulation EL functions
> 
>
> Key: MINIFICPP-472
> URL: https://issues.apache.org/jira/browse/MINIFICPP-472
> Project: NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Andrew Christianson
>Assignee: Andrew Christianson
>Priority: Major
>
> [Date 
> Manipulation|https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#dates]
>  * 
> [format|https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#format]
>  * 
> [toDate|https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#todate]
>  * 
> [now|https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#now]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-3422) Run Once Scheduling

2018-05-11 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3422?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472173#comment-16472173
 ] 

Joseph Witt commented on NIFI-3422:
---

Another JIRA for this just came in 
https://issues.apache.org/jira/browse/NIFI-3422

It would be cool to support..

> Run Once Scheduling
> ---
>
> Key: NIFI-3422
> URL: https://issues.apache.org/jira/browse/NIFI-3422
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework, Core UI
>Affects Versions: 1.2.0
>Reporter: Nazario
>Priority: Minor
>  Labels: features
>
> A run once scheduling option allows a Processor to run once and then 
> automatically stop.  This is convenient when developing and debugging flows,  
> or when building "visual scripts" for ad-hoc process integration or iterative 
> analytics. Individual processors set to `Run once' can be selected on the 
> canvas with a shift-click.  Then clicking `Start' on the Operate Palette will 
> start those processors which will run once and stop.  Then one can modify 
> processing parameters and repeat.  Interactive analytics in particular 
> benefit from this scheduling mode as they often require human review of 
> results at the end of a flow followed by adjustment of flow parameters before 
> running the next analytic flow.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-5185) Convenient "Run Once" button for scheduled processors

2018-05-11 Thread Joseph Witt (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-5185?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joseph Witt resolved NIFI-5185.
---
Resolution: Duplicate

> Convenient "Run Once" button for scheduled processors
> -
>
> Key: NIFI-5185
> URL: https://issues.apache.org/jira/browse/NIFI-5185
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Nicholas Carenza
>Priority: Minor
>
> I have a stateful cron processor that runs once a day. Sometimes though I 
> would like to trigger it to run on-demand. I could change it to use different 
> scheduling temporarily. I would just like a more convenient way to do this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-5185) Convenient "Run Once" button for scheduled processors

2018-05-11 Thread Nicholas Carenza (JIRA)
Nicholas Carenza created NIFI-5185:
--

 Summary: Convenient "Run Once" button for scheduled processors
 Key: NIFI-5185
 URL: https://issues.apache.org/jira/browse/NIFI-5185
 Project: Apache NiFi
  Issue Type: Improvement
Reporter: Nicholas Carenza


I have a stateful cron processor that runs once a day. Sometimes though I would 
like to trigger it to run on-demand. I could change it to use different 
scheduling temporarily. I would just like a more convenient way to do this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472154#comment-16472154
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/2686
  
@markap14 - 

Thanks for your prompt review and advice.  

I've updated the code based on your review and am looking forward to 
your/other members feedback.

Thanks again.

Mans


> Create deep learning classification and regression processor
> 
>
> Key: NIFI-5166
> URL: https://issues.apache.org/jira/browse/NIFI-5166
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: Learning, classification,, deep, regression,
> Fix For: 1.7.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> We need a deep learning classification and regression processor.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2686: NIFI-5166 - Deep learning classification and regression pr...

2018-05-11 Thread mans2singh
Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/2686
  
@markap14 - 

Thanks for your prompt review and advice.  

I've updated the code based on your review and am looking forward to 
your/other members feedback.

Thanks again.

Mans


---


[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472152#comment-16472152
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187655735
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+

[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187655735
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+tempRelationships.add(REL_FAILURE);
+relationships = Collections.unmodifiableSet(tempRelationships);
+final List tempDescriptors = new ArrayList<>();
+tempDescriptors.add(MODEL_FILE);
+

[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472151#comment-16472151
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187655641
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
--- End diff --

@markap14  - This is to establish a foundation for future extensions which 
will be easier if some common base classes are present.   I found this pattern 
to be useful and hope it's not an overkill.


> Create deep learning classification and regression processor
> 
>
> Key: NIFI-5166
> URL: https://issues.apache.org/jira/browse/NIFI-5166
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: Learning, classification,, deep, regression,
> Fix For: 1.7.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> We need a deep learning classification and regression processor.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187655641
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
--- End diff --

@markap14  - This is to establish a foundation for future extensions which 
will be easier if some common base classes are present.   I found this pattern 
to be useful and hope it's not an overkill.


---


[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187655109
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/test/resources/classification_test.txt
 ---
@@ -0,0 +1,100 @@
+1.1,0.5,0.5,0.2,0
--- End diff --

I've mentioned at the beginning of the tests that these are based on 
deeplearning4j examples.  I generated this very simple/small sample file to 
make consistent predictions for the tests even with just a few observations.


---


[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472148#comment-16472148
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187655109
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/test/resources/classification_test.txt
 ---
@@ -0,0 +1,100 @@
+1.1,0.5,0.5,0.2,0
--- End diff --

I've mentioned at the beginning of the tests that these are based on 
deeplearning4j examples.  I generated this very simple/small sample file to 
make consistent predictions for the tests even with just a few observations.


> Create deep learning classification and regression processor
> 
>
> Key: NIFI-5166
> URL: https://issues.apache.org/jira/browse/NIFI-5166
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: Learning, classification,, deep, regression,
> Fix For: 1.7.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> We need a deep learning classification and regression processor.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472143#comment-16472143
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187654383
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+

[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472144#comment-16472144
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187654464
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+

[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187654464
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+tempRelationships.add(REL_FAILURE);
+relationships = Collections.unmodifiableSet(tempRelationships);
+final List tempDescriptors = new ArrayList<>();
+tempDescriptors.add(MODEL_FILE);
+

[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472142#comment-16472142
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187654258
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+

[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187654383
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+tempRelationships.add(REL_FAILURE);
+relationships = Collections.unmodifiableSet(tempRelationships);
+final List tempDescriptors = new ArrayList<>();
+tempDescriptors.add(MODEL_FILE);
+

[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187654258
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
+
+static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("Successful DeepLearning4j results are routed to 
this relationship").build();
+
+static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("Failed DeepLearning4j results are routed to this 
relationship").build();
+
+protected Gson gson = new Gson();
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+tempRelationships.add(REL_FAILURE);
+relationships = Collections.unmodifiableSet(tempRelationships);
+final List tempDescriptors = new ArrayList<>();
+tempDescriptors.add(MODEL_FILE);
+

[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472140#comment-16472140
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187654197
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
--- End diff --

This is the intuitive name I could come up with.  Please let me know if you 
have any other recommendations.


> Create deep learning classification and regression processor
> 
>
> Key: NIFI-5166
> URL: https://issues.apache.org/jira/browse/NIFI-5166
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: Learning, classification,, deep, regression,
> Fix For: 1.7.0
>
>   Original Estimate: 168h
>  

[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187654197
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/DeepLearning4JPredictor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.nd4j.linalg.api.ndarray.INDArray;
+import org.nd4j.linalg.factory.Nd4j;
+import com.google.gson.Gson;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"deeplearning4j", "dl4j", "predict", "classification", 
"regression", "deep", "learning"})
+@CapabilityDescription("The DeepLearning4JPredictor predicts one or more 
value(s) based on provided deeplearning4j (https://github.com/deeplearning4j) 
model and the content of a FlowFile. "
++ "The processor supports both classification and regression by 
extracting the record from the FlowFile body and applying the model. "
++ "The processor supports batch by allowing multiple records to be 
passed in the FlowFile body with each record separated by the 'Record 
Separator' property. "
++ "Each record can contain multiple fields with each field separated 
by the 'Field Separator' property."
+)
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_ERROR_MESSAGE, description = 
"Deeplearning4J error message"),
+@WritesAttribute(attribute = 
AbstractDeepLearning4JProcessor.DEEPLEARNING4J_OUTPUT_SHAPE, description = 
"Deeplearning4J output shape"),
+})
+public class DeepLearning4JPredictor extends 
AbstractDeepLearning4JProcessor {
--- End diff --

This is the intuitive name I could come up with.  Please let me know if you 
have any other recommendations.


---


[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472137#comment-16472137
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187653982
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
+
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor FIELD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-field-separator")
+.displayName("Field Separator")
+.description("Specifies the field separator in the records. 
(default is comma)")
+.required(true)
+.defaultValue(",")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-separator")
+.displayName("Record Separator")
+.description("Specifies the records separator in the message 
body. (defaults to new line)")
+.required(true)
+.defaultValue(System.lineSeparator())
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor MODEL_FILE = new 
PropertyDescriptor.Builder()
+.name("model-file")
+.displayName("Model File")
+.description("Location of the Deeplearning4J model zip file")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.FILE_EXISTS_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_DIMENSIONS = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-dimension")
+.displayName("Record dimensions separated by field separator")
+.description("Dimension of array in each a record (eg: 2,4 - a 
2x4 array)")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final String 

[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472136#comment-16472136
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187653950
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
+
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor FIELD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-field-separator")
+.displayName("Field Separator")
+.description("Specifies the field separator in the records. 
(default is comma)")
+.required(true)
+.defaultValue(",")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-separator")
+.displayName("Record Separator")
+.description("Specifies the records separator in the message 
body. (defaults to new line)")
+.required(true)
+.defaultValue(System.lineSeparator())
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor MODEL_FILE = new 
PropertyDescriptor.Builder()
+.name("model-file")
+.displayName("Model File")
+.description("Location of the Deeplearning4J model zip file")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.FILE_EXISTS_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_DIMENSIONS = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-dimension")
+.displayName("Record dimensions separated by field separator")
+.description("Dimension of array in each a record (eg: 2,4 - a 
2x4 array)")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final String 

[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187653950
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
+
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor FIELD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-field-separator")
+.displayName("Field Separator")
+.description("Specifies the field separator in the records. 
(default is comma)")
+.required(true)
+.defaultValue(",")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-separator")
+.displayName("Record Separator")
+.description("Specifies the records separator in the message 
body. (defaults to new line)")
+.required(true)
+.defaultValue(System.lineSeparator())
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor MODEL_FILE = new 
PropertyDescriptor.Builder()
+.name("model-file")
+.displayName("Model File")
+.description("Location of the Deeplearning4J model zip file")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.FILE_EXISTS_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_DIMENSIONS = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-dimension")
+.displayName("Record dimensions separated by field separator")
+.description("Dimension of array in each a record (eg: 2,4 - a 
2x4 array)")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final String DEEPLEARNING4J_ERROR_MESSAGE = 
"deeplearning4j.error.message";
+
+public static final String DEEPLEARNING4J_OUTPUT_SHAPE = 
"deeplearning4j.output.shape";
+
+protected MultiLayerNetwork model = null;
--- End 

[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187653982
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
+
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor FIELD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-field-separator")
+.displayName("Field Separator")
+.description("Specifies the field separator in the records. 
(default is comma)")
+.required(true)
+.defaultValue(",")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-separator")
+.displayName("Record Separator")
+.description("Specifies the records separator in the message 
body. (defaults to new line)")
+.required(true)
+.defaultValue(System.lineSeparator())
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor MODEL_FILE = new 
PropertyDescriptor.Builder()
+.name("model-file")
+.displayName("Model File")
+.description("Location of the Deeplearning4J model zip file")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.FILE_EXISTS_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_DIMENSIONS = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-dimension")
+.displayName("Record dimensions separated by field separator")
+.description("Dimension of array in each a record (eg: 2,4 - a 
2x4 array)")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final String DEEPLEARNING4J_ERROR_MESSAGE = 
"deeplearning4j.error.message";
+
+public static final String DEEPLEARNING4J_OUTPUT_SHAPE = 
"deeplearning4j.output.shape";
+
+protected MultiLayerNetwork model = null;
+
+

[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472131#comment-16472131
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187653485
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
+
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor FIELD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-field-separator")
+.displayName("Field Separator")
+.description("Specifies the field separator in the records. 
(default is comma)")
--- End diff --

Corrected.


> Create deep learning classification and regression processor
> 
>
> Key: NIFI-5166
> URL: https://issues.apache.org/jira/browse/NIFI-5166
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: Learning, classification,, deep, regression,
> Fix For: 1.7.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> We need a deep learning classification and regression processor.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5166) Create deep learning classification and regression processor

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472132#comment-16472132
 ] 

ASF GitHub Bot commented on NIFI-5166:
--

Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187653539
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
+
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor FIELD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-field-separator")
+.displayName("Field Separator")
+.description("Specifies the field separator in the records. 
(default is comma)")
+.required(true)
+.defaultValue(",")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-separator")
+.displayName("Record Separator")
+.description("Specifies the records separator in the message 
body. (defaults to new line)")
--- End diff --

Corrected


> Create deep learning classification and regression processor
> 
>
> Key: NIFI-5166
> URL: https://issues.apache.org/jira/browse/NIFI-5166
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: Learning, classification,, deep, regression,
> Fix For: 1.7.0
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> We need a deep learning classification and regression processor.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187653539
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
+
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor FIELD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-field-separator")
+.displayName("Field Separator")
+.description("Specifies the field separator in the records. 
(default is comma)")
+.required(true)
+.defaultValue(",")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor RECORD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-record-separator")
+.displayName("Record Separator")
+.description("Specifies the records separator in the message 
body. (defaults to new line)")
--- End diff --

Corrected


---


[GitHub] nifi pull request #2686: NIFI-5166 - Deep learning classification and regres...

2018-05-11 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2686#discussion_r187653485
  
--- Diff: 
nifi-nar-bundles/nifi-deeplearning4j-bundle/nifi-deeplearning4j-processors/src/main/java/org/apache/nifi/processors/deeplearning4j/AbstractDeepLearning4JProcessor.java
 ---
@@ -0,0 +1,106 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.deeplearning4j;
+import java.io.IOException;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.deeplearning4j.nn.multilayer.MultiLayerNetwork;
+import org.deeplearning4j.util.ModelSerializer;
+
+/**
+ * Base class for deeplearning4j processors
+ */
+public abstract class AbstractDeepLearning4JProcessor extends 
AbstractProcessor {
+
+public static final PropertyDescriptor CHARSET = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-charset")
+.displayName("Character Set")
+.description("Specifies the character set of the document 
data.")
+.required(true)
+.defaultValue("UTF-8")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.CHARACTER_SET_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor FIELD_SEPARATOR = new 
PropertyDescriptor.Builder()
+.name("deeplearning4j-field-separator")
+.displayName("Field Separator")
+.description("Specifies the field separator in the records. 
(default is comma)")
--- End diff --

Corrected.


---


[jira] [Updated] (NIFI-5041) Add convenient SPNEGO/Kerberos authentication support to LivySessionController

2018-05-11 Thread Matt Burgess (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-5041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-5041:
---
Affects Version/s: (was: 1.5.0)
   Status: Patch Available  (was: Open)

> Add convenient SPNEGO/Kerberos authentication support to LivySessionController
> --
>
> Key: NIFI-5041
> URL: https://issues.apache.org/jira/browse/NIFI-5041
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Peter Toth
>Priority: Minor
>
> Livy requires SPNEGO/Kerberos authentication on a secured cluster. Initiating 
> such an authentication from NiFi is a viable by providing a 
> java.security.auth.login.config system property 
> (https://docs.oracle.com/javase/8/docs/technotes/guides/security/jgss/lab/part6.html),
>  but this is a bit cumbersome and needs kinit running outside of NiFi.
> An alternative and more sophisticated solution would be to do the SPNEGO 
> negotiation programmatically.
>  * This solution would add some new properties to the LivySessionController 
> to fetch kerberos principal and password/keytab
>  * Add the required HTTP Negotiate header (with an SPNEGO token) to the 
> HttpURLConnection to do the authentication programmatically 
> (https://tools.ietf.org/html/rfc4559)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Greg Senia (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472080#comment-16472080
 ] 

Greg Senia commented on NIFI-5184:
--

[https://www.oracle.com/search/results?Ntt=native%20library%20classloader=1=1=bugs=S3]

 

There are the rest of the native lib bugs/mostly feature requests to handle 
multiple

 

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An 
> exception occurred in the Java(tm) MQI.
>  at sun.reflect.GeneratedConstructorAccessor192.newInstance(Unknown Source)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>  at 
> com.ibm.msg.client.commonservices.j2se.NLSServices.createException(NLSServices.java:319)
>  at 
> 

[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Greg Senia (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472062#comment-16472062
 ] 

Greg Senia commented on NIFI-5184:
--

[~joewitt] here is the original bug that folks chased way back when I was just 
getting involved with WAS 3.5 on Solaris.  
[https://bugs.java.com/view_bug.do?bug_id=4225434]

 

 

fred.oliver@East 2001-08-03 Will not fix. There is no API to force the 
collection of any dead object to occur at any particular time. The ability of 
the VM to force the unloading of a shared library is not something that can be 
guaranteed on every platform. Anand's workaround (above) seems practical.

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An 
> exception occurred in the Java(tm) MQI.
>  at sun.reflect.GeneratedConstructorAccessor192.newInstance(Unknown Source)
>  at 
> 

[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472051#comment-16472051
 ] 

Joseph Witt commented on NIFI-5184:
---

[~gss2002] there are some client libs now written by wonderful people that if I 
ever meet them I will hug. but the gist is they use some mechanism of prefixing 
the loading of the native lib so it could have a unique name/be loaded multiple 
times and use nicely across class loaders.  In fairness though this feels like 
something that should be fixed in the JVM so that native libs are bound to the 
classloader which loaded them (but i'm pretty ignorant to details of what is 
happening under the covers there so maybe that isnt feasible).

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An 
> exception occurred in the Java(tm) MQI.
>  at sun.reflect.GeneratedConstructorAccessor192.newInstance(Unknown Source)
>  at 

[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Greg Senia (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472044#comment-16472044
 ] 

Greg Senia commented on NIFI-5184:
--

[~joewitt] and [~bende] that sounds like a valid solution. I know from my days 
of doing WAS/Tomcat/JBoss with MQ and other nativelibs there was no way around 
it other than having only one version of MQ on a node in a process at time. Or 
in a vendor app that used a native lib only one version of the app/native lib 
per jvm

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An 
> exception occurred in the Java(tm) MQI.
>  at sun.reflect.GeneratedConstructorAccessor192.newInstance(Unknown Source)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>  at 
> 

[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Bryan Bende (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472016#comment-16472016
 ] 

Bryan Bende commented on NIFI-5184:
---

Correct, that is what I was trying to say in my last sentence... basically you 
should be able to have one JMS service or one HDFS processor that uses native 
libs, but more than one is problematic because of what you said.

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An 
> exception occurred in the Java(tm) MQI.
>  at sun.reflect.GeneratedConstructorAccessor192.newInstance(Unknown Source)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>  at 
> com.ibm.msg.client.commonservices.j2se.NLSServices.createException(NLSServices.java:319)
>  at 
> 

[jira] [Created] (MINIFICPP-494) Resolve C2 issues with memory access

2018-05-11 Thread marco polo (JIRA)
marco polo created MINIFICPP-494:


 Summary: Resolve C2 issues with memory access
 Key: MINIFICPP-494
 URL: https://issues.apache.org/jira/browse/MINIFICPP-494
 Project: NiFi MiNiFi C++
  Issue Type: Sub-task
Reporter: marco polo
Assignee: marco polo






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472008#comment-16472008
 ] 

Joseph Witt commented on NIFI-5184:
---

specifically the hadoop/jms example you noted would still not work because the 
client libs (hadoop/jms) in this case are not written to have favorable 
classloader friendly behaviors (since native libs are JVM wide by name), right?

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> Caused by: com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An 
> exception occurred in the Java(tm) MQI.
>  at sun.reflect.GeneratedConstructorAccessor192.newInstance(Unknown Source)
>  at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>  at 
> com.ibm.msg.client.commonservices.j2se.NLSServices.createException(NLSServices.java:319)
>  at 
> 

[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Bryan Bende (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16472006#comment-16472006
 ] 

Bryan Bende commented on NIFI-5184:
---

In this case I think changing JmsConnectionFactoryProvider to remove all of the 
class loading code that is inside the class, and then change the property for 
client lib dir path to have dynamicallyModifiesClasspath(true) might help...

The difference is that by using the framework level capabilities it will only 
modify the classpath when the value of the property changes, instead of on 
every stop/start, and when it does change it will close the previous 
ClassLoader and create a new one, so hopefully the native libs won't be seen as 
already loaded.

The case that still won't work is if you try to have two or more instances of 
JmsConnectionFactoryProvider and they all want to use client libs, this will be 
the same as the Hadoop scenario and won't work.

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at 

[jira] [Resolved] (NIFI-5049) Fix handling of Phonenix datetime columns in QueryDatabaseTable and GenerateTableFetch

2018-05-11 Thread Mike Thomsen (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-5049?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Thomsen resolved NIFI-5049.

Resolution: Fixed

> Fix handling of Phonenix datetime columns in QueryDatabaseTable and 
> GenerateTableFetch
> --
>
> Key: NIFI-5049
> URL: https://issues.apache.org/jira/browse/NIFI-5049
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Gardella Juan Pablo
>Assignee: Matt Burgess
>Priority: Major
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> QueryDatabaseAdapter does not work against Phoenix DB if it should convert 
> TIMESTAMP. The error is described below:
> [https://stackoverflow.com/questions/45989678/convert-varchar-to-timestamp-in-hbase]
> Basically, it's required to use TO_TIMESTAMP(MAX_COLUMN) to make it work. 
> See 
> [https://lists.apache.org/thread.html/%3cca+kifscje8ay+uxt_d_vst4qgzf4jxwovboynjgztt4dsbs...@mail.gmail.com%3E]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5049) Fix handling of Phonenix datetime columns in QueryDatabaseTable and GenerateTableFetch

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5049?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471966#comment-16471966
 ] 

ASF GitHub Bot commented on NIFI-5049:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2625


> Fix handling of Phonenix datetime columns in QueryDatabaseTable and 
> GenerateTableFetch
> --
>
> Key: NIFI-5049
> URL: https://issues.apache.org/jira/browse/NIFI-5049
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Gardella Juan Pablo
>Assignee: Matt Burgess
>Priority: Major
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> QueryDatabaseAdapter does not work against Phoenix DB if it should convert 
> TIMESTAMP. The error is described below:
> [https://stackoverflow.com/questions/45989678/convert-varchar-to-timestamp-in-hbase]
> Basically, it's required to use TO_TIMESTAMP(MAX_COLUMN) to make it work. 
> See 
> [https://lists.apache.org/thread.html/%3cca+kifscje8ay+uxt_d_vst4qgzf4jxwovboynjgztt4dsbs...@mail.gmail.com%3E]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5049) Fix handling of Phonenix datetime columns in QueryDatabaseTable and GenerateTableFetch

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5049?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471964#comment-16471964
 ] 

ASF subversion and git services commented on NIFI-5049:
---

Commit 64356e001432b78d919604d13787ea7b40e80e8e in nifi's branch 
refs/heads/master from [~gardellajuanpablo]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=64356e0 ]

NIFI-5049 Fix handling of Phonenix datetime columns

This closes #2625

Signed-off-by: Mike Thomsen 


> Fix handling of Phonenix datetime columns in QueryDatabaseTable and 
> GenerateTableFetch
> --
>
> Key: NIFI-5049
> URL: https://issues.apache.org/jira/browse/NIFI-5049
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Gardella Juan Pablo
>Assignee: Matt Burgess
>Priority: Major
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> QueryDatabaseAdapter does not work against Phoenix DB if it should convert 
> TIMESTAMP. The error is described below:
> [https://stackoverflow.com/questions/45989678/convert-varchar-to-timestamp-in-hbase]
> Basically, it's required to use TO_TIMESTAMP(MAX_COLUMN) to make it work. 
> See 
> [https://lists.apache.org/thread.html/%3cca+kifscje8ay+uxt_d_vst4qgzf4jxwovboynjgztt4dsbs...@mail.gmail.com%3E]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5121) DBCPService should support passing in an attribute map when obtaining a connection

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5121?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471967#comment-16471967
 ] 

ASF GitHub Bot commented on NIFI-5121:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2658


> DBCPService should support passing in an attribute map when obtaining a 
> connection
> --
>
> Key: NIFI-5121
> URL: https://issues.apache.org/jira/browse/NIFI-5121
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Bryan Bende
>Assignee: Matt Burgess
>Priority: Minor
> Fix For: 1.7.0
>
>
> Many users have asked for a way to obtain dynamic database connections. 
> Essentially being able to use the existing SQL processors like PutSQL, etc, 
> and be able to pass in flow file attributes to the DBCPService to obtain a 
> connection based on the attributes.
> The current DBCPService interface has a single method:
> {code:java}
> Connection getConnection(){code}
> Since there is no way for a processor to pass in any information, we can add 
> an additional method to this interface and make the interface like this:
> {code:java}
> Connection getConnection(Map attributes)
> default Connection getConnection() {
>   return getConnection(Collections.emptyMap());
> }{code}
> This would leave it up to the implementations of DBCPService interface to 
> decide if they want to use the attributes map for anything.
> The DBCPConnectionPool would not use the attributes map and would continue to 
> provide a fixed connection pool against a single data source.
> A new implementation can then be created that somehow maintains multiple 
> connection pools, or creates connections on the fly.
> The PropertyDescriptors in each implementation should indicate how they use 
> expression language.
> For example, since DBCPConnectionPool will not use the attribute map, it's 
> property descriptors will indicate expression language scope as variable 
> registry only:
> {code:java}
> .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY){code}
> The dynamic implementations should indicate:
> {code:java}
> .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBURES){code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2658: NIFI-5121: Added DBCPService API method for passing...

2018-05-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2658


---


[GitHub] nifi pull request #2625: NIFI-5049 Fix handling of Phonenix datetime columns

2018-05-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2625


---


[jira] [Commented] (NIFI-5184) Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs

2018-05-11 Thread Greg Senia (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5184?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471914#comment-16471914
 ] 

Greg Senia commented on NIFI-5184:
--

[~joewitt] seems like NIFI-3673 has a similar issue and the user calls it out 
but never responds:

 

Byunghwa Yun added a comment - 13/Apr/17 03:54
Bryan Bende It's working. Thanks for your efforts.
But I got the another classloader problem. NiFi doesn't load the hadoop native 
library.
I attach the log. Thank you.

2017-04-13 16:49:26,430 DEBUG [StandardProcessScheduler Thread-6] 
org.apache.hadoop.util.NativeCodeLoader Trying to load the custom-built 
native-hadoop library...
2017-04-13 16:49:26,430 DEBUG [StandardProcessScheduler Thread-6] 
*org.apache.hadoop.util.NativeCodeLoader Failed to load native-hadoop with 
error: java.lang.UnsatisfiedLinkError: Native Library 
/home/hadoop/hadoop-2.6.0-cdh5.5.1/lib/native/libhadoop.so already loaded in 
another classloader*
*2017-04-13 16:49:26,430 DEBUG [StandardProcessScheduler Thread-6]* 
org.apache.hadoop.util.NativeCodeLoader 
java.library.path=/home/hadoop/hdfs/lib/native
2017-04-13 16:49:26,430 WARN [StandardProcessScheduler Thread-6] 
org.apache.hadoop.util.NativeCodeLoader Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable

> Nifi JMS Controller Disable/Enable causes Classloader Issues with Native Libs
> -
>
> Key: NIFI-5184
> URL: https://issues.apache.org/jira/browse/NIFI-5184
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Greg Senia
>Priority: Major
>
> When attempting to use a Local IBM QMGR with MQ Bindings. IBM States that 
> within the JDK only one instance of the native library can be loaded. I've 
> worked around this issue by symlinking the IBM MQ Libs into nifi/lib/ which 
> is not a good solution. Wondering if this is a known issue with Nifi and the 
> NAR classloader functions or if this is something that can be fixed so that 
> Nifi can correctly work IBM MQ and MQBindings.
>  
> This only occurs after you disable and than reenable the controller:
>  
> 2018-05-10 21:10:34,865 ERROR [Timer-Driven Process Thread-2] 
> o.apache.nifi.jms.processors.ConsumeJMS ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] ConsumeJMS - 
> JMSConsumer[destination:null; pub-sub:false;] failed to process session due 
> to org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.: {}
> org.springframework.jms.UncategorizedJmsException: Uncategorized exception 
> occured during JMS processing; nested exception is 
> com.ibm.msg.client.jms.DetailedJMSException: JMSFMQ6312: An exception 
> occurred in the Java(tm) MQI.
> The Java(tm) MQI has thrown an exception describing the problem. 
> See the linked exception for further information.; nested exception is 
> com.ibm.mq.jmqi.local.LocalMQ$4: CC=2;RC=2495;AMQ8598: Failed to load the 
> WebSphere MQ native JNI library: 'mqjbnd'.
>  at 
> org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)
>  at 
> org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)
>  at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:497)
>  at org.apache.nifi.jms.processors.JMSConsumer.consume(JMSConsumer.java:66)
>  at 
> org.apache.nifi.jms.processors.ConsumeJMS.rendezvousWithJms(ConsumeJMS.java:156)
>  at 
> org.apache.nifi.jms.processors.AbstractJMSProcessor.onTrigger(AbstractJMSProcessor.java:147)
>  at org.apache.nifi.jms.processors.ConsumeJMS.onTrigger(ConsumeJMS.java:58)
>  at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> 

[jira] [Updated] (NIFI-5121) DBCPService should support passing in an attribute map when obtaining a connection

2018-05-11 Thread Matt Burgess (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-5121?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-5121:
---
Fix Version/s: 1.7.0

> DBCPService should support passing in an attribute map when obtaining a 
> connection
> --
>
> Key: NIFI-5121
> URL: https://issues.apache.org/jira/browse/NIFI-5121
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Bryan Bende
>Assignee: Matt Burgess
>Priority: Minor
> Fix For: 1.7.0
>
>
> Many users have asked for a way to obtain dynamic database connections. 
> Essentially being able to use the existing SQL processors like PutSQL, etc, 
> and be able to pass in flow file attributes to the DBCPService to obtain a 
> connection based on the attributes.
> The current DBCPService interface has a single method:
> {code:java}
> Connection getConnection(){code}
> Since there is no way for a processor to pass in any information, we can add 
> an additional method to this interface and make the interface like this:
> {code:java}
> Connection getConnection(Map attributes)
> default Connection getConnection() {
>   return getConnection(Collections.emptyMap());
> }{code}
> This would leave it up to the implementations of DBCPService interface to 
> decide if they want to use the attributes map for anything.
> The DBCPConnectionPool would not use the attributes map and would continue to 
> provide a fixed connection pool against a single data source.
> A new implementation can then be created that somehow maintains multiple 
> connection pools, or creates connections on the fly.
> The PropertyDescriptors in each implementation should indicate how they use 
> expression language.
> For example, since DBCPConnectionPool will not use the attribute map, it's 
> property descriptors will indicate expression language scope as variable 
> registry only:
> {code:java}
> .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY){code}
> The dynamic implementations should indicate:
> {code:java}
> .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBURES){code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5121) DBCPService should support passing in an attribute map when obtaining a connection

2018-05-11 Thread Mike Thomsen (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-5121?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Thomsen updated NIFI-5121:
---
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> DBCPService should support passing in an attribute map when obtaining a 
> connection
> --
>
> Key: NIFI-5121
> URL: https://issues.apache.org/jira/browse/NIFI-5121
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Bryan Bende
>Assignee: Matt Burgess
>Priority: Minor
>
> Many users have asked for a way to obtain dynamic database connections. 
> Essentially being able to use the existing SQL processors like PutSQL, etc, 
> and be able to pass in flow file attributes to the DBCPService to obtain a 
> connection based on the attributes.
> The current DBCPService interface has a single method:
> {code:java}
> Connection getConnection(){code}
> Since there is no way for a processor to pass in any information, we can add 
> an additional method to this interface and make the interface like this:
> {code:java}
> Connection getConnection(Map attributes)
> default Connection getConnection() {
>   return getConnection(Collections.emptyMap());
> }{code}
> This would leave it up to the implementations of DBCPService interface to 
> decide if they want to use the attributes map for anything.
> The DBCPConnectionPool would not use the attributes map and would continue to 
> provide a fixed connection pool against a single data source.
> A new implementation can then be created that somehow maintains multiple 
> connection pools, or creates connections on the fly.
> The PropertyDescriptors in each implementation should indicate how they use 
> expression language.
> For example, since DBCPConnectionPool will not use the attribute map, it's 
> property descriptors will indicate expression language scope as variable 
> registry only:
> {code:java}
> .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY){code}
> The dynamic implementations should indicate:
> {code:java}
> .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBURES){code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5121) DBCPService should support passing in an attribute map when obtaining a connection

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5121?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471819#comment-16471819
 ] 

ASF subversion and git services commented on NIFI-5121:
---

Commit 099bfcdf3a5873a311312eb7e9e85b7b22ef1b98 in nifi's branch 
refs/heads/master from [~ca9mbu]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=099bfcd ]

NIFI-5121: Added DBCPService API method for passing in flow file attributes 
when available

This closes #2658

Signed-off-by: Mike Thomsen 


> DBCPService should support passing in an attribute map when obtaining a 
> connection
> --
>
> Key: NIFI-5121
> URL: https://issues.apache.org/jira/browse/NIFI-5121
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Bryan Bende
>Assignee: Matt Burgess
>Priority: Minor
>
> Many users have asked for a way to obtain dynamic database connections. 
> Essentially being able to use the existing SQL processors like PutSQL, etc, 
> and be able to pass in flow file attributes to the DBCPService to obtain a 
> connection based on the attributes.
> The current DBCPService interface has a single method:
> {code:java}
> Connection getConnection(){code}
> Since there is no way for a processor to pass in any information, we can add 
> an additional method to this interface and make the interface like this:
> {code:java}
> Connection getConnection(Map attributes)
> default Connection getConnection() {
>   return getConnection(Collections.emptyMap());
> }{code}
> This would leave it up to the implementations of DBCPService interface to 
> decide if they want to use the attributes map for anything.
> The DBCPConnectionPool would not use the attributes map and would continue to 
> provide a fixed connection pool against a single data source.
> A new implementation can then be created that somehow maintains multiple 
> connection pools, or creates connections on the fly.
> The PropertyDescriptors in each implementation should indicate how they use 
> expression language.
> For example, since DBCPConnectionPool will not use the attribute map, it's 
> property descriptors will indicate expression language scope as variable 
> registry only:
> {code:java}
> .expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY){code}
> The dynamic implementations should indicate:
> {code:java}
> .expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBURES){code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471803#comment-16471803
 ] 

ASF GitHub Bot commented on NIFI-4637:
--

Github user anoopsjohn commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2518#discussion_r187594182
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-hbase_1_1_2-client-service-bundle/nifi-hbase_1_1_2-client-service/src/main/java/org/apache/nifi/hbase/HBase_1_1_2_ClientService.java
 ---
@@ -336,51 +346,85 @@ public void shutdown() {
 }
 }
 
+private static final byte[] EMPTY_VIS_STRING;
+
+static {
+try {
+EMPTY_VIS_STRING = "".getBytes("UTF-8");
+} catch (UnsupportedEncodingException e) {
+throw new RuntimeException(e);
+}
+}
+
+private List buildPuts(byte[] rowKey, List columns) {
+List retVal = new ArrayList<>();
+
+try {
+Put put = null;
+
+for (final PutColumn column : columns) {
--- End diff --

In the list of PutColumn I can have 1st column having 'x' as visibility and 
next one with 'y' and 3rd one again with 'x'. So here you will create 3 Put 
objects I guess. Would be better if we map 'columns' into a visibility vs 
columns data structure and then for each of visibility create one Put (?)  


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
> Fix For: 1.7.0
>
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2518: NIFI-4637 Added support for visibility labels to th...

2018-05-11 Thread anoopsjohn
Github user anoopsjohn commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2518#discussion_r187594182
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-hbase_1_1_2-client-service-bundle/nifi-hbase_1_1_2-client-service/src/main/java/org/apache/nifi/hbase/HBase_1_1_2_ClientService.java
 ---
@@ -336,51 +346,85 @@ public void shutdown() {
 }
 }
 
+private static final byte[] EMPTY_VIS_STRING;
+
+static {
+try {
+EMPTY_VIS_STRING = "".getBytes("UTF-8");
+} catch (UnsupportedEncodingException e) {
+throw new RuntimeException(e);
+}
+}
+
+private List buildPuts(byte[] rowKey, List columns) {
+List retVal = new ArrayList<>();
+
+try {
+Put put = null;
+
+for (final PutColumn column : columns) {
--- End diff --

In the list of PutColumn I can have 1st column having 'x' as visibility and 
next one with 'y' and 3rd one again with 'x'. So here you will create 3 Put 
objects I guess. Would be better if we map 'columns' into a visibility vs 
columns data structure and then for each of visibility create one Put (?)  


---


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471798#comment-16471798
 ] 

ASF GitHub Bot commented on NIFI-4637:
--

Github user anoopsjohn commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2518#discussion_r187593742
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-hbase_1_1_2-client-service-bundle/nifi-hbase_1_1_2-client-service/src/main/java/org/apache/nifi/hbase/HBase_1_1_2_ClientService.java
 ---
@@ -336,51 +346,85 @@ public void shutdown() {
 }
 }
 
+private static final byte[] EMPTY_VIS_STRING;
+
+static {
+try {
+EMPTY_VIS_STRING = "".getBytes("UTF-8");
+} catch (UnsupportedEncodingException e) {
+throw new RuntimeException(e);
+}
+}
+
+private List buildPuts(byte[] rowKey, List columns) {
+List retVal = new ArrayList<>();
+
+try {
+Put put = null;
+
+for (final PutColumn column : columns) {
--- End diff --

Allowing the different columns in same row to have diff visibility right.  
For each of the unique visibility one Put need to be there along with 
corresponding columns in it.  I believe the below logic is doing that. Just 
saying.


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
> Fix For: 1.7.0
>
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2518: NIFI-4637 Added support for visibility labels to th...

2018-05-11 Thread anoopsjohn
Github user anoopsjohn commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2518#discussion_r187593742
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-hbase_1_1_2-client-service-bundle/nifi-hbase_1_1_2-client-service/src/main/java/org/apache/nifi/hbase/HBase_1_1_2_ClientService.java
 ---
@@ -336,51 +346,85 @@ public void shutdown() {
 }
 }
 
+private static final byte[] EMPTY_VIS_STRING;
+
+static {
+try {
+EMPTY_VIS_STRING = "".getBytes("UTF-8");
+} catch (UnsupportedEncodingException e) {
+throw new RuntimeException(e);
+}
+}
+
+private List buildPuts(byte[] rowKey, List columns) {
+List retVal = new ArrayList<>();
+
+try {
+Put put = null;
+
+for (final PutColumn column : columns) {
--- End diff --

Allowing the different columns in same row to have diff visibility right.  
For each of the unique visibility one Put need to be there along with 
corresponding columns in it.  I believe the below logic is doing that. Just 
saying.


---


[jira] [Commented] (NIFI-5113) Add XML record writer

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471788#comment-16471788
 ] 

ASF GitHub Bot commented on NIFI-5113:
--

Github user JohannesDaniel commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187592686
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/WriteXMLResult.java
 ---
@@ -0,0 +1,602 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import javanet.staxutils.IndentingXMLStreamWriter;
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.schema.access.SchemaAccessWriter;
+import org.apache.nifi.serialization.AbstractRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.WriteResult;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.RawRecordWriter;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.type.ArrayDataType;
+import org.apache.nifi.serialization.record.type.ChoiceDataType;
+import org.apache.nifi.serialization.record.type.MapDataType;
+import org.apache.nifi.serialization.record.type.RecordDataType;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+
+import javax.xml.stream.XMLOutputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamWriter;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.text.DateFormat;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.Supplier;
+
+
+public class WriteXMLResult extends AbstractRecordSetWriter implements 
RecordSetWriter, RawRecordWriter {
+
+final ComponentLog logger;
+final RecordSchema recordSchema;
+final SchemaAccessWriter schemaAccess;
+final XMLStreamWriter writer;
+final NullSuppression nullSuppression;
+final ArrayWrapping arrayWrapping;
+final String arrayTagName;
+final String recordTagName;
+final String rootTagName;
+
+private final Supplier LAZY_DATE_FORMAT;
+private final Supplier LAZY_TIME_FORMAT;
+private final Supplier LAZY_TIMESTAMP_FORMAT;
+
+public WriteXMLResult(final ComponentLog logger, final RecordSchema 
recordSchema, final SchemaAccessWriter schemaAccess, final OutputStream out, 
final boolean prettyPrint,
+  final NullSuppression nullSuppression, final 
ArrayWrapping arrayWrapping, final String arrayTagName, final String 
rootTagName, final String recordTagName,
+  final String dateFormat, final String 
timeFormat, final String timestampFormat) throws IOException {
+
+super(out);
+
+this.logger = logger;
+this.recordSchema = recordSchema;
+this.schemaAccess = schemaAccess;
+this.nullSuppression = nullSuppression;
+
+this.arrayWrapping = arrayWrapping;
+this.arrayTagName = arrayTagName;
+
+this.rootTagName = rootTagName;
+this.recordTagName = recordTagName;
+
+final DateFormat df = dateFormat == null ? null : 
DataTypeUtils.getDateFormat(dateFormat);
+final DateFormat tf = timeFormat == null ? null : 
DataTypeUtils.getDateFormat(timeFormat);
+final DateFormat tsf = timestampFormat == null ? null : 
DataTypeUtils.getDateFormat(timestampFormat);
+
+LAZY_DATE_FORMAT = () -> 

[GitHub] nifi pull request #2675: NIFI-5113 Add XMLRecordSetWriter

2018-05-11 Thread JohannesDaniel
Github user JohannesDaniel commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187592686
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/WriteXMLResult.java
 ---
@@ -0,0 +1,602 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import javanet.staxutils.IndentingXMLStreamWriter;
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.schema.access.SchemaAccessWriter;
+import org.apache.nifi.serialization.AbstractRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.WriteResult;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.RawRecordWriter;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.type.ArrayDataType;
+import org.apache.nifi.serialization.record.type.ChoiceDataType;
+import org.apache.nifi.serialization.record.type.MapDataType;
+import org.apache.nifi.serialization.record.type.RecordDataType;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+
+import javax.xml.stream.XMLOutputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamWriter;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.text.DateFormat;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.Supplier;
+
+
+public class WriteXMLResult extends AbstractRecordSetWriter implements 
RecordSetWriter, RawRecordWriter {
+
+final ComponentLog logger;
+final RecordSchema recordSchema;
+final SchemaAccessWriter schemaAccess;
+final XMLStreamWriter writer;
+final NullSuppression nullSuppression;
+final ArrayWrapping arrayWrapping;
+final String arrayTagName;
+final String recordTagName;
+final String rootTagName;
+
+private final Supplier LAZY_DATE_FORMAT;
+private final Supplier LAZY_TIME_FORMAT;
+private final Supplier LAZY_TIMESTAMP_FORMAT;
+
+public WriteXMLResult(final ComponentLog logger, final RecordSchema 
recordSchema, final SchemaAccessWriter schemaAccess, final OutputStream out, 
final boolean prettyPrint,
+  final NullSuppression nullSuppression, final 
ArrayWrapping arrayWrapping, final String arrayTagName, final String 
rootTagName, final String recordTagName,
+  final String dateFormat, final String 
timeFormat, final String timestampFormat) throws IOException {
+
+super(out);
+
+this.logger = logger;
+this.recordSchema = recordSchema;
+this.schemaAccess = schemaAccess;
+this.nullSuppression = nullSuppression;
+
+this.arrayWrapping = arrayWrapping;
+this.arrayTagName = arrayTagName;
+
+this.rootTagName = rootTagName;
+this.recordTagName = recordTagName;
+
+final DateFormat df = dateFormat == null ? null : 
DataTypeUtils.getDateFormat(dateFormat);
+final DateFormat tf = timeFormat == null ? null : 
DataTypeUtils.getDateFormat(timeFormat);
+final DateFormat tsf = timestampFormat == null ? null : 
DataTypeUtils.getDateFormat(timestampFormat);
+
+LAZY_DATE_FORMAT = () -> df;
+LAZY_TIME_FORMAT = () -> tf;
+LAZY_TIMESTAMP_FORMAT = () -> tsf;
+
+try {
+XMLOutputFactory factory = XMLOutputFactory.newInstance();
+
+if (prettyPrint) {
  

[jira] [Commented] (NIFI-5113) Add XML record writer

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471784#comment-16471784
 ] 

ASF GitHub Bot commented on NIFI-5113:
--

Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187591727
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/WriteXMLResult.java
 ---
@@ -0,0 +1,602 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import javanet.staxutils.IndentingXMLStreamWriter;
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.schema.access.SchemaAccessWriter;
+import org.apache.nifi.serialization.AbstractRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.WriteResult;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.RawRecordWriter;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.type.ArrayDataType;
+import org.apache.nifi.serialization.record.type.ChoiceDataType;
+import org.apache.nifi.serialization.record.type.MapDataType;
+import org.apache.nifi.serialization.record.type.RecordDataType;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+
+import javax.xml.stream.XMLOutputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamWriter;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.text.DateFormat;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.Supplier;
+
+
+public class WriteXMLResult extends AbstractRecordSetWriter implements 
RecordSetWriter, RawRecordWriter {
+
+final ComponentLog logger;
+final RecordSchema recordSchema;
+final SchemaAccessWriter schemaAccess;
+final XMLStreamWriter writer;
+final NullSuppression nullSuppression;
+final ArrayWrapping arrayWrapping;
+final String arrayTagName;
+final String recordTagName;
+final String rootTagName;
+
+private final Supplier LAZY_DATE_FORMAT;
+private final Supplier LAZY_TIME_FORMAT;
+private final Supplier LAZY_TIMESTAMP_FORMAT;
+
+public WriteXMLResult(final ComponentLog logger, final RecordSchema 
recordSchema, final SchemaAccessWriter schemaAccess, final OutputStream out, 
final boolean prettyPrint,
+  final NullSuppression nullSuppression, final 
ArrayWrapping arrayWrapping, final String arrayTagName, final String 
rootTagName, final String recordTagName,
+  final String dateFormat, final String 
timeFormat, final String timestampFormat) throws IOException {
+
+super(out);
+
+this.logger = logger;
+this.recordSchema = recordSchema;
+this.schemaAccess = schemaAccess;
+this.nullSuppression = nullSuppression;
+
+this.arrayWrapping = arrayWrapping;
+this.arrayTagName = arrayTagName;
+
+this.rootTagName = rootTagName;
+this.recordTagName = recordTagName;
+
+final DateFormat df = dateFormat == null ? null : 
DataTypeUtils.getDateFormat(dateFormat);
+final DateFormat tf = timeFormat == null ? null : 
DataTypeUtils.getDateFormat(timeFormat);
+final DateFormat tsf = timestampFormat == null ? null : 
DataTypeUtils.getDateFormat(timestampFormat);
+
+LAZY_DATE_FORMAT = () -> df;

[GitHub] nifi pull request #2675: NIFI-5113 Add XMLRecordSetWriter

2018-05-11 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187591727
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/WriteXMLResult.java
 ---
@@ -0,0 +1,602 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import javanet.staxutils.IndentingXMLStreamWriter;
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.schema.access.SchemaAccessWriter;
+import org.apache.nifi.serialization.AbstractRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.WriteResult;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.RawRecordWriter;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.type.ArrayDataType;
+import org.apache.nifi.serialization.record.type.ChoiceDataType;
+import org.apache.nifi.serialization.record.type.MapDataType;
+import org.apache.nifi.serialization.record.type.RecordDataType;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+
+import javax.xml.stream.XMLOutputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamWriter;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.text.DateFormat;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.Supplier;
+
+
+public class WriteXMLResult extends AbstractRecordSetWriter implements 
RecordSetWriter, RawRecordWriter {
+
+final ComponentLog logger;
+final RecordSchema recordSchema;
+final SchemaAccessWriter schemaAccess;
+final XMLStreamWriter writer;
+final NullSuppression nullSuppression;
+final ArrayWrapping arrayWrapping;
+final String arrayTagName;
+final String recordTagName;
+final String rootTagName;
+
+private final Supplier LAZY_DATE_FORMAT;
+private final Supplier LAZY_TIME_FORMAT;
+private final Supplier LAZY_TIMESTAMP_FORMAT;
+
+public WriteXMLResult(final ComponentLog logger, final RecordSchema 
recordSchema, final SchemaAccessWriter schemaAccess, final OutputStream out, 
final boolean prettyPrint,
+  final NullSuppression nullSuppression, final 
ArrayWrapping arrayWrapping, final String arrayTagName, final String 
rootTagName, final String recordTagName,
+  final String dateFormat, final String 
timeFormat, final String timestampFormat) throws IOException {
+
+super(out);
+
+this.logger = logger;
+this.recordSchema = recordSchema;
+this.schemaAccess = schemaAccess;
+this.nullSuppression = nullSuppression;
+
+this.arrayWrapping = arrayWrapping;
+this.arrayTagName = arrayTagName;
+
+this.rootTagName = rootTagName;
+this.recordTagName = recordTagName;
+
+final DateFormat df = dateFormat == null ? null : 
DataTypeUtils.getDateFormat(dateFormat);
+final DateFormat tf = timeFormat == null ? null : 
DataTypeUtils.getDateFormat(timeFormat);
+final DateFormat tsf = timestampFormat == null ? null : 
DataTypeUtils.getDateFormat(timestampFormat);
+
+LAZY_DATE_FORMAT = () -> df;
+LAZY_TIME_FORMAT = () -> tf;
+LAZY_TIMESTAMP_FORMAT = () -> tsf;
+
+try {
+XMLOutputFactory factory = XMLOutputFactory.newInstance();
+
+if (prettyPrint) {
   

[jira] [Commented] (MINIFICPP-491) Disable logging within C API

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/MINIFICPP-491?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471763#comment-16471763
 ] 

ASF GitHub Bot commented on MINIFICPP-491:
--

Github user phrocker commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/327
  
https://travis-ci.org/phrocker/nifi-minifi-cpp/jobs/377542802 is a 
concerning test failure. I will investigate this. 


> Disable logging within C API
> 
>
> Key: MINIFICPP-491
> URL: https://issues.apache.org/jira/browse/MINIFICPP-491
> Project: NiFi MiNiFi C++
>  Issue Type: Sub-task
>Reporter: marco polo
>Assignee: marco polo
>Priority: Major
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-minifi-cpp issue #327: MINIFICPP-491: Disable logging for C api

2018-05-11 Thread phrocker
Github user phrocker commented on the issue:

https://github.com/apache/nifi-minifi-cpp/pull/327
  
https://travis-ci.org/phrocker/nifi-minifi-cpp/jobs/377542802 is a 
concerning test failure. I will investigate this. 


---


[jira] [Created] (MINIFICPP-493) Create referenced content repository

2018-05-11 Thread marco polo (JIRA)
marco polo created MINIFICPP-493:


 Summary: Create referenced content repository
 Key: MINIFICPP-493
 URL: https://issues.apache.org/jira/browse/MINIFICPP-493
 Project: NiFi MiNiFi C++
  Issue Type: Sub-task
Reporter: marco polo
Assignee: marco polo


The content file system content repository moves a file into the content repo; 
however, for the purposes of the C API this isn't necessary. Nor is allocating 
memory when using the volatile repos. As a result, a referenced content repo 
would keep the content in place that is imported ( Such as GetFile ).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5113) Add XML record writer

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471676#comment-16471676
 ] 

ASF GitHub Bot commented on NIFI-5113:
--

Github user JohannesDaniel commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187560805
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/XMLRecordSetWriter.java
 ---
@@ -0,0 +1,196 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.DateTimeTextRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+
+@Tags({"xml", "resultset", "writer", "serialize", "record", "recordset", 
"row"})
+@CapabilityDescription("Writes a RecordSet to XML. The records are wrapped 
by a root tag.")
+public class XMLRecordSetWriter extends DateTimeTextRecordSetWriter 
implements RecordSetWriterFactory {
+
+public static final AllowableValue ALWAYS_SUPPRESS = new 
AllowableValue("always-suppress", "Always Suppress",
+"Fields that are missing (present in the schema but not in the 
record), or that have a value of null, will not be written out");
+public static final AllowableValue NEVER_SUPPRESS = new 
AllowableValue("never-suppress", "Never Suppress",
+"Fields that are missing (present in the schema but not in the 
record), or that have a value of null, will be written out as a null value");
+public static final AllowableValue SUPPRESS_MISSING = new 
AllowableValue("suppress-missing", "Suppress Missing Values",
+"When a field has a value of null, it will be written out. 
However, if a field is defined in the schema and not present in the record, the 
field will not be written out.");
+
+public static final AllowableValue USE_PROPERTY_AS_WRAPPER = new 
AllowableValue("use-property-as-wrapper", "Use Property as Wrapper",
+"The value of the property \"Array Tag Name\" will be used as 
the tag name to wrap elements of an array. The field name of the array field 
will be used for the tag name " +
+"of the elements.");
+public static final AllowableValue USE_PROPERTY_FOR_ELEMENTS = new 
AllowableValue("use-property-for-elements", "Use Property for Elements",
+"The value of the property \"Array Tag Name\" will be used for 
the tag name of the elements of an array. The field name of the array field 
will be used as the tag name " +
+"to wrap elements.");
+public static final AllowableValue NO_WRAPPING = new 
AllowableValue("no-wrapping", "No Wrapping",
+"The elements of an array will not be wrapped");
+
+public static final PropertyDescriptor SUPPRESS_NULLS = new 
PropertyDescriptor.Builder()
+.name("suppress_nulls")
+.displayName("Suppress Null Values")
+.description("Specifies 

[GitHub] nifi pull request #2675: NIFI-5113 Add XMLRecordSetWriter

2018-05-11 Thread JohannesDaniel
Github user JohannesDaniel commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187560805
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/XMLRecordSetWriter.java
 ---
@@ -0,0 +1,196 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.DateTimeTextRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+
+@Tags({"xml", "resultset", "writer", "serialize", "record", "recordset", 
"row"})
+@CapabilityDescription("Writes a RecordSet to XML. The records are wrapped 
by a root tag.")
+public class XMLRecordSetWriter extends DateTimeTextRecordSetWriter 
implements RecordSetWriterFactory {
+
+public static final AllowableValue ALWAYS_SUPPRESS = new 
AllowableValue("always-suppress", "Always Suppress",
+"Fields that are missing (present in the schema but not in the 
record), or that have a value of null, will not be written out");
+public static final AllowableValue NEVER_SUPPRESS = new 
AllowableValue("never-suppress", "Never Suppress",
+"Fields that are missing (present in the schema but not in the 
record), or that have a value of null, will be written out as a null value");
+public static final AllowableValue SUPPRESS_MISSING = new 
AllowableValue("suppress-missing", "Suppress Missing Values",
+"When a field has a value of null, it will be written out. 
However, if a field is defined in the schema and not present in the record, the 
field will not be written out.");
+
+public static final AllowableValue USE_PROPERTY_AS_WRAPPER = new 
AllowableValue("use-property-as-wrapper", "Use Property as Wrapper",
+"The value of the property \"Array Tag Name\" will be used as 
the tag name to wrap elements of an array. The field name of the array field 
will be used for the tag name " +
+"of the elements.");
+public static final AllowableValue USE_PROPERTY_FOR_ELEMENTS = new 
AllowableValue("use-property-for-elements", "Use Property for Elements",
+"The value of the property \"Array Tag Name\" will be used for 
the tag name of the elements of an array. The field name of the array field 
will be used as the tag name " +
+"to wrap elements.");
+public static final AllowableValue NO_WRAPPING = new 
AllowableValue("no-wrapping", "No Wrapping",
+"The elements of an array will not be wrapped");
+
+public static final PropertyDescriptor SUPPRESS_NULLS = new 
PropertyDescriptor.Builder()
+.name("suppress_nulls")
+.displayName("Suppress Null Values")
+.description("Specifies how the writer should handle a null 
field")
+.allowableValues(NEVER_SUPPRESS, ALWAYS_SUPPRESS, 
SUPPRESS_MISSING)
+.defaultValue(NEVER_SUPPRESS.getValue())
+.required(true)
+  

[jira] [Commented] (NIFI-5113) Add XML record writer

2018-05-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471641#comment-16471641
 ] 

ASF GitHub Bot commented on NIFI-5113:
--

Github user JohannesDaniel commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187549020
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/WriteXMLResult.java
 ---
@@ -0,0 +1,602 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import javanet.staxutils.IndentingXMLStreamWriter;
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.schema.access.SchemaAccessWriter;
+import org.apache.nifi.serialization.AbstractRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.WriteResult;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.RawRecordWriter;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.type.ArrayDataType;
+import org.apache.nifi.serialization.record.type.ChoiceDataType;
+import org.apache.nifi.serialization.record.type.MapDataType;
+import org.apache.nifi.serialization.record.type.RecordDataType;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+
+import javax.xml.stream.XMLOutputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamWriter;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.text.DateFormat;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.Supplier;
+
+
+public class WriteXMLResult extends AbstractRecordSetWriter implements 
RecordSetWriter, RawRecordWriter {
+
+final ComponentLog logger;
+final RecordSchema recordSchema;
+final SchemaAccessWriter schemaAccess;
+final XMLStreamWriter writer;
+final NullSuppression nullSuppression;
+final ArrayWrapping arrayWrapping;
+final String arrayTagName;
+final String recordTagName;
+final String rootTagName;
+
+private final Supplier LAZY_DATE_FORMAT;
+private final Supplier LAZY_TIME_FORMAT;
+private final Supplier LAZY_TIMESTAMP_FORMAT;
+
+public WriteXMLResult(final ComponentLog logger, final RecordSchema 
recordSchema, final SchemaAccessWriter schemaAccess, final OutputStream out, 
final boolean prettyPrint,
+  final NullSuppression nullSuppression, final 
ArrayWrapping arrayWrapping, final String arrayTagName, final String 
rootTagName, final String recordTagName,
+  final String dateFormat, final String 
timeFormat, final String timestampFormat) throws IOException {
+
+super(out);
+
+this.logger = logger;
+this.recordSchema = recordSchema;
+this.schemaAccess = schemaAccess;
+this.nullSuppression = nullSuppression;
+
+this.arrayWrapping = arrayWrapping;
+this.arrayTagName = arrayTagName;
+
+this.rootTagName = rootTagName;
+this.recordTagName = recordTagName;
+
+final DateFormat df = dateFormat == null ? null : 
DataTypeUtils.getDateFormat(dateFormat);
+final DateFormat tf = timeFormat == null ? null : 
DataTypeUtils.getDateFormat(timeFormat);
+final DateFormat tsf = timestampFormat == null ? null : 
DataTypeUtils.getDateFormat(timestampFormat);
+
+LAZY_DATE_FORMAT = () -> 

[GitHub] nifi pull request #2675: NIFI-5113 Add XMLRecordSetWriter

2018-05-11 Thread JohannesDaniel
Github user JohannesDaniel commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187549020
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/WriteXMLResult.java
 ---
@@ -0,0 +1,602 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import javanet.staxutils.IndentingXMLStreamWriter;
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.schema.access.SchemaAccessWriter;
+import org.apache.nifi.serialization.AbstractRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.WriteResult;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.RawRecordWriter;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.type.ArrayDataType;
+import org.apache.nifi.serialization.record.type.ChoiceDataType;
+import org.apache.nifi.serialization.record.type.MapDataType;
+import org.apache.nifi.serialization.record.type.RecordDataType;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+
+import javax.xml.stream.XMLOutputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamWriter;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.text.DateFormat;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.Supplier;
+
+
+public class WriteXMLResult extends AbstractRecordSetWriter implements 
RecordSetWriter, RawRecordWriter {
+
+final ComponentLog logger;
+final RecordSchema recordSchema;
+final SchemaAccessWriter schemaAccess;
+final XMLStreamWriter writer;
+final NullSuppression nullSuppression;
+final ArrayWrapping arrayWrapping;
+final String arrayTagName;
+final String recordTagName;
+final String rootTagName;
+
+private final Supplier LAZY_DATE_FORMAT;
+private final Supplier LAZY_TIME_FORMAT;
+private final Supplier LAZY_TIMESTAMP_FORMAT;
+
+public WriteXMLResult(final ComponentLog logger, final RecordSchema 
recordSchema, final SchemaAccessWriter schemaAccess, final OutputStream out, 
final boolean prettyPrint,
+  final NullSuppression nullSuppression, final 
ArrayWrapping arrayWrapping, final String arrayTagName, final String 
rootTagName, final String recordTagName,
+  final String dateFormat, final String 
timeFormat, final String timestampFormat) throws IOException {
+
+super(out);
+
+this.logger = logger;
+this.recordSchema = recordSchema;
+this.schemaAccess = schemaAccess;
+this.nullSuppression = nullSuppression;
+
+this.arrayWrapping = arrayWrapping;
+this.arrayTagName = arrayTagName;
+
+this.rootTagName = rootTagName;
+this.recordTagName = recordTagName;
+
+final DateFormat df = dateFormat == null ? null : 
DataTypeUtils.getDateFormat(dateFormat);
+final DateFormat tf = timeFormat == null ? null : 
DataTypeUtils.getDateFormat(timeFormat);
+final DateFormat tsf = timestampFormat == null ? null : 
DataTypeUtils.getDateFormat(timestampFormat);
+
+LAZY_DATE_FORMAT = () -> df;
+LAZY_TIME_FORMAT = () -> tf;
+LAZY_TIMESTAMP_FORMAT = () -> tsf;
+
+try {
+XMLOutputFactory factory = XMLOutputFactory.newInstance();
+
+if (prettyPrint) {
  

[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471591#comment-16471591
 ] 

ASF subversion and git services commented on NIFI-4637:
---

Commit 0b851910f3968e32ec51d2b92bf53c6f9453edb0 in nifi's branch 
refs/heads/master from [~mike.thomsen]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0b85191 ]

NIFI-4637 Added support for visibility labels to the HBase processors.

NIFI-4637 Removed integration test and updated the hbase client from 1.1.2 to 
1.1.13 which is the final version for 1.1.X

NIFI-4637 Fixed EL support issue w/ tests.

NIFI-4637 Added more documentation to DeleteHBaseCells.

NIFI-4637 changed PutHBaseCell/JSON to use dynamic properties instead of a 
'default visibility string.'

NIFI-4637 Added changes requested in a code review.

NIFI-4637 Moved pickVisibilityString to a utility class to make testing easier.

NIFI-4637 Added additionalDetails.html for PutHBaseRecord.

NIFI-4637 Added additional documentation and testing.

NIFI-4637 Added documentation for DeleteHBaseCells.

NIFI-4637 Added pickVisibilityLabel support to PutHBaseRecord and updated 
documentation to reflect that.

NIFI-4637 Reverted version bump to hbase client.

This closes #2518.

Signed-off-by: Koji Kawamura 


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471587#comment-16471587
 ] 

ASF subversion and git services commented on NIFI-4637:
---

Commit 0b851910f3968e32ec51d2b92bf53c6f9453edb0 in nifi's branch 
refs/heads/master from [~mike.thomsen]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0b85191 ]

NIFI-4637 Added support for visibility labels to the HBase processors.

NIFI-4637 Removed integration test and updated the hbase client from 1.1.2 to 
1.1.13 which is the final version for 1.1.X

NIFI-4637 Fixed EL support issue w/ tests.

NIFI-4637 Added more documentation to DeleteHBaseCells.

NIFI-4637 changed PutHBaseCell/JSON to use dynamic properties instead of a 
'default visibility string.'

NIFI-4637 Added changes requested in a code review.

NIFI-4637 Moved pickVisibilityString to a utility class to make testing easier.

NIFI-4637 Added additionalDetails.html for PutHBaseRecord.

NIFI-4637 Added additional documentation and testing.

NIFI-4637 Added documentation for DeleteHBaseCells.

NIFI-4637 Added pickVisibilityLabel support to PutHBaseRecord and updated 
documentation to reflect that.

NIFI-4637 Reverted version bump to hbase client.

This closes #2518.

Signed-off-by: Koji Kawamura 


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread Koji Kawamura (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Kawamura resolved NIFI-4637.
-
   Resolution: Fixed
Fix Version/s: 1.7.0

> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
> Fix For: 1.7.0
>
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471592#comment-16471592
 ] 

ASF subversion and git services commented on NIFI-4637:
---

Commit 0b851910f3968e32ec51d2b92bf53c6f9453edb0 in nifi's branch 
refs/heads/master from [~mike.thomsen]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0b85191 ]

NIFI-4637 Added support for visibility labels to the HBase processors.

NIFI-4637 Removed integration test and updated the hbase client from 1.1.2 to 
1.1.13 which is the final version for 1.1.X

NIFI-4637 Fixed EL support issue w/ tests.

NIFI-4637 Added more documentation to DeleteHBaseCells.

NIFI-4637 changed PutHBaseCell/JSON to use dynamic properties instead of a 
'default visibility string.'

NIFI-4637 Added changes requested in a code review.

NIFI-4637 Moved pickVisibilityString to a utility class to make testing easier.

NIFI-4637 Added additionalDetails.html for PutHBaseRecord.

NIFI-4637 Added additional documentation and testing.

NIFI-4637 Added documentation for DeleteHBaseCells.

NIFI-4637 Added pickVisibilityLabel support to PutHBaseRecord and updated 
documentation to reflect that.

NIFI-4637 Reverted version bump to hbase client.

This closes #2518.

Signed-off-by: Koji Kawamura 


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471590#comment-16471590
 ] 

ASF subversion and git services commented on NIFI-4637:
---

Commit 0b851910f3968e32ec51d2b92bf53c6f9453edb0 in nifi's branch 
refs/heads/master from [~mike.thomsen]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0b85191 ]

NIFI-4637 Added support for visibility labels to the HBase processors.

NIFI-4637 Removed integration test and updated the hbase client from 1.1.2 to 
1.1.13 which is the final version for 1.1.X

NIFI-4637 Fixed EL support issue w/ tests.

NIFI-4637 Added more documentation to DeleteHBaseCells.

NIFI-4637 changed PutHBaseCell/JSON to use dynamic properties instead of a 
'default visibility string.'

NIFI-4637 Added changes requested in a code review.

NIFI-4637 Moved pickVisibilityString to a utility class to make testing easier.

NIFI-4637 Added additionalDetails.html for PutHBaseRecord.

NIFI-4637 Added additional documentation and testing.

NIFI-4637 Added documentation for DeleteHBaseCells.

NIFI-4637 Added pickVisibilityLabel support to PutHBaseRecord and updated 
documentation to reflect that.

NIFI-4637 Reverted version bump to hbase client.

This closes #2518.

Signed-off-by: Koji Kawamura 


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471595#comment-16471595
 ] 

ASF subversion and git services commented on NIFI-4637:
---

Commit 500a254e3f5fe709a2cf6bd1268d57710334ea27 in nifi's branch 
refs/heads/master from [~ijokarumawak]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=500a254 ]

NIFI-4637: HBase visibility lables

Fixed additional docs directory structure.


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471589#comment-16471589
 ] 

ASF subversion and git services commented on NIFI-4637:
---

Commit 0b851910f3968e32ec51d2b92bf53c6f9453edb0 in nifi's branch 
refs/heads/master from [~mike.thomsen]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0b85191 ]

NIFI-4637 Added support for visibility labels to the HBase processors.

NIFI-4637 Removed integration test and updated the hbase client from 1.1.2 to 
1.1.13 which is the final version for 1.1.X

NIFI-4637 Fixed EL support issue w/ tests.

NIFI-4637 Added more documentation to DeleteHBaseCells.

NIFI-4637 changed PutHBaseCell/JSON to use dynamic properties instead of a 
'default visibility string.'

NIFI-4637 Added changes requested in a code review.

NIFI-4637 Moved pickVisibilityString to a utility class to make testing easier.

NIFI-4637 Added additionalDetails.html for PutHBaseRecord.

NIFI-4637 Added additional documentation and testing.

NIFI-4637 Added documentation for DeleteHBaseCells.

NIFI-4637 Added pickVisibilityLabel support to PutHBaseRecord and updated 
documentation to reflect that.

NIFI-4637 Reverted version bump to hbase client.

This closes #2518.

Signed-off-by: Koji Kawamura 


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471583#comment-16471583
 ] 

ASF subversion and git services commented on NIFI-4637:
---

Commit 0b851910f3968e32ec51d2b92bf53c6f9453edb0 in nifi's branch 
refs/heads/master from [~mike.thomsen]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0b85191 ]

NIFI-4637 Added support for visibility labels to the HBase processors.

NIFI-4637 Removed integration test and updated the hbase client from 1.1.2 to 
1.1.13 which is the final version for 1.1.X

NIFI-4637 Fixed EL support issue w/ tests.

NIFI-4637 Added more documentation to DeleteHBaseCells.

NIFI-4637 changed PutHBaseCell/JSON to use dynamic properties instead of a 
'default visibility string.'

NIFI-4637 Added changes requested in a code review.

NIFI-4637 Moved pickVisibilityString to a utility class to make testing easier.

NIFI-4637 Added additionalDetails.html for PutHBaseRecord.

NIFI-4637 Added additional documentation and testing.

NIFI-4637 Added documentation for DeleteHBaseCells.

NIFI-4637 Added pickVisibilityLabel support to PutHBaseRecord and updated 
documentation to reflect that.

NIFI-4637 Reverted version bump to hbase client.

This closes #2518.

Signed-off-by: Koji Kawamura 


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4637) Add support for HBase visibility labels to HBase processors and controller services

2018-05-11 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16471584#comment-16471584
 ] 

ASF subversion and git services commented on NIFI-4637:
---

Commit 0b851910f3968e32ec51d2b92bf53c6f9453edb0 in nifi's branch 
refs/heads/master from [~mike.thomsen]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0b85191 ]

NIFI-4637 Added support for visibility labels to the HBase processors.

NIFI-4637 Removed integration test and updated the hbase client from 1.1.2 to 
1.1.13 which is the final version for 1.1.X

NIFI-4637 Fixed EL support issue w/ tests.

NIFI-4637 Added more documentation to DeleteHBaseCells.

NIFI-4637 changed PutHBaseCell/JSON to use dynamic properties instead of a 
'default visibility string.'

NIFI-4637 Added changes requested in a code review.

NIFI-4637 Moved pickVisibilityString to a utility class to make testing easier.

NIFI-4637 Added additionalDetails.html for PutHBaseRecord.

NIFI-4637 Added additional documentation and testing.

NIFI-4637 Added documentation for DeleteHBaseCells.

NIFI-4637 Added pickVisibilityLabel support to PutHBaseRecord and updated 
documentation to reflect that.

NIFI-4637 Reverted version bump to hbase client.

This closes #2518.

Signed-off-by: Koji Kawamura 


> Add support for HBase visibility labels to HBase processors and controller 
> services
> ---
>
> Key: NIFI-4637
> URL: https://issues.apache.org/jira/browse/NIFI-4637
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> HBase supports visibility labels, but you can't use them from NiFi because 
> there is no way to set them. The existing processors and services should be 
> upgraded to handle this capability.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >