[GitHub] nifi issue #2746: NIFI-5247 NiFi toolkit signal handling changes, Dockerfile...

2018-06-01 Thread pepov
Github user pepov commented on the issue:

https://github.com/apache/nifi/pull/2746
  
added test files to rat exclude, also rebased and squashed commits


---


[jira] [Assigned] (NIFI-5054) Nifi Couchbase Processors does not support User Authentication

2018-06-01 Thread Koji Kawamura (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5054?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Kawamura reassigned NIFI-5054:
---

Assignee: Koji Kawamura

> Nifi Couchbase Processors does not support User Authentication
> --
>
> Key: NIFI-5054
> URL: https://issues.apache.org/jira/browse/NIFI-5054
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.5.0, 1.6.0
>Reporter: Shagun Jaju
>Assignee: Koji Kawamura
>Priority: Major
>  Labels: authentication, security
>
> Issue Description: Nifi Couchbase processors don't work with new couchbase 
> versions 5.0 and 5.1.
> New Couchbase Version 5.x has introduced *Role Based Access Control (RBAC),* 
> a ** new security feature.
>  # All buckets must now be accessed by a *user*/*password* combination that 
> has a *role with access rights* to the bucket.
>  # Buckets no longer use bucket-level passwords
>  # There is no default bucket and no sample buckets with blank passwords.
>  # You cannot create a user without a password.
> *(Ref:* 
> https://developer.couchbase.com/documentation/server/5.0/introduction/whats-new.html
> [https://blog.couchbase.com/new-sdk-authentication/] )
>  
> nifi-couchbase-processors : GetCouchbaseKey and PutCouchbaseKey using 
> Controller Service still uses old authentication mechanism.
>  * org.apache.nifi.processors.couchbase.GetCouchbaseKey
>  * org.apache.nifi.processors.couchbase.PutCouchbaseKey
> Ref: 
> [https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-couchbase-bundle/nifi-couchbase-processors/src/main/java/org/apache/nifi/couchbase/CouchbaseClusterService.java#L116]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5054) Nifi Couchbase Processors does not support User Authentication

2018-06-01 Thread Koji Kawamura (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5054?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Kawamura updated NIFI-5054:

Status: Patch Available  (was: In Progress)

> Nifi Couchbase Processors does not support User Authentication
> --
>
> Key: NIFI-5054
> URL: https://issues.apache.org/jira/browse/NIFI-5054
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0, 1.5.0
>Reporter: Shagun Jaju
>Assignee: Koji Kawamura
>Priority: Major
>  Labels: authentication, security
>
> Issue Description: Nifi Couchbase processors don't work with new couchbase 
> versions 5.0 and 5.1.
> New Couchbase Version 5.x has introduced *Role Based Access Control (RBAC),* 
> a ** new security feature.
>  # All buckets must now be accessed by a *user*/*password* combination that 
> has a *role with access rights* to the bucket.
>  # Buckets no longer use bucket-level passwords
>  # There is no default bucket and no sample buckets with blank passwords.
>  # You cannot create a user without a password.
> *(Ref:* 
> https://developer.couchbase.com/documentation/server/5.0/introduction/whats-new.html
> [https://blog.couchbase.com/new-sdk-authentication/] )
>  
> nifi-couchbase-processors : GetCouchbaseKey and PutCouchbaseKey using 
> Controller Service still uses old authentication mechanism.
>  * org.apache.nifi.processors.couchbase.GetCouchbaseKey
>  * org.apache.nifi.processors.couchbase.PutCouchbaseKey
> Ref: 
> [https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-couchbase-bundle/nifi-couchbase-processors/src/main/java/org/apache/nifi/couchbase/CouchbaseClusterService.java#L116]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5257) Expand Couchbase Server integration as a cache storage

2018-06-01 Thread Koji Kawamura (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Kawamura updated NIFI-5257:

Summary: Expand Couchbase Server integration as a cache storage  (was: 
Expand Couchbase Server integration)

> Expand Couchbase Server integration as a cache storage
> --
>
> Key: NIFI-5257
> URL: https://issues.apache.org/jira/browse/NIFI-5257
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
>Priority: Major
>
> Expand Couchbase Server integration to utilize Couchbase Server as a reliable 
> cache storage.
> Add following new Controller Services:
> * CouchbaseMapCacheClient
> * CouchbaseKeyValueLookupService
> * CouchbaseRecordLookupService



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5247) NiFi toolkit signal handling changes, Dockerfile enhancements

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5247?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497661#comment-16497661
 ] 

ASF GitHub Bot commented on NIFI-5247:
--

Github user pepov commented on the issue:

https://github.com/apache/nifi/pull/2746
  
added test files to rat exclude, also rebased and squashed commits


> NiFi toolkit signal handling changes, Dockerfile enhancements
> -
>
> Key: NIFI-5247
> URL: https://issues.apache.org/jira/browse/NIFI-5247
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker, Tools and Build
>Affects Versions: 1.6.0, 1.7.0
>Reporter: Peter Wilcsinszky
>Priority: Minor
>
> 1. Signal handling issues
> In order for processes to handle signals properly in Docker we have to 
> implement explicit signal handling for the first process in the container. In 
> the case of the NiFi toolkit the easiest solution is to replace the bash 
> shell with the Java process and let it handle the signal using the exec 
> system call. More detailed explanation of the issue: 
> [http://veithen.github.io/2014/11/16/sigterm-propagation.html]
> Relevant issues: NIFI-3505 and NIFI-2689 that already added exec to the run 
> invocation of the nifi.sh start script.
> This changes makes stopping containers fast and graceful.
> 2. TLS toolkit commands and basic tooling in the container
> In order to be able to request certificates from a running CA server instance 
> some tooling is needed inside the container. These tools are openssl for 
> checking ssl certificates and endpoints, and jq for config.json processing. A 
> complete use case is available in the following NiFi helm chart: 
> [https://github.com/pepov/apache-nifi-helm/blob/master/templates/statefulset.yaml#L75]
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5054) Nifi Couchbase Processors does not support User Authentication

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5054?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497718#comment-16497718
 ] 

ASF GitHub Bot commented on NIFI-5054:
--

GitHub user ijokarumawak opened a pull request:

https://github.com/apache/nifi/pull/2750

NIFI-5054: Couchbase Authentication, NIFI-5257: Expand Couchbase integration

This PR includes two commits for NIFI-5054 and NIFI-5257.
NIFI-5054 can be merged separately, but NIFI-5054 depends on the dependency 
version bump in NIFI-5054.
These enhancements are all Couchbase Server related, and should be easy to 
review together.

---

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [x] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [x] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ijokarumawak/nifi nifi-5054

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2750.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2750


commit 4990269646b9dce529cb2c9adb4d70e7bba1beb0
Author: Koji Kawamura 
Date:   2018-06-01T07:55:26Z

NIFI-5054: Add Couchbase user authentication

commit b977450945f014aa9da23f086a53c2e0ac3e8b79
Author: Koji Kawamura 
Date:   2018-06-01T07:57:43Z

NIFI-5257: Expand Couchbase Server integration

- Added CouchbaseMapCacheClient.
- Added CouchbaseKeyValueLookupService.
- Added CouchbaseRecordLookupService.
- Added 'Put Value to Attribute' to GetCouchbaseKey.
- Fixed Get/PutCouchbaseKey relationship descriptions.




> Nifi Couchbase Processors does not support User Authentication
> --
>
> Key: NIFI-5054
> URL: https://issues.apache.org/jira/browse/NIFI-5054
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.5.0, 1.6.0
>Reporter: Shagun Jaju
>Priority: Major
>  Labels: authentication, security
>
> Issue Description: Nifi Couchbase processors don't work with new couchbase 
> versions 5.0 and 5.1.
> New Couchbase Version 5.x has introduced *Role Based Access Control (RBAC),* 
> a ** new security feature.
>  # All buckets must now be accessed by a *user*/*password* combination that 
> has a *role with access rights* to the bucket.
>  # Buckets no longer use bucket-level passwords
>  # There is no default bucket and no sample buckets with blank passwords.
>  # You cannot create a user without a password.
> *(Ref:* 
> https://developer.couchbase.com/documentation/server/5.0/introduction/whats-new.html
> [https://blog.couchbase.com/new-sdk-authentication/] )
>  
> nifi-couchbase-processors : GetCouchbaseKey and PutCouchbaseKey using 
> Controller Service still uses old authentication mechanism.
>  * org.apache.nifi.processors.couchbase.GetCouchbaseKey
>  * org.apache.nifi.processors.couchbase.PutCouchbaseKey
> Ref: 
> 

[jira] [Updated] (NIFI-5257) Expand Couchbase Server integration as a cache storage

2018-06-01 Thread Koji Kawamura (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Kawamura updated NIFI-5257:

Status: Patch Available  (was: In Progress)

> Expand Couchbase Server integration as a cache storage
> --
>
> Key: NIFI-5257
> URL: https://issues.apache.org/jira/browse/NIFI-5257
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
>Priority: Major
>
> Expand Couchbase Server integration to utilize Couchbase Server as a reliable 
> cache storage.
> Add following new Controller Services:
> * CouchbaseMapCacheClient
> * CouchbaseKeyValueLookupService
> * CouchbaseRecordLookupService



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2750: NIFI-5054: Couchbase Authentication, NIFI-5257: Exp...

2018-06-01 Thread ijokarumawak
GitHub user ijokarumawak opened a pull request:

https://github.com/apache/nifi/pull/2750

NIFI-5054: Couchbase Authentication, NIFI-5257: Expand Couchbase integration

This PR includes two commits for NIFI-5054 and NIFI-5257.
NIFI-5054 can be merged separately, but NIFI-5054 depends on the dependency 
version bump in NIFI-5054.
These enhancements are all Couchbase Server related, and should be easy to 
review together.

---

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [x] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [x] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ijokarumawak/nifi nifi-5054

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2750.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2750


commit 4990269646b9dce529cb2c9adb4d70e7bba1beb0
Author: Koji Kawamura 
Date:   2018-06-01T07:55:26Z

NIFI-5054: Add Couchbase user authentication

commit b977450945f014aa9da23f086a53c2e0ac3e8b79
Author: Koji Kawamura 
Date:   2018-06-01T07:57:43Z

NIFI-5257: Expand Couchbase Server integration

- Added CouchbaseMapCacheClient.
- Added CouchbaseKeyValueLookupService.
- Added CouchbaseRecordLookupService.
- Added 'Put Value to Attribute' to GetCouchbaseKey.
- Fixed Get/PutCouchbaseKey relationship descriptions.




---


[jira] [Created] (NIFI-5257) Expand Couchbase Server integration

2018-06-01 Thread Koji Kawamura (JIRA)
Koji Kawamura created NIFI-5257:
---

 Summary: Expand Couchbase Server integration
 Key: NIFI-5257
 URL: https://issues.apache.org/jira/browse/NIFI-5257
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Extensions
Reporter: Koji Kawamura
Assignee: Koji Kawamura


Expand Couchbase Server integration to utilize Couchbase Server as a reliable 
cache storage.

Add following new Controller Services:
* CouchbaseMapCacheClient
* CouchbaseKeyValueLookupService
* CouchbaseRecordLookupService



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5054) Nifi Couchbase Processors does not support User Authentication

2018-06-01 Thread Koji Kawamura (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5054?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Koji Kawamura updated NIFI-5054:

Issue Type: Improvement  (was: Bug)

> Nifi Couchbase Processors does not support User Authentication
> --
>
> Key: NIFI-5054
> URL: https://issues.apache.org/jira/browse/NIFI-5054
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.5.0, 1.6.0
>Reporter: Shagun Jaju
>Assignee: Koji Kawamura
>Priority: Major
>  Labels: authentication, security
>
> Issue Description: Nifi Couchbase processors don't work with new couchbase 
> versions 5.0 and 5.1.
> New Couchbase Version 5.x has introduced *Role Based Access Control (RBAC),* 
> a ** new security feature.
>  # All buckets must now be accessed by a *user*/*password* combination that 
> has a *role with access rights* to the bucket.
>  # Buckets no longer use bucket-level passwords
>  # There is no default bucket and no sample buckets with blank passwords.
>  # You cannot create a user without a password.
> *(Ref:* 
> https://developer.couchbase.com/documentation/server/5.0/introduction/whats-new.html
> [https://blog.couchbase.com/new-sdk-authentication/] )
>  
> nifi-couchbase-processors : GetCouchbaseKey and PutCouchbaseKey using 
> Controller Service still uses old authentication mechanism.
>  * org.apache.nifi.processors.couchbase.GetCouchbaseKey
>  * org.apache.nifi.processors.couchbase.PutCouchbaseKey
> Ref: 
> [https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-couchbase-bundle/nifi-couchbase-processors/src/main/java/org/apache/nifi/couchbase/CouchbaseClusterService.java#L116]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5054) Nifi Couchbase Processors does not support User Authentication

2018-06-01 Thread Koji Kawamura (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5054?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497731#comment-16497731
 ] 

Koji Kawamura commented on NIFI-5054:
-

Changed from BUG to Improvement as the existing NiFi components were written 
when Couchbase did not have user/password authentication. Only bucket password 
was available at that time.

> Nifi Couchbase Processors does not support User Authentication
> --
>
> Key: NIFI-5054
> URL: https://issues.apache.org/jira/browse/NIFI-5054
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.5.0, 1.6.0
>Reporter: Shagun Jaju
>Assignee: Koji Kawamura
>Priority: Major
>  Labels: authentication, security
>
> Issue Description: Nifi Couchbase processors don't work with new couchbase 
> versions 5.0 and 5.1.
> New Couchbase Version 5.x has introduced *Role Based Access Control (RBAC),* 
> a ** new security feature.
>  # All buckets must now be accessed by a *user*/*password* combination that 
> has a *role with access rights* to the bucket.
>  # Buckets no longer use bucket-level passwords
>  # There is no default bucket and no sample buckets with blank passwords.
>  # You cannot create a user without a password.
> *(Ref:* 
> https://developer.couchbase.com/documentation/server/5.0/introduction/whats-new.html
> [https://blog.couchbase.com/new-sdk-authentication/] )
>  
> nifi-couchbase-processors : GetCouchbaseKey and PutCouchbaseKey using 
> Controller Service still uses old authentication mechanism.
>  * org.apache.nifi.processors.couchbase.GetCouchbaseKey
>  * org.apache.nifi.processors.couchbase.PutCouchbaseKey
> Ref: 
> [https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-couchbase-bundle/nifi-couchbase-processors/src/main/java/org/apache/nifi/couchbase/CouchbaseClusterService.java#L116]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2703: NIFI-4907: add 'view provenance' component policy

2018-06-01 Thread mcgilman
Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192492247
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/controller/ControllerFacade.java
 ---
@@ -1359,7 +1363,12 @@ public ProvenanceEventDTO getProvenanceEvent(final 
Long eventId) {
 } else {
 dataAuthorizable = 
flowController.createLocalDataAuthorizable(event.getComponentId());
 }
-dataAuthorizable.authorize(authorizer, RequestAction.READ, 
NiFiUserUtils.getNiFiUser(), attributes);
+// If not authorized for 'view the data', create only 
summarized provenance event
--- End diff --

The original JIRA called to make this more granular because using the data 
policies was too blunt. In the PR as-is, for each event it appears that we 
authorize the event and then authorize the data policies twice. We are 
authorizing the data policy to determine if we should summarize and then again 
to determine if replay is authorized. The replay portion is not changed/new in 
this PR but is an area for improvement we could make now.

Since we're taking this more granular approach I agree with your originally 
filed JIRA to add the additional component based check. This shouldn't 
introduce too much additional cost. The component checks do not consider flow 
file attributes and the results should be easily cached. 

Another improvement that I didn't call out specifically above, is that we 
really only need to check the data policies if we are not summarizing. Whether 
the user is approved for data of a component would only be relevant if we were 
returning the fully populated event.

In order to return the summary, we only need to check the policies for the 
event and the component. Like the component policies, I don't _think_ the flow 
file attributes would need to be considered for the event policies. I believe 
the attributes would only need to be considered for the data policies where we 
are actually returning the attributes and content. This should help with some 
of the performance concerns regarding frequent authorization.


---


[jira] [Commented] (NIFI-4907) Provenance authorization refactoring

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498430#comment-16498430
 ] 

ASF GitHub Bot commented on NIFI-4907:
--

Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192492247
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/controller/ControllerFacade.java
 ---
@@ -1359,7 +1363,12 @@ public ProvenanceEventDTO getProvenanceEvent(final 
Long eventId) {
 } else {
 dataAuthorizable = 
flowController.createLocalDataAuthorizable(event.getComponentId());
 }
-dataAuthorizable.authorize(authorizer, RequestAction.READ, 
NiFiUserUtils.getNiFiUser(), attributes);
+// If not authorized for 'view the data', create only 
summarized provenance event
--- End diff --

The original JIRA called to make this more granular because using the data 
policies was too blunt. In the PR as-is, for each event it appears that we 
authorize the event and then authorize the data policies twice. We are 
authorizing the data policy to determine if we should summarize and then again 
to determine if replay is authorized. The replay portion is not changed/new in 
this PR but is an area for improvement we could make now.

Since we're taking this more granular approach I agree with your originally 
filed JIRA to add the additional component based check. This shouldn't 
introduce too much additional cost. The component checks do not consider flow 
file attributes and the results should be easily cached. 

Another improvement that I didn't call out specifically above, is that we 
really only need to check the data policies if we are not summarizing. Whether 
the user is approved for data of a component would only be relevant if we were 
returning the fully populated event.

In order to return the summary, we only need to check the policies for the 
event and the component. Like the component policies, I don't _think_ the flow 
file attributes would need to be considered for the event policies. I believe 
the attributes would only need to be considered for the data policies where we 
are actually returning the attributes and content. This should help with some 
of the performance concerns regarding frequent authorization.


> Provenance authorization refactoring
> 
>
> Key: NIFI-4907
> URL: https://issues.apache.org/jira/browse/NIFI-4907
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Mark Bean
>Assignee: Mark Bean
>Priority: Major
>
> Currently, the 'view the data' component policy is too tightly coupled with 
> Provenance queries. The 'query provenance' policy should be the only policy 
> required for viewing Provenance query results. Both 'view the component' and 
> 'view the data' policies should be used to refine the appropriate visibility 
> of event details - but not the event itself.
> 1) Component Visibility
> The authorization of Provenance events is inconsistent with the behavior of 
> the graph. For example, if a user does not have 'view the component' policy, 
> the graph shows this component as a "black box" (no details such as name, 
> UUID, etc.) However, when querying Provenance, this component will show up 
> including the Component Type and the Component Name. This is in effect a 
> violation of the policy. These component details should be obscured in the 
> Provenance event displayed if user does not have the appropriate 'view the 
> component' policy.
> 2) Data Visibility
> For a Provenance query, all events should be visible as long as the user 
> performing the query belongs to the 'query provenance' global policy. As 
> mentioned above, some information about the component may be obscured 
> depending on 'view the component' policy, but the event itself should be 
> visible. Additionally, details of the event (clicking the View Details "i" 
> icon) should only be accessible if the user belongs to the 'view the data' 
> policy for the affected component. If the user is not in the appropriate 
> 'view the data' policy, a popup warning should be displayed indicating the 
> reason details are not visible with more specific detail than the current 
> "Contact the system administrator".
> 3) Lineage Graphs
> As with the Provenance table view recommendation above, the lineage graph 
> should display all events. Currently, if the lineage graph includes an event 
> belonging to a component which the user does not have 'view the data', it is 
> shown on the graph as "UNKNOWN". As with Data Visibility mentioned above, the 
> graph should indicate the 

[jira] [Commented] (NIFI-4907) Provenance authorization refactoring

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498436#comment-16498436
 ] 

ASF GitHub Bot commented on NIFI-4907:
--

Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192493531
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-persistent-provenance-repository/src/main/java/org/apache/nifi/provenance/PersistentProvenanceRepository.java
 ---
@@ -403,12 +399,7 @@ public void authorize(final ProvenanceEventRecord 
event, final NiFiUser user) {
 return;
 }
 
-final Authorizable eventAuthorizable;
-if (event.isRemotePortType()) {
-eventAuthorizable = 
resourceFactory.createRemoteDataAuthorizable(event.getComponentId());
-} else {
-eventAuthorizable = 
resourceFactory.createLocalDataAuthorizable(event.getComponentId());
-}
+final Authorizable eventAuthorizable = 
resourceFactory.createProvenanceDataAuthorizable(event.getComponentId());
 eventAuthorizable.authorize(authorizer, RequestAction.READ, user, 
event.getAttributes());
--- End diff --

I don't think the attributes are necessary here. I'm pretty sure the event 
attributes would be necessary for authorizing access to attributes/content.


> Provenance authorization refactoring
> 
>
> Key: NIFI-4907
> URL: https://issues.apache.org/jira/browse/NIFI-4907
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Mark Bean
>Assignee: Mark Bean
>Priority: Major
>
> Currently, the 'view the data' component policy is too tightly coupled with 
> Provenance queries. The 'query provenance' policy should be the only policy 
> required for viewing Provenance query results. Both 'view the component' and 
> 'view the data' policies should be used to refine the appropriate visibility 
> of event details - but not the event itself.
> 1) Component Visibility
> The authorization of Provenance events is inconsistent with the behavior of 
> the graph. For example, if a user does not have 'view the component' policy, 
> the graph shows this component as a "black box" (no details such as name, 
> UUID, etc.) However, when querying Provenance, this component will show up 
> including the Component Type and the Component Name. This is in effect a 
> violation of the policy. These component details should be obscured in the 
> Provenance event displayed if user does not have the appropriate 'view the 
> component' policy.
> 2) Data Visibility
> For a Provenance query, all events should be visible as long as the user 
> performing the query belongs to the 'query provenance' global policy. As 
> mentioned above, some information about the component may be obscured 
> depending on 'view the component' policy, but the event itself should be 
> visible. Additionally, details of the event (clicking the View Details "i" 
> icon) should only be accessible if the user belongs to the 'view the data' 
> policy for the affected component. If the user is not in the appropriate 
> 'view the data' policy, a popup warning should be displayed indicating the 
> reason details are not visible with more specific detail than the current 
> "Contact the system administrator".
> 3) Lineage Graphs
> As with the Provenance table view recommendation above, the lineage graph 
> should display all events. Currently, if the lineage graph includes an event 
> belonging to a component which the user does not have 'view the data', it is 
> shown on the graph as "UNKNOWN". As with Data Visibility mentioned above, the 
> graph should indicate the event type as long as the user is in the 'view the 
> component'. Subsequent "View Details" on the event should only be visible if 
> the user is in the 'view the data' policy.
> In summary, for Provenance query results and lineage graphs, all events 
> should be shown. Component Name and Component Type information should be 
> conditionally visible depending on the corresponding component policy 'view 
> the component' policy. Event details including Provenance event type and 
> FlowFile information should be conditionally available depending on the 
> corresponding component policy 'view the data'. Inability to display event 
> details should provide feedback to the user indicating the reason.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4907) Provenance authorization refactoring

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498437#comment-16498437
 ] 

ASF GitHub Bot commented on NIFI-4907:
--

Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192493603
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-persistent-provenance-repository/src/main/java/org/apache/nifi/provenance/authorization/UserEventAuthorizer.java
 ---
@@ -65,12 +61,7 @@ public void authorize(final ProvenanceEventRecord event) 
{
 return;
 }
 
-final Authorizable eventAuthorizable;
-if (event.isRemotePortType()) {
-eventAuthorizable = 
resourceFactory.createRemoteDataAuthorizable(event.getComponentId());
-} else {
-eventAuthorizable = 
resourceFactory.createLocalDataAuthorizable(event.getComponentId());
-}
+final Authorizable eventAuthorizable = 
resourceFactory.createProvenanceDataAuthorizable(event.getComponentId());
 eventAuthorizable.authorize(authorizer, RequestAction.READ, user, 
event.getAttributes());
--- End diff --

I don't think the attributes are necessary here. I'm pretty sure the event 
attributes would be necessary for authorizing access to attributes/content.


> Provenance authorization refactoring
> 
>
> Key: NIFI-4907
> URL: https://issues.apache.org/jira/browse/NIFI-4907
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Mark Bean
>Assignee: Mark Bean
>Priority: Major
>
> Currently, the 'view the data' component policy is too tightly coupled with 
> Provenance queries. The 'query provenance' policy should be the only policy 
> required for viewing Provenance query results. Both 'view the component' and 
> 'view the data' policies should be used to refine the appropriate visibility 
> of event details - but not the event itself.
> 1) Component Visibility
> The authorization of Provenance events is inconsistent with the behavior of 
> the graph. For example, if a user does not have 'view the component' policy, 
> the graph shows this component as a "black box" (no details such as name, 
> UUID, etc.) However, when querying Provenance, this component will show up 
> including the Component Type and the Component Name. This is in effect a 
> violation of the policy. These component details should be obscured in the 
> Provenance event displayed if user does not have the appropriate 'view the 
> component' policy.
> 2) Data Visibility
> For a Provenance query, all events should be visible as long as the user 
> performing the query belongs to the 'query provenance' global policy. As 
> mentioned above, some information about the component may be obscured 
> depending on 'view the component' policy, but the event itself should be 
> visible. Additionally, details of the event (clicking the View Details "i" 
> icon) should only be accessible if the user belongs to the 'view the data' 
> policy for the affected component. If the user is not in the appropriate 
> 'view the data' policy, a popup warning should be displayed indicating the 
> reason details are not visible with more specific detail than the current 
> "Contact the system administrator".
> 3) Lineage Graphs
> As with the Provenance table view recommendation above, the lineage graph 
> should display all events. Currently, if the lineage graph includes an event 
> belonging to a component which the user does not have 'view the data', it is 
> shown on the graph as "UNKNOWN". As with Data Visibility mentioned above, the 
> graph should indicate the event type as long as the user is in the 'view the 
> component'. Subsequent "View Details" on the event should only be visible if 
> the user is in the 'view the data' policy.
> In summary, for Provenance query results and lineage graphs, all events 
> should be shown. Component Name and Component Type information should be 
> conditionally visible depending on the corresponding component policy 'view 
> the component' policy. Event details including Provenance event type and 
> FlowFile information should be conditionally available depending on the 
> corresponding component policy 'view the data'. Inability to display event 
> details should provide feedback to the user indicating the reason.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4907) Provenance authorization refactoring

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498439#comment-16498439
 ] 

ASF GitHub Bot commented on NIFI-4907:
--

Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192493550
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-persistent-provenance-repository/src/main/java/org/apache/nifi/provenance/WriteAheadProvenanceRepository.java
 ---
@@ -226,12 +226,7 @@ private void authorize(final ProvenanceEventRecord 
event, final NiFiUser user) {
 return;
 }
 
-final Authorizable eventAuthorizable;
-if (event.isRemotePortType()) {
-eventAuthorizable = 
resourceFactory.createRemoteDataAuthorizable(event.getComponentId());
-} else {
-eventAuthorizable = 
resourceFactory.createLocalDataAuthorizable(event.getComponentId());
-}
+final Authorizable eventAuthorizable = 
resourceFactory.createProvenanceDataAuthorizable(event.getComponentId());
 eventAuthorizable.authorize(authorizer, RequestAction.READ, user, 
event.getAttributes());
--- End diff --

I don't think the attributes are necessary here. I'm pretty sure the event 
attributes would be necessary for authorizing access to attributes/content.


> Provenance authorization refactoring
> 
>
> Key: NIFI-4907
> URL: https://issues.apache.org/jira/browse/NIFI-4907
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Mark Bean
>Assignee: Mark Bean
>Priority: Major
>
> Currently, the 'view the data' component policy is too tightly coupled with 
> Provenance queries. The 'query provenance' policy should be the only policy 
> required for viewing Provenance query results. Both 'view the component' and 
> 'view the data' policies should be used to refine the appropriate visibility 
> of event details - but not the event itself.
> 1) Component Visibility
> The authorization of Provenance events is inconsistent with the behavior of 
> the graph. For example, if a user does not have 'view the component' policy, 
> the graph shows this component as a "black box" (no details such as name, 
> UUID, etc.) However, when querying Provenance, this component will show up 
> including the Component Type and the Component Name. This is in effect a 
> violation of the policy. These component details should be obscured in the 
> Provenance event displayed if user does not have the appropriate 'view the 
> component' policy.
> 2) Data Visibility
> For a Provenance query, all events should be visible as long as the user 
> performing the query belongs to the 'query provenance' global policy. As 
> mentioned above, some information about the component may be obscured 
> depending on 'view the component' policy, but the event itself should be 
> visible. Additionally, details of the event (clicking the View Details "i" 
> icon) should only be accessible if the user belongs to the 'view the data' 
> policy for the affected component. If the user is not in the appropriate 
> 'view the data' policy, a popup warning should be displayed indicating the 
> reason details are not visible with more specific detail than the current 
> "Contact the system administrator".
> 3) Lineage Graphs
> As with the Provenance table view recommendation above, the lineage graph 
> should display all events. Currently, if the lineage graph includes an event 
> belonging to a component which the user does not have 'view the data', it is 
> shown on the graph as "UNKNOWN". As with Data Visibility mentioned above, the 
> graph should indicate the event type as long as the user is in the 'view the 
> component'. Subsequent "View Details" on the event should only be visible if 
> the user is in the 'view the data' policy.
> In summary, for Provenance query results and lineage graphs, all events 
> should be shown. Component Name and Component Type information should be 
> conditionally visible depending on the corresponding component policy 'view 
> the component' policy. Event details including Provenance event type and 
> FlowFile information should be conditionally available depending on the 
> corresponding component policy 'view the data'. Inability to display event 
> details should provide feedback to the user indicating the reason.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4907) Provenance authorization refactoring

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498438#comment-16498438
 ] 

ASF GitHub Bot commented on NIFI-4907:
--

Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192493635
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-volatile-provenance-repository/src/main/java/org/apache/nifi/provenance/VolatileProvenanceRepository.java
 ---
@@ -280,12 +276,7 @@ protected void authorize(final ProvenanceEventRecord 
event, final NiFiUser user)
 return;
 }
 
-final Authorizable eventAuthorizable;
-if (event.isRemotePortType()) {
-eventAuthorizable = 
resourceFactory.createRemoteDataAuthorizable(event.getComponentId());
-} else {
-eventAuthorizable = 
resourceFactory.createLocalDataAuthorizable(event.getComponentId());
-}
+final Authorizable eventAuthorizable = 
resourceFactory.createProvenanceDataAuthorizable(event.getComponentId());
 eventAuthorizable.authorize(authorizer, RequestAction.READ, user, 
event.getAttributes());
--- End diff --

I don't think the attributes are necessary here. I'm pretty sure the event 
attributes would be necessary for authorizing access to attributes/content.


> Provenance authorization refactoring
> 
>
> Key: NIFI-4907
> URL: https://issues.apache.org/jira/browse/NIFI-4907
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Mark Bean
>Assignee: Mark Bean
>Priority: Major
>
> Currently, the 'view the data' component policy is too tightly coupled with 
> Provenance queries. The 'query provenance' policy should be the only policy 
> required for viewing Provenance query results. Both 'view the component' and 
> 'view the data' policies should be used to refine the appropriate visibility 
> of event details - but not the event itself.
> 1) Component Visibility
> The authorization of Provenance events is inconsistent with the behavior of 
> the graph. For example, if a user does not have 'view the component' policy, 
> the graph shows this component as a "black box" (no details such as name, 
> UUID, etc.) However, when querying Provenance, this component will show up 
> including the Component Type and the Component Name. This is in effect a 
> violation of the policy. These component details should be obscured in the 
> Provenance event displayed if user does not have the appropriate 'view the 
> component' policy.
> 2) Data Visibility
> For a Provenance query, all events should be visible as long as the user 
> performing the query belongs to the 'query provenance' global policy. As 
> mentioned above, some information about the component may be obscured 
> depending on 'view the component' policy, but the event itself should be 
> visible. Additionally, details of the event (clicking the View Details "i" 
> icon) should only be accessible if the user belongs to the 'view the data' 
> policy for the affected component. If the user is not in the appropriate 
> 'view the data' policy, a popup warning should be displayed indicating the 
> reason details are not visible with more specific detail than the current 
> "Contact the system administrator".
> 3) Lineage Graphs
> As with the Provenance table view recommendation above, the lineage graph 
> should display all events. Currently, if the lineage graph includes an event 
> belonging to a component which the user does not have 'view the data', it is 
> shown on the graph as "UNKNOWN". As with Data Visibility mentioned above, the 
> graph should indicate the event type as long as the user is in the 'view the 
> component'. Subsequent "View Details" on the event should only be visible if 
> the user is in the 'view the data' policy.
> In summary, for Provenance query results and lineage graphs, all events 
> should be shown. Component Name and Component Type information should be 
> conditionally visible depending on the corresponding component policy 'view 
> the component' policy. Event details including Provenance event type and 
> FlowFile information should be conditionally available depending on the 
> corresponding component policy 'view the data'. Inability to display event 
> details should provide feedback to the user indicating the reason.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498479#comment-16498479
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192496067
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498473#comment-16498473
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192493590
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498471#comment-16498471
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192492813
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498413#comment-16498413
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192488650
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,276 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@Tags({ "rest", "lookup", "json", "xml" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(Validator.VALID)
+.required(false)
+.build();
+
+static final PropertyDescriptor SSL_CONTEXT_SERVICE = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-ssl-context-service")
+

[GitHub] nifi issue #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2723
  
@ijokarumawak Your first feedback points were merged in, let me know what 
you think of the last item and it should be easy to get done.


---


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192488650
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,276 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@Tags({ "rest", "lookup", "json", "xml" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(Validator.VALID)
+.required(false)
+.build();
+
+static final PropertyDescriptor SSL_CONTEXT_SERVICE = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-ssl-context-service")
+.displayName("SSL Context Service")
+.description("The SSL Context Service used to provide client 
certificate information for TLS/SSL "
++ "connections.")
+.required(false)
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498415#comment-16498415
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2723
  
@ijokarumawak Your first feedback points were merged in, let me know what 
you think of the last item and it should be easy to get done.


> Add a REST lookup service
> -
>
> Key: NIFI-5214
> URL: https://issues.apache.org/jira/browse/NIFI-5214
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> * Should have reader API support
>  * Should be able to drill down through complex XML and JSON responses to a 
> nested record.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-5259) Provenance repo "failed to perform background maintenance procedures" due failing to read schema

2018-06-01 Thread Joseph Percivall (JIRA)
Joseph Percivall created NIFI-5259:
--

 Summary: Provenance repo "failed to perform background maintenance 
procedures" due failing to read schema
 Key: NIFI-5259
 URL: https://issues.apache.org/jira/browse/NIFI-5259
 Project: Apache NiFi
  Issue Type: Bug
Affects Versions: 1.6.0
 Environment: Dockerized NiFi v1.6.0, with a link to a Registry 
instance, receiving data from MiNiFi java v0.4.0 and NiFi v1.6.0
Reporter: Joseph Percivall
Assignee: Mark Payne


Seeing an odd error (ST below) with the Provenance Repo as a background task 
and also when attempting to query it. It's not getting a lot of data and the 
issue persists through restarts of the container and also 
stop/rm/docker-compose up of the container.

Looking at the code, it's attempting to read the first record in the repo:
final List firstEvents = eventStore.getEvents(0, 1);
Looking through the provenance record itself, it appears the event appears to 
just be missing that field altogether.

 
{quote}2018-06-01 19:32:55,114 ERROR [Provenance Repository Maintenance-1] 
o.a.n.p.index.lucene.LuceneEventIndex Failed to perform background maintenance 
procedures
java.io.IOException: Invalid Boolean value found when reading 'Repetition' of 
field 'Source System FlowFile Identifier'. Expected 0 or 1 but got 145
  at 
[org.apache.nifi|http://org.apache.nifi/].repository.schema.SchemaRecordReader.readField(SchemaRecordReader.java:107)
  at 
[org.apache.nifi|http://org.apache.nifi/].repository.schema.SchemaRecordReader.readRecord(SchemaRecordReader.java:72)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.EventIdFirstSchemaRecordReader.readRecord(EventIdFirstSchemaRecordReader.java:138)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.EventIdFirstSchemaRecordReader.nextRecord(EventIdFirstSchemaRecordReader.java:132)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.serialization.CompressableRecordReader.nextRecord(CompressableRecordReader.java:287)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.store.iterator.SequentialRecordReaderEventIterator.nextEvent(SequentialRecordReaderEventIterator.java:73)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.store.iterator.AuthorizingEventIterator.nextEvent(AuthorizingEventIterator.java:47)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.store.PartitionedEventStore.getEvents(PartitionedEventStore.java:214)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.store.PartitionedEventStore.getEvents(PartitionedEventStore.java:158)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.store.PartitionedEventStore.getEvents(PartitionedEventStore.java:148)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.index.lucene.LuceneEventIndex.performMaintenance(LuceneEventIndex.java:650)
  at 
[org.apache.nifi|http://org.apache.nifi/].provenance.index.lucene.LuceneEventIndex.lambda$initialize$0(LuceneEventIndex.java:156)
  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
  at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
  at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
  at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  at java.lang.Thread.run(Thread.java:748)
{quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5259) Provenance repo "failed to perform background maintenance procedures" due failing to read schema

2018-06-01 Thread Joseph Percivall (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5259?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498600#comment-16498600
 ] 

Joseph Percivall commented on NIFI-5259:


Talking on the Apache NiFi HipChat, [~markap14] said he'd take a look next week.

> Provenance repo "failed to perform background maintenance procedures" due 
> failing to read schema
> 
>
> Key: NIFI-5259
> URL: https://issues.apache.org/jira/browse/NIFI-5259
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
> Environment: Dockerized NiFi v1.6.0, with a link to a Registry 
> instance, receiving data from MiNiFi java v0.4.0 and NiFi v1.6.0
>Reporter: Joseph Percivall
>Assignee: Mark Payne
>Priority: Major
>
> Seeing an odd error (ST below) with the Provenance Repo as a background task 
> and also when attempting to query it. It's not getting a lot of data and the 
> issue persists through restarts of the container and also 
> stop/rm/docker-compose up of the container.
> Looking at the code, it's attempting to read the first record in the repo:
> final List firstEvents = eventStore.getEvents(0, 1);
> Looking through the provenance record itself, it appears the event appears to 
> just be missing that field altogether.
>  
> {quote}2018-06-01 19:32:55,114 ERROR [Provenance Repository Maintenance-1] 
> o.a.n.p.index.lucene.LuceneEventIndex Failed to perform background 
> maintenance procedures
> java.io.IOException: Invalid Boolean value found when reading 'Repetition' of 
> field 'Source System FlowFile Identifier'. Expected 0 or 1 but got 145
>   at 
> [org.apache.nifi|http://org.apache.nifi/].repository.schema.SchemaRecordReader.readField(SchemaRecordReader.java:107)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].repository.schema.SchemaRecordReader.readRecord(SchemaRecordReader.java:72)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.EventIdFirstSchemaRecordReader.readRecord(EventIdFirstSchemaRecordReader.java:138)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.EventIdFirstSchemaRecordReader.nextRecord(EventIdFirstSchemaRecordReader.java:132)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.serialization.CompressableRecordReader.nextRecord(CompressableRecordReader.java:287)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.store.iterator.SequentialRecordReaderEventIterator.nextEvent(SequentialRecordReaderEventIterator.java:73)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.store.iterator.AuthorizingEventIterator.nextEvent(AuthorizingEventIterator.java:47)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.store.PartitionedEventStore.getEvents(PartitionedEventStore.java:214)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.store.PartitionedEventStore.getEvents(PartitionedEventStore.java:158)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.store.PartitionedEventStore.getEvents(PartitionedEventStore.java:148)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.index.lucene.LuceneEventIndex.performMaintenance(LuceneEventIndex.java:650)
>   at 
> [org.apache.nifi|http://org.apache.nifi/].provenance.index.lucene.LuceneEventIndex.lambda$initialize$0(LuceneEventIndex.java:156)
>   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>   at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>   at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748)
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFIREG-173) Allow metadata DB to use other DBs besides H2

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFIREG-173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498634#comment-16498634
 ] 

ASF GitHub Bot commented on NIFIREG-173:


Github user kevdoran commented on the issue:

https://github.com/apache/nifi-registry/pull/121
  
Will review...


> Allow metadata DB to use other DBs besides H2
> -
>
> Key: NIFIREG-173
> URL: https://issues.apache.org/jira/browse/NIFIREG-173
> Project: NiFi Registry
>  Issue Type: Improvement
>Affects Versions: 0.1.0
>Reporter: Bryan Bende
>Assignee: Bryan Bende
>Priority: Major
>
> Now that we have the Git provider for flow storage which can be used to push 
> flows to a remote location, it would be nice to be able to leverage an 
> external DB for the metadata database.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-registry issue #121: NIFIREG-173 Refactor metadata DB to be independent...

2018-06-01 Thread kevdoran
Github user kevdoran commented on the issue:

https://github.com/apache/nifi-registry/pull/121
  
Will review...


---


[jira] [Commented] (NIFIREG-173) Allow metadata DB to use other DBs besides H2

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFIREG-173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498386#comment-16498386
 ] 

ASF GitHub Bot commented on NIFIREG-173:


GitHub user bbende opened a pull request:

https://github.com/apache/nifi-registry/pull/121

NIFIREG-173 Refactor metadata DB to be independent of H2


The approach here is to create a new DB with a schema and DataSource that 
is not specific to H2 and migrate existing data. 

During start up if the previous DB properties are populated and the new DB 
has not been setup yet, then it initiate a migration of the data from the old 
DB to new DB. For anyone starting up for the first time without the legacy DB 
then it will just start with the new schema and DB.

I tested taking an existing 0.1.0 registry and dropping the old H2 DB into 
a build of this PR and it migrated over to the new H2 DB and was able to 
continue using the app as normal. Future restarts after that don't trigger the 
migration since the new DB already exists at that point.

I also tested the same scenario as above, but using Postgres as the target 
DB by copying the Postgres driver jar into the lib directory and configuring 
the appropriate properties in nifi-registry.properties.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/bbende/nifi-registry db-refactor

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-registry/pull/121.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #121


commit b8ac082b0c0449a038aa09a38d67789e8b1559db
Author: Bryan Bende 
Date:   2018-05-30T18:31:26Z

NIFIREG-173 Refactor metadata DB to be independent of H2




> Allow metadata DB to use other DBs besides H2
> -
>
> Key: NIFIREG-173
> URL: https://issues.apache.org/jira/browse/NIFIREG-173
> Project: NiFi Registry
>  Issue Type: Improvement
>Affects Versions: 0.1.0
>Reporter: Bryan Bende
>Assignee: Bryan Bende
>Priority: Major
>
> Now that we have the Git provider for flow storage which can be used to push 
> flows to a remote location, it would be nice to be able to leverage an 
> external DB for the metadata database.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498401#comment-16498401
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192486471
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/test/groovy/org/apache/nifi/lookup/RestLookupServiceIT.groovy
 ---
@@ -106,6 +106,37 @@ class RestLookupServiceIT {
 }
 }
 
+@Test
+void testHeaders() {
+runner.disableControllerService(lookupService)
+runner.setProperty(lookupService, "header.X-USER", "jane.doe")
+runner.setProperty(lookupService, "header.X-PASS", "testing7890")
+runner.enableControllerService(lookupService)
+
+TestServer server = new TestServer()
+ServletHandler handler = new ServletHandler()
+handler.addServletWithMapping(SimpleJson.class, "/simple")
+server.addHandler(handler)
+try {
+server.startServer()
+
+def coordinates = [
+"schema.name": "simple",
+"endpoint": server.url + "/simple",
--- End diff --

I like that. How about this breakdown:

1. `endpoint` is a template string set on a property descriptor and we use 
an EL compiler to generate the endpoint on the fly with those lookup 
coordinates.
2. Add `direct_endpoint` which is a treated as literal value to override 
that if present.
3. If both are present, throw an exception.

Thoughts?


> Add a REST lookup service
> -
>
> Key: NIFI-5214
> URL: https://issues.apache.org/jira/browse/NIFI-5214
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> * Should have reader API support
>  * Should be able to drill down through complex XML and JSON responses to a 
> nested record.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192486471
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/test/groovy/org/apache/nifi/lookup/RestLookupServiceIT.groovy
 ---
@@ -106,6 +106,37 @@ class RestLookupServiceIT {
 }
 }
 
+@Test
+void testHeaders() {
+runner.disableControllerService(lookupService)
+runner.setProperty(lookupService, "header.X-USER", "jane.doe")
+runner.setProperty(lookupService, "header.X-PASS", "testing7890")
+runner.enableControllerService(lookupService)
+
+TestServer server = new TestServer()
+ServletHandler handler = new ServletHandler()
+handler.addServletWithMapping(SimpleJson.class, "/simple")
+server.addHandler(handler)
+try {
+server.startServer()
+
+def coordinates = [
+"schema.name": "simple",
+"endpoint": server.url + "/simple",
--- End diff --

I like that. How about this breakdown:

1. `endpoint` is a template string set on a property descriptor and we use 
an EL compiler to generate the endpoint on the fly with those lookup 
coordinates.
2. Add `direct_endpoint` which is a treated as literal value to override 
that if present.
3. If both are present, throw an exception.

Thoughts?


---


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192492813
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192492212
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192494587
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192496067
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192495631
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192492300
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
--- End diff --

No need for the explicit validator when identifying controller services.


---


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192493834
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192493590
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192495332
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498476#comment-16498476
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192496937
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498470#comment-16498470
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192493834
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498478#comment-16498478
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192495631
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498474#comment-16498474
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192495332
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192494134
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192494945
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192496937
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498467#comment-16498467
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192492212
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498469#comment-16498469
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192493332
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498477#comment-16498477
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192494587
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498472#comment-16498472
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192494134
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498468#comment-16498468
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192492300
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
--- End diff --

No need for the explicit validator when identifying controller services.


> Add a REST lookup service
> -
>
> Key: NIFI-5214
> URL: https://issues.apache.org/jira/browse/NIFI-5214
> Project: Apache NiFi
>  Issue Type: New Feature
>

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498475#comment-16498475
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192494945
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real 

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread markap14
Github user markap14 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192493332
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,390 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+

[GitHub] nifi pull request #2703: NIFI-4907: add 'view provenance' component policy

2018-06-01 Thread mcgilman
Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192493635
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-volatile-provenance-repository/src/main/java/org/apache/nifi/provenance/VolatileProvenanceRepository.java
 ---
@@ -280,12 +276,7 @@ protected void authorize(final ProvenanceEventRecord 
event, final NiFiUser user)
 return;
 }
 
-final Authorizable eventAuthorizable;
-if (event.isRemotePortType()) {
-eventAuthorizable = 
resourceFactory.createRemoteDataAuthorizable(event.getComponentId());
-} else {
-eventAuthorizable = 
resourceFactory.createLocalDataAuthorizable(event.getComponentId());
-}
+final Authorizable eventAuthorizable = 
resourceFactory.createProvenanceDataAuthorizable(event.getComponentId());
 eventAuthorizable.authorize(authorizer, RequestAction.READ, user, 
event.getAttributes());
--- End diff --

I don't think the attributes are necessary here. I'm pretty sure the event 
attributes would be necessary for authorizing access to attributes/content.


---


[GitHub] nifi pull request #2703: NIFI-4907: add 'view provenance' component policy

2018-06-01 Thread mcgilman
Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192493603
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-persistent-provenance-repository/src/main/java/org/apache/nifi/provenance/authorization/UserEventAuthorizer.java
 ---
@@ -65,12 +61,7 @@ public void authorize(final ProvenanceEventRecord event) 
{
 return;
 }
 
-final Authorizable eventAuthorizable;
-if (event.isRemotePortType()) {
-eventAuthorizable = 
resourceFactory.createRemoteDataAuthorizable(event.getComponentId());
-} else {
-eventAuthorizable = 
resourceFactory.createLocalDataAuthorizable(event.getComponentId());
-}
+final Authorizable eventAuthorizable = 
resourceFactory.createProvenanceDataAuthorizable(event.getComponentId());
 eventAuthorizable.authorize(authorizer, RequestAction.READ, user, 
event.getAttributes());
--- End diff --

I don't think the attributes are necessary here. I'm pretty sure the event 
attributes would be necessary for authorizing access to attributes/content.


---


[GitHub] nifi pull request #2703: NIFI-4907: add 'view provenance' component policy

2018-06-01 Thread mcgilman
Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192493531
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-persistent-provenance-repository/src/main/java/org/apache/nifi/provenance/PersistentProvenanceRepository.java
 ---
@@ -403,12 +399,7 @@ public void authorize(final ProvenanceEventRecord 
event, final NiFiUser user) {
 return;
 }
 
-final Authorizable eventAuthorizable;
-if (event.isRemotePortType()) {
-eventAuthorizable = 
resourceFactory.createRemoteDataAuthorizable(event.getComponentId());
-} else {
-eventAuthorizable = 
resourceFactory.createLocalDataAuthorizable(event.getComponentId());
-}
+final Authorizable eventAuthorizable = 
resourceFactory.createProvenanceDataAuthorizable(event.getComponentId());
 eventAuthorizable.authorize(authorizer, RequestAction.READ, user, 
event.getAttributes());
--- End diff --

I don't think the attributes are necessary here. I'm pretty sure the event 
attributes would be necessary for authorizing access to attributes/content.


---


[GitHub] nifi pull request #2703: NIFI-4907: add 'view provenance' component policy

2018-06-01 Thread mcgilman
Github user mcgilman commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192493550
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-persistent-provenance-repository/src/main/java/org/apache/nifi/provenance/WriteAheadProvenanceRepository.java
 ---
@@ -226,12 +226,7 @@ private void authorize(final ProvenanceEventRecord 
event, final NiFiUser user) {
 return;
 }
 
-final Authorizable eventAuthorizable;
-if (event.isRemotePortType()) {
-eventAuthorizable = 
resourceFactory.createRemoteDataAuthorizable(event.getComponentId());
-} else {
-eventAuthorizable = 
resourceFactory.createLocalDataAuthorizable(event.getComponentId());
-}
+final Authorizable eventAuthorizable = 
resourceFactory.createProvenanceDataAuthorizable(event.getComponentId());
 eventAuthorizable.authorize(authorizer, RequestAction.READ, user, 
event.getAttributes());
--- End diff --

I don't think the attributes are necessary here. I'm pretty sure the event 
attributes would be necessary for authorizing access to attributes/content.


---


[GitHub] nifi-registry pull request #121: NIFIREG-173 Refactor metadata DB to be inde...

2018-06-01 Thread bbende
GitHub user bbende opened a pull request:

https://github.com/apache/nifi-registry/pull/121

NIFIREG-173 Refactor metadata DB to be independent of H2


The approach here is to create a new DB with a schema and DataSource that 
is not specific to H2 and migrate existing data. 

During start up if the previous DB properties are populated and the new DB 
has not been setup yet, then it initiate a migration of the data from the old 
DB to new DB. For anyone starting up for the first time without the legacy DB 
then it will just start with the new schema and DB.

I tested taking an existing 0.1.0 registry and dropping the old H2 DB into 
a build of this PR and it migrated over to the new H2 DB and was able to 
continue using the app as normal. Future restarts after that don't trigger the 
migration since the new DB already exists at that point.

I also tested the same scenario as above, but using Postgres as the target 
DB by copying the Postgres driver jar into the lib directory and configuring 
the appropriate properties in nifi-registry.properties.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/bbende/nifi-registry db-refactor

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-registry/pull/121.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #121


commit b8ac082b0c0449a038aa09a38d67789e8b1559db
Author: Bryan Bende 
Date:   2018-05-30T18:31:26Z

NIFIREG-173 Refactor metadata DB to be independent of H2




---


[GitHub] nifi pull request #2751: NIFI-5221: Added 'Object Tagging' functionalities t...

2018-06-01 Thread zenfenan
GitHub user zenfenan opened a pull request:

https://github.com/apache/nifi/pull/2751

NIFI-5221: Added 'Object Tagging' functionalities to S3 Processors

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [x] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [x] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/zenfenan/nifi NIFI-5221

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2751.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2751


commit 91ddbcd85e1de7239ea6d151150e0fa83504a98e
Author: zenfenan 
Date:   2018-06-01T11:35:09Z

NIFI-5221: Added 'Object Tagging' functionalities to S3 Processors




---


[GitHub] nifi issue #2749: NIFI-5145 Fixed evaluateAttributeExpressions in mockproper...

2018-06-01 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2749
  
@joewitt can you review this commit?


---


[jira] [Commented] (NIFI-5145) MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null inputs

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497951#comment-16497951
 ] 

ASF GitHub Bot commented on NIFI-5145:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2749
  
@joewitt can you review this commit?


> MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null 
> inputs
> 
>
> Key: NIFI-5145
> URL: https://issues.apache.org/jira/browse/NIFI-5145
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
> Fix For: 1.7.0
>
>
> The method mentioned in the title line cannot handle null inputs, even though 
> the main NiFi execution classes can handle that scenario. This forces hack to 
> pass testing with nulls that looks like this:
> String val = flowFile != null ? 
> context.getProperty(PROP).evaluateExpressionLanguage(flowfile).getValue() : 
> context.getProperty(PROP).evaluateExpressionLanguage(new 
> HashMap()).getValue();



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2723: NIFI-5214 Added REST LookupService

2018-06-01 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2723
  
@ijokarumawak I can't believe I missed your comments. Will try to get to 
those today.


---


[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497954#comment-16497954
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2723
  
@ijokarumawak I can't believe I missed your comments. Will try to get to 
those today.


> Add a REST lookup service
> -
>
> Key: NIFI-5214
> URL: https://issues.apache.org/jira/browse/NIFI-5214
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> * Should have reader API support
>  * Should be able to drill down through complex XML and JSON responses to a 
> nested record.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5221) Add Object Tagging support for AWS S3 Processors

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497884#comment-16497884
 ] 

ASF GitHub Bot commented on NIFI-5221:
--

GitHub user zenfenan opened a pull request:

https://github.com/apache/nifi/pull/2751

NIFI-5221: Added 'Object Tagging' functionalities to S3 Processors

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [x] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [x] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/zenfenan/nifi NIFI-5221

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2751.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2751


commit 91ddbcd85e1de7239ea6d151150e0fa83504a98e
Author: zenfenan 
Date:   2018-06-01T11:35:09Z

NIFI-5221: Added 'Object Tagging' functionalities to S3 Processors




> Add Object Tagging support for AWS S3 Processors
> 
>
> Key: NIFI-5221
> URL: https://issues.apache.org/jira/browse/NIFI-5221
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> AWS has introduced new set of functionalities that enable the S3 bucket and 
> objects to be tagged. This can be useful for data classification purposes and 
> with new regulatory process related to data are being introduced such as 
> GDPR, object tagging can be quite useful and helpful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5221) Add Object Tagging support for AWS S3 Processors

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497887#comment-16497887
 ] 

ASF GitHub Bot commented on NIFI-5221:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r192371979
  
--- Diff: nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/pom.xml ---
@@ -69,6 +69,11 @@
 2.6.6
 test
 
+
+com.google.code.gson
+gson
--- End diff --

Does it need to be added in nifi-aws-nar's 
[NOTICE](https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-aws-bundle/nifi-aws-nar/src/main/resources/META-INF/NOTICE)
 ?


> Add Object Tagging support for AWS S3 Processors
> 
>
> Key: NIFI-5221
> URL: https://issues.apache.org/jira/browse/NIFI-5221
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> AWS has introduced new set of functionalities that enable the S3 bucket and 
> objects to be tagged. This can be useful for data classification purposes and 
> with new regulatory process related to data are being introduced such as 
> GDPR, object tagging can be quite useful and helpful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2751: NIFI-5221: Added 'Object Tagging' functionalities t...

2018-06-01 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r192371979
  
--- Diff: nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/pom.xml ---
@@ -69,6 +69,11 @@
 2.6.6
 test
 
+
+com.google.code.gson
+gson
--- End diff --

Does it need to be added in nifi-aws-nar's 
[NOTICE](https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-aws-bundle/nifi-aws-nar/src/main/resources/META-INF/NOTICE)
 ?


---


[jira] [Commented] (NIFI-5249) Dockerfile enhancements

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497963#comment-16497963
 ] 

ASF GitHub Bot commented on NIFI-5249:
--

Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2747
  
So to test this?
- mvn package
- mvn package -P docker from nifi-docker
- ???
- docker run --rm -ti --entrypoint /bin/bash apache/nifi -c "env | grep 
NIFI" ? from nifi-docker dir?
- docker run --rm -ti --entrypoint /bin/bash apache/nifi -c "find /opt/nifi 
! -user nifi"  from nifi-docker dir?


> Dockerfile enhancements
> ---
>
> Key: NIFI-5249
> URL: https://issues.apache.org/jira/browse/NIFI-5249
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker
>Reporter: Peter Wilcsinszky
>Priority: Minor
>
> * make environment variables more explicit
>  * create data and log directories
>  * add procps for process visibility inside the container



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2747: NIFI-5249 Dockerfile enhancements

2018-06-01 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2747
  
So to test this?
- mvn package
- mvn package -P docker from nifi-docker
- ???
- docker run --rm -ti --entrypoint /bin/bash apache/nifi -c "env | grep 
NIFI" ? from nifi-docker dir?
- docker run --rm -ti --entrypoint /bin/bash apache/nifi -c "find /opt/nifi 
! -user nifi"  from nifi-docker dir?


---


[jira] [Commented] (NIFI-4907) Provenance authorization refactoring

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498059#comment-16498059
 ] 

ASF GitHub Bot commented on NIFI-4907:
--

Github user markobean commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192413226
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/controller/ControllerFacade.java
 ---
@@ -1359,7 +1363,12 @@ public ProvenanceEventDTO getProvenanceEvent(final 
Long eventId) {
 } else {
 dataAuthorizable = 
flowController.createLocalDataAuthorizable(event.getComponentId());
 }
-dataAuthorizable.authorize(authorizer, RequestAction.READ, 
NiFiUserUtils.getNiFiUser(), attributes);
+// If not authorized for 'view the data', create only 
summarized provenance event
--- End diff --

My only concern with the approach you outlined is the additional 
authorizations calls to determine "if the user is allowed". What you suggest 
requires up to 2 additional authorizations per provenance event. Already on 
busy systems, we have observed authorizing the user to each provenance event as 
a limiting factor (it can result in provenance becoming unusable).  
Having said that, unless you think of another approach which would require 
fewer authorizations calls, I'll proceed as you recommend. I suspect there may 
be a future JIRA ticket to address the provenance query/authorization impact 
anyhow; if so, this can be addressed at that time. We won't know for sure if 
this is a problem until we get the current fix into an appropriately loaded 
test environment.


> Provenance authorization refactoring
> 
>
> Key: NIFI-4907
> URL: https://issues.apache.org/jira/browse/NIFI-4907
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.5.0
>Reporter: Mark Bean
>Assignee: Mark Bean
>Priority: Major
>
> Currently, the 'view the data' component policy is too tightly coupled with 
> Provenance queries. The 'query provenance' policy should be the only policy 
> required for viewing Provenance query results. Both 'view the component' and 
> 'view the data' policies should be used to refine the appropriate visibility 
> of event details - but not the event itself.
> 1) Component Visibility
> The authorization of Provenance events is inconsistent with the behavior of 
> the graph. For example, if a user does not have 'view the component' policy, 
> the graph shows this component as a "black box" (no details such as name, 
> UUID, etc.) However, when querying Provenance, this component will show up 
> including the Component Type and the Component Name. This is in effect a 
> violation of the policy. These component details should be obscured in the 
> Provenance event displayed if user does not have the appropriate 'view the 
> component' policy.
> 2) Data Visibility
> For a Provenance query, all events should be visible as long as the user 
> performing the query belongs to the 'query provenance' global policy. As 
> mentioned above, some information about the component may be obscured 
> depending on 'view the component' policy, but the event itself should be 
> visible. Additionally, details of the event (clicking the View Details "i" 
> icon) should only be accessible if the user belongs to the 'view the data' 
> policy for the affected component. If the user is not in the appropriate 
> 'view the data' policy, a popup warning should be displayed indicating the 
> reason details are not visible with more specific detail than the current 
> "Contact the system administrator".
> 3) Lineage Graphs
> As with the Provenance table view recommendation above, the lineage graph 
> should display all events. Currently, if the lineage graph includes an event 
> belonging to a component which the user does not have 'view the data', it is 
> shown on the graph as "UNKNOWN". As with Data Visibility mentioned above, the 
> graph should indicate the event type as long as the user is in the 'view the 
> component'. Subsequent "View Details" on the event should only be visible if 
> the user is in the 'view the data' policy.
> In summary, for Provenance query results and lineage graphs, all events 
> should be shown. Component Name and Component Type information should be 
> conditionally visible depending on the corresponding component policy 'view 
> the component' policy. Event details including Provenance event type and 
> FlowFile information should be conditionally available depending on the 
> corresponding component policy 'view the data'. Inability to display event 
> details should provide feedback to the user indicating the 

[jira] [Commented] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498117#comment-16498117
 ] 

ASF GitHub Bot commented on NIFI-5200:
--

GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2753

NIFI-5200: Fixed issue with InputStream being closed when calling Pro…

…cessSession.read() twice against sequential Content Claims

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi NIFI-5200-Fix

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2753.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2753


commit 5ac73af025bbb34246dabdf39d806a50d49971a6
Author: Mark Payne 
Date:   2018-06-01T15:14:56Z

NIFI-5200: Fixed issue with InputStream being closed when calling 
ProcessSession.read() twice against sequential Content Claims




> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Minor
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2747: NIFI-5249 Dockerfile enhancements

2018-06-01 Thread pepov
Github user pepov commented on the issue:

https://github.com/apache/nifi/pull/2747
  
No, sorry! I meant integration-test not package. I beleive this would do 
it, at least works for me, just tested:
```
mvn package -pl nifi-assembly
cd nifi-docker
mvn integration-test -P docker
```



---


[jira] [Commented] (NIFI-5249) Dockerfile enhancements

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16497992#comment-16497992
 ] 

ASF GitHub Bot commented on NIFI-5249:
--

Github user pepov commented on the issue:

https://github.com/apache/nifi/pull/2747
  
No, sorry! I meant integration-test not package. I beleive this would do 
it, at least works for me, just tested:
```
mvn package -pl nifi-assembly
cd nifi-docker
mvn integration-test -P docker
```



> Dockerfile enhancements
> ---
>
> Key: NIFI-5249
> URL: https://issues.apache.org/jira/browse/NIFI-5249
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker
>Reporter: Peter Wilcsinszky
>Priority: Minor
>
> * make environment variables more explicit
>  * create data and log directories
>  * add procps for process visibility inside the container



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4410) PutElasticsearchHttp needs better error handling and logging

2018-06-01 Thread John Smith (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4410?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498066#comment-16498066
 ] 

John Smith commented on NIFI-4410:
--

We're still seeing this problem in NiFi 1.6.0. We're using Elasticsearch 6.2.4

> PutElasticsearchHttp needs better error handling and logging
> 
>
> Key: NIFI-4410
> URL: https://issues.apache.org/jira/browse/NIFI-4410
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Joseph Witt
>Assignee: Matt Burgess
>Priority: Major
> Fix For: 1.6.0
>
>
> https://github.com/apache/nifi/blob/6b5015e39b4233cf230151fb45bebcb21df03730/nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttp.java#L364-L366
> If it cannot extract the reason text it provides a very generic error and 
> there is nothing else logged.  You get no context as to what went wrong and 
> further the condition doesn't cause yielding or anything so there is just a 
> massive flood of errors in logs that dont' advise the user of the problem.
> We need to make sure the information can be made available to help 
> troubleshoot and we need to cause yielding so that such cases do not cause 
> continuous floods of errors.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2747: NIFI-5249 Dockerfile enhancements

2018-06-01 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2747
  
+1 fwiw|lgtm.
Ran the above steps to build and test the image from the maven snapshots ( 
not built locally ).
Everything ran fine.

Your integration test is awesome.  I'm totally going to steal it.

Super stuff.



---


[jira] [Commented] (NIFI-5249) Dockerfile enhancements

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5249?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498077#comment-16498077
 ] 

ASF GitHub Bot commented on NIFI-5249:
--

Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2747
  
+1 fwiw|lgtm.
Ran the above steps to build and test the image from the maven snapshots ( 
not built locally ).
Everything ran fine.

Your integration test is awesome.  I'm totally going to steal it.

Super stuff.



> Dockerfile enhancements
> ---
>
> Key: NIFI-5249
> URL: https://issues.apache.org/jira/browse/NIFI-5249
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker
>Reporter: Peter Wilcsinszky
>Priority: Minor
>
> * make environment variables more explicit
>  * create data and log directories
>  * add procps for process visibility inside the container



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-01 Thread Mark Payne (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498116#comment-16498116
 ] 

Mark Payne commented on NIFI-5200:
--

Re-Opening Jira as it introduced a new bug. In the case where 
`ProcessSession.read(flowFile1)` is called, followed by 
`ProcessSession.read(flowFile2)` within the same ProcessSession, if flowFile2's 
content is in the same resource claim as flowFile1, but at a later position in 
the claim, then an Exception is thrown when attempting to read from the stream 
for flowFile2:
{code:java}
2018-06-01 10:39:58,661 ERROR [Timer-Driven Process Thread-10] 
o.a.nifi.processors.standard.QueryRecord 
QueryRecord[id=9c2efcee-d6db-3017-c02d-02cc3b76b759] Unable to query 
StandardFlowFileRecord[uuid=06a17875-275f-47b8-b7fa-c90e43dd024f,claim=StandardContentClaim
 [resourceClaim=StandardResourceClaim[id=1527275483034-20, container=default, 
section=20], offset=747772, length=186943],offset=0,name=1.log,size=186943] due 
to org.apache.nifi.processor.exception.ProcessException: Failed to read next 
record in stream for 
StandardFlowFileRecord[uuid=06a17875-275f-47b8-b7fa-c90e43dd024f,claim=StandardContentClaim
 [resourceClaim=StandardResourceClaim[id=1527275483034-20, container=default, 
section=20], offset=747772, length=186943],offset=0,name=1.log,size=186943]: 
org.apache.nifi.processor.exception.ProcessException: Failed to read next 
record in stream for 
StandardFlowFileRecord[uuid=06a17875-275f-47b8-b7fa-c90e43dd024f,claim=StandardContentClaim
 [resourceClaim=StandardResourceClaim[id=1527275483034-20, container=default, 
section=20], offset=747772, length=186943],offset=0,name=1.log,size=186943]
org.apache.nifi.processor.exception.ProcessException: Failed to read next 
record in stream for 
StandardFlowFileRecord[uuid=06a17875-275f-47b8-b7fa-c90e43dd024f,claim=StandardContentClaim
 [resourceClaim=StandardResourceClaim[id=1527275483034-20, container=default, 
section=20], offset=747772, length=186943],offset=0,name=1.log,size=186943]
at 
org.apache.nifi.queryrecord.FlowFileEnumerator.moveNext(FlowFileEnumerator.java:65)
at Baz$1$1.moveNext(Unknown Source)
at org.apache.calcite.linq4j.Linq4j$EnumeratorIterator.(Linq4j.java:664)
at org.apache.calcite.linq4j.Linq4j.enumeratorIterator(Linq4j.java:98)
at 
org.apache.calcite.linq4j.AbstractEnumerable.iterator(AbstractEnumerable.java:33)
at org.apache.calcite.avatica.MetaImpl.createCursor(MetaImpl.java:89)
at 
org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:196)
at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
at 
org.apache.calcite.avatica.AvaticaConnection.executeQueryInternal(AvaticaConnection.java:513)
at 
org.apache.calcite.avatica.AvaticaPreparedStatement.executeQuery(AvaticaPreparedStatement.java:132)
at 
org.apache.nifi.processors.standard.QueryRecord.queryWithCache(QueryRecord.java:470)
at 
org.apache.nifi.processors.standard.QueryRecord.onTrigger(QueryRecord.java:301)
at 
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at 
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
at 
org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
at 
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.nifi.processor.exception.FlowFileAccessException: Could 
not read from 
StandardFlowFileRecord[uuid=06a17875-275f-47b8-b7fa-c90e43dd024f,claim=StandardContentClaim
 [resourceClaim=StandardResourceClaim[id=1527275483034-20, container=default, 
section=20], offset=747772, length=186943],offset=0,name=1.log,size=186943]
at 
org.apache.nifi.controller.repository.io.FlowFileAccessInputStream.read(FlowFileAccessInputStream.java:93)
at 
org.apache.nifi.controller.repository.StandardProcessSession$6.read(StandardProcessSession.java:2284)
at 
org.apache.nifi.controller.repository.io.TaskTerminationInputStream.read(TaskTerminationInputStream.java:68)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at 

[GitHub] nifi pull request #2753: NIFI-5200: Fixed issue with InputStream being close...

2018-06-01 Thread markap14
GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2753

NIFI-5200: Fixed issue with InputStream being closed when calling Pro…

…cessSession.read() twice against sequential Content Claims

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi NIFI-5200-Fix

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2753.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2753


commit 5ac73af025bbb34246dabdf39d806a50d49971a6
Author: Mark Payne 
Date:   2018-06-01T15:14:56Z

NIFI-5200: Fixed issue with InputStream being closed when calling 
ProcessSession.read() twice against sequential Content Claims




---


[jira] [Updated] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-01 Thread Mark Payne (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-5200:
-
Status: Patch Available  (was: Reopened)

> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Minor
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-01 Thread Mark Payne (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-5200:
-
Priority: Blocker  (was: Minor)

> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Blocker
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-01 Thread Mark Payne (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498119#comment-16498119
 ] 

Mark Payne commented on NIFI-5200:
--

I've submitted a new PR to address the issue that I noted above. The unit test 
will fail without the fix applied and passes with the fix applied.

> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Minor
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5241) When calculating stats for components, use synchronized methods instead of atomic variables

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5241?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498054#comment-16498054
 ] 

ASF GitHub Bot commented on NIFI-5241:
--

GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2752

NIFI-5241: Updated EventSumValue to use synchronized methods instead …

…of many atomic values. This is more efficient and uses less heap. Also 
noticed that the Logger instance in ProcessorNode was not used so removed it, 
and in testing this also noticed that the default connection pool size for 
OkHttpReplicationClient was only 5, which can cause a lot of unnecessary HTTP 
connections to be created so adjusted the pool size

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi NIFI-5241

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2752.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2752


commit 14e5a228650a9995a7b030cc84fe7bf5e639cd93
Author: Mark Payne 
Date:   2018-06-01T14:21:17Z

NIFI-5241: Updated EventSumValue to use synchronized methods instead of 
many atomic values. This is more efficient and uses less heap. Also noticed 
that the Logger instance in ProcessorNode was not used so removed it, and in 
testing this also noticed that the default connection pool size for 
OkHttpReplicationClient was only 5, which can cause a lot of unnecessary HTTP 
connections to be created so adjusted the pool size




> When calculating stats for components, use synchronized methods instead of 
> atomic variables
> ---
>
> Key: NIFI-5241
> URL: https://issues.apache.org/jira/browse/NIFI-5241
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.7.0
>
>
> Currently, the EventSumValue that is used to calculate stats for components, 
> such as bytes in, bytes out, etc. using AtomicLong's and AtomicInteger's, etc 
> to keep track of values. This made sense at first when there were only a few 
> stats. Now, however, they hold about 17 different values and the atomic 
> updates / atomic reads are more expensive than a synchronized method would 
> be. This can cause sluggishness in the UI after the instance has been running 
> for a while, especially if there are a lot of processors.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2752: NIFI-5241: Updated EventSumValue to use synchronize...

2018-06-01 Thread markap14
GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2752

NIFI-5241: Updated EventSumValue to use synchronized methods instead …

…of many atomic values. This is more efficient and uses less heap. Also 
noticed that the Logger instance in ProcessorNode was not used so removed it, 
and in testing this also noticed that the default connection pool size for 
OkHttpReplicationClient was only 5, which can cause a lot of unnecessary HTTP 
connections to be created so adjusted the pool size

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi NIFI-5241

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2752.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2752


commit 14e5a228650a9995a7b030cc84fe7bf5e639cd93
Author: Mark Payne 
Date:   2018-06-01T14:21:17Z

NIFI-5241: Updated EventSumValue to use synchronized methods instead of 
many atomic values. This is more efficient and uses less heap. Also noticed 
that the Logger instance in ProcessorNode was not used so removed it, and in 
testing this also noticed that the default connection pool size for 
OkHttpReplicationClient was only 5, which can cause a lot of unnecessary HTTP 
connections to be created so adjusted the pool size




---


[jira] [Updated] (NIFI-5241) When calculating stats for components, use synchronized methods instead of atomic variables

2018-06-01 Thread Mark Payne (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5241?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-5241:
-
Fix Version/s: 1.7.0
   Status: Patch Available  (was: Open)

> When calculating stats for components, use synchronized methods instead of 
> atomic variables
> ---
>
> Key: NIFI-5241
> URL: https://issues.apache.org/jira/browse/NIFI-5241
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.7.0
>
>
> Currently, the EventSumValue that is used to calculate stats for components, 
> such as bytes in, bytes out, etc. using AtomicLong's and AtomicInteger's, etc 
> to keep track of values. This made sense at first when there were only a few 
> stats. Now, however, they hold about 17 different values and the atomic 
> updates / atomic reads are more expensive than a synchronized method would 
> be. This can cause sluggishness in the UI after the instance has been running 
> for a while, especially if there are a lot of processors.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFIREG-173) Allow metadata DB to use other DBs besides H2

2018-06-01 Thread Bryan Bende (JIRA)
Bryan Bende created NIFIREG-173:
---

 Summary: Allow metadata DB to use other DBs besides H2
 Key: NIFIREG-173
 URL: https://issues.apache.org/jira/browse/NIFIREG-173
 Project: NiFi Registry
  Issue Type: Improvement
Affects Versions: 0.1.0
Reporter: Bryan Bende
Assignee: Bryan Bende


Now that we have the Git provider for flow storage which can be used to push 
flows to a remote location, it would be nice to be able to leverage an external 
DB for the metadata database.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5145) MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null inputs

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498041#comment-16498041
 ] 

ASF GitHub Bot commented on NIFI-5145:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2749#discussion_r192408467
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListFileTransfer.java
 ---
@@ -43,7 +43,7 @@
 .description("The fully qualified hostname or IP address of the 
remote system")
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .required(true)
-
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
--- End diff --

According to 
[this](https://github.com/apache/nifi/pull/2749/files#diff-a68a345757d54ad30f79658062d6e794R76),
 the attributes will not be populated with the right information if the 
attributes come in from flow files as no FlowFile is passed in. 

This is a weird thing because ListFTP declares a property HOSTNAME that had 
EL scope VARIABLE_REGISTRY. But it also creates an FTPTransfer object which 
declares a HOSTNAME property that has EL scope FLOWFILES. Because they have the 
same name, the context fetches the FTPTransfer property but gets ListFTP's 
property instead. Then it calls evaluateAttributeExpressions(flowFile), but 
that violates ListFTP's HOSTNAME EL scope.

This fix prevents the violation, but the attributes would still be 
incorrect right? I mention that because I thought of applying this same fix but 
instead put in the one from #2717 


> MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null 
> inputs
> 
>
> Key: NIFI-5145
> URL: https://issues.apache.org/jira/browse/NIFI-5145
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
> Fix For: 1.7.0
>
>
> The method mentioned in the title line cannot handle null inputs, even though 
> the main NiFi execution classes can handle that scenario. This forces hack to 
> pass testing with nulls that looks like this:
> String val = flowFile != null ? 
> context.getProperty(PROP).evaluateExpressionLanguage(flowfile).getValue() : 
> context.getProperty(PROP).evaluateExpressionLanguage(new 
> HashMap()).getValue();



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2749: NIFI-5145 Fixed evaluateAttributeExpressions in moc...

2018-06-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2749#discussion_r192408467
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListFileTransfer.java
 ---
@@ -43,7 +43,7 @@
 .description("The fully qualified hostname or IP address of the 
remote system")
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .required(true)
-
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
--- End diff --

According to 
[this](https://github.com/apache/nifi/pull/2749/files#diff-a68a345757d54ad30f79658062d6e794R76),
 the attributes will not be populated with the right information if the 
attributes come in from flow files as no FlowFile is passed in. 

This is a weird thing because ListFTP declares a property HOSTNAME that had 
EL scope VARIABLE_REGISTRY. But it also creates an FTPTransfer object which 
declares a HOSTNAME property that has EL scope FLOWFILES. Because they have the 
same name, the context fetches the FTPTransfer property but gets ListFTP's 
property instead. Then it calls evaluateAttributeExpressions(flowFile), but 
that violates ListFTP's HOSTNAME EL scope.

This fix prevents the violation, but the attributes would still be 
incorrect right? I mention that because I thought of applying this same fix but 
instead put in the one from #2717 


---


[GitHub] nifi pull request #2703: NIFI-4907: add 'view provenance' component policy

2018-06-01 Thread markobean
Github user markobean commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2703#discussion_r192413226
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/controller/ControllerFacade.java
 ---
@@ -1359,7 +1363,12 @@ public ProvenanceEventDTO getProvenanceEvent(final 
Long eventId) {
 } else {
 dataAuthorizable = 
flowController.createLocalDataAuthorizable(event.getComponentId());
 }
-dataAuthorizable.authorize(authorizer, RequestAction.READ, 
NiFiUserUtils.getNiFiUser(), attributes);
+// If not authorized for 'view the data', create only 
summarized provenance event
--- End diff --

My only concern with the approach you outlined is the additional 
authorizations calls to determine "if the user is allowed". What you suggest 
requires up to 2 additional authorizations per provenance event. Already on 
busy systems, we have observed authorizing the user to each provenance event as 
a limiting factor (it can result in provenance becoming unusable).  
Having said that, unless you think of another approach which would require 
fewer authorizations calls, I'll proceed as you recommend. I suspect there may 
be a future JIRA ticket to address the provenance query/authorization impact 
anyhow; if so, this can be addressed at that time. We won't know for sure if 
this is a problem until we get the current fix into an appropriately loaded 
test environment.


---


[jira] [Reopened] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-01 Thread Mark Payne (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne reopened NIFI-5200:
--

> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Minor
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-01 Thread Mark Payne (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498121#comment-16498121
 ] 

Mark Payne commented on NIFI-5200:
--

Also changed Priority to Blocker, since NiFi should not be released without 
this problem being addressed.

> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Blocker
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2749: NIFI-5145 Fixed evaluateAttributeExpressions in moc...

2018-06-01 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2749#discussion_r192440580
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListFileTransfer.java
 ---
@@ -43,7 +43,7 @@
 .description("The fully qualified hostname or IP address of the 
remote system")
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .required(true)
-
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
--- End diff --

Your first link raises a good point about the FTP issue. Do you want to 
separate this into two tickets with the understanding that the immediate fix 
will break the master build (due to the FTP processors) but address the issue 
or do it all here?

@joewitt 


---


[jira] [Commented] (NIFI-5145) MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null inputs

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498154#comment-16498154
 ] 

ASF GitHub Bot commented on NIFI-5145:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2749#discussion_r192440580
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListFileTransfer.java
 ---
@@ -43,7 +43,7 @@
 .description("The fully qualified hostname or IP address of the 
remote system")
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .required(true)
-
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
--- End diff --

Your first link raises a good point about the FTP issue. Do you want to 
separate this into two tickets with the understanding that the immediate fix 
will break the master build (due to the FTP processors) but address the issue 
or do it all here?

@joewitt 


> MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null 
> inputs
> 
>
> Key: NIFI-5145
> URL: https://issues.apache.org/jira/browse/NIFI-5145
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
> Fix For: 1.7.0
>
>
> The method mentioned in the title line cannot handle null inputs, even though 
> the main NiFi execution classes can handle that scenario. This forces hack to 
> pass testing with nulls that looks like this:
> String val = flowFile != null ? 
> context.getProperty(PROP).evaluateExpressionLanguage(flowfile).getValue() : 
> context.getProperty(PROP).evaluateExpressionLanguage(new 
> HashMap()).getValue();



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5247) NiFi toolkit signal handling changes, Dockerfile enhancements

2018-06-01 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5247?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498283#comment-16498283
 ] 

ASF subversion and git services commented on NIFI-5247:
---

Commit caa71fce9260ed717d501bd962a6a26ae61c25ea in nifi's branch 
refs/heads/master from [~pepov]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=caa71fc ]

NIFI-5247 nifi-toolkit bash entry points should leverage exec to replace bash 
with the current java process in order to handle signals properly in docker.
 - Also add bash, openssl, jq to make certificate request operations easier
 - Move project.version to the build config from the Dockerfile, use target/ 
folder for the build dependency
 - Docker integration tests for checking exit codes and tls-toolkit basic 
server-client interaction

This closes #2746.


> NiFi toolkit signal handling changes, Dockerfile enhancements
> -
>
> Key: NIFI-5247
> URL: https://issues.apache.org/jira/browse/NIFI-5247
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker, Tools and Build
>Affects Versions: 1.6.0, 1.7.0
>Reporter: Peter Wilcsinszky
>Priority: Minor
>
> 1. Signal handling issues
> In order for processes to handle signals properly in Docker we have to 
> implement explicit signal handling for the first process in the container. In 
> the case of the NiFi toolkit the easiest solution is to replace the bash 
> shell with the Java process and let it handle the signal using the exec 
> system call. More detailed explanation of the issue: 
> [http://veithen.github.io/2014/11/16/sigterm-propagation.html]
> Relevant issues: NIFI-3505 and NIFI-2689 that already added exec to the run 
> invocation of the nifi.sh start script.
> This changes makes stopping containers fast and graceful.
> 2. TLS toolkit commands and basic tooling in the container
> In order to be able to request certificates from a running CA server instance 
> some tooling is needed inside the container. These tools are openssl for 
> checking ssl certificates and endpoints, and jq for config.json processing. A 
> complete use case is available in the following NiFi helm chart: 
> [https://github.com/pepov/apache-nifi-helm/blob/master/templates/statefulset.yaml#L75]
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2749: NIFI-5145 Fixed evaluateAttributeExpressions in moc...

2018-06-01 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2749#discussion_r192455843
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListFileTransfer.java
 ---
@@ -43,7 +43,7 @@
 .description("The fully qualified hostname or IP address of the 
remote system")
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .required(true)
-
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
--- End diff --

I spent some time trying to untangle it and found that the reuse of some of 
these property descriptors between the FTP processors, combine with the 
processors having different input requirements wrecked havoc on the test 
framework. So I think we're going to need to commit the bare minimum fix here 
and do a separate ticket to refactor the FTP processors.


---


[jira] [Commented] (NIFI-5145) MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null inputs

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498227#comment-16498227
 ] 

ASF GitHub Bot commented on NIFI-5145:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2749#discussion_r192455843
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListFileTransfer.java
 ---
@@ -43,7 +43,7 @@
 .description("The fully qualified hostname or IP address of the 
remote system")
 .addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
 .required(true)
-
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
--- End diff --

I spent some time trying to untangle it and found that the reuse of some of 
these property descriptors between the FTP processors, combine with the 
processors having different input requirements wrecked havoc on the test 
framework. So I think we're going to need to commit the bare minimum fix here 
and do a separate ticket to refactor the FTP processors.


> MockPropertyValue.evaluateExpressionLanguage(FlowFile) cannot handle null 
> inputs
> 
>
> Key: NIFI-5145
> URL: https://issues.apache.org/jira/browse/NIFI-5145
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
> Fix For: 1.7.0
>
>
> The method mentioned in the title line cannot handle null inputs, even though 
> the main NiFi execution classes can handle that scenario. This forces hack to 
> pass testing with nulls that looks like this:
> String val = flowFile != null ? 
> context.getProperty(PROP).evaluateExpressionLanguage(flowfile).getValue() : 
> context.getProperty(PROP).evaluateExpressionLanguage(new 
> HashMap()).getValue();



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2746: NIFI-5247 NiFi toolkit signal handling changes, Dockerfile...

2018-06-01 Thread jtstorck
Github user jtstorck commented on the issue:

https://github.com/apache/nifi/pull/2746
  
Contrib build passes, and was able to reproduce your example usages of 
invoking the toolkit and observing the exit codes.

+1, merging to master.  Thanks for your contribution, @pepov!


---


[jira] [Commented] (NIFI-5247) NiFi toolkit signal handling changes, Dockerfile enhancements

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5247?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498245#comment-16498245
 ] 

ASF GitHub Bot commented on NIFI-5247:
--

Github user jtstorck commented on the issue:

https://github.com/apache/nifi/pull/2746
  
Contrib build passes, and was able to reproduce your example usages of 
invoking the toolkit and observing the exit codes.

+1, merging to master.  Thanks for your contribution, @pepov!


> NiFi toolkit signal handling changes, Dockerfile enhancements
> -
>
> Key: NIFI-5247
> URL: https://issues.apache.org/jira/browse/NIFI-5247
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker, Tools and Build
>Affects Versions: 1.6.0, 1.7.0
>Reporter: Peter Wilcsinszky
>Priority: Minor
>
> 1. Signal handling issues
> In order for processes to handle signals properly in Docker we have to 
> implement explicit signal handling for the first process in the container. In 
> the case of the NiFi toolkit the easiest solution is to replace the bash 
> shell with the Java process and let it handle the signal using the exec 
> system call. More detailed explanation of the issue: 
> [http://veithen.github.io/2014/11/16/sigterm-propagation.html]
> Relevant issues: NIFI-3505 and NIFI-2689 that already added exec to the run 
> invocation of the nifi.sh start script.
> This changes makes stopping containers fast and graceful.
> 2. TLS toolkit commands and basic tooling in the container
> In order to be able to request certificates from a running CA server instance 
> some tooling is needed inside the container. These tools are openssl for 
> checking ssl certificates and endpoints, and jq for config.json processing. A 
> complete use case is available in the following NiFi helm chart: 
> [https://github.com/pepov/apache-nifi-helm/blob/master/templates/statefulset.yaml#L75]
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2746: NIFI-5247 NiFi toolkit signal handling changes, Doc...

2018-06-01 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2746


---


[jira] [Commented] (NIFI-5247) NiFi toolkit signal handling changes, Dockerfile enhancements

2018-06-01 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5247?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16498285#comment-16498285
 ] 

ASF GitHub Bot commented on NIFI-5247:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2746


> NiFi toolkit signal handling changes, Dockerfile enhancements
> -
>
> Key: NIFI-5247
> URL: https://issues.apache.org/jira/browse/NIFI-5247
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker, Tools and Build
>Affects Versions: 1.6.0, 1.7.0
>Reporter: Peter Wilcsinszky
>Priority: Minor
>
> 1. Signal handling issues
> In order for processes to handle signals properly in Docker we have to 
> implement explicit signal handling for the first process in the container. In 
> the case of the NiFi toolkit the easiest solution is to replace the bash 
> shell with the Java process and let it handle the signal using the exec 
> system call. More detailed explanation of the issue: 
> [http://veithen.github.io/2014/11/16/sigterm-propagation.html]
> Relevant issues: NIFI-3505 and NIFI-2689 that already added exec to the run 
> invocation of the nifi.sh start script.
> This changes makes stopping containers fast and graceful.
> 2. TLS toolkit commands and basic tooling in the container
> In order to be able to request certificates from a running CA server instance 
> some tooling is needed inside the container. These tools are openssl for 
> checking ssl certificates and endpoints, and jq for config.json processing. A 
> complete use case is available in the following NiFi helm chart: 
> [https://github.com/pepov/apache-nifi-helm/blob/master/templates/statefulset.yaml#L75]
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-5247) NiFi toolkit signal handling changes, Dockerfile enhancements

2018-06-01 Thread Jeff Storck (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5247?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jeff Storck resolved NIFI-5247.
---
   Resolution: Fixed
Fix Version/s: 1.7.0

> NiFi toolkit signal handling changes, Dockerfile enhancements
> -
>
> Key: NIFI-5247
> URL: https://issues.apache.org/jira/browse/NIFI-5247
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Docker, Tools and Build
>Affects Versions: 1.6.0, 1.7.0
>Reporter: Peter Wilcsinszky
>Priority: Minor
> Fix For: 1.7.0
>
>
> 1. Signal handling issues
> In order for processes to handle signals properly in Docker we have to 
> implement explicit signal handling for the first process in the container. In 
> the case of the NiFi toolkit the easiest solution is to replace the bash 
> shell with the Java process and let it handle the signal using the exec 
> system call. More detailed explanation of the issue: 
> [http://veithen.github.io/2014/11/16/sigterm-propagation.html]
> Relevant issues: NIFI-3505 and NIFI-2689 that already added exec to the run 
> invocation of the nifi.sh start script.
> This changes makes stopping containers fast and graceful.
> 2. TLS toolkit commands and basic tooling in the container
> In order to be able to request certificates from a running CA server instance 
> some tooling is needed inside the container. These tools are openssl for 
> checking ssl certificates and endpoints, and jq for config.json processing. A 
> complete use case is available in the following NiFi helm chart: 
> [https://github.com/pepov/apache-nifi-helm/blob/master/templates/statefulset.yaml#L75]
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


  1   2   >