[jira] [Updated] (NIFI-5044) SelectHiveQL accept only one statement

2018-05-12 Thread Ed Berezitsky (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-5044?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ed Berezitsky updated NIFI-5044:

   Labels: features patch pull-request-available  (was: )
Affects Version/s: 1.3.0
   1.4.0
   1.5.0
   1.6.0
   Attachment: 
0001-NIFI-5044-SelectHiveQL-accept-only-one-statement.patch
   Status: Patch Available  (was: In Progress)

> SelectHiveQL accept only one statement
> --
>
> Key: NIFI-5044
> URL: https://issues.apache.org/jira/browse/NIFI-5044
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.6.0, 1.5.0, 1.4.0, 1.3.0, 1.2.0
>Reporter: Davide Isoardi
>Assignee: Ed Berezitsky
>Priority: Critical
>  Labels: patch, pull-request-available, features
> Attachments: 
> 0001-NIFI-5044-SelectHiveQL-accept-only-one-statement.patch
>
>
> In [this 
> |[https://github.com/apache/nifi/commit/bbc714e73ba245de7bc32fd9958667c847101f7d]
>  ] commit claims to add support to running multiple statements both on 
> SelectHiveQL and PutHiveQL; instead, it adds only the support to PutHiveQL, 
> so SelectHiveQL still lacks this important feature. @Matt Burgess, I saw that 
> you worked on that, is there any reason for this? If not, can we support it?
> If I try to execute this query:
> {quote}set hive.vectorized.execution.enabled = false; SELECT * FROM table_name
> {quote}
> I have this error:
>  
> {quote}2018-04-05 13:35:40,572 ERROR [Timer-Driven Process Thread-146] 
> o.a.nifi.processors.hive.SelectHiveQL 
> SelectHiveQL[id=243d4c17-b1fe-14af--ee8ce15e] Unable to execute 
> HiveQL select query set hive.vectorized.execution.enabled = false; SELECT * 
> FROM table_name for 
> StandardFlowFileRecord[uuid=0e035558-07ce-473b-b0d4-ac00b8b1df93,claim=StandardContentClaim
>  [resourceClaim=StandardResourceClaim[id=1522824912161-2753, 
> container=default, section=705], offset=838441, 
> length=25],offset=0,name=cliente_attributi.csv,size=25] due to 
> org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: 
> The query did not generate a result set!; routing to failure: {}
>  org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: 
> The query did not generate a result set!
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL$2.process(SelectHiveQL.java:305)
>  at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2529)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.onTrigger(SelectHiveQL.java:275)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.lambda$onTrigger$0(SelectHiveQL.java:215)
>  at 
> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
>  at 
> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:106)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.onTrigger(SelectHiveQL.java:215)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)
>  Caused by: java.sql.SQLException: The query did not generate a result set!
>  at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:438)
>  at 
> org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
>  at 
> org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL$2.process(SelectHiveQL.java:293)
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5044) SelectHiveQL accept only one statement

2018-05-12 Thread Ed Berezitsky (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473374#comment-16473374
 ] 

Ed Berezitsky commented on NIFI-5044:
-

[~disoardi], [~pvillard] and [~mattyb149],

PR create, take a look please, and let me know if everything is OK.

> SelectHiveQL accept only one statement
> --
>
> Key: NIFI-5044
> URL: https://issues.apache.org/jira/browse/NIFI-5044
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0
>Reporter: Davide Isoardi
>Assignee: Ed Berezitsky
>Priority: Critical
>
> In [this 
> |[https://github.com/apache/nifi/commit/bbc714e73ba245de7bc32fd9958667c847101f7d]
>  ] commit claims to add support to running multiple statements both on 
> SelectHiveQL and PutHiveQL; instead, it adds only the support to PutHiveQL, 
> so SelectHiveQL still lacks this important feature. @Matt Burgess, I saw that 
> you worked on that, is there any reason for this? If not, can we support it?
> If I try to execute this query:
> {quote}set hive.vectorized.execution.enabled = false; SELECT * FROM table_name
> {quote}
> I have this error:
>  
> {quote}2018-04-05 13:35:40,572 ERROR [Timer-Driven Process Thread-146] 
> o.a.nifi.processors.hive.SelectHiveQL 
> SelectHiveQL[id=243d4c17-b1fe-14af--ee8ce15e] Unable to execute 
> HiveQL select query set hive.vectorized.execution.enabled = false; SELECT * 
> FROM table_name for 
> StandardFlowFileRecord[uuid=0e035558-07ce-473b-b0d4-ac00b8b1df93,claim=StandardContentClaim
>  [resourceClaim=StandardResourceClaim[id=1522824912161-2753, 
> container=default, section=705], offset=838441, 
> length=25],offset=0,name=cliente_attributi.csv,size=25] due to 
> org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: 
> The query did not generate a result set!; routing to failure: {}
>  org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: 
> The query did not generate a result set!
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL$2.process(SelectHiveQL.java:305)
>  at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2529)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.onTrigger(SelectHiveQL.java:275)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.lambda$onTrigger$0(SelectHiveQL.java:215)
>  at 
> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
>  at 
> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:106)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.onTrigger(SelectHiveQL.java:215)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)
>  Caused by: java.sql.SQLException: The query did not generate a result set!
>  at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:438)
>  at 
> org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
>  at 
> org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL$2.process(SelectHiveQL.java:293)
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5044) SelectHiveQL accept only one statement

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473373#comment-16473373
 ] 

ASF GitHub Bot commented on NIFI-5044:
--

GitHub user bdesert opened a pull request:

https://github.com/apache/nifi/pull/2695

NIFI-5044 SelectHiveQL accept only one statement

SelectHiveQL support only single SELECT statement.
This change adds support for pre- and post- select statements.
It will be useful for configuration queries, i.e. "set 
tez.queue.name=default", and others.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [X] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [X] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [X] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [X] Is your initial contribution a single, squashed commit?

### For code changes:
- [X] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [X] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [X] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/bdesert/nifi NIFI-5044_SelectHiveQL_

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2695.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2695


commit 5e4c2b00405418fc4673e851740129e94f59caab
Author: Ed B 
Date:   2018-05-13T05:24:09Z

NIFI-5044 SelectHiveQL accept only one statement

SelectHiveQL support only single SELECT statement.
This change adds support for pre- and post- select statements.
It will be useful for configuration queries, i.e. "set 
tez.queue.name=default", and others.




> SelectHiveQL accept only one statement
> --
>
> Key: NIFI-5044
> URL: https://issues.apache.org/jira/browse/NIFI-5044
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0
>Reporter: Davide Isoardi
>Assignee: Ed Berezitsky
>Priority: Critical
>
> In [this 
> |[https://github.com/apache/nifi/commit/bbc714e73ba245de7bc32fd9958667c847101f7d]
>  ] commit claims to add support to running multiple statements both on 
> SelectHiveQL and PutHiveQL; instead, it adds only the support to PutHiveQL, 
> so SelectHiveQL still lacks this important feature. @Matt Burgess, I saw that 
> you worked on that, is there any reason for this? If not, can we support it?
> If I try to execute this query:
> {quote}set hive.vectorized.execution.enabled = false; SELECT * FROM table_name
> {quote}
> I have this error:
>  
> {quote}2018-04-05 13:35:40,572 ERROR [Timer-Driven Process Thread-146] 
> o.a.nifi.processors.hive.SelectHiveQL 
> SelectHiveQL[id=243d4c17-b1fe-14af--ee8ce15e] Unable to execute 
> HiveQL select query set hive.vectorized.execution.enabled = false; SELECT * 
> FROM table_name for 
> StandardFlowFileRecord[uuid=0e035558-07ce-473b-b0d4-ac00b8b1df93,claim=StandardContentClaim
>  [resourceClaim=StandardResourceClaim[id=1522824912161-2753, 
> container=default, section=705], offset=838441, 
> length=25],offset=0,name=cliente_attributi.csv,size=25] due to 
> org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: 
> The query did not generate a result set!; routing to failure: {}
>  org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: 
> The query did not generate a result set!
>  

[GitHub] nifi pull request #2695: NIFI-5044 SelectHiveQL accept only one statement

2018-05-12 Thread bdesert
GitHub user bdesert opened a pull request:

https://github.com/apache/nifi/pull/2695

NIFI-5044 SelectHiveQL accept only one statement

SelectHiveQL support only single SELECT statement.
This change adds support for pre- and post- select statements.
It will be useful for configuration queries, i.e. "set 
tez.queue.name=default", and others.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [X] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [X] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [X] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [X] Is your initial contribution a single, squashed commit?

### For code changes:
- [X] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [X] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [X] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/bdesert/nifi NIFI-5044_SelectHiveQL_

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2695.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2695


commit 5e4c2b00405418fc4673e851740129e94f59caab
Author: Ed B 
Date:   2018-05-13T05:24:09Z

NIFI-5044 SelectHiveQL accept only one statement

SelectHiveQL support only single SELECT statement.
This change adds support for pre- and post- select statements.
It will be useful for configuration queries, i.e. "set 
tez.queue.name=default", and others.




---


[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473298#comment-16473298
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2614
  
At that point, this should work now to clear things up hopefully:

`git push origin --force NIFI-4914`


> Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, 
> PublishPulsarRecord
> --
>
> Key: NIFI-4914
> URL: https://issues.apache.org/jira/browse/NIFI-4914
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: David Kjerrumgaard
>Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Create record-based processors for Apache Pulsar 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2614
  
At that point, this should work now to clear things up hopefully:

`git push origin --force NIFI-4914`


---


[jira] [Commented] (NIFI-4494) Add a FetchOracleRow processor

2018-05-12 Thread Mike Thomsen (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4494?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473295#comment-16473295
 ] 

Mike Thomsen commented on NIFI-4494:


[~fred_liu_2017] can you elaborate on the use case? From the sounds of this, it 
doesn't sound like it's limited to Oracle as a general problem. Also, we can't 
build an Oracle-specific processor AFAIK because the client driver is 
proprietary and thus prohibited by the ASF.

> Add a FetchOracleRow processor
> --
>
> Key: NIFI-4494
> URL: https://issues.apache.org/jira/browse/NIFI-4494
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
> Environment: oracle
>Reporter: Fred Liu
>Priority: Major
>
> We encounter a lot of demand, poor data quality, no primary key, no time 
> stamp, and even a lot of duplicate data. But the customer requires a high 
> performance and accuracy.
> Using GenerateTableFetch or QueryDatabaseTable, we can not meet the 
> functional and performance requirements. So we want to add a new processor, 
> it is specifically for the oracle database, able to ingest very poor quality 
> data and have better performance.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-4736) New Processor for Fetch MongoDB

2018-05-12 Thread Mike Thomsen (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4736?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Thomsen resolved NIFI-4736.

   Resolution: Fixed
Fix Version/s: 1.6.0

> New Processor for Fetch MongoDB
> ---
>
> Key: NIFI-4736
> URL: https://issues.apache.org/jira/browse/NIFI-4736
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0
>Reporter: Preetham Uchil
>Priority: Major
> Fix For: 1.6.0
>
>
> Raising an JIRA for this very issue highlighted in the link below. I have the 
> same requirement as Pablo stated:
> "I've just managed to get the PutMongo processor work successfully. 
> However, I just realized that you can't use the GetMongo processor to 
> retrieve data based on the input from another flowfile or attribute. It has 
> no input. That leaves the Mongo database that you I sent data to a bit orphan 
> if you can't retrieve data based on another source. 
> Does anybody have an alternative on how to get a specific record from MongoDB 
> based on an input?"
> http://apache-nifi-users-list.2361937.n4.nabble.com/GetMongo-Processor-Alternative-td702.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4736) New Processor for Fetch MongoDB

2018-05-12 Thread Mike Thomsen (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4736?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473294#comment-16473294
 ] 

Mike Thomsen commented on NIFI-4736:


[~upreetham] closing this out because your concern here is now addressed in 
1.6.0:
{quote}However, I just realized that you can't use the GetMongo processor to 
retrieve data based on the input from another flowfile or attribute. It has no 
input.
{quote}

> New Processor for Fetch MongoDB
> ---
>
> Key: NIFI-4736
> URL: https://issues.apache.org/jira/browse/NIFI-4736
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.4.0
>Reporter: Preetham Uchil
>Priority: Major
>
> Raising an JIRA for this very issue highlighted in the link below. I have the 
> same requirement as Pablo stated:
> "I've just managed to get the PutMongo processor work successfully. 
> However, I just realized that you can't use the GetMongo processor to 
> retrieve data based on the input from another flowfile or attribute. It has 
> no input. That leaves the Mongo database that you I sent data to a bit orphan 
> if you can't retrieve data based on another source. 
> Does anybody have an alternative on how to get a specific record from MongoDB 
> based on an input?"
> http://apache-nifi-users-list.2361937.n4.nabble.com/GetMongo-Processor-Alternative-td702.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4845) Add JanusGraph put processor

2018-05-12 Thread Mike Thomsen (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4845?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473293#comment-16473293
 ] 

Mike Thomsen commented on NIFI-4845:


[~liufucai-inspur] a discussion related to this just popped up on the developer 
mailing list. If you're interested, join the list and the discussion about 
graph DBs because there might be some real interest and momentum starting to 
build up.

> Add JanusGraph put processor
> 
>
> Key: NIFI-4845
> URL: https://issues.apache.org/jira/browse/NIFI-4845
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Fred Liu
>Assignee: Fred Liu
>Priority: Major
>
> Create processor for Reading records from an incoming FlowFile using the 
> provided Record Reader, and writting those records to JanusGraph. And using a 
> JanusGraphControllerService is good.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4904) PutElasticSearch5 should support higher than elasticsearch 5.0.0

2018-05-12 Thread Mike Thomsen (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4904?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473292#comment-16473292
 ] 

Mike Thomsen commented on NIFI-4904:


The transport protocol is being deprecated in favor of HTTP/REST by Elastic, so 
new ES functionality is being steadily developed around the official client 
APIs for REST. 
[Source|https://www.elastic.co/guide/en/elasticsearch/client/java-api/master/transport-client.html]

> PutElasticSearch5 should support higher than elasticsearch 5.0.0
> 
>
> Key: NIFI-4904
> URL: https://issues.apache.org/jira/browse/NIFI-4904
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.5.0
> Environment: Ubuntu
>Reporter: Dye357
>Priority: Trivial
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> Currently the PutElasticSearch5 component is using the following transport 
> artifact
> 
>  org.elasticsearch.client
>  transport
>  ${es.version}
>  
> Where es.version is 5.0.1. Upgrading to the highest 5.x dependency would 
> enable this component to be compatible with later 5.x versions of elastic 
> search as well as early versions of elastic search 6.x.
> Here is Nifi 1.5.0 connecting to ES 6.2.1 on port 9300:
> [2018-02-23T01:41:04,162][WARN ][o.e.t.n.Netty4Transport ] [uQSW8O8] 
> exception caught on transport layer 
> [NettyTcpChannel\{localAddress=/127.0.0.1:9300, 
> remoteAddress=/127.0.0.1:57457}], closing connection
> java.lang.IllegalStateException: Received message from unsupported version: 
> [5.0.0] minimal compatible version is: [5.6.0]
>  at 
> org.elasticsearch.transport.TcpTransport.ensureVersionCompatibility(TcpTransport.java:1430)
>  ~[elasticsearch-6.2.1.jar:6.2.1]
>  at 
> org.elasticsearch.transport.TcpTransport.messageReceived(TcpTransport.java:1377)
>  ~[elasticsearch-6.2.1.jar:6.2.1]
>  at 
> org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:64)
>  ~[transport-netty4-6.2.1.jar:6.2.1]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310)
>  [netty-codec-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297)
>  [netty-codec-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413)
>  [netty-codec-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265)
>  [netty-codec-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:241) 
> [netty-handler-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
>  [netty-transport-4.1.16.Final.jar:4.1.16.Final]
>  at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
>  

[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473290#comment-16473290
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Davids-MacBook-Pro:nifi david$ git rebase --continue
Applying: Changed artifact versions to 1.7.0-SNAPSHOT
Applying: Fixed issues identified during code review
Applying: Removed invalid characters left over from merge
Using index info to reconstruct a base tree...
M   nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service/pom.xml
M   
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/PublishPulsarRecord_1_X.java
Falling back to patching base and 3-way merge...
No changes -- Patch already applied.
Davids-MacBook-Pro:nifi david$ git status
On branch NIFI-4914
Your branch and 'origin/NIFI-4914' have diverged,
and have 210 and 217 different commits each, respectively.
  (use "git pull" to merge the remote branch into yours)


> Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, 
> PublishPulsarRecord
> --
>
> Key: NIFI-4914
> URL: https://issues.apache.org/jira/browse/NIFI-4914
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: David Kjerrumgaard
>Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Create record-based processors for Apache Pulsar 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Davids-MacBook-Pro:nifi david$ git rebase --continue
Applying: Changed artifact versions to 1.7.0-SNAPSHOT
Applying: Fixed issues identified during code review
Applying: Removed invalid characters left over from merge
Using index info to reconstruct a base tree...
M   nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service/pom.xml
M   
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/PublishPulsarRecord_1_X.java
Falling back to patching base and 3-way merge...
No changes -- Patch already applied.
Davids-MacBook-Pro:nifi david$ git status
On branch NIFI-4914
Your branch and 'origin/NIFI-4914' have diverged,
and have 210 and 217 different commits each, respectively.
  (use "git pull" to merge the remote branch into yours)


---


[jira] [Commented] (NIFI-4964) Add bulk lookup feature in LookupRecord

2018-05-12 Thread Mike Thomsen (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473287#comment-16473287
 ] 

Mike Thomsen commented on NIFI-4964:


The HBase service can use a lot of Gets at once to do that, but the other ones 
don't have any good way that I can think of to make one bulk request to the 
external system that won't confuse things badly.

> Add bulk lookup feature in LookupRecord
> ---
>
> Key: NIFI-4964
> URL: https://issues.apache.org/jira/browse/NIFI-4964
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Priority: Major
>
> When having a flow file with a large number of records it would be much more 
> efficient to parse the whole flow file once to list all the coordinates to 
> look for, then call a new method (lookupAll?) in the lookup service to get 
> all the results, and then parse the file one more time to update the records.
> It should be added in the CS description/annotations that this approach could 
> hold in memory a large number of objects but could result in better 
> performances for lookup services accessing external systems (Mongo, HBase, 
> etc).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (NIFI-5130) ExecuteInfluxDBQuery processor chunking support

2018-05-12 Thread Mike Thomsen (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-5130?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mike Thomsen resolved NIFI-5130.

Resolution: Fixed

Merged.

> ExecuteInfluxDBQuery processor chunking support
> ---
>
> Key: NIFI-5130
> URL: https://issues.apache.org/jira/browse/NIFI-5130
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Michał Misiewicz
>Priority: Minor
>
> Many production InfluxDB installation has limited number of rows returned in 
> a single query (by default 10k). In case of huge collections, 10k rows can 
> correspond to less than 1 minute of events, which make usage of 
> ExecuteInfluxDBQuery processor inconvenient. I suggest adding support for 
> chunking queries. Chunking can be used to return results in a stream of 
> smaller batches (each has a partial results up to a chunk size) rather than 
> as a single response. Chunking query can return an unlimited number of rows.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Bingo! That's what you should be seeing now. Fix the merge conflicts, `git 
add` the files and continue (and repeat until it's done)


---


[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473264#comment-16473264
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Davids-MacBook-Pro:nifi david$ git checkout NIFI-4914
Switched to branch 'NIFI-4914'
Your branch is up to date with 'origin/NIFI-4914'.
Davids-MacBook-Pro:nifi david$ git rebase master
First, rewinding head to replay your work on top of it...
Applying: Added Apache Pulsar Processors and Controller Service
Applying: Changed code to use new ExpressionLanguageScope Enum
Applying: Changed artifact versions to 1.7.0-SNAPSHOT
Applying: Added Apache Pulsar Processors and Controller Service
Using index info to reconstruct a base tree...
M   nifi-nar-bundles/pom.xml
.git/rebase-apply/patch:848: trailing whitespace.
  
.git/rebase-apply/patch:854: trailing whitespace.
  
.git/rebase-apply/patch:857: trailing whitespace.
  
.git/rebase-apply/patch:859: space before tab in indent.

.git/rebase-apply/patch:865: trailing whitespace.
  
warning: squelched 161 whitespace errors
warning: 166 lines add whitespace errors.
Falling back to patching base and 3-way merge...
Auto-merging nifi-nar-bundles/nifi-pulsar-bundle/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/pom.xml
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestPublishPulsar_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestPublishPulsar_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestConsumePulsar_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestConsumePulsar_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestConsumePulsarRecord_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestConsumePulsarRecord_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/AbstractPulsarProcessorTest.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/AbstractPulsarProcessorTest.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/PublishPulsar_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/PublishPulsar_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/ConsumePulsar_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/ConsumePulsar_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/AbstractPulsarProducerProcessor.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/AbstractPulsarProducerProcessor.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/AbstractPulsarConsumerProcessor.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/AbstractPulsarConsumerProcessor.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/pom.xml
Auto-merging nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-nar/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-nar/pom.xml
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service/pom.xml
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service-nar/pom.xml
CONFLICT 

[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Davids-MacBook-Pro:nifi david$ git checkout NIFI-4914
Switched to branch 'NIFI-4914'
Your branch is up to date with 'origin/NIFI-4914'.
Davids-MacBook-Pro:nifi david$ git rebase master
First, rewinding head to replay your work on top of it...
Applying: Added Apache Pulsar Processors and Controller Service
Applying: Changed code to use new ExpressionLanguageScope Enum
Applying: Changed artifact versions to 1.7.0-SNAPSHOT
Applying: Added Apache Pulsar Processors and Controller Service
Using index info to reconstruct a base tree...
M   nifi-nar-bundles/pom.xml
.git/rebase-apply/patch:848: trailing whitespace.
  
.git/rebase-apply/patch:854: trailing whitespace.
  
.git/rebase-apply/patch:857: trailing whitespace.
  
.git/rebase-apply/patch:859: space before tab in indent.

.git/rebase-apply/patch:865: trailing whitespace.
  
warning: squelched 161 whitespace errors
warning: 166 lines add whitespace errors.
Falling back to patching base and 3-way merge...
Auto-merging nifi-nar-bundles/nifi-pulsar-bundle/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/pom.xml
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestPublishPulsar_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestPublishPulsar_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestConsumePulsar_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestConsumePulsar_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestConsumePulsarRecord_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/TestConsumePulsarRecord_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/AbstractPulsarProcessorTest.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/test/java/org/apache/nifi/processors/pulsar/pubsub/AbstractPulsarProcessorTest.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/PublishPulsar_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/PublishPulsar_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/ConsumePulsar_1_X.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/pubsub/ConsumePulsar_1_X.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/AbstractPulsarProducerProcessor.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/AbstractPulsarProducerProcessor.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/AbstractPulsarConsumerProcessor.java
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/src/main/java/org/apache/nifi/processors/pulsar/AbstractPulsarConsumerProcessor.java
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-processors/pom.xml
Auto-merging nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-nar/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-nar/pom.xml
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service/pom.xml
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service-nar/pom.xml
CONFLICT (add/add): Merge conflict in 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service-nar/pom.xml
Auto-merging 
nifi-nar-bundles/nifi-pulsar-bundle/nifi-pulsar-client-service-api/pom.xml
CONFLICT (add/add): Merge 

[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Pretty sure we found the culprit:

> Davids-MacBook-Pro:nifi david$ git rebase

That should be `git rebase master`. Not sure what git thinks it's doing, 
but it's clearly not doing what you want it to do. Give that a shot and let us 
know what you get.


---


[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473262#comment-16473262
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
`Davids-MacBook-Pro:nifi david$ git rebase 
First, rewinding head to replay your work on top of it...
Applying: NIFI-4289 - InfluxDB Put processor
Applying: NIFI-4827 Added support for reading queries from the flowfile 
body to GetMongo.
Applying: NIFI-4910 Fixing slight spelling mistake in error message, needs 
a space
Applying: NIFI-3502:
Applying: NIFI-4876 - Adding Min Object Age to ListS3
Applying: NIFI-4839 Creating nifi-toolkit-cli to provide a CLI for 
interacting with NiFi and NiFi Registry
Applying: NIFI-4839 - Rename the registry group to `registry` for better UX
Applying: NIFI-4839 - Fixing completer unit test
Applying: NIFI-4839 - Implemented auto-layout when importing the PG. Will 
find an available spot on a canvas which doesn't overlap with other components 
and is as close to the canvas center as possible.
Applying: NIFI-4839
Applying: NIFI-4839 - Support both public URLs and local files as inputs 
for import actions.
Applying: NIFI-4839 - Updating README and cleaning up descriptions and 
comments
Applying: NIFI-4839 - Implemented nice dynamic table output for all 
list-XXX commands (in simple mode)
Applying: NIFI-4839 - Added abbreviation in simple output for name, 
description, and comments
Applying: NIFI-4839 - Fixed handling of a connection object position - it 
doesn't have one and just returns null (calculated by the UI dynamically)
Applying: NIFI-4839 - Switching standalone mode to default to simple output
Applying: NIFI-4839 - The "Disabled" column had incorrect size and skewed 
the header
Applying: NIFI-4839 Improving back-ref support so that ReferenceResolver is 
passed the option being resolved
Applying: NIFI-4880: Add the ability to map record based on the aliases. 
This closes #2474
Applying: Fixed failing unit tests: Changed the queues used to unique names 
so that one test won't interfere with another; also changed 
JMSPublisherConsumerTest to JMSPublisherConsumerIT since it is an integration 
test between the publisher and consumer with ActiveMQ as the broker
Applying: NIFI-4916 - ConvertExcelToCSVProcessor inherit parent attributes. 
This closes #2500.
Applying: NIFI-2630 Allow PublishJMS to send TextMessages
Applying: NIFI-2630: Changed name of queue in unit test to be unique in 
order to avoid getting messages from another test if the other tests fails to 
properly shutdown the connection. This closes #2458.
Applying: NIFI-4920 Skipping sensitive properties when updating component 
properties from versioned component. This closes #2505.
Applying: NIFI-4773 - Fixed column type map initialization in 
QueryDatabaseTable
Applying: NIFI-4922 - Add badges to the README file
Applying: NIFI-4872 Added annotation for specifying scenarios in which 
components can cause high usage of system resources.
Applying: NIFI-4925:
Applying: NIFI-4855:
Applying: NIFI-4893 Cannot convert Avro schemas to Record schemas with 
default arrays
Applying: NIFI-4928 Updated BouncyCastle dependencies to version 1.59.
Applying: NIFI-4929 Converted the majority of MongoDB unit tests to 
integration tests so they can be reliably run with 'mvn -Pintegration-tests 
integration-test'
Applying: NIFI-4948 - MongoDB Lookup Service throws an exception if there 
is no match
Applying: NIFI-4859 Corrects Swagger Spec VersionedFlowState allowableValues
Applying: NIFI-4835 Corrects Swagger spec response types in FlowResource
Applying: NIFI-4949 Converted nifi-mongodb-services' unit tests into 
integration tests so that the @Ignore annotation doesn't have to be removed to 
make them run.
Applying: NIFI-4870 Upgraded activemq-client and activemq-broker versions 
to 5.15.3.
Applying: NIFI-4945:
Applying: NIFI-4936 trying to quiet down the mvn output a bit so we dont 
exceed the travis-ci 4MB max
Applying: NIFI-3039 Provenance Repository - Fix PurgeOldEvent and Rollover 
Size Limits
Applying: NIFI-4960: fix object setting. This closes #2531.
Applying: NIFI-4936:
Applying: NIFI-4944: Guard against race condition in Snappy for 
PutHiveStreaming
Applying: NIFI-4885:
Applying: NIFI-4953 - FetchHBaseRow - update log level for "not found" to 
DEBUG instead of ERROR
Applying: NIFI-4958 - This closes #2529. Fix Travis job status + atlas 
profile
Applying: NIFI-4885 fixing checkstyle issue
Applying: NIFI-4800 Expose the flattenMode as property in FlattenJSON 
processor
Applying: NIFI-3402 - Added etag support to InvokeHTTP
Applying: NIFI-4833 Add ScanHBase Processor
Applying: NIFI-4774: 

[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
`Davids-MacBook-Pro:nifi david$ git rebase 
First, rewinding head to replay your work on top of it...
Applying: NIFI-4289 - InfluxDB Put processor
Applying: NIFI-4827 Added support for reading queries from the flowfile 
body to GetMongo.
Applying: NIFI-4910 Fixing slight spelling mistake in error message, needs 
a space
Applying: NIFI-3502:
Applying: NIFI-4876 - Adding Min Object Age to ListS3
Applying: NIFI-4839 Creating nifi-toolkit-cli to provide a CLI for 
interacting with NiFi and NiFi Registry
Applying: NIFI-4839 - Rename the registry group to `registry` for better UX
Applying: NIFI-4839 - Fixing completer unit test
Applying: NIFI-4839 - Implemented auto-layout when importing the PG. Will 
find an available spot on a canvas which doesn't overlap with other components 
and is as close to the canvas center as possible.
Applying: NIFI-4839
Applying: NIFI-4839 - Support both public URLs and local files as inputs 
for import actions.
Applying: NIFI-4839 - Updating README and cleaning up descriptions and 
comments
Applying: NIFI-4839 - Implemented nice dynamic table output for all 
list-XXX commands (in simple mode)
Applying: NIFI-4839 - Added abbreviation in simple output for name, 
description, and comments
Applying: NIFI-4839 - Fixed handling of a connection object position - it 
doesn't have one and just returns null (calculated by the UI dynamically)
Applying: NIFI-4839 - Switching standalone mode to default to simple output
Applying: NIFI-4839 - The "Disabled" column had incorrect size and skewed 
the header
Applying: NIFI-4839 Improving back-ref support so that ReferenceResolver is 
passed the option being resolved
Applying: NIFI-4880: Add the ability to map record based on the aliases. 
This closes #2474
Applying: Fixed failing unit tests: Changed the queues used to unique names 
so that one test won't interfere with another; also changed 
JMSPublisherConsumerTest to JMSPublisherConsumerIT since it is an integration 
test between the publisher and consumer with ActiveMQ as the broker
Applying: NIFI-4916 - ConvertExcelToCSVProcessor inherit parent attributes. 
This closes #2500.
Applying: NIFI-2630 Allow PublishJMS to send TextMessages
Applying: NIFI-2630: Changed name of queue in unit test to be unique in 
order to avoid getting messages from another test if the other tests fails to 
properly shutdown the connection. This closes #2458.
Applying: NIFI-4920 Skipping sensitive properties when updating component 
properties from versioned component. This closes #2505.
Applying: NIFI-4773 - Fixed column type map initialization in 
QueryDatabaseTable
Applying: NIFI-4922 - Add badges to the README file
Applying: NIFI-4872 Added annotation for specifying scenarios in which 
components can cause high usage of system resources.
Applying: NIFI-4925:
Applying: NIFI-4855:
Applying: NIFI-4893 Cannot convert Avro schemas to Record schemas with 
default arrays
Applying: NIFI-4928 Updated BouncyCastle dependencies to version 1.59.
Applying: NIFI-4929 Converted the majority of MongoDB unit tests to 
integration tests so they can be reliably run with 'mvn -Pintegration-tests 
integration-test'
Applying: NIFI-4948 - MongoDB Lookup Service throws an exception if there 
is no match
Applying: NIFI-4859 Corrects Swagger Spec VersionedFlowState allowableValues
Applying: NIFI-4835 Corrects Swagger spec response types in FlowResource
Applying: NIFI-4949 Converted nifi-mongodb-services' unit tests into 
integration tests so that the @Ignore annotation doesn't have to be removed to 
make them run.
Applying: NIFI-4870 Upgraded activemq-client and activemq-broker versions 
to 5.15.3.
Applying: NIFI-4945:
Applying: NIFI-4936 trying to quiet down the mvn output a bit so we dont 
exceed the travis-ci 4MB max
Applying: NIFI-3039 Provenance Repository - Fix PurgeOldEvent and Rollover 
Size Limits
Applying: NIFI-4960: fix object setting. This closes #2531.
Applying: NIFI-4936:
Applying: NIFI-4944: Guard against race condition in Snappy for 
PutHiveStreaming
Applying: NIFI-4885:
Applying: NIFI-4953 - FetchHBaseRow - update log level for "not found" to 
DEBUG instead of ERROR
Applying: NIFI-4958 - This closes #2529. Fix Travis job status + atlas 
profile
Applying: NIFI-4885 fixing checkstyle issue
Applying: NIFI-4800 Expose the flattenMode as property in FlattenJSON 
processor
Applying: NIFI-3402 - Added etag support to InvokeHTTP
Applying: NIFI-4833 Add ScanHBase Processor
Applying: NIFI-4774: Allow user to choose which write-ahead log 
implementation should be used by the WriteAheadFlowFileRepository
Applying: NIFI-4969 Fixing error when importing versioned flow with a 
processor that uses event driven scheduling. This 

[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473261#comment-16473261
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
`Davids-MacBook-Pro:nifi david$ git checkout NIFI-4914
Switched to branch 'NIFI-4914'
Your branch is up to date with 'origin/NIFI-4914'.
Davids-MacBook-Pro:nifi david$ git rebase --continue
No rebase in progress?
`


> Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, 
> PublishPulsarRecord
> --
>
> Key: NIFI-4914
> URL: https://issues.apache.org/jira/browse/NIFI-4914
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: David Kjerrumgaard
>Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Create record-based processors for Apache Pulsar 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
`Davids-MacBook-Pro:nifi david$ git checkout NIFI-4914
Switched to branch 'NIFI-4914'
Your branch is up to date with 'origin/NIFI-4914'.
Davids-MacBook-Pro:nifi david$ git rebase --continue
No rebase in progress?
`


---


[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473258#comment-16473258
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
run the rebase against my 'master' branch or my 'NIFI-4914' branch?


> Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, 
> PublishPulsarRecord
> --
>
> Key: NIFI-4914
> URL: https://issues.apache.org/jira/browse/NIFI-4914
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: David Kjerrumgaard
>Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Create record-based processors for Apache Pulsar 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
run the rebase against my 'master' branch or my 'NIFI-4914' branch?


---


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Also, run `git rebase --continue` on that branch and tell us what happens. 
Copy and paste the output as "code" in the combox (code Markdown helper is the 
<> button)


---


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2614
  
I pulled your branch and got a lot of merge conflicts when I tried rebasing 
against master. I am pretty sure that @mattyb149 is right or on the right 
track. When you followed those four steps, did you adjust the name `upstream` 
to match whatever you call the remote that points to 
`https://github.com/apache/nifi`? Because if you did all of that, you would 
have had a real fun time with all of the merge conflicts on rebasing :)


---


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
My latest commit passed all the tests, and is showing no conflicts with the 
base branch. But you are seeing "an explosion of conflicts".  So I am 
confusedWhat steps do you need me to perform in order to get this PR in an 
acceptable state?


---


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Yeah, I think @mattyb149 is right about that. I just pulled your branch 
again, did a rebase against master and there was an explosion of conflicts. 
Once you resolve those, you have to do `git rebase --continue` to finish the 
rebasing.


---


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Probably needed a git rebase --continue instead...


---


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread david-streamlio
Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Yea, I'm not sure what happened

First I did the following 4 steps you recommended:

As a rule of thumb, this is how you want to do this sort of update:

git checkout master
git pull upstream master
git checkout YOUR_BRANCH
git rebase master

Then I checkout made the changes, ran mvn -Pcontrib-check clean install and 
had a clean install. When I tried to do a git push, I got the following error

Davids-MacBook-Pro:nifi david$ git push origin
To https://github.com/david-streamlio/nifi.git
 ! [rejected]NIFI-4914 -> NIFI-4914 (non-fast-forward)
error: failed to push some refs to 
'https://github.com/david-streamlio/nifi.git'
hint: Updates were rejected because the tip of your current branch is behind
hint: its remote counterpart. Integrate the remote changes (e.g.
hint: 'git pull ...') before pushing again.


So, I fixed the conflicts, committed them and pushed again. 


---


[jira] [Commented] (NIFI-4914) Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, PublishPulsarRecord

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473243#comment-16473243
 ] 

ASF GitHub Bot commented on NIFI-4914:
--

Github user david-streamlio commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Yea, I'm not sure what happened

First I did the following 4 steps you recommended:

As a rule of thumb, this is how you want to do this sort of update:

git checkout master
git pull upstream master
git checkout YOUR_BRANCH
git rebase master

Then I checkout made the changes, ran mvn -Pcontrib-check clean install and 
had a clean install. When I tried to do a git push, I got the following error

Davids-MacBook-Pro:nifi david$ git push origin
To https://github.com/david-streamlio/nifi.git
 ! [rejected]NIFI-4914 -> NIFI-4914 (non-fast-forward)
error: failed to push some refs to 
'https://github.com/david-streamlio/nifi.git'
hint: Updates were rejected because the tip of your current branch is behind
hint: its remote counterpart. Integrate the remote changes (e.g.
hint: 'git pull ...') before pushing again.


So, I fixed the conflicts, committed them and pushed again. 


> Implement record model processor for Pulsar, i.e. ConsumePulsarRecord, 
> PublishPulsarRecord
> --
>
> Key: NIFI-4914
> URL: https://issues.apache.org/jira/browse/NIFI-4914
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: David Kjerrumgaard
>Priority: Minor
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Create record-based processors for Apache Pulsar 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2614: Added Apache Pulsar Processors and Controller Service

2018-05-12 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/2614
  
Something is a bit wonky with this latest push, perhaps a bad merge?


---


[jira] [Commented] (NIFI-5170) Update Grok to 0.1.9

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5170?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473121#comment-16473121
 ] 

ASF GitHub Bot commented on NIFI-5170:
--

Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774759
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/resources/default-grok-patterns.txt
 ---
@@ -0,0 +1,115 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
--- End diff --

It if from the standard serialization services, it existed for the 
GrokReader, I copied it over


> Update Grok to 0.1.9
> 
>
> Key: NIFI-5170
> URL: https://issues.apache.org/jira/browse/NIFI-5170
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Otto Fowler
>Assignee: Otto Fowler
>Priority: Major
>
> Grok 0.1.9 has been released, including work for empty capture support.
>  
> https://github.com/thekrakken/java-grok#maven-repository



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774759
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/resources/default-grok-patterns.txt
 ---
@@ -0,0 +1,115 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
--- End diff --

It if from the standard serialization services, it existed for the 
GrokReader, I copied it over


---


[jira] [Commented] (NIFI-5170) Update Grok to 0.1.9

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5170?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473120#comment-16473120
 ] 

ASF GitHub Bot commented on NIFI-5170:
--

Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774740
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -179,17 +187,59 @@ public void onStopped() {
 bufferQueue.clear();
 }
 
+@Override
+protected Collection customValidate(final 
ValidationContext validationContext) {
+Collection problems = new ArrayList<>();
+
+// validate the grok expression against configuration
+boolean namedCaptures = false;
+if (validationContext.getProperty(NAMED_CAPTURES_ONLY).isSet()) {
+namedCaptures = 
validationContext.getProperty(NAMED_CAPTURES_ONLY).asBoolean();
+}
+GrokCompiler grokCompiler = GrokCompiler.newInstance();
+String subject = GROK_EXPRESSION.getName();
+String input = 
validationContext.getProperty(GROK_EXPRESSION).getValue();
+if (validationContext.getProperty(GROK_PATTERN_FILE).isSet()) {
+try (final InputStream in = new FileInputStream(new 
File(validationContext.getProperty(GROK_PATTERN_FILE).getValue()));
+ final Reader reader = new InputStreamReader(in)) {
+grokCompiler.register(reader);
+grok = grokCompiler.compile(input, namedCaptures);
+} catch (IOException | GrokException | 
java.util.regex.PatternSyntaxException e) {
+problems.add(new ValidationResult.Builder()
+.subject(subject)
--- End diff --

I needed to refactor this to be correct, sorry.  Please check the new commit


> Update Grok to 0.1.9
> 
>
> Key: NIFI-5170
> URL: https://issues.apache.org/jira/browse/NIFI-5170
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Otto Fowler
>Assignee: Otto Fowler
>Priority: Major
>
> Grok 0.1.9 has been released, including work for empty capture support.
>  
> https://github.com/thekrakken/java-grok#maven-repository



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774740
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -179,17 +187,59 @@ public void onStopped() {
 bufferQueue.clear();
 }
 
+@Override
+protected Collection customValidate(final 
ValidationContext validationContext) {
+Collection problems = new ArrayList<>();
+
+// validate the grok expression against configuration
+boolean namedCaptures = false;
+if (validationContext.getProperty(NAMED_CAPTURES_ONLY).isSet()) {
+namedCaptures = 
validationContext.getProperty(NAMED_CAPTURES_ONLY).asBoolean();
+}
+GrokCompiler grokCompiler = GrokCompiler.newInstance();
+String subject = GROK_EXPRESSION.getName();
+String input = 
validationContext.getProperty(GROK_EXPRESSION).getValue();
+if (validationContext.getProperty(GROK_PATTERN_FILE).isSet()) {
+try (final InputStream in = new FileInputStream(new 
File(validationContext.getProperty(GROK_PATTERN_FILE).getValue()));
+ final Reader reader = new InputStreamReader(in)) {
+grokCompiler.register(reader);
+grok = grokCompiler.compile(input, namedCaptures);
+} catch (IOException | GrokException | 
java.util.regex.PatternSyntaxException e) {
+problems.add(new ValidationResult.Builder()
+.subject(subject)
--- End diff --

I needed to refactor this to be correct, sorry.  Please check the new commit


---


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774270
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -80,18 +84,21 @@
 public static final String FLOWFILE_ATTRIBUTE = "flowfile-attribute";
 public static final String FLOWFILE_CONTENT = "flowfile-content";
 private static final String APPLICATION_JSON = "application/json";
+private static final String DEFAULT_PATTERN_NAME = 
"/default-grok-patterns.txt";
 
 public static final PropertyDescriptor GROK_EXPRESSION = new 
PropertyDescriptor.Builder()
 .name("Grok Expression")
-.description("Grok expression")
+.description("Grok expression. If other Grok expressions are 
referenced in this expression, they must be provided "
++ "in the Grok Pattern File if set or exist in the default Grok 
patterns")
 .required(true)
-.addValidator(validateGrokExpression())
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
--- End diff --

I changed to the customValidate because the new grok no longer ignores 
missing named patterns when compiling.

So if I had an expression %{FOO:foo}abc and tried to compile it without 
providing the FOO pattern to the compiler it would silently eat the error in 
the old version.

In the current version, it throws an illegal argument exception.  So the 
validation needs to utilize the provided pattern file, so I didn't think it 
could be in Property validate.  I thought it needed to be in the custom 
validate, since it runs *after* all the regular validates.

Does that make sense?



---


[jira] [Commented] (NIFI-5170) Update Grok to 0.1.9

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5170?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473108#comment-16473108
 ] 

ASF GitHub Bot commented on NIFI-5170:
--

Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774270
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -80,18 +84,21 @@
 public static final String FLOWFILE_ATTRIBUTE = "flowfile-attribute";
 public static final String FLOWFILE_CONTENT = "flowfile-content";
 private static final String APPLICATION_JSON = "application/json";
+private static final String DEFAULT_PATTERN_NAME = 
"/default-grok-patterns.txt";
 
 public static final PropertyDescriptor GROK_EXPRESSION = new 
PropertyDescriptor.Builder()
 .name("Grok Expression")
-.description("Grok expression")
+.description("Grok expression. If other Grok expressions are 
referenced in this expression, they must be provided "
++ "in the Grok Pattern File if set or exist in the default Grok 
patterns")
 .required(true)
-.addValidator(validateGrokExpression())
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
--- End diff --

I changed to the customValidate because the new grok no longer ignores 
missing named patterns when compiling.

So if I had an expression %{FOO:foo}abc and tried to compile it without 
providing the FOO pattern to the compiler it would silently eat the error in 
the old version.

In the current version, it throws an illegal argument exception.  So the 
validation needs to utilize the provided pattern file, so I didn't think it 
could be in Property validate.  I thought it needed to be in the custom 
validate, since it runs *after* all the regular validates.

Does that make sense?



> Update Grok to 0.1.9
> 
>
> Key: NIFI-5170
> URL: https://issues.apache.org/jira/browse/NIFI-5170
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Otto Fowler
>Assignee: Otto Fowler
>Priority: Major
>
> Grok 0.1.9 has been released, including work for empty capture support.
>  
> https://github.com/thekrakken/java-grok#maven-repository



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2675: NIFI-5113 Add XMLRecordSetWriter

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2675
  
@markap14 are you good with this or do you want some additional eyes on it?


---


[jira] [Commented] (NIFI-5113) Add XML record writer

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473046#comment-16473046
 ] 

ASF GitHub Bot commented on NIFI-5113:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2675
  
@markap14 are you good with this or do you want some additional eyes on it?


> Add XML record writer
> -
>
> Key: NIFI-5113
> URL: https://issues.apache.org/jira/browse/NIFI-5113
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Johannes Peter
>Assignee: Johannes Peter
>Priority: Major
>
> Corresponding writer for the XML record reader



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5113) Add XML record writer

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473043#comment-16473043
 ] 

ASF GitHub Bot commented on NIFI-5113:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187770744
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/XMLRecordSetWriter.java
 ---
@@ -0,0 +1,196 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.DateTimeTextRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+
+@Tags({"xml", "resultset", "writer", "serialize", "record", "recordset", 
"row"})
+@CapabilityDescription("Writes a RecordSet to XML. The records are wrapped 
by a root tag.")
+public class XMLRecordSetWriter extends DateTimeTextRecordSetWriter 
implements RecordSetWriterFactory {
+
+public static final AllowableValue ALWAYS_SUPPRESS = new 
AllowableValue("always-suppress", "Always Suppress",
+"Fields that are missing (present in the schema but not in the 
record), or that have a value of null, will not be written out");
+public static final AllowableValue NEVER_SUPPRESS = new 
AllowableValue("never-suppress", "Never Suppress",
+"Fields that are missing (present in the schema but not in the 
record), or that have a value of null, will be written out as a null value");
+public static final AllowableValue SUPPRESS_MISSING = new 
AllowableValue("suppress-missing", "Suppress Missing Values",
+"When a field has a value of null, it will be written out. 
However, if a field is defined in the schema and not present in the record, the 
field will not be written out.");
+
+public static final AllowableValue USE_PROPERTY_AS_WRAPPER = new 
AllowableValue("use-property-as-wrapper", "Use Property as Wrapper",
+"The value of the property \"Array Tag Name\" will be used as 
the tag name to wrap elements of an array. The field name of the array field 
will be used for the tag name " +
+"of the elements.");
+public static final AllowableValue USE_PROPERTY_FOR_ELEMENTS = new 
AllowableValue("use-property-for-elements", "Use Property for Elements",
+"The value of the property \"Array Tag Name\" will be used for 
the tag name of the elements of an array. The field name of the array field 
will be used as the tag name " +
+"to wrap elements.");
+public static final AllowableValue NO_WRAPPING = new 
AllowableValue("no-wrapping", "No Wrapping",
+"The elements of an array will not be wrapped");
+
+public static final PropertyDescriptor SUPPRESS_NULLS = new 
PropertyDescriptor.Builder()
+.name("suppress_nulls")
+.displayName("Suppress Null Values")
+.description("Specifies how 

[GitHub] nifi pull request #2675: NIFI-5113 Add XMLRecordSetWriter

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187770744
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/XMLRecordSetWriter.java
 ---
@@ -0,0 +1,196 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.DateTimeTextRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+
+@Tags({"xml", "resultset", "writer", "serialize", "record", "recordset", 
"row"})
+@CapabilityDescription("Writes a RecordSet to XML. The records are wrapped 
by a root tag.")
+public class XMLRecordSetWriter extends DateTimeTextRecordSetWriter 
implements RecordSetWriterFactory {
+
+public static final AllowableValue ALWAYS_SUPPRESS = new 
AllowableValue("always-suppress", "Always Suppress",
+"Fields that are missing (present in the schema but not in the 
record), or that have a value of null, will not be written out");
+public static final AllowableValue NEVER_SUPPRESS = new 
AllowableValue("never-suppress", "Never Suppress",
+"Fields that are missing (present in the schema but not in the 
record), or that have a value of null, will be written out as a null value");
+public static final AllowableValue SUPPRESS_MISSING = new 
AllowableValue("suppress-missing", "Suppress Missing Values",
+"When a field has a value of null, it will be written out. 
However, if a field is defined in the schema and not present in the record, the 
field will not be written out.");
+
+public static final AllowableValue USE_PROPERTY_AS_WRAPPER = new 
AllowableValue("use-property-as-wrapper", "Use Property as Wrapper",
+"The value of the property \"Array Tag Name\" will be used as 
the tag name to wrap elements of an array. The field name of the array field 
will be used for the tag name " +
+"of the elements.");
+public static final AllowableValue USE_PROPERTY_FOR_ELEMENTS = new 
AllowableValue("use-property-for-elements", "Use Property for Elements",
+"The value of the property \"Array Tag Name\" will be used for 
the tag name of the elements of an array. The field name of the array field 
will be used as the tag name " +
+"to wrap elements.");
+public static final AllowableValue NO_WRAPPING = new 
AllowableValue("no-wrapping", "No Wrapping",
+"The elements of an array will not be wrapped");
+
+public static final PropertyDescriptor SUPPRESS_NULLS = new 
PropertyDescriptor.Builder()
+.name("suppress_nulls")
+.displayName("Suppress Null Values")
+.description("Specifies how the writer should handle a null 
field")
+.allowableValues(NEVER_SUPPRESS, ALWAYS_SUPPRESS, 
SUPPRESS_MISSING)
+.defaultValue(NEVER_SUPPRESS.getValue())
+.required(true)
+

[jira] [Commented] (NIFI-5170) Update Grok to 0.1.9

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5170?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473029#comment-16473029
 ] 

ASF GitHub Bot commented on NIFI-5170:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187770394
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/resources/default-grok-patterns.txt
 ---
@@ -0,0 +1,115 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
--- End diff --

What's the origin of this file?


> Update Grok to 0.1.9
> 
>
> Key: NIFI-5170
> URL: https://issues.apache.org/jira/browse/NIFI-5170
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Otto Fowler
>Assignee: Otto Fowler
>Priority: Major
>
> Grok 0.1.9 has been released, including work for empty capture support.
>  
> https://github.com/thekrakken/java-grok#maven-repository



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187770374
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -179,17 +187,59 @@ public void onStopped() {
 bufferQueue.clear();
 }
 
+@Override
+protected Collection customValidate(final 
ValidationContext validationContext) {
+Collection problems = new ArrayList<>();
+
+// validate the grok expression against configuration
+boolean namedCaptures = false;
+if (validationContext.getProperty(NAMED_CAPTURES_ONLY).isSet()) {
+namedCaptures = 
validationContext.getProperty(NAMED_CAPTURES_ONLY).asBoolean();
+}
+GrokCompiler grokCompiler = GrokCompiler.newInstance();
+String subject = GROK_EXPRESSION.getName();
+String input = 
validationContext.getProperty(GROK_EXPRESSION).getValue();
+if (validationContext.getProperty(GROK_PATTERN_FILE).isSet()) {
+try (final InputStream in = new FileInputStream(new 
File(validationContext.getProperty(GROK_PATTERN_FILE).getValue()));
+ final Reader reader = new InputStreamReader(in)) {
+grokCompiler.register(reader);
+grok = grokCompiler.compile(input, namedCaptures);
+} catch (IOException | GrokException | 
java.util.regex.PatternSyntaxException e) {
+problems.add(new ValidationResult.Builder()
+.subject(subject)
--- End diff --

Why are you reusing the subject and input from the expression here? Is it 
because Grok uses the pattern to validate them?


---


[jira] [Commented] (NIFI-5170) Update Grok to 0.1.9

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5170?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473031#comment-16473031
 ] 

ASF GitHub Bot commented on NIFI-5170:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187770374
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -179,17 +187,59 @@ public void onStopped() {
 bufferQueue.clear();
 }
 
+@Override
+protected Collection customValidate(final 
ValidationContext validationContext) {
+Collection problems = new ArrayList<>();
+
+// validate the grok expression against configuration
+boolean namedCaptures = false;
+if (validationContext.getProperty(NAMED_CAPTURES_ONLY).isSet()) {
+namedCaptures = 
validationContext.getProperty(NAMED_CAPTURES_ONLY).asBoolean();
+}
+GrokCompiler grokCompiler = GrokCompiler.newInstance();
+String subject = GROK_EXPRESSION.getName();
+String input = 
validationContext.getProperty(GROK_EXPRESSION).getValue();
+if (validationContext.getProperty(GROK_PATTERN_FILE).isSet()) {
+try (final InputStream in = new FileInputStream(new 
File(validationContext.getProperty(GROK_PATTERN_FILE).getValue()));
+ final Reader reader = new InputStreamReader(in)) {
+grokCompiler.register(reader);
+grok = grokCompiler.compile(input, namedCaptures);
+} catch (IOException | GrokException | 
java.util.regex.PatternSyntaxException e) {
+problems.add(new ValidationResult.Builder()
+.subject(subject)
--- End diff --

Why are you reusing the subject and input from the expression here? Is it 
because Grok uses the pattern to validate them?


> Update Grok to 0.1.9
> 
>
> Key: NIFI-5170
> URL: https://issues.apache.org/jira/browse/NIFI-5170
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Otto Fowler
>Assignee: Otto Fowler
>Priority: Major
>
> Grok 0.1.9 has been released, including work for empty capture support.
>  
> https://github.com/thekrakken/java-grok#maven-repository



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5170) Update Grok to 0.1.9

2018-05-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-5170?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16473030#comment-16473030
 ] 

ASF GitHub Bot commented on NIFI-5170:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187720132
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -80,18 +84,21 @@
 public static final String FLOWFILE_ATTRIBUTE = "flowfile-attribute";
 public static final String FLOWFILE_CONTENT = "flowfile-content";
 private static final String APPLICATION_JSON = "application/json";
+private static final String DEFAULT_PATTERN_NAME = 
"/default-grok-patterns.txt";
 
 public static final PropertyDescriptor GROK_EXPRESSION = new 
PropertyDescriptor.Builder()
 .name("Grok Expression")
-.description("Grok expression")
+.description("Grok expression. If other Grok expressions are 
referenced in this expression, they must be provided "
++ "in the Grok Pattern File if set or exist in the default Grok 
patterns")
 .required(true)
-.addValidator(validateGrokExpression())
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
--- End diff --

I think having a custom validator here was a better idea. Just checking 
that it's non-blank doesn't do much to help the user.


> Update Grok to 0.1.9
> 
>
> Key: NIFI-5170
> URL: https://issues.apache.org/jira/browse/NIFI-5170
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Otto Fowler
>Assignee: Otto Fowler
>Priority: Major
>
> Grok 0.1.9 has been released, including work for empty capture support.
>  
> https://github.com/thekrakken/java-grok#maven-repository



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187770394
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/resources/default-grok-patterns.txt
 ---
@@ -0,0 +1,115 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
--- End diff --

What's the origin of this file?


---


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187720132
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -80,18 +84,21 @@
 public static final String FLOWFILE_ATTRIBUTE = "flowfile-attribute";
 public static final String FLOWFILE_CONTENT = "flowfile-content";
 private static final String APPLICATION_JSON = "application/json";
+private static final String DEFAULT_PATTERN_NAME = 
"/default-grok-patterns.txt";
 
 public static final PropertyDescriptor GROK_EXPRESSION = new 
PropertyDescriptor.Builder()
 .name("Grok Expression")
-.description("Grok expression")
+.description("Grok expression. If other Grok expressions are 
referenced in this expression, they must be provided "
++ "in the Grok Pattern File if set or exist in the default Grok 
patterns")
 .required(true)
-.addValidator(validateGrokExpression())
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
--- End diff --

I think having a custom validator here was a better idea. Just checking 
that it's non-blank doesn't do much to help the user.


---


[GitHub] nifi-minifi pull request #126: MINIFI-455: Updating C2 readme to include S3....

2018-05-12 Thread jzonthemtn
GitHub user jzonthemtn opened a pull request:

https://github.com/apache/nifi-minifi/pull/126

MINIFI-455: Updating C2 readme to include S3. Simplifying minifi-c2-c…

MINIFI-455: Updating C2 readme to include S3. Simplifying 
minifi-c2-context.xml.


Thank you for submitting a contribution to Apache NiFi - MiNiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [X] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [X] Does your PR title start with MINIFI- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.

- [X] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [X] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi-minifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under minifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under minifi-assembly?

### For documentation related changes:
- [X] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jzonthemtn/nifi-minifi MINIFI-455

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-minifi/pull/126.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #126


commit 96ce61bc2968caf2f48f794066db3bdf5f7605b3
Author: jzonthemtn 
Date:   2018-05-12T10:50:54Z

MINIFI-455: Updating C2 readme to include S3. Simplifying 
minifi-c2-context.xml.




---