[GitHub] nifi issue #1497: NIFI-3454 Tests should consistently use the FileNameFilter...

2017-02-10 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/1497
  
Sorry, will do next time. Your Jira is different from my other apache
project

On February 10, 2017 at 19:10:28, Koji Kawamura (notificati...@github.com)
wrote:

> @ottobackwards <https://github.com/ottobackwards> Sorry, I forgot to add
> "This closes #. " in the commit message. Would you please close this
> PR? It's already merged in the master branch. Thanks!
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/nifi/pull/1497#issuecomment-279100701>, or mute
> the thread
> 
<https://github.com/notifications/unsubscribe-auth/ABD1_wgtI6tQQ5Y4XZ5h5UyTS9k_2C3Yks5rbPx0gaJpZM4L9fY2>
> .
>



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1497: NIFI-3454 Tests should consistently use the FileNameFilter...

2017-02-10 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/1497
  
Will do later tonight

On February 10, 2017 at 19:10:28, Koji Kawamura (notificati...@github.com)
wrote:

> @ottobackwards <https://github.com/ottobackwards> Sorry, I forgot to add
> "This closes #. " in the commit message. Would you please close this
> PR? It's already merged in the master branch. Thanks!
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/nifi/pull/1497#issuecomment-279100701>, or mute
> the thread
> 
<https://github.com/notifications/unsubscribe-auth/ABD1_wgtI6tQQ5Y4XZ5h5UyTS9k_2C3Yks5rbPx0gaJpZM4L9fY2>
> .
>



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1497: NIFI-3454 Tests should consistently use the FileNam...

2017-02-10 Thread ottobackwards
Github user ottobackwards closed the pull request at:

https://github.com/apache/nifi/pull/1497


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi-maven pull request #2: NIFI-3628 allow override additions for extention...

2017-03-31 Thread ottobackwards
Github user ottobackwards closed the pull request at:

https://github.com/apache/nifi-maven/pull/2


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi-maven pull request #2: NIFI-3628 allow override additions for extention...

2017-03-20 Thread ottobackwards
GitHub user ottobackwards opened a pull request:

https://github.com/apache/nifi-maven/pull/2

NIFI-3628  allow override additions for extention and id prefixes

This PR introduces the ability to change the extension name though changing 
the type property as well as specifying the prefix on the Nar- manifest 
entries.

The type property is used to specify the file extension, and hardcoded 
instances of 'nar' are changed to use this property consistantly.

The property is also used in the NarDependencyMojo, such that the hardcoded 
NAR='nar' usage is changed to use the type property.   The configuration of 
this property will govern both mojos.

The defaults are all left as [N,n]ar.  Such that there are NO required 
changes to existing POMs or archetypes to continue working after this change.

This was tested by building building the archetype, and building nifi with 
the pom reference changed to the version built ( validating from the logs the 
plugin version used ).

All tests run and pass

Nifi was run from the assembly dir, and a flow created, logs viewed to 
verify that there were no errors loading / unpacking the nars, running the 
flows.




You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ottobackwards/nifi-maven NIFI-3628

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-maven/pull/2.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2


commit f876fb78d90a3a93363c4d9473ad4315eca0b58a
Author: Otto Fowler <ottobackwa...@gmail.com>
Date:   2017-03-20T14:00:35Z

override additions for extention and id prefixes
override prefix for Nar-* manifest properties




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-14 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2691
  
Thanks for the review @MikeThomsen 


---


[GitHub] nifi pull request #2698: NIFI-5077 ExtractGrok support for `keep empty captu...

2018-05-14 Thread ottobackwards
GitHub user ottobackwards opened a pull request:

https://github.com/apache/nifi/pull/2698

NIFI-5077 ExtractGrok support for `keep empty captures`

Support for the new option to keep empty captures.  I did not add to the 
GrokReader because I am not sure of the effect on
the schema extraction.

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [-] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [-] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [-] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

I did not, in keeping with the current file 

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ottobackwards/nifi grok-empty-captures

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2698.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2698


commit c5d44d56767effdbddeb66cc5c0ca89db670c6a1
Author: Otto Fowler <ottobackwards@...>
Date:   2018-05-14T13:35:23Z

NIFI-5077 ExtractGrok support for `keep empty captures`

Support for the new option to keep empty captures.  I did not add to the 
GrokReader because I am not sure of the effect on
the schema extraction.




---


[GitHub] nifi issue #2672: NIFI-5145 Made MockPropertyValue.evaluateExpressionLanguag...

2018-05-14 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2672
  
I think making this consistent for testing is a good idea.  The fact that 
the Mock classes in nifi can replicate the runtime behavior is very important 
to implementors.

I think that the changes to the other processors might be better suited 
outside of this pr though.

Also, I wouldn't mind seeing a comment or more comments in the Mock class 
referencing the runtime behavior they are enforcing.



---


[GitHub] nifi pull request #2699: [NIFI-5192] allow expression language in Schema Fil...

2018-05-14 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2699#discussion_r188064016
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ValidateXml.java
 ---
@@ -72,6 +72,7 @@
 .name("Schema File")
 .description("The path to the Schema file that is to be used 
for validation")
 .required(true)
+.expressionLanguageSupported(true)
--- End diff --

contrib-check should fail for this no?


---


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-09 Thread ottobackwards
GitHub user ottobackwards opened a pull request:

https://github.com/apache/nifi/pull/2691

NIFI-5170 Upgrad Grok to version 0.1.9

Upgrade to the new java-grok release and update for changes in the library.
This includes:

- Changes to the namespace from io.thekraken to io.krakens
- Refactoring to use the new GrokCompiler api
- Refactoring to do customValidation, since Grok will throw an 
IllegalArgumentException is an expression
references a Grok that is not defined, which is a change of behavior

- ExtractGrok now supports the default patterns, so the patterns file 
property is no longer required

Handles both the Record Reader and Legacy Processor

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [x] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ottobackwards/nifi update-grok-019

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2691.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2691


commit d05d72830cfb63458f5f87213be5a64ca12c3270
Author: Otto Fowler <ottobackwards@...>
Date:   2018-05-08T19:53:20Z

NIFI-5170 Upgrad Grok to version 0.1.9

Upgrade to the new java-grok release and update for changes in the library.
This includes:

- Changes to the namespace from io.thekraken to io.krakens
- Refactoring to use the new GrokCompiler api
- Refactoring to do customValidation, since Grok will throw an 
IllegalArgumentException is an expression
references a Grok that is not defined, which is a change of behavior

Handles both the Record Reader and Legacy Processor




---


[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-04-28 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
@zenfenan  any chance for a review?


---


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774759
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/resources/default-grok-patterns.txt
 ---
@@ -0,0 +1,115 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
--- End diff --

It if from the standard serialization services, it existed for the 
GrokReader, I copied it over


---


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774270
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -80,18 +84,21 @@
 public static final String FLOWFILE_ATTRIBUTE = "flowfile-attribute";
 public static final String FLOWFILE_CONTENT = "flowfile-content";
 private static final String APPLICATION_JSON = "application/json";
+private static final String DEFAULT_PATTERN_NAME = 
"/default-grok-patterns.txt";
 
 public static final PropertyDescriptor GROK_EXPRESSION = new 
PropertyDescriptor.Builder()
 .name("Grok Expression")
-.description("Grok expression")
+.description("Grok expression. If other Grok expressions are 
referenced in this expression, they must be provided "
++ "in the Grok Pattern File if set or exist in the default Grok 
patterns")
 .required(true)
-.addValidator(validateGrokExpression())
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
--- End diff --

I changed to the customValidate because the new grok no longer ignores 
missing named patterns when compiling.

So if I had an expression %{FOO:foo}abc and tried to compile it without 
providing the FOO pattern to the compiler it would silently eat the error in 
the old version.

In the current version, it throws an illegal argument exception.  So the 
validation needs to utilize the provided pattern file, so I didn't think it 
could be in Property validate.  I thought it needed to be in the custom 
validate, since it runs *after* all the regular validates.

Does that make sense?



---


[GitHub] nifi pull request #2691: NIFI-5170 Upgrad Grok to version 0.1.9

2018-05-12 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2691#discussion_r187774740
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -179,17 +187,59 @@ public void onStopped() {
 bufferQueue.clear();
 }
 
+@Override
+protected Collection customValidate(final 
ValidationContext validationContext) {
+Collection problems = new ArrayList<>();
+
+// validate the grok expression against configuration
+boolean namedCaptures = false;
+if (validationContext.getProperty(NAMED_CAPTURES_ONLY).isSet()) {
+namedCaptures = 
validationContext.getProperty(NAMED_CAPTURES_ONLY).asBoolean();
+}
+GrokCompiler grokCompiler = GrokCompiler.newInstance();
+String subject = GROK_EXPRESSION.getName();
+String input = 
validationContext.getProperty(GROK_EXPRESSION).getValue();
+if (validationContext.getProperty(GROK_PATTERN_FILE).isSet()) {
+try (final InputStream in = new FileInputStream(new 
File(validationContext.getProperty(GROK_PATTERN_FILE).getValue()));
+ final Reader reader = new InputStreamReader(in)) {
+grokCompiler.register(reader);
+grok = grokCompiler.compile(input, namedCaptures);
+} catch (IOException | GrokException | 
java.util.regex.PatternSyntaxException e) {
+problems.add(new ValidationResult.Builder()
+.subject(subject)
--- End diff --

I needed to refactor this to be correct, sorry.  Please check the new commit


---


[GitHub] nifi pull request #2675: NIFI-5113 Add XMLRecordSetWriter

2018-05-11 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2675#discussion_r187591727
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/xml/WriteXMLResult.java
 ---
@@ -0,0 +1,602 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.xml;
+
+import javanet.staxutils.IndentingXMLStreamWriter;
+import org.apache.nifi.NullSuppression;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.schema.access.SchemaAccessWriter;
+import org.apache.nifi.serialization.AbstractRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.WriteResult;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.RawRecordWriter;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.type.ArrayDataType;
+import org.apache.nifi.serialization.record.type.ChoiceDataType;
+import org.apache.nifi.serialization.record.type.MapDataType;
+import org.apache.nifi.serialization.record.type.RecordDataType;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+
+import javax.xml.stream.XMLOutputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamWriter;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.text.DateFormat;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.Supplier;
+
+
+public class WriteXMLResult extends AbstractRecordSetWriter implements 
RecordSetWriter, RawRecordWriter {
+
+final ComponentLog logger;
+final RecordSchema recordSchema;
+final SchemaAccessWriter schemaAccess;
+final XMLStreamWriter writer;
+final NullSuppression nullSuppression;
+final ArrayWrapping arrayWrapping;
+final String arrayTagName;
+final String recordTagName;
+final String rootTagName;
+
+private final Supplier LAZY_DATE_FORMAT;
+private final Supplier LAZY_TIME_FORMAT;
+private final Supplier LAZY_TIMESTAMP_FORMAT;
+
+public WriteXMLResult(final ComponentLog logger, final RecordSchema 
recordSchema, final SchemaAccessWriter schemaAccess, final OutputStream out, 
final boolean prettyPrint,
+  final NullSuppression nullSuppression, final 
ArrayWrapping arrayWrapping, final String arrayTagName, final String 
rootTagName, final String recordTagName,
+  final String dateFormat, final String 
timeFormat, final String timestampFormat) throws IOException {
+
+super(out);
+
+this.logger = logger;
+this.recordSchema = recordSchema;
+this.schemaAccess = schemaAccess;
+this.nullSuppression = nullSuppression;
+
+this.arrayWrapping = arrayWrapping;
+this.arrayTagName = arrayTagName;
+
+this.rootTagName = rootTagName;
+this.recordTagName = recordTagName;
+
+final DateFormat df = dateFormat == null ? null : 
DataTypeUtils.getDateFormat(dateFormat);
+final DateFormat tf = timeFormat == null ? null : 
DataTypeUtils.getDateFormat(timeFormat);
+final DateFormat tsf = timestampFormat == null ? null : 
DataTypeUtils.getDateFormat(timestampFormat);
+
+LAZY_DATE_FORMAT = () -> df;
+LAZY_TIME_FORMAT = () -> tf;
+LAZY_TIMESTAMP_FORMAT = () -> tsf;
+
+try {
+XMLOutputFactory factory = XMLOutputFactory.newInstance();
+
   

[GitHub] nifi pull request #2705: NIFI-5169 Upgrade to JSONPath 2.4

2018-05-15 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2705#discussion_r188281072
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/AbstractJsonPathProcessor.java
 ---
@@ -112,10 +111,10 @@ static String getResultRepresentation(Object 
jsonPathResult, String defaultValue
 public ValidationResult validate(final String subject, final 
String input, final ValidationContext context) {
 String error = null;
 if (isStale(subject, input)) {
-if (JsonPathExpressionValidator.isValidExpression(input)) {
+try {
 JsonPath compiledJsonPath = JsonPath.compile(input);
 cacheComputedValue(subject, input, compiledJsonPath);
--- End diff --

This throws an IllegalArgumentException if the input is null, and an 
InvalidPathException if the compile is wrong.  Maybe we should catch the 
explicit exception?  This is a nit I know.


---


[GitHub] nifi issue #2704: NIFI-4199: Consistent proxy support across components

2018-05-18 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2704
  
Should the tests for InvokeHTTP be updated to test with the changes?


---


[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-05-18 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
@mattyb149 no word on that pr.  @jvwing any chance you may be able to 
review?



---


[GitHub] nifi issue #2723: NIFI-5214 Added REST LookupService

2018-05-19 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2723
  
This is cool, I will definitely make an aws web api version of this after 
it and my pr lands. 
 I think that any rest service needs to support the options that InvokeHttp 
supports.  Proxies etc.  This doesn't seem to do that.



---


[GitHub] nifi issue #2723: NIFI-5214 Added REST LookupService

2018-05-19 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2723
  
Don't I know it.  If you want to do this as a 'series' of prs, I think you 
should document that in jira with tasks representing each PR and note it 
clearly in the pr.

That will set the right context for the reviewers.

I chose with my InvokeAwsGatewayApi processor to make sure that I 
replicated the InvokeHttp test suite.  I would suggest that the end goal here 
would be to have and extensive set of tests adapted from that suite for the 
reader.




---


[GitHub] nifi issue #1955: NIFI-4136 Add a failure option to unmatch behavior options...

2018-05-15 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/1955
  
@MikeThomsen This change is not related to my pr's.  It is concerned with 
Nifi behavior when the grok's do not match not with grok behavior itself.  It 
is still required in as much as it ever was.


---


[GitHub] nifi issue #2704: NIFI-4199: Consistent proxy support across components

2018-05-15 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2704
  
How will this work with the AWS components?  They have proxy as well ( 
although there is a PR for full support ), but a different builder I think


---


[GitHub] nifi issue #2704: NIFI-4199: Consistent proxy support across components

2018-05-16 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2704
  
@ijokarumawak I'm talking about passing around an HttpClientBuilder when 
not everyone uses that.


---


[GitHub] nifi issue #2698: NIFI-5077 ExtractGrok support for `keep empty captures`

2018-05-15 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2698
  
Thanks for the review guys 


---


[GitHub] nifi pull request #2698: NIFI-5077 ExtractGrok support for `keep empty captu...

2018-05-15 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2698#discussion_r188355616
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -102,6 +103,15 @@
 .addValidator(StandardValidators.FILE_EXISTS_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor KEEP_EMPTY_CAPTURES = new 
PropertyDescriptor.Builder()
+.name("Keep Empty Captures")
+.description("If true, then empty capture values will be 
included in the returned capture map.")
+.required(false)
--- End diff --

done


---


[GitHub] nifi pull request #2698: NIFI-5077 ExtractGrok support for `keep empty captu...

2018-05-15 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2698#discussion_r188355650
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -234,6 +247,11 @@ public void onStopped() {
 
 @OnScheduled
 public void onScheduled(final ProcessContext context) throws 
GrokException, IOException {
+
+if (context.getProperty(KEEP_EMPTY_CAPTURES).isSet()) {
--- End diff --

done


---


[GitHub] nifi issue #2705: NIFI-5169 Upgrade to JSONPath 2.4

2018-05-15 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2705
  
LGTM, +1  FWIW


---


[GitHub] nifi issue #2704: NIFI-4199: Consistent proxy support across components

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2704
  
Once I prove out my fix and update my pr, I'll guess I'll do a PR against 
master with that fix?


---


[GitHub] nifi issue #2704: NIFI-4199: Consistent proxy support across components

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2704
  
I found a bug in this in the aws implementation, I am not sure how you 
would see it in the other processors, I found it when bringing this code into 
my Gateway Api PR.

The issue is that customValidate validates that both host and port need to 
be set, but not that both user and password need to be set.

Since I test for this ( from the InvokeHttp testProxy ), I fail.  






---


[GitHub] nifi issue #2704: NIFI-4199: Consistent proxy support across components

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2704
  
https://issues.apache.org/jira/browse/NIFI-5220


---


[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
bump, any takers?


---


[GitHub] nifi pull request #2727: NIFI-5220 add aws abstract processor validation for...

2018-05-21 Thread ottobackwards
GitHub user ottobackwards opened a pull request:

https://github.com/apache/nifi/pull/2727

NIFI-5220 add aws abstract processor validation for proxy host and 
password, and tests 

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [-] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [-] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [-] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [-] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [-] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ottobackwards/nifi fix-aws-validate-proxy

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2727.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2727


commit a712dfac5f0295703acb46e5decfd8a0315ad205
Author: Otto Fowler <ottobackwards@...>
Date:   2018-05-21T16:16:17Z

add validation for host and password, and tests




---


[GitHub] nifi issue #2727: NIFI-5220 add aws abstract processor validation for proxy ...

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2727
  
@MikeThomsen @ijokarumawak 


---


[GitHub] nifi issue #2704: NIFI-4199: Consistent proxy support across components

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2704
  
https://github.com/apache/nifi/pull/2727


---


[GitHub] nifi issue #2723: NIFI-5214 Added REST LookupService

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2723
  
I almost wonder if there should be an http rest connection service


---


[GitHub] nifi issue #2723: NIFI-5214 Added REST LookupService

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2723
  
@MikeThomsen I agree.  I haven't seen any conversation on it, but it seems 
more and more obvious, given the way things are going right?


---


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r189554878
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,280 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@Tags({ "rest", "lookup", "json", "xml" })
+@CapabilityDescription("Use a REST service to enrich records.")
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-reader")
+.displayName("Record Reader")
+.description("The record reader to use for loading the payload and 
handling it as a record set.")
+.expressionLanguageSupported(ExpressionLanguageScope.NONE)
+.identifiesControllerService(RecordReaderFactory.class)
+.addValidator(Validator.VALID)
+.required(true)
+.build();
+
+static final PropertyDescriptor RECORD_PATH = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-record-path")
+.displayName("Record Path")
+.description("An optional record path that can be used to define 
where in a record to get the real data to merge " +
+"into the record set to be enriched. See documentation for 
examples of when this might be useful.")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(Validator.VALID)
+.required(false)
+.build();
+
+static final PropertyDescriptor SSL_CONTEXT_SERVICE = new 
PropertyDescriptor.Builder()
+.name("rest-lookup-ssl-context-service")
+.displayName("SSL Context Service")
+.description(&q

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r189554397
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,280 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
--- End diff --

Should this be tagged HTTP?


---


[GitHub] nifi issue #2723: NIFI-5214 Added REST LookupService

2018-05-21 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2723
  
@MikeThomsen 
Here are the properties exposed for configuration of InvokeHttp:

```java
 public static final List PROPERTIES = 
Collections.unmodifiableList(Arrays.asList(
PROP_METHOD,
PROP_URL,
PROP_SSL_CONTEXT_SERVICE,
PROP_CONNECT_TIMEOUT,
PROP_READ_TIMEOUT,
PROP_DATE_HEADER,
PROP_FOLLOW_REDIRECTS,
PROP_ATTRIBUTES_TO_SEND,
PROP_BASIC_AUTH_USERNAME,
PROP_BASIC_AUTH_PASSWORD,
PROP_PROXY_HOST,
PROP_PROXY_PORT,
PROP_PROXY_TYPE,
PROP_PROXY_USER,
PROP_PROXY_PASSWORD,
PROP_PUT_OUTPUT_IN_ATTRIBUTE,
PROP_PUT_ATTRIBUTE_MAX_LENGTH,
PROP_DIGEST_AUTH,
PROP_OUTPUT_RESPONSE_REGARDLESS,
PROP_TRUSTED_HOSTNAME,
PROP_ADD_HEADERS_TO_REQUEST,
PROP_CONTENT_TYPE,
PROP_SEND_BODY,
PROP_USE_CHUNKED_ENCODING,
PROP_PENALIZE_NO_RETRY));
```

Of these, I wonder if we should consider for the rest lookup
   PROP_CONNECT_TIMEOUT,
PROP_READ_TIMEOUT,
PROP_DATE_HEADER,
PROP_FOLLOW_REDIRECTS,
PROP_ATTRIBUTES_TO_SEND,
PROP_TRUSTED_HOSTNAME,
PROP_ADD_HEADERS_TO_REQUEST
In whatever form makes sense for the lookup service.

For example:  what if my lookup service is someone else's API and I need to 
send custom headers and api keys?




---


[GitHub] nifi pull request #2711: NIFI-1705 - Adding AttributesToCSV processor

2018-05-23 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2711#discussion_r190273915
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/AttributesToCSV.java
 ---
@@ -0,0 +1,272 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringEscapeUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.List;
+import java.util.ArrayList;
+import java.util.Set;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Collections;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"csv", "attributes", "flowfile"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Generates a CSV representation of the input 
FlowFile Attributes. The resulting CSV " +
+"can be written to either a newly generated attribute named 
'CSVAttributes' or written to the FlowFile as content.  " +
+"If the attribute value contains a comma, newline or double quote, 
then the attribute value will be " +
+"escaped with double quotes.  Any double quote characters in the 
attribute value are escaped with " +
+"another double quote.  If the attribute value does not contain a 
comma, newline or double quote, then the " +
+"attribute value is returned unchanged.")
+@WritesAttribute(attribute = "CSVAttributes", description = "CSV 
representation of Attributes")
+public class AttributesToCSV extends AbstractProcessor {
+
+private static final String OUTPUT_NEW_ATTRIBUTE = 
"flowfile-attribute";
+private static final String OUTPUT_OVERWRITE_CONTENT = 
"flowfile-content";
+private static final String OUTPUT_ATTRIBUTE_NAME = "CSVAttributes";
+private static final String OUTPUT_SEPARATOR = ",";
+private static final String OUTPUT_MIME_TYPE = "text/csv";
+
+
+public static final PropertyDescriptor ATTRIBUTES_LIST = new 
PropertyDescriptor.Builder()
+.name("attribute-list")
+.displayName("Attribute List")
+.description("Comma separated list of attributes to be 
included in the resulting CSV. If this value " +
+"is left empty then all existing Attributes will be 
included. This list of attributes is " +
+"case sensitive and does not support attribute names 
that contain commas. If an attribute s

[GitHub] nifi pull request #2711: NIFI-1705 - Adding AttributesToCSV processor

2018-05-23 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2711#discussion_r190271982
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/AttributesToCSV.java
 ---
@@ -0,0 +1,272 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringEscapeUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.List;
+import java.util.ArrayList;
+import java.util.Set;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Collections;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"csv", "attributes", "flowfile"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Generates a CSV representation of the input 
FlowFile Attributes. The resulting CSV " +
+"can be written to either a newly generated attribute named 
'CSVAttributes' or written to the FlowFile as content.  " +
+"If the attribute value contains a comma, newline or double quote, 
then the attribute value will be " +
+"escaped with double quotes.  Any double quote characters in the 
attribute value are escaped with " +
+"another double quote.  If the attribute value does not contain a 
comma, newline or double quote, then the " +
+"attribute value is returned unchanged.")
+@WritesAttribute(attribute = "CSVAttributes", description = "CSV 
representation of Attributes")
+public class AttributesToCSV extends AbstractProcessor {
+
+private static final String OUTPUT_NEW_ATTRIBUTE = 
"flowfile-attribute";
+private static final String OUTPUT_OVERWRITE_CONTENT = 
"flowfile-content";
+private static final String OUTPUT_ATTRIBUTE_NAME = "CSVAttributes";
+private static final String OUTPUT_SEPARATOR = ",";
+private static final String OUTPUT_MIME_TYPE = "text/csv";
+
+
+public static final PropertyDescriptor ATTRIBUTES_LIST = new 
PropertyDescriptor.Builder()
+.name("attribute-list")
+.displayName("Attribute List")
+.description("Comma separated list of attributes to be 
included in the resulting CSV. If this value " +
+"is left empty then all existing Attributes will be 
included. This list of attributes is " +
+"case sensitive and does not support attribute names 
that contain commas. If an attribute 

[GitHub] nifi pull request #2734: NIFI-5230: Fixed NPE in InvokeScriptedProcessor on ...

2018-05-23 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2734#discussion_r190263349
  
--- Diff: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/InvokeScriptedProcessor.java
 ---
@@ -237,7 +237,7 @@ public void setup() {
 @Override
 public void onPropertyModified(final PropertyDescriptor descriptor, 
final String oldValue, final String newValue) {
 
--- End diff --

Why are we using HashSet here, when we initialize to ArrayList initially?  
Shouldn't they be the same and consistent?


---


[GitHub] nifi pull request #2734: NIFI-5230: Fixed NPE in InvokeScriptedProcessor on ...

2018-05-23 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2734#discussion_r190267071
  
--- Diff: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/InvokeScriptedProcessor.java
 ---
@@ -237,7 +237,7 @@ public void setup() {
 @Override
 public void onPropertyModified(final PropertyDescriptor descriptor, 
final String oldValue, final String newValue) {
 
--- End diff --

Also, I don't understand customValidate.   The validationResults are used 
to 'seed' the currentValidationResults, but if empty, any new validation 
results are not set back to validationResults.  Is that right?


---


[GitHub] nifi pull request #2737: NIFI-5231 Added RecordStats processor.

2018-05-25 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2737#discussion_r190990529
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/RecordStats.java
 ---
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.RecordPathResult;
+import org.apache.nifi.record.path.util.RecordPathCache;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+
+import java.io.InputStream;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class RecordStats extends AbstractProcessor {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("record-stats-reader")
+.displayName("Record Reader")
+.description("A record reader to use for reading the records.")
+.addValidator(Validator.VALID)
+.identifiesControllerService(RecordReaderFactory.class)
+.build();
+
+static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("If a flowfile is successfully processed, it goes 
here.")
+.build();
+static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("If a flowfile fails to be processed, it goes here.")
+.build();
+
+protected PropertyDescriptor 
getSupportedDynamicPropertyDescriptor(final String propertyDescriptorName) {
+return new PropertyDescriptor.Builder()
+.name(propertyDescriptorName)
+.displayName(propertyDescriptorName)
+.dynamic(true)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+}
+
+private RecordPathCache cache;
+
+@OnScheduled
+public void onEnabled(ProcessContext context) {
+cache = new RecordPathCache(25);
+}
+
+@Override
+public Set getRelationships() {
+return new HashSet() {{
+add(REL_SUCCESS);
+add(REL_FAILURE);
+}};
+}
+
+@Override
+public void onTrigger(ProcessContext context, ProcessSession session) 
throws ProcessException {
+FlowFile input = session.get();
+if (input == null) {
+return;
+}
+
+try {
+Map<String, RecordPath> paths = getRecordPaths(context);
+Map<String, String> stats = getStats(input, paths, context, 
session);
+
+input = session.putAllAttributes(input, stats);
+
+session.transfer(input, REL_SUCCESS);
+
+} catch (Exception ex) {

[GitHub] nifi pull request #2737: NIFI-5231 Added RecordStats processor.

2018-05-25 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2737#discussion_r190990605
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/RecordStats.java
 ---
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.RecordPathResult;
+import org.apache.nifi.record.path.util.RecordPathCache;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+
+import java.io.InputStream;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class RecordStats extends AbstractProcessor {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("record-stats-reader")
+.displayName("Record Reader")
+.description("A record reader to use for reading the records.")
+.addValidator(Validator.VALID)
+.identifiesControllerService(RecordReaderFactory.class)
+.build();
+
+static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("If a flowfile is successfully processed, it goes 
here.")
+.build();
+static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("If a flowfile fails to be processed, it goes here.")
+.build();
+
+protected PropertyDescriptor 
getSupportedDynamicPropertyDescriptor(final String propertyDescriptorName) {
+return new PropertyDescriptor.Builder()
+.name(propertyDescriptorName)
+.displayName(propertyDescriptorName)
+.dynamic(true)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+}
+
+private RecordPathCache cache;
+
+@OnScheduled
+public void onEnabled(ProcessContext context) {
+cache = new RecordPathCache(25);
+}
+
+@Override
+public Set getRelationships() {
+return new HashSet() {{
+add(REL_SUCCESS);
+add(REL_FAILURE);
+}};
+}
+
+@Override
+public void onTrigger(ProcessContext context, ProcessSession session) 
throws ProcessException {
+FlowFile input = session.get();
+if (input == null) {
+return;
+}
+
+try {
+Map<String, RecordPath> paths = getRecordPaths(context);
+Map<String, String> stats = getStats(input, paths, context, 
session);
+
+input = session.putAllAttributes(input, stats);
+
+session.transfer(input, REL_SUCCESS);
+
+} catch (Exception ex) {

[GitHub] nifi issue #2737: NIFI-5231 Added RecordStats processor.

2018-05-24 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2737
  
That makes sense, just thinking it through, and obviously I don't 
understand everything as well ;)
I guess I never thought of provenance as including perf and stats stuff, so 
it seems like putting it there is just doing it because that is the thing that 
is present to use, co-opting it so to speak.


---


[GitHub] nifi issue #2737: NIFI-5231 Added RecordStats processor.

2018-05-24 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2737
  
@MikeThomsen this is cool.

The only thing it makes me wonder is if this kind of data can't be 
automatically generated and sent to a repository, almost like a new ( or actual 
) reporting task. 

This seems like it lends itself to time series analysis like other things.

Nifi doesn't necessarily have to provide that repo. 




---


[GitHub] nifi issue #2737: NIFI-5231 Added RecordStats processor.

2018-05-24 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2737
  
prob the reader and write would have to get some context passed where they 
can track states or increment stats, then be configured with a 'reporting' task 
to send the stats from a given context to


---


[GitHub] nifi pull request #2737: NIFI-5231 Added RecordStats processor.

2018-05-24 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2737#discussion_r190721904
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/RecordStats.java
 ---
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.RecordPathResult;
+import org.apache.nifi.record.path.util.RecordPathCache;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+
+import java.io.InputStream;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class RecordStats extends AbstractProcessor {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("record-stats-reader")
+.displayName("Record Reader")
+.description("A record reader to use for reading the records.")
+.addValidator(Validator.VALID)
+.identifiesControllerService(RecordReaderFactory.class)
+.build();
+
+static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("If a flowfile is successfully processed, it goes 
here.")
+.build();
+static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("If a flowfile fails to be processed, it goes here.")
+.build();
+
+protected PropertyDescriptor 
getSupportedDynamicPropertyDescriptor(final String propertyDescriptorName) {
+return new PropertyDescriptor.Builder()
+.name(propertyDescriptorName)
+.displayName(propertyDescriptorName)
+.dynamic(true)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+}
+
+private RecordPathCache cache;
+
+@OnScheduled
+public void onEnabled(ProcessContext context) {
+cache = new RecordPathCache(25);
+}
+
+@Override
+public Set getRelationships() {
+return new HashSet() {{
+add(REL_SUCCESS);
+add(REL_FAILURE);
+}};
+}
+
+@Override
+public void onTrigger(ProcessContext context, ProcessSession session) 
throws ProcessException {
+FlowFile input = session.get();
+if (input == null) {
+return;
+}
+
+try {
+Map<String, RecordPath> paths = getRecordPaths(context);
+Map<String, String> stats = getStats(input, paths, context, 
session);
+
+input = session.putAllAttributes(input, stats);
+
+session.transfer(input, REL_SUCCESS);
+
+} catch (Exception ex) {

[GitHub] nifi pull request #2737: NIFI-5231 Added RecordStats processor.

2018-05-24 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2737#discussion_r190714515
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/RecordStats.java
 ---
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.RecordPathResult;
+import org.apache.nifi.record.path.util.RecordPathCache;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+
+import java.io.InputStream;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class RecordStats extends AbstractProcessor {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("record-stats-reader")
+.displayName("Record Reader")
+.description("A record reader to use for reading the records.")
+.addValidator(Validator.VALID)
+.identifiesControllerService(RecordReaderFactory.class)
+.build();
+
+static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("If a flowfile is successfully processed, it goes 
here.")
+.build();
+static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("If a flowfile fails to be processed, it goes here.")
+.build();
+
+protected PropertyDescriptor 
getSupportedDynamicPropertyDescriptor(final String propertyDescriptorName) {
+return new PropertyDescriptor.Builder()
+.name(propertyDescriptorName)
+.displayName(propertyDescriptorName)
+.dynamic(true)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+}
+
+private RecordPathCache cache;
+
+@OnScheduled
+public void onEnabled(ProcessContext context) {
+cache = new RecordPathCache(25);
+}
+
+@Override
+public Set getRelationships() {
+return new HashSet() {{
+add(REL_SUCCESS);
+add(REL_FAILURE);
+}};
+}
+
+@Override
+public void onTrigger(ProcessContext context, ProcessSession session) 
throws ProcessException {
+FlowFile input = session.get();
+if (input == null) {
+return;
+}
+
+try {
+Map<String, RecordPath> paths = getRecordPaths(context);
+Map<String, String> stats = getStats(input, paths, context, 
session);
+
+input = session.putAllAttributes(input, stats);
+
+session.transfer(input, REL_SUCCESS);
+
+} catch (Exception ex) {

[GitHub] nifi pull request #2737: NIFI-5231 Added RecordStats processor.

2018-05-24 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2737#discussion_r190723484
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/RecordStats.java
 ---
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.RecordPathResult;
+import org.apache.nifi.record.path.util.RecordPathCache;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+
+import java.io.InputStream;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class RecordStats extends AbstractProcessor {
+static final PropertyDescriptor RECORD_READER = new 
PropertyDescriptor.Builder()
+.name("record-stats-reader")
+.displayName("Record Reader")
+.description("A record reader to use for reading the records.")
+.addValidator(Validator.VALID)
+.identifiesControllerService(RecordReaderFactory.class)
+.build();
+
+static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("If a flowfile is successfully processed, it goes 
here.")
+.build();
+static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("If a flowfile fails to be processed, it goes here.")
+.build();
+
+protected PropertyDescriptor 
getSupportedDynamicPropertyDescriptor(final String propertyDescriptorName) {
+return new PropertyDescriptor.Builder()
+.name(propertyDescriptorName)
+.displayName(propertyDescriptorName)
+.dynamic(true)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+}
+
+private RecordPathCache cache;
+
+@OnScheduled
+public void onEnabled(ProcessContext context) {
+cache = new RecordPathCache(25);
+}
+
+@Override
+public Set getRelationships() {
+return new HashSet() {{
+add(REL_SUCCESS);
+add(REL_FAILURE);
+}};
+}
+
+@Override
+public void onTrigger(ProcessContext context, ProcessSession session) 
throws ProcessException {
+FlowFile input = session.get();
+if (input == null) {
+return;
+}
+
+try {
+Map<String, RecordPath> paths = getRecordPaths(context);
+Map<String, String> stats = getStats(input, paths, context, 
session);
+
+input = session.putAllAttributes(input, stats);
+
+session.transfer(input, REL_SUCCESS);
+
+} catch (Exception ex) {

[GitHub] nifi pull request #2698: NIFI-5077 ExtractGrok support for `keep empty captu...

2018-05-15 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2698#discussion_r188354997
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ExtractGrok.java
 ---
@@ -102,6 +103,15 @@
 .addValidator(StandardValidators.FILE_EXISTS_VALIDATOR)
 .build();
 
+public static final PropertyDescriptor KEEP_EMPTY_CAPTURES = new 
PropertyDescriptor.Builder()
+.name("Keep Empty Captures")
+.description("If true, then empty capture values will be 
included in the returned capture map.")
+.required(false)
+.defaultValue("true")
--- End diff --

No, currently the empties are returned, this maintains that


---


[GitHub] nifi pull request #2734: NIFI-5230: Fixed NPE in InvokeScriptedProcessor on ...

2018-05-23 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2734#discussion_r190284675
  
--- Diff: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/InvokeScriptedProcessor.java
 ---
@@ -237,7 +237,7 @@ public void setup() {
 @Override
 public void onPropertyModified(final PropertyDescriptor descriptor, 
final String oldValue, final String newValue) {
 
--- End diff --

Ok, that makes sense.  Thanks for the explanation.


---


[GitHub] nifi pull request #2711: NIFI-1705 - Adding AttributesToCSV processor

2018-05-23 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2711#discussion_r190291470
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/AttributesToCSV.java
 ---
@@ -0,0 +1,272 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringEscapeUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.List;
+import java.util.ArrayList;
+import java.util.Set;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Collections;
+import java.util.stream.Collectors;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"csv", "attributes", "flowfile"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Generates a CSV representation of the input 
FlowFile Attributes. The resulting CSV " +
+"can be written to either a newly generated attribute named 
'CSVAttributes' or written to the FlowFile as content.  " +
+"If the attribute value contains a comma, newline or double quote, 
then the attribute value will be " +
+"escaped with double quotes.  Any double quote characters in the 
attribute value are escaped with " +
+"another double quote.  If the attribute value does not contain a 
comma, newline or double quote, then the " +
+"attribute value is returned unchanged.")
+@WritesAttribute(attribute = "CSVAttributes", description = "CSV 
representation of Attributes")
+public class AttributesToCSV extends AbstractProcessor {
+
+private static final String OUTPUT_NEW_ATTRIBUTE = 
"flowfile-attribute";
+private static final String OUTPUT_OVERWRITE_CONTENT = 
"flowfile-content";
+private static final String OUTPUT_ATTRIBUTE_NAME = "CSVAttributes";
+private static final String OUTPUT_SEPARATOR = ",";
+private static final String OUTPUT_MIME_TYPE = "text/csv";
+
+
+public static final PropertyDescriptor ATTRIBUTES_LIST = new 
PropertyDescriptor.Builder()
+.name("attribute-list")
+.displayName("Attribute List")
+.description("Comma separated list of attributes to be 
included in the resulting CSV. If this value " +
+"is left empty then all existing Attributes will be 
included. This list of attributes is " +
+"case sensitive and does not support attribute names 
that contain commas. If an attribute 

[GitHub] nifi pull request #2748: NIFI-4272 ReplaceText support multiple captures whe...

2018-06-06 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2748#discussion_r193368464
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ReplaceText.java
 ---
@@ -79,7 +80,8 @@
 @SystemResourceConsideration(resource = SystemResource.MEMORY)
 public class ReplaceText extends AbstractProcessor {
 
-private static Pattern REPLACEMENT_NORMALIZATION_PATTERN = 
Pattern.compile("(\\$\\D)");
+private static Pattern QUOTED_GROUP_REF_PATTERN = 
Pattern.compile("('\\$\\d+')");
+private static Pattern LITERAL_QUOTED_PATTERN = 
Pattern.compile("literal\\(('.*?')\\)",Pattern.DOTALL);
--- End diff --

done


---


[GitHub] nifi pull request #2778: NIFI-5288 Quietly convert Java arrays to Lists so t...

2018-06-09 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2778#discussion_r194239526
  
--- Diff: 
nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/PutMongoRecord.java
 ---
@@ -131,7 +131,7 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 for (String name : schema.getFieldNames()) {
--- End diff --

Why not do the conversion while inserting?


---


[GitHub] nifi issue #2777: NIFI-5287 Made LookupRecord able to take in flowfile attri...

2018-06-12 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2777
  
Maybe there should be a `Context` interface, and there can be support for 
implementations that support more than one map or type of backing.  I think the 
limitation here is using a literal Map instead of a logical construct.

This is similar to variable resolution in a custom language.  You may  need 
more than a map.

`LookupContext context = new AttributeAndCoordinateLookup(attrs, coords);`





---


[GitHub] nifi issue #2777: NIFI-5287 Made LookupRecord able to take in flowfile attri...

2018-06-12 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2777
  
It may be overkill, that is true.  But if you have to keep adding new 
functions to account for different scenarios that isn't great either and may 
suggest something like that would be good to have. Having a context or resolver 
for this type of thing isn't that radical. Having a context sets the interface, 
that is true, but the implementation can be any kind of 
policy/strategy/composition you may require.

That being said, just a thought anyways.



---


[GitHub] nifi pull request #2787: NIFI-5252 - support arbitrary headers in PutEmail p...

2018-06-12 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2787#discussion_r194700389
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/PutEmail.java
 ---
@@ -319,6 +330,15 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 message.setRecipients(RecipientType.CC, 
toInetAddresses(context, flowFile, CC));
 message.setRecipients(RecipientType.BCC, 
toInetAddresses(context, flowFile, BCC));
 
+final String attributeNameRegex = 
context.getProperty(ATTRIBUTE_NAME_REGEX).getValue();
+final Pattern attributeNamePattern = attributeNameRegex == 
null ? null : Pattern.compile(attributeNameRegex);
+if (attributeNamePattern != null) {
+for (final Map.Entry entry : 
flowFile.getAttributes().entrySet()) {
+if 
(attributeNamePattern.matcher(entry.getKey()).matches()) {
--- End diff --

There are rules about how the headers have to be encoded.  We should use 
the MimeUtility to ensure everything is encoded correctly.

```
Note that RFC 822 headers must contain only US-ASCII characters, so a 
header that contains non US-ASCII characters must have been encoded by the 
caller as per the rules of RFC 2047.
```


[MimeUtility](https://docs.oracle.com/javaee/6/api/javax/mail/internet/MimeUtility.html)

We should also have tests for ensuring attributes with content that must be 
encoded are handled.


---


[GitHub] nifi issue #2768: NIFI-5278: fixes JSON escaping of code parameter in Execut...

2018-06-07 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2768
  
Is there a test that should be created or updated for this change?


---


[GitHub] nifi issue #2767: NIFI-5274 avoid rollback on uncaught errors in ReplaceText

2018-06-07 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2767
  
Can you point out the recursive code you are referencing?


---


[GitHub] nifi issue #2767: NIFI-5274 avoid rollback on uncaught errors in ReplaceText

2018-06-07 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2767
  
Thanks for the info @mosermw 


---


[GitHub] nifi pull request #2802: NIFI-5147 Add CalculateAttributeHash processor

2018-06-18 Thread ottobackwards
GitHub user ottobackwards opened a pull request:

https://github.com/apache/nifi/pull/2802

NIFI-5147 Add CalculateAttributeHash processor

Created the new processor per jira requirements.

Might need a better description or more documentation.

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [-] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [-] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [-] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [x] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [-] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ottobackwards/nifi calculate-hash-attribute

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2802.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2802


commit 82d83654dc21b85ced12bbd26d8f70fdb3d70b62
Author: Otto Fowler 
Date:   2018-06-18T15:00:38Z

NIFI-5147 Add CalculateAttributeHash processor




---


[GitHub] nifi issue #2802: NIFI-5147 Add CalculateAttributeHash processor

2018-06-18 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2802
  
@alopresto is this what you had in mind?


---


[GitHub] nifi pull request #2802: NIFI-5147 Add CalculateAttributeHash processor

2018-06-18 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2802#discussion_r196148072
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/CalculateAttributeHash.java
 ---
@@ -0,0 +1,198 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.codec.binary.Hex;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.atomic.AtomicReference;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"attributes", "hash"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Calculates a hash value for the value specified 
attributes and write it to an output attribute")
+@WritesAttribute(attribute = "", description = "This Processor adds an attribute whose value is the 
result of "
++ "Hashing of the found attribute. The name of this attribute is 
specified by the value of the dynamic property.")
+@DynamicProperty(name = "A flowfile attribute key for attribute 
inspection", value = "Attribute Name",
+description = "The property name defines the attribute to look for 
and hash in the incoming flowfile."
++ "The property value defines the name to give the 
generated attribute."
++ "Attribute names must be unique.")
+public class CalculateAttributeHash extends AbstractProcessor {
+
+public static final Charset UTF8 = Charset.forName("UTF-8");
+
+static final AllowableValue MD2_VALUE = new AllowableValue("MD2", "MD2 
Hashing Algorithm", "MD2 Hashing Algorithm");
+static final AllowableValue MD5_VALUE = new AllowableValue("MD5", "MD5 
Hashing Algorithm", "MD5 Hashing Algorithm");
--- End diff --

I put in support for all the digest types that were supported.  Although it 
is not recommended to use MD5, it still may be *required* by some target 
systems.





---


[GitHub] nifi issue #2801: NIFI-5319 Utilize NiFi Registry 0.2.0 client

2018-06-18 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2801
  
Np, I am watching 5319 and got excited for a minute ;)


---


[GitHub] nifi issue #2801: NIFI-5139 Utilize NiFi Registry 0.2.0 client

2018-06-18 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2801
  
I think you have the wrong jira


---


[GitHub] nifi issue #2777: NIFI-5287 Made LookupRecord able to take in flowfile attri...

2018-06-12 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2777
  
@markap14 that certainly makes sense, thanks for taking the time to respond


---


[GitHub] nifi issue #2787: NIFI-5252 - support arbitrary headers in PutEmail processo...

2018-06-15 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2787
  
The changes to the encoding looks good.  
Thanks for the contribution!

+1


---


[GitHub] nifi issue #2800: NIFI-5317 - support non-ASCII X-Mailer header

2018-06-17 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2800
  
Yikes, sorry.


---


[GitHub] nifi issue #2800: NIFI-5317 - support non-ASCII X-Mailer header

2018-06-17 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2800
  
This looks good, just two comments.

1. If we are going to add support for non-ascii, the test should test 
non-ascii, can we add that?
2. We now have a pattern of use in two places using with a try catch around 
setting headers with encoding, maybe we should have a function?


---


[GitHub] nifi pull request #2800: NIFI-5317 - support non-ASCII X-Mailer header

2018-06-17 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2800#discussion_r195934504
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/PutEmail.java
 ---
@@ -320,6 +320,15 @@ public void onScheduled(final ProcessContext context) {
 this.attributeNamePattern = attributeNameRegex == null ? null : 
Pattern.compile(attributeNameRegex);
 }
 
--- End diff --

We should include the header name in the message


---


[GitHub] nifi pull request #2800: NIFI-5317 - support non-ASCII X-Mailer header

2018-06-17 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2800#discussion_r195934550
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestPutEmail.java
 ---
@@ -137,7 +137,7 @@ public void testOutgoingMessage() throws Exception {
 public void testOutgoingMessageWithOptionalProperties() throws 
Exception {
 // verifies that optional attributes are set on the outgoing 
Message correctly
--- End diff --

I don't see the non-ascii, am I missing something?


---


[GitHub] nifi issue #2800: NIFI-5317 - support non-ASCII X-Mailer header

2018-06-17 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2800
  
+1 LGTM, mvn install and contrib-check pass, use of MineType utility per 
best practices.



---


[GitHub] nifi pull request #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-06-11 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2588#discussion_r194368856
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-abstract-processors/src/main/java/org/apache/nifi/processors/aws/wag/client/Validate.java
 ---
@@ -0,0 +1,33 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.wag.client;
+
+import com.amazonaws.util.StringUtils;
--- End diff --

This is part of the code from an external library, and I believe this is 
intentional.


---


[GitHub] nifi pull request #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-06-11 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2588#discussion_r194380548
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-abstract-processors/src/main/java/org/apache/nifi/processors/aws/wag/client/GenericApiGatewayClient.java
 ---
@@ -0,0 +1,134 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.wag.client;
+
+import com.amazonaws.AmazonServiceException;
+import com.amazonaws.AmazonWebServiceClient;
+import com.amazonaws.ClientConfiguration;
+import com.amazonaws.DefaultRequest;
+import com.amazonaws.auth.AWS4Signer;
+import com.amazonaws.auth.AWSCredentialsProvider;
+import com.amazonaws.http.AmazonHttpClient;
+import com.amazonaws.http.ExecutionContext;
+import com.amazonaws.http.HttpMethodName;
+import com.amazonaws.http.HttpResponseHandler;
+import com.amazonaws.http.JsonResponseHandler;
+import com.amazonaws.internal.auth.DefaultSignerProvider;
+import com.amazonaws.protocol.json.JsonOperationMetadata;
+import com.amazonaws.protocol.json.SdkStructuredPlainJsonFactory;
+import com.amazonaws.regions.Region;
+import com.amazonaws.transform.JsonErrorUnmarshaller;
+import com.amazonaws.transform.JsonUnmarshallerContext;
+import com.amazonaws.transform.Unmarshaller;
+import com.fasterxml.jackson.databind.JsonNode;
+import java.io.InputStream;
+import java.net.URI;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+public class GenericApiGatewayClient extends AmazonWebServiceClient {
+private static final String API_GATEWAY_SERVICE_NAME = "execute-api";
+private static final String API_KEY_HEADER = "x-api-key";
+
+private final JsonResponseHandler 
responseHandler;
+private final HttpResponseHandler 
errorResponseHandler;
+private final AWSCredentialsProvider credentials;
+private String apiKey;
+private final AWS4Signer signer;
+
+GenericApiGatewayClient(ClientConfiguration clientConfiguration, 
String endpoint, Region region,
+AWSCredentialsProvider credentials, String 
apiKey, AmazonHttpClient httpClient) {
+super(clientConfiguration);
+setRegion(region);
+setEndpoint(endpoint);
+this.credentials = credentials;
+this.apiKey = apiKey;
+this.signer = new AWS4Signer();
+this.signer.setServiceName(API_GATEWAY_SERVICE_NAME);
+this.signer.setRegionName(region.getName());
+
+final JsonOperationMetadata metadata = new 
JsonOperationMetadata().withHasStreamingSuccessResponse(false).withPayloadJson(false);
+final Unmarshaller responseUnmarshaller = in -> new 
GenericApiGatewayResponse(in.getHttpResponse());
+this.responseHandler = 
SdkStructuredPlainJsonFactory.SDK_JSON_FACTORY.createResponseHandler(metadata, 
responseUnmarshaller);
+JsonErrorUnmarshaller defaultErrorUnmarshaller = new 
JsonErrorUnmarshaller(GenericApiGatewayException.class, null) {
+@Override
+public AmazonServiceException unmarshall(JsonNode jsonContent) 
throws Exception {
+return new 
GenericApiGatewayException(jsonContent.toString());
+}
+};
+this.errorResponseHandler = 
SdkStructuredPlainJsonFactory.SDK_JSON_FACTORY.createErrorResponseHandler(
+Collections.singletonList(defaultErrorUnmarshaller), null);
+
+if (httpClient != null) {
+super.client = httpClient;
+}
+}
+
+public GenericApiGatewayResponse execute(GenericApiGatewayRequest 
request) {
+return execute(request.getHttpMethod(), request.getResourcePath(), 
request.getHeaders(), request.getParameters(), request.getBody());
+}
+
+private GenericApiGatewayResponse execute(HttpMethodName met

[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-06-11 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
@joewitt 
I added the headers because the code was already apache lic. in the 
project, I thought that would be the proper thing to do.  I  ( possibly or 
probably wrongly ) thought that if you put the lic. in the project you *should* 
lic. the files with headers.

As for taking the library vs. including the code.

I took that code because I was not sure if the library would be updated 
quickly enough if there where changes in the AWS library versions in nifi or 
other changes required ( such as my need to add support for query parameters ). 
 Also, there is not a lot of code to work with.  @mattyb149 and I talked it out 
and I *have* done a pr to the project for a version bump that would allow us to 
use the library, but there has been no response to that pr since May 3rd when I 
posted it ( as I had feared at the start ).

Currently I *think* the options are ( in no order )

- The PR goes with the code as it is, with a notice to move to the library 
if and when it is updated to a comparable version of aws
- I do a proper fork into my github organization with the required version 
and publish ( currently I just do bintray and center )
- This PR is shelved until the project takes the PR

Obviously I would prefer not to shelve this.
Thoughts?

Should I remove the headers?


---


[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-06-11 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
Do I need to add ratcheck exceptions if I remove the headers?


---


[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-06-11 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
Thanks @joewitt for clearing that up.  I'll get right to it.


---


[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-06-11 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
@joewitt requested changes are in


---


[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-06-11 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
Thanks everybody!


---


[GitHub] nifi pull request #2748: NIFI-4272 support multiple captures when el is pres...

2018-05-30 Thread ottobackwards
GitHub user ottobackwards opened a pull request:

https://github.com/apache/nifi/pull/2748

NIFI-4272 support multiple captures when el is present

From the Jira Statement:
```
I am using the ReplaceText processor to take a string input (example:  
{"name":"Smith","middle":"nifi","firstname":"John"}
) and change all the filed names to all uppercase.
Using above input as an example, I expect output like
{"NAME":"Smith","MIDDLE":"nifi","FIRSTNAME":"John"}
I expect I should be able to do this with ReplaceText processor; however, I 
see some unexpected behavior:
---
Test 1: (uses EL in the replacement value property)
Search value:  \"([a-z]?)\":\"(.?)\"
Replacement Value: \"${'$1':toUpper()}":\"$2\"
Result:
{"NAME":"Smith","NAME":"nifi","NAME":"John"}
---
Test 2: (Does not use EL in the replacement Value property)
Search value:  \"([a-z]?)\":\"(.?)\"
Replacement Value: \"new$1":\"$2\"
Result:
{"newname":"Smith","newmiddle":"nifi","newfirstname":"John"}
```

The issue is that the processor evaluates the expression language before 
executing the regex and capture replacement.  The expression replaces the 
capture with the first value, and that is why the user was seeing that value 
repeated.

This pr changes the Regex evaluation of the processor to evaluate the regex 
first, and then run the expression on the result.

Some changes where required for escaping values.

I added tests for the reported issue and for escaping newlines etc that 
would break EL even if '' in a literal.

### For all changes:
- [x] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [x] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [x] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [x] Is your initial contribution a single, squashed commit?

### For code changes:
- [x] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [x] Have you written or updated unit tests to verify your changes?
- [-] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [-] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [-] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [-] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ottobackwards/nifi replace-text-el

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2748.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2748


commit 75638bc201d7a492fa09cdb8358201398cebe586
Author: Otto Fowler 
Date:   2018-05-30T20:53:55Z

NIFI-4272 support multiple captures when el is present




---


[GitHub] nifi issue #2747: NIFI-5249 Dockerfile enhancements

2018-05-31 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2747
  
Can you add explicit steps to test and verify, for those of use who don't 
work with docker as much?


---


[GitHub] nifi issue #2747: NIFI-5249 Dockerfile enhancements

2018-05-31 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2747
  
Thanks, I'd like to try this and help with the review


---


[GitHub] nifi issue #2742: NIFI-5244 Fixed a bug in MockSchemaRegistry that prevented...

2018-05-29 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2742
  
+1 looks good to me


---


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-04 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r192722395
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -277,6 +280,20 @@ private void setProxy(OkHttpClient.Builder builder) {
 }
 }
 
--- End diff --

Should this handle AttributeExpressionLanguageParsingException?  Is there 
any validation that can be done?


---


[GitHub] nifi issue #2747: NIFI-5249 Dockerfile enhancements

2018-06-01 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2747
  
So to test this?
- mvn package
- mvn package -P docker from nifi-docker
- ???
- docker run --rm -ti --entrypoint /bin/bash apache/nifi -c "env | grep 
NIFI" ? from nifi-docker dir?
- docker run --rm -ti --entrypoint /bin/bash apache/nifi -c "find /opt/nifi 
! -user nifi"  from nifi-docker dir?


---


[GitHub] nifi issue #2747: NIFI-5249 Dockerfile enhancements

2018-06-01 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2747
  
+1 fwiw|lgtm.
Ran the above steps to build and test the image from the maven snapshots ( 
not built locally ).
Everything ran fine.

Your integration test is awesome.  I'm totally going to steal it.

Super stuff.



---


[GitHub] nifi pull request #2760: NIFI-5266: Sanitize ES parameters in PutElasticsear...

2018-06-05 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2760#discussion_r193132253
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttp.java
 ---
@@ -227,7 +226,10 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 List flowFilesToTransfer = new LinkedList<>(flowFiles);
 
 final StringBuilder sb = new StringBuilder();
-final String baseUrl = 
trimToEmpty(context.getProperty(ES_URL).evaluateAttributeExpressions().getValue());
+final String baseUrl = 
context.getProperty(ES_URL).evaluateAttributeExpressions().getValue().trim();
+if (StringUtils.isEmpty(baseUrl)) {
--- End diff --

if baseUrl is empty should the exception message be that "Elasticsearch URL 
evaluates to empty"  or something?  Your message is always going to be "... not 
valid: " since baseUrl will be empty.



---


[GitHub] nifi pull request #2760: NIFI-5266: Sanitize ES parameters in PutElasticsear...

2018-06-05 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2760#discussion_r193132823
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttp.java
 ---
@@ -288,42 +290,23 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 session.read(file, in -> {
 json.append(IOUtils.toString(in, charset).replace("\r\n", 
" ").replace('\n', ' ').replace('\r', ' '));
 });
-if (indexOp.equalsIgnoreCase("index")) {
-sb.append("{\"index\": { \"_index\": \"");
-sb.append(index);
-sb.append("\", \"_type\": \"");
-sb.append(docType);
-sb.append("\"");
-if (!StringUtils.isEmpty(id)) {
-sb.append(", \"_id\": \"");
-sb.append(id);
-sb.append("\"");
-}
-sb.append("}}\n");
-sb.append(json);
-sb.append("\n");
-} else if (indexOp.equalsIgnoreCase("upsert") || 
indexOp.equalsIgnoreCase("update")) {
-sb.append("{\"update\": { \"_index\": \"");
-sb.append(index);
-sb.append("\", \"_type\": \"");
-sb.append(docType);
-sb.append("\", \"_id\": \"");
-sb.append(id);
-sb.append("\" }\n");
-sb.append("{\"doc\": ");
-sb.append(json);
-sb.append(", \"doc_as_upsert\": ");
-sb.append(indexOp.equalsIgnoreCase("upsert"));
-sb.append(" }\n");
-} else if (indexOp.equalsIgnoreCase("delete")) {
-sb.append("{\"delete\": { \"_index\": \"");
-sb.append(index);
-sb.append("\", \"_type\": \"");
-sb.append(docType);
-sb.append("\", \"_id\": \"");
-sb.append(id);
-sb.append("\" }\n");
+
+String jsonString = json.toString();
+
+// Ensure the JSON body is well-formed
+try {
--- End diff --

mapper should be static


---


[GitHub] nifi pull request #2760: NIFI-5266: Sanitize ES parameters in PutElasticsear...

2018-06-05 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2760#discussion_r193133584
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttpRecord.java
 ---
@@ -261,7 +259,10 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 OkHttpClient okHttpClient = getClient();
 final ComponentLog logger = getLogger();
 
-final String baseUrl = 
trimToEmpty(context.getProperty(ES_URL).evaluateAttributeExpressions().getValue());
+final String baseUrl = 
context.getProperty(ES_URL).evaluateAttributeExpressions().getValue().trim();
+if (StringUtils.isEmpty(baseUrl)) {
--- End diff --

Same as previous wrt baseUrl always being empty in the message


---


[GitHub] nifi pull request #2760: NIFI-5266: Sanitize ES parameters in PutElasticsear...

2018-06-05 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2760#discussion_r193133140
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttp.java
 ---
@@ -288,42 +290,23 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 session.read(file, in -> {
 json.append(IOUtils.toString(in, charset).replace("\r\n", 
" ").replace('\n', ' ').replace('\r', ' '));
 });
-if (indexOp.equalsIgnoreCase("index")) {
-sb.append("{\"index\": { \"_index\": \"");
-sb.append(index);
-sb.append("\", \"_type\": \"");
-sb.append(docType);
-sb.append("\"");
-if (!StringUtils.isEmpty(id)) {
-sb.append(", \"_id\": \"");
-sb.append(id);
-sb.append("\"");
-}
-sb.append("}}\n");
-sb.append(json);
-sb.append("\n");
-} else if (indexOp.equalsIgnoreCase("upsert") || 
indexOp.equalsIgnoreCase("update")) {
-sb.append("{\"update\": { \"_index\": \"");
-sb.append(index);
-sb.append("\", \"_type\": \"");
-sb.append(docType);
-sb.append("\", \"_id\": \"");
-sb.append(id);
-sb.append("\" }\n");
-sb.append("{\"doc\": ");
-sb.append(json);
-sb.append(", \"doc_as_upsert\": ");
-sb.append(indexOp.equalsIgnoreCase("upsert"));
-sb.append(" }\n");
-} else if (indexOp.equalsIgnoreCase("delete")) {
-sb.append("{\"delete\": { \"_index\": \"");
-sb.append(index);
-sb.append("\", \"_type\": \"");
-sb.append(docType);
-sb.append("\", \"_id\": \"");
-sb.append(id);
-sb.append("\" }\n");
+
+String jsonString = json.toString();
+
+// Ensure the JSON body is well-formed
+try {
--- End diff --

I wonder if a generally available validate json wouldn't be better, and 
something the next person couldn't use.


---


[GitHub] nifi issue #2588: NIFI-5022 InvokeAWSGatewayApi processor

2018-06-05 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2588
  
I can provide the reviewers with access to my PETSTORE endpoint for testing 
and we can work out how not to overload it, if that helps get this going


---


[GitHub] nifi issue #2760: NIFI-5266: Sanitize ES parameters in PutElasticsearchHttp ...

2018-06-05 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2760
  
+1 (non-binding), contrib-check build looks good, tests updated.


---


[GitHub] nifi pull request #2760: NIFI-5266: Sanitize ES parameters in PutElasticsear...

2018-06-05 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2760#discussion_r193183815
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/test/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttpRecordIT.java
 ---
@@ -227,6 +227,29 @@ public void testBadIndexName() throws Exception {
 runner.assertTransferCount(PutElasticsearchHttpRecord.REL_SUCCESS, 
0);
 }
 
+@Test
+public void testIndexNameWithJsonChar() throws Exception {
+// Undo some stuff from setup()
+runner.setProperty(PutElasticsearchHttpRecord.INDEX, 
"people}test");
+runner.setProperty(PutElasticsearchHttpRecord.TYPE, "person");
+recordReader.addRecord(1, new MapRecord(personSchema, new 
HashMap() {{
+put("name", "John Doe");
+put("age", 48);
+put("sport", null);
+}}));
+
+List> attrs = new ArrayList<>();
+Map attr = new HashMap<>();
+attr.put("doc_id", "1");
+attrs.add(attr);
+
+runner.enqueue("");
+runner.run(1, true, true);
+runner.assertTransferCount(PutElasticsearchHttpRecord.REL_FAILURE, 
0);
+runner.assertTransferCount(PutElasticsearchHttpRecord.REL_RETRY, 
0);
--- End diff --

Should this succeed or fail?


---


[GitHub] nifi pull request #2760: NIFI-5266: Sanitize ES parameters in PutElasticsear...

2018-06-05 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2760#discussion_r193189510
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/test/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttpRecordIT.java
 ---
@@ -227,6 +227,29 @@ public void testBadIndexName() throws Exception {
 runner.assertTransferCount(PutElasticsearchHttpRecord.REL_SUCCESS, 
0);
 }
 
+@Test
+public void testIndexNameWithJsonChar() throws Exception {
+// Undo some stuff from setup()
+runner.setProperty(PutElasticsearchHttpRecord.INDEX, 
"people}test");
+runner.setProperty(PutElasticsearchHttpRecord.TYPE, "person");
+recordReader.addRecord(1, new MapRecord(personSchema, new 
HashMap() {{
+put("name", "John Doe");
+put("age", 48);
+put("sport", null);
+}}));
+
+List> attrs = new ArrayList<>();
+Map attr = new HashMap<>();
+attr.put("doc_id", "1");
+attrs.add(attr);
+
+runner.enqueue("");
+runner.run(1, true, true);
+runner.assertTransferCount(PutElasticsearchHttpRecord.REL_FAILURE, 
0);
+runner.assertTransferCount(PutElasticsearchHttpRecord.REL_RETRY, 
0);
--- End diff --

ah, ok sorry.  Sometimes the intent isn't clear just looking at the test 
without a descriptive name or comment.  Sorry


---


[GitHub] nifi pull request #2748: NIFI-4272 ReplaceText support multiple captures whe...

2018-06-06 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2748#discussion_r193428577
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestReplaceText.java
 ---
@@ -1074,12 +1090,11 @@ public void testRegexWithBadCaptureGroup() throws 
IOException {
 runner.setProperty(ReplaceText.REPLACEMENT_STRATEGY, 
ReplaceText.REGEX_REPLACE);
 runner.setProperty(ReplaceText.EVALUATION_MODE, 
ReplaceText.ENTIRE_TEXT);
 
+exception.expect(AssertionError.class);
+exception.expectMessage("java.lang.IndexOutOfBoundsException: No 
group 1");
--- End diff --

OK, I made it escape even if there are no named captures, reverted the 
changed test.


---


[GitHub] nifi pull request #2748: NIFI-4272 ReplaceText support multiple captures whe...

2018-06-06 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2748#discussion_r193423188
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestReplaceText.java
 ---
@@ -1074,12 +1090,11 @@ public void testRegexWithBadCaptureGroup() throws 
IOException {
 runner.setProperty(ReplaceText.REPLACEMENT_STRATEGY, 
ReplaceText.REGEX_REPLACE);
 runner.setProperty(ReplaceText.EVALUATION_MODE, 
ReplaceText.ENTIRE_TEXT);
 
+exception.expect(AssertionError.class);
+exception.expectMessage("java.lang.IndexOutOfBoundsException: No 
group 1");
--- End diff --

so the issue is this:
```java
 // If we find a back reference that is not valid, then we will treat it as 
a literal string. For example, if we have 3 capturing
// groups and the Replacement Value has the value is "I owe $8 to him", 
then we want to treat the $8 as a literal "$8", rather
// than attempting to use it as a back reference.
private static String escapeLiteralBackReferences(final String 
unescaped, final int numCapturingGroups) {
if (numCapturingGroups == 0) {
return unescaped;
}
```

If there are no capture groups, we don't escape all the findings


---


[GitHub] nifi pull request #2748: NIFI-4272 ReplaceText support multiple captures whe...

2018-06-06 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2748#discussion_r193371373
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestReplaceText.java
 ---
@@ -81,15 +81,16 @@ public void testSimple() throws IOException {
 @Test
 public void testWithEscaped$InReplacement() throws IOException {
 final TestRunner runner = getRunner();
-runner.setProperty(ReplaceText.SEARCH_VALUE, "(?s:^.*$)");
+//runner.setProperty(ReplaceText.SEARCH_VALUE, "(?s:^.*$)");
+runner.setProperty(ReplaceText.SEARCH_VALUE, "(?s)(^.*$)");
 runner.setProperty(ReplaceText.REPLACEMENT_VALUE, "a\\$b");
 
 runner.enqueue("a$a,b,c,d");
 runner.run();
 
 runner.assertAllFlowFilesTransferred(ReplaceText.REL_SUCCESS, 1);
 final MockFlowFile out = 
runner.getFlowFilesForRelationship(ReplaceText.REL_SUCCESS).get(0);
-out.assertContentEquals("a\\$b".getBytes("UTF-8"));
+out.assertContentEquals("a$b".getBytes("UTF-8"));
--- End diff --

So, I think the behavior was incorrect before.

Since we do the Regex first now, and then the expression, what we end up 
with is
a .replaceAll((?s)(^.*$)),"a\$b")
passing \$ to the regex evaluates as a literal '$', so the results it 
correctly a$b.
In other words, it is working as the java regex works.

To get what you are looking for, further escaping is required:
```java
runner.setProperty(ReplaceText.REPLACEMENT_VALUE, "a\\$b");
```

I'll add the test




---


[GitHub] nifi pull request #2748: NIFI-4272 ReplaceText support multiple captures whe...

2018-06-06 Thread ottobackwards
Github user ottobackwards commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2748#discussion_r193422077
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestReplaceText.java
 ---
@@ -1074,12 +1090,11 @@ public void testRegexWithBadCaptureGroup() throws 
IOException {
 runner.setProperty(ReplaceText.REPLACEMENT_STRATEGY, 
ReplaceText.REGEX_REPLACE);
 runner.setProperty(ReplaceText.EVALUATION_MODE, 
ReplaceText.ENTIRE_TEXT);
 
+exception.expect(AssertionError.class);
+exception.expectMessage("java.lang.IndexOutOfBoundsException: No 
group 1");
--- End diff --

I'm looking into it, but the processor already has handing for this at 
runtime not at validation time, where it escapes back references that are over 
the actual capture count


---


  1   2   3   4   >