[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506820#comment-16506820
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194214335
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread ijokarumawak
Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194214335
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+.description("The URL for the REST endpoint. Expression language 
is evaluated against the lookup key/value pairs, " +
+"not flowfile attributes or variable registry.")
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread ijokarumawak
Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194214189
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+.description("The URL for the REST endpoint. Expression language 
is evaluated against the lookup key/value pairs, " +
+"not flowfile attributes or variable registry.")
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506817#comment-16506817
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user ijokarumawak commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194214189
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+

[jira] [Created] (NIFI-5286) Update FasterXML Jackson version to 2.9.5

2018-06-08 Thread Sivaprasanna Sethuraman (JIRA)
Sivaprasanna Sethuraman created NIFI-5286:
-

 Summary: Update FasterXML Jackson version to 2.9.5
 Key: NIFI-5286
 URL: https://issues.apache.org/jira/browse/NIFI-5286
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Core Framework, Extensions
Affects Versions: 1.6.0, 1.5.0, 1.4.0, 1.3.0
Reporter: Sivaprasanna Sethuraman
Assignee: Sivaprasanna Sethuraman


The current version of FasterXML Jackson-databind used is 2.9.4 which was 
supposed to fix several critical vulnerabilities but wasn't completely 
addressed. A fix to address them was introduced in 2.9.5.

More details about the vulnerability can be found at : 
https://nvd.nist.gov/vuln/detail/CVE-2018-7489

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5285) Re-evaluate memory/time cost parameters for 2018

2018-06-08 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5285?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto updated NIFI-5285:

Description: 
There are some bcrypt, SCrypt, and PBKDF2 initial parameters which were 
determined to be secure against a default threat model given best known attacks 
in 2016. These should be re-evaluated for 2018. 

Administration Guide
* Line 1303
* Line 1311
* Line 1321
* Line 1637

If these values are updated, backward-compatibility for internal uses also 
needs to be evaluated. 

  was:
There are some bcrypt, SCrypt, and PBKDF2 initial parameters which were 
determined to be secure against a default threat model given best known attacks 
in 2016. These should be re-evaluated for 2018. 

Administration Guide
* Line 1303
* Line 1311
* Line 1321
* Line 1637




> Re-evaluate memory/time cost parameters for 2018
> 
>
> Key: NIFI-5285
> URL: https://issues.apache.org/jira/browse/NIFI-5285
> Project: Apache NiFi
>  Issue Type: Task
>  Components: Documentation  Website
>Affects Versions: 1.6.0
>Reporter: Andy LoPresto
>Priority: Major
>  Labels: documentation, security
>
> There are some bcrypt, SCrypt, and PBKDF2 initial parameters which were 
> determined to be secure against a default threat model given best known 
> attacks in 2016. These should be re-evaluated for 2018. 
> Administration Guide
> * Line 1303
> * Line 1311
> * Line 1321
> * Line 1637
> If these values are updated, backward-compatibility for internal uses also 
> needs to be evaluated. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5006) Update docs to reflect 2018 where applicable

2018-06-08 Thread Andy LoPresto (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5006?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506800#comment-16506800
 ] 

Andy LoPresto commented on NIFI-5006:
-

I searched through the docs for {{201\d}} but I don't see any year instances 
that can be automatically updated. There are some "As of [this date in 2016], 
these values are sufficient ..." sentences, but the data values they prescribe 
need to be evaluated. I've made a new Jira 
[NIFI-5285|https://issues.apache.org/jira/browse/NIFI-5285] for those dates 
because it will require additional effort. Are there any other instances where 
you think the dates should be updated?

> Update docs to reflect 2018 where applicable
> 
>
> Key: NIFI-5006
> URL: https://issues.apache.org/jira/browse/NIFI-5006
> Project: Apache NiFi
>  Issue Type: Task
>Reporter: Aldrin Piri
>Assignee: Andy LoPresto
>Priority: Major
> Fix For: 1.7.0
>
>
> While reviewing the RC1 for NiFi 1.6.0 I noticed that docs have not been 
> updated to reflect the new year.  We should update these when handling our 
> next release.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (NIFI-5285) Re-evaluate memory/time cost parameters for 2018

2018-06-08 Thread Andy LoPresto (JIRA)
Andy LoPresto created NIFI-5285:
---

 Summary: Re-evaluate memory/time cost parameters for 2018
 Key: NIFI-5285
 URL: https://issues.apache.org/jira/browse/NIFI-5285
 Project: Apache NiFi
  Issue Type: Task
  Components: Documentation  Website
Affects Versions: 1.6.0
Reporter: Andy LoPresto


There are some bcrypt, SCrypt, and PBKDF2 initial parameters which were 
determined to be secure against a default threat model given best known attacks 
in 2016. These should be re-evaluated for 2018. 

Administration Guide
* Line 1303
* Line 1311
* Line 1321
* Line 1637





--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (NIFI-5006) Update docs to reflect 2018 where applicable

2018-06-08 Thread Andy LoPresto (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andy LoPresto reassigned NIFI-5006:
---

Assignee: Andy LoPresto

> Update docs to reflect 2018 where applicable
> 
>
> Key: NIFI-5006
> URL: https://issues.apache.org/jira/browse/NIFI-5006
> Project: Apache NiFi
>  Issue Type: Task
>Reporter: Aldrin Piri
>Assignee: Andy LoPresto
>Priority: Major
> Fix For: 1.7.0
>
>
> While reviewing the RC1 for NiFi 1.6.0 I noticed that docs have not been 
> updated to reflect the new year.  We should update these when handling our 
> next release.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5044) SelectHiveQL accept only one statement

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506567#comment-16506567
 ] 

ASF GitHub Bot commented on NIFI-5044:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2695#discussion_r194183320
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/test/java/org/apache/nifi/processors/hive/TestSelectHiveQL.java
 ---
@@ -198,6 +200,51 @@ public void testWithSqlException() throws SQLException 
{
 runner.assertAllFlowFilesTransferred(SelectHiveQL.REL_FAILURE, 1);
 }
 
+@Test
+public void invokeOnTriggerExceptionInPreQieriesNoIncomingFlows()
+throws InitializationException, ClassNotFoundException, 
SQLException, IOException {
+
+doOnTrigger(QUERY_WITHOUT_EL, false, CSV,
+"select 'no exception' from persons; select exception from 
persons",
+null);
+
+runner.assertAllFlowFilesTransferred(SelectHiveQL.REL_FAILURE, 1);
--- End diff --

I must have been thinking of some other processor or perhaps the behavior 
had changed at some point, sorry about that. As long as it behaves the same way 
it used to with respect to where/if FFs get transferred, then I'm good :)


> SelectHiveQL accept only one statement
> --
>
> Key: NIFI-5044
> URL: https://issues.apache.org/jira/browse/NIFI-5044
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0, 1.4.0, 1.5.0, 1.6.0
>Reporter: Davide Isoardi
>Assignee: Ed Berezitsky
>Priority: Critical
>  Labels: features, patch, pull-request-available
> Attachments: 
> 0001-NIFI-5044-SelectHiveQL-accept-only-one-statement.patch
>
>
> In [this 
> |[https://github.com/apache/nifi/commit/bbc714e73ba245de7bc32fd9958667c847101f7d]
>  ] commit claims to add support to running multiple statements both on 
> SelectHiveQL and PutHiveQL; instead, it adds only the support to PutHiveQL, 
> so SelectHiveQL still lacks this important feature. @Matt Burgess, I saw that 
> you worked on that, is there any reason for this? If not, can we support it?
> If I try to execute this query:
> {quote}set hive.vectorized.execution.enabled = false; SELECT * FROM table_name
> {quote}
> I have this error:
>  
> {quote}2018-04-05 13:35:40,572 ERROR [Timer-Driven Process Thread-146] 
> o.a.nifi.processors.hive.SelectHiveQL 
> SelectHiveQL[id=243d4c17-b1fe-14af--ee8ce15e] Unable to execute 
> HiveQL select query set hive.vectorized.execution.enabled = false; SELECT * 
> FROM table_name for 
> StandardFlowFileRecord[uuid=0e035558-07ce-473b-b0d4-ac00b8b1df93,claim=StandardContentClaim
>  [resourceClaim=StandardResourceClaim[id=1522824912161-2753, 
> container=default, section=705], offset=838441, 
> length=25],offset=0,name=cliente_attributi.csv,size=25] due to 
> org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: 
> The query did not generate a result set!; routing to failure: {}
>  org.apache.nifi.processor.exception.ProcessException: java.sql.SQLException: 
> The query did not generate a result set!
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL$2.process(SelectHiveQL.java:305)
>  at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2529)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.onTrigger(SelectHiveQL.java:275)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.lambda$onTrigger$0(SelectHiveQL.java:215)
>  at 
> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
>  at 
> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:106)
>  at 
> org.apache.nifi.processors.hive.SelectHiveQL.onTrigger(SelectHiveQL.java:215)
>  at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>  at 
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>  at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>  at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> 

[GitHub] nifi pull request #2695: NIFI-5044 SelectHiveQL accept only one statement

2018-06-08 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2695#discussion_r194183320
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/test/java/org/apache/nifi/processors/hive/TestSelectHiveQL.java
 ---
@@ -198,6 +200,51 @@ public void testWithSqlException() throws SQLException 
{
 runner.assertAllFlowFilesTransferred(SelectHiveQL.REL_FAILURE, 1);
 }
 
+@Test
+public void invokeOnTriggerExceptionInPreQieriesNoIncomingFlows()
+throws InitializationException, ClassNotFoundException, 
SQLException, IOException {
+
+doOnTrigger(QUERY_WITHOUT_EL, false, CSV,
+"select 'no exception' from persons; select exception from 
persons",
+null);
+
+runner.assertAllFlowFilesTransferred(SelectHiveQL.REL_FAILURE, 1);
--- End diff --

I must have been thinking of some other processor or perhaps the behavior 
had changed at some point, sorry about that. As long as it behaves the same way 
it used to with respect to where/if FFs get transferred, then I'm good :)


---


[jira] [Updated] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-08 Thread Matt Burgess (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-5200:
---
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Blocker
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506547#comment-16506547
 ] 

ASF GitHub Bot commented on NIFI-5200:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2753


> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Blocker
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-08 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506546#comment-16506546
 ] 

ASF subversion and git services commented on NIFI-5200:
---

Commit 00a63d17af3c82727b9119acb00fccfcf6639fc5 in nifi's branch 
refs/heads/master from [~markap14]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=00a63d1 ]

NIFI-5200: Fixed issue with InputStream being closed when calling 
ProcessSession.read() twice against sequential Content Claims

Signed-off-by: Matthew Burgess 

This closes #2753


> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Blocker
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5200) Nested ProcessSession.read resulting in outer stream being closed.

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506545#comment-16506545
 ] 

ASF GitHub Bot commented on NIFI-5200:
--

Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/2753
  
+1 LGTM, verified unit tests illustrate and validate the fix. Thanks for 
the fix, merging to master.


> Nested ProcessSession.read resulting in outer stream being closed.
> --
>
> Key: NIFI-5200
> URL: https://issues.apache.org/jira/browse/NIFI-5200
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Peter Radden
>Assignee: Mark Payne
>Priority: Blocker
> Fix For: 1.7.0
>
>
> Consider this example processor:
> {code:java}
> FlowFile ff1 = session.write(session.create(),
> (out) -> { out.write(new byte[]{ 'A', 'B' }); });
> FlowFile ff2 = session.write(session.create(),
> (out) -> { out.write('C'); });
> session.read(ff1,
> (in1) -> {
> int a = in1.read();
> session.read(ff2, (in2) -> { int c = in2.read(); });
> int b = in1.read();
> });
> session.transfer(ff1, REL_SUCCESS);
> session.transfer(ff2, REL_SUCCESS);{code}
> The expectation is that a='A', b='B' and c='C'.
> The actual result is that the final call to in1.read() throws due to the 
> underlying stream being closed by the previous session.read on ff2.
> A workaround seems to be to pass the optional parameter to session.read of 
> allowSessionStreamManagement=true.
> Is this expected that nested reads used in this way will not work?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2753: NIFI-5200: Fixed issue with InputStream being close...

2018-06-08 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2753


---


[GitHub] nifi issue #2753: NIFI-5200: Fixed issue with InputStream being closed when ...

2018-06-08 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/2753
  
+1 LGTM, verified unit tests illustrate and validate the fix. Thanks for 
the fix, merging to master.


---


[jira] [Created] (NIFI-5284) RunMongoAggregation uses ObjectIdSerializer & SimpleDateFormat

2018-06-08 Thread Zambonilli (JIRA)
Zambonilli created NIFI-5284:


 Summary: RunMongoAggregation uses ObjectIdSerializer & 
SimpleDateFormat
 Key: NIFI-5284
 URL: https://issues.apache.org/jira/browse/NIFI-5284
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Extensions
Affects Versions: 1.6.0
Reporter: Zambonilli


The RunMongoAggregation processor uses Jackson to serialize the document to 
JSON. However, the default serialization for Jackson on Mongo ObjectId and 
dates leaves a lot to be desired. The ObjectId's are serialized into the 
decimal representation of each component of the ObjectId instead of the hex 
string of the full byte array. Mongo dates are being serialized as unix time as 
opposed to ISO8601 zulu string.

It looks like the GetMongo processor has set the correct serializer flags on 
Jackson to fix this. The fix for GetMongo is here. 
https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-mongodb-bundle/nifi-mongodb-processors/src/main/java/org/apache/nifi/processors/mongodb/GetMongo.java#L213



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (NIFI-5275) PostHTTP - Hung connections and zero reuse of existing connections

2018-06-08 Thread Michael Moser (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5275?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Moser reassigned NIFI-5275:
---

Assignee: Michael Moser

> PostHTTP - Hung connections and zero reuse of existing connections
> --
>
> Key: NIFI-5275
> URL: https://issues.apache.org/jira/browse/NIFI-5275
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.6.0
>Reporter: Steven Youtsey
>Assignee: Michael Moser
>Priority: Major
>
> Connection setups, the HEAD request, and the DELETE request do not have any 
> timeout associated with them. When the remote server goes sideways, these 
> actions will wait indefinitely and appear as being hung. See 
> https://issues.apache.org/jira/browse/HTTPCLIENT-1892 for an explanation as 
> to why the initial connection setups are not timing out.
> Connections, though pooled, are not being re-used. A new connection is 
> established for every POST. This creates a burden on highly loaded remote 
> listener servers. Verified by both netstat and turning on Debug for 
> org.apache.http.impl.conn.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-fds pull request #5: add flow_design_styles.css to demo-app index.html

2018-06-08 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi-fds/pull/5


---


[jira] [Commented] (NIFI-4508) AMQP Processor that uses basicConsume

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506490#comment-16506490
 ] 

ASF GitHub Bot commented on NIFI-4508:
--

GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2774

[WIP] NIFI-4508: Update ConsumeAMQP to use basicConsume API instead of 
basicGet in order to provide better performance


This PR is not ready to be merged. However, I wanted to go ahead and create 
a WIP PR so that those who are interested can look at, test, and provide 
feedback.

Currently, the unit tests do not work because the mocks don't implement all 
of the appropriate methods. As a result, they are @Ignore'd for now.

Also, more testing needs to be done to ensure that we are properly 
performing rollback's of un-acknowledged messages when the processor is stopped.



Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi NIFI-4508

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2774.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2774


commit d16105ddd57cea38e9866b6a59fab9d806418a5a
Author: Mark Payne 
Date:   2018-06-08T19:37:17Z

NIFI-4508: Update ConsumeAMQP to use basicConsume API instead of basicGet 
in order to provide better performance




> AMQP Processor that uses basicConsume
> -
>
> Key: NIFI-4508
> URL: https://issues.apache.org/jira/browse/NIFI-4508
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.3.0, 1.4.0, 1.5.0, 1.6.0, 1.7.0
>Reporter: Randy Bovay
>Assignee: Mark Payne
>Priority: Major
>
> Due to poor performance of the AMQP Processor, we need to be able to have a 
> basicConsume based interface to RabbitMQ.
> https://community.hortonworks.com/questions/66799/consumeamqp-performance-issue-less-than-50-msgs-se.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2774: [WIP] NIFI-4508: Update ConsumeAMQP to use basicCon...

2018-06-08 Thread markap14
GitHub user markap14 opened a pull request:

https://github.com/apache/nifi/pull/2774

[WIP] NIFI-4508: Update ConsumeAMQP to use basicConsume API instead of 
basicGet in order to provide better performance


This PR is not ready to be merged. However, I wanted to go ahead and create 
a WIP PR so that those who are interested can look at, test, and provide 
feedback.

Currently, the unit tests do not work because the mocks don't implement all 
of the appropriate methods. As a result, they are @Ignore'd for now.

Also, more testing needs to be done to ensure that we are properly 
performing rollback's of un-acknowledged messages when the processor is stopped.



Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/markap14/nifi NIFI-4508

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2774.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2774


commit d16105ddd57cea38e9866b6a59fab9d806418a5a
Author: Mark Payne 
Date:   2018-06-08T19:37:17Z

NIFI-4508: Update ConsumeAMQP to use basicConsume API instead of basicGet 
in order to provide better performance




---


[jira] [Assigned] (NIFI-4508) AMQP Processor that uses basicConsume

2018-06-08 Thread Mark Payne (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-4508?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne reassigned NIFI-4508:


Assignee: Mark Payne

> AMQP Processor that uses basicConsume
> -
>
> Key: NIFI-4508
> URL: https://issues.apache.org/jira/browse/NIFI-4508
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core Framework
>Affects Versions: 1.3.0, 1.4.0, 1.5.0, 1.6.0, 1.7.0
>Reporter: Randy Bovay
>Assignee: Mark Payne
>Priority: Major
>
> Due to poor performance of the AMQP Processor, we need to be able to have a 
> basicConsume based interface to RabbitMQ.
> https://community.hortonworks.com/questions/66799/consumeamqp-performance-issue-less-than-50-msgs-se.html



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi-fds pull request #5: add flow_design_styles.css to demo-app index.html

2018-06-08 Thread scottyaslan
GitHub user scottyaslan opened a pull request:

https://github.com/apache/nifi-fds/pull/5

add flow_design_styles.css to demo-app index.html

Thank you for submitting a contribution to Apache NiFi Flow Design System.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced
 in the commit message?

- [ ] Does your PR title start with either NIFI- where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] Have you ensured that a full build and that the full suite of unit 
tests is executed via npm run clean:install at the root nifi-fds folder?
- [ ] Have you written or updated the Apache NiFi Flow Design System demo 
application to demonstrate any new functionality, provide examples of usage, 
and to verify your changes via npm start at the nifi-fds/target folder?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-fds?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-fds?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/scottyaslan/nifi-fds NIFI-5283

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi-fds/pull/5.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5


commit 300d2d678d065ebad2b61699005759658d265f2d
Author: Scott Aslan 
Date:   2018-06-08T19:32:43Z

add flow_design_styles.css to demo-app index.html




---


[jira] [Created] (NIFI-5283) Add flow design system core styles to demo-app

2018-06-08 Thread Scott Aslan (JIRA)
Scott Aslan created NIFI-5283:
-

 Summary: Add flow design system core styles to demo-app
 Key: NIFI-5283
 URL: https://issues.apache.org/jira/browse/NIFI-5283
 Project: Apache NiFi
  Issue Type: Bug
  Components: FDS
Reporter: Scott Aslan
Assignee: Scott Aslan
 Fix For: fds-0.1






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506475#comment-16506475
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user peter-toth commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194157756
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractive.java
 ---
@@ -85,16 +85,17 @@ private static TestServer createServer() throws 
IOException {
 
 @Test
 public void testSparkSession() throws Exception {
-
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

Thanks for the review @mgaido91. I made a minor refactor to the tests and 
separated the cases.


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506474#comment-16506474
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user peter-toth commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194157588
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractiveSSL.java
 ---
@@ -109,13 +109,15 @@ private static TestServer createServer() throws 
IOException {
 public void testSslSparkSession() throws Exception {
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

Removed.


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506472#comment-16506472
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user peter-toth commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194157501
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
--- End diff --

Fixed.


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread peter-toth
Github user peter-toth commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194157756
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractive.java
 ---
@@ -85,16 +85,17 @@ private static TestServer createServer() throws 
IOException {
 
 @Test
 public void testSparkSession() throws Exception {
-
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

Thanks for the review @mgaido91. I made a minor refactor to the tests and 
separated the cases.


---


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506473#comment-16506473
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user peter-toth commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194157547
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
+
+new ObjectMapper().readTree(requestBody);
--- End diff --

Done.


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread peter-toth
Github user peter-toth commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194157547
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
+
+new ObjectMapper().readTree(requestBody);
--- End diff --

Done.


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread peter-toth
Github user peter-toth commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194157588
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractiveSSL.java
 ---
@@ -109,13 +109,15 @@ private static TestServer createServer() throws 
IOException {
 public void testSslSparkSession() throws Exception {
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

Removed.


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread peter-toth
Github user peter-toth commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194157501
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
--- End diff --

Fixed.


---


[jira] [Commented] (NIFI-5221) Add Object Tagging support for AWS S3 Processors

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506353#comment-16506353
 ] 

ASF GitHub Bot commented on NIFI-5221:
--

Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194140856
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/ListS3.java
 ---
@@ -307,6 +328,20 @@ private boolean commit(final ProcessContext context, 
final ProcessSession sessio
 return willCommit;
 }
 
+private Map writeObjectTags(AmazonS3 client, 
S3VersionSummary versionSummary) {
+final GetObjectTaggingResult taggingResult = 
client.getObjectTagging(new 
GetObjectTaggingRequest(versionSummary.getBucketName(), 
versionSummary.getKey()));
--- End diff --

I agree with @pvillard31 that it should be off by default.  From comments 
on the users/developer email lists, I understand ListS3 is used to process very 
large lists of objects, easily 10,000+ on a regular basis.  Even if the 
additional API calls are quick, it will add up to be a lot of API calls.

Unfortunately, it does not look like the 
[S3ObjectSummary](https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/model/S3ObjectSummary.html)
 returned by the listing contains any hints on the number of tags present, if 
any.


> Add Object Tagging support for AWS S3 Processors
> 
>
> Key: NIFI-5221
> URL: https://issues.apache.org/jira/browse/NIFI-5221
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> AWS has introduced new set of functionalities that enable the S3 bucket and 
> objects to be tagged. This can be useful for data classification purposes and 
> with new regulatory process related to data are being introduced such as 
> GDPR, object tagging can be quite useful and helpful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2751: NIFI-5221: Added 'Object Tagging' functionalities t...

2018-06-08 Thread jvwing
Github user jvwing commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194140856
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/ListS3.java
 ---
@@ -307,6 +328,20 @@ private boolean commit(final ProcessContext context, 
final ProcessSession sessio
 return willCommit;
 }
 
+private Map writeObjectTags(AmazonS3 client, 
S3VersionSummary versionSummary) {
+final GetObjectTaggingResult taggingResult = 
client.getObjectTagging(new 
GetObjectTaggingRequest(versionSummary.getBucketName(), 
versionSummary.getKey()));
--- End diff --

I agree with @pvillard31 that it should be off by default.  From comments 
on the users/developer email lists, I understand ListS3 is used to process very 
large lists of objects, easily 10,000+ on a regular basis.  Even if the 
additional API calls are quick, it will add up to be a lot of API calls.

Unfortunately, it does not look like the 
[S3ObjectSummary](https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/model/S3ObjectSummary.html)
 returned by the listing contains any hints on the number of tags present, if 
any.


---


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506199#comment-16506199
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194108434
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractiveSSL.java
 ---
@@ -109,13 +109,15 @@ private static TestServer createServer() throws 
IOException {
 public void testSslSparkSession() throws Exception {
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

I think that adding a UT for the non-SSL case is enough, isn't it? There is 
no difference among SSL and non-SSL about escaping and the content IIUC.


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506196#comment-16506196
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194108030
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractive.java
 ---
@@ -85,16 +85,17 @@ private static TestServer createServer() throws 
IOException {
 
 @Test
 public void testSparkSession() throws Exception {
-
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

instead of changing the existing UT, what about creating a new one for this 
specific case? It is good that every UT has very little scope so that failures 
can clearly indicate to the developer what he/she broke applying a patch...


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506198#comment-16506198
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194106518
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
--- End diff --

I think this is a leftover from your tests and should be removed.


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506197#comment-16506197
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194107658
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
+
+new ObjectMapper().readTree(requestBody);
--- End diff --

may you please add some comments explaining what and why you are doing 
this? It is clear since we are in the context of this PR, but for future 
readers I think a comment would be very helpful.


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194107658
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
+
+new ObjectMapper().readTree(requestBody);
--- End diff --

may you please add some comments explaining what and why you are doing 
this? It is clear since we are in the context of this PR, but for future 
readers I think a comment would be very helpful.


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194106518
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/ExecuteSparkInteractiveTestBase.java
 ---
@@ -64,33 +66,37 @@ public void handle(String target, Request baseRequest, 
HttpServletRequest reques
 }
 session1Requests++;
 }
-
-response.setContentLength(responseBody.length());
-
-try (PrintWriter writer = response.getWriter()) {
-writer.print(responseBody);
-writer.flush();
-}
-
 } else if ("POST".equalsIgnoreCase(request.getMethod())) {
-
-String responseBody = "{}";
-response.setContentType("application/json");
-
-if ("/sessions".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 1, \"kind\": \"spark\", 
\"state\": \"idle\"}";
-} else if 
("/sessions/1/statements".equalsIgnoreCase(target)) {
-responseBody = "{\"id\": 7}";
+String requestBody = IOUtils.toString(request.getReader());
+try {
+System.out.println("requestBody: " + requestBody);
--- End diff --

I think this is a leftover from your tests and should be removed.


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194108030
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractive.java
 ---
@@ -85,16 +85,17 @@ private static TestServer createServer() throws 
IOException {
 
 @Test
 public void testSparkSession() throws Exception {
-
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

instead of changing the existing UT, what about creating a new one for this 
specific case? It is good that every UT has very little scope so that failures 
can clearly indicate to the developer what he/she broke applying a patch...


---


[GitHub] nifi pull request #2768: NIFI-5278: fixes JSON escaping of code parameter in...

2018-06-08 Thread mgaido91
Github user mgaido91 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2768#discussion_r194108434
  
--- Diff: 
nifi-nar-bundles/nifi-spark-bundle/nifi-livy-processors/src/test/java/org/apache/nifi/processors/livy/TestExecuteSparkInteractiveSSL.java
 ---
@@ -109,13 +109,15 @@ private static TestServer createServer() throws 
IOException {
 public void testSslSparkSession() throws Exception {
 addHandler(new LivyAPIHandler());
 
-runner.enqueue("print \"hello world\"");
+String code = "print \"hello world\" //'?!<>[]{}()$&*=%;.|_-\\";
--- End diff --

I think that adding a UT for the non-SSL case is enough, isn't it? There is 
no difference among SSL and non-SSL about escaping and the content IIUC.


---


[jira] [Updated] (NIFI-4262) MergeContent - option to add merged uuid in original flow files

2018-06-08 Thread Sivaprasanna Sethuraman (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-4262?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sivaprasanna Sethuraman updated NIFI-4262:
--
   Resolution: Fixed
Fix Version/s: 1.7.0
   Status: Resolved  (was: Patch Available)

> MergeContent - option to add merged uuid in original flow files
> ---
>
> Key: NIFI-4262
> URL: https://issues.apache.org/jira/browse/NIFI-4262
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
> Fix For: 1.7.0
>
>
> With the apparition of Wait/Notify processors it is now possible tackle new 
> challenges when it comes to synchronize the execution of different parts of 
> the workflow.
> The objective here is the following:
> Flow files are sent to a MergeContent processor. Merged flow files are then 
> sent to a processor A while original flow files are sent to a processor B. I 
> want to trigger processor B when and only when processor A has completed.
> To use the Wait/Notify approach, a common attribute must be available to be 
> used as a signal in the distributed cache. This JIRA is about adding a 
> processor property allowing a user to add the UUID of the merged flow file as 
> a new attribute of all the original flow files that are constituting the 
> merged flow file.
> The template attached to NIFI-4028 can be used for this use case. Note that 
> the fix for NIFI-4028 is needed to solve the use case described in this JIRA.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4262) MergeContent - option to add merged uuid in original flow files

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4262?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506146#comment-16506146
 ] 

ASF GitHub Bot commented on NIFI-4262:
--

Github user zenfenan commented on the issue:

https://github.com/apache/nifi/pull/2056
  
Pulled it and built it locally. Ran the template attached in 
[NIFI-4028](https://issues.apache.org/jira/browse/NIFI-4028) and verified it to 
be working as expected. Thanks @pvillard31 @markap14 @mattyb149 Merged to 
master. +1


> MergeContent - option to add merged uuid in original flow files
> ---
>
> Key: NIFI-4262
> URL: https://issues.apache.org/jira/browse/NIFI-4262
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> With the apparition of Wait/Notify processors it is now possible tackle new 
> challenges when it comes to synchronize the execution of different parts of 
> the workflow.
> The objective here is the following:
> Flow files are sent to a MergeContent processor. Merged flow files are then 
> sent to a processor A while original flow files are sent to a processor B. I 
> want to trigger processor B when and only when processor A has completed.
> To use the Wait/Notify approach, a common attribute must be available to be 
> used as a signal in the distributed cache. This JIRA is about adding a 
> processor property allowing a user to add the UUID of the merged flow file as 
> a new attribute of all the original flow files that are constituting the 
> merged flow file.
> The template attached to NIFI-4028 can be used for this use case. Note that 
> the fix for NIFI-4028 is needed to solve the use case described in this JIRA.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2056: NIFI-4262 - MergeContent - option to add merged uuid in or...

2018-06-08 Thread zenfenan
Github user zenfenan commented on the issue:

https://github.com/apache/nifi/pull/2056
  
Pulled it and built it locally. Ran the template attached in 
[NIFI-4028](https://issues.apache.org/jira/browse/NIFI-4028) and verified it to 
be working as expected. Thanks @pvillard31 @markap14 @mattyb149 Merged to 
master. +1


---


[jira] [Commented] (NIFI-4262) MergeContent - option to add merged uuid in original flow files

2018-06-08 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4262?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506139#comment-16506139
 ] 

ASF subversion and git services commented on NIFI-4262:
---

Commit 05d7b6c6e7d640a300b580724e460d45eaed1938 in nifi's branch 
refs/heads/master from [~pvillard]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=05d7b6c ]

NIFI-4262 - MergeContent - option to add merged uuid in original flow files

This closes #2056

Signed-off-by: zenfenan 


> MergeContent - option to add merged uuid in original flow files
> ---
>
> Key: NIFI-4262
> URL: https://issues.apache.org/jira/browse/NIFI-4262
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> With the apparition of Wait/Notify processors it is now possible tackle new 
> challenges when it comes to synchronize the execution of different parts of 
> the workflow.
> The objective here is the following:
> Flow files are sent to a MergeContent processor. Merged flow files are then 
> sent to a processor A while original flow files are sent to a processor B. I 
> want to trigger processor B when and only when processor A has completed.
> To use the Wait/Notify approach, a common attribute must be available to be 
> used as a signal in the distributed cache. This JIRA is about adding a 
> processor property allowing a user to add the UUID of the merged flow file as 
> a new attribute of all the original flow files that are constituting the 
> merged flow file.
> The template attached to NIFI-4028 can be used for this use case. Note that 
> the fix for NIFI-4028 is needed to solve the use case described in this JIRA.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4262) MergeContent - option to add merged uuid in original flow files

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4262?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506142#comment-16506142
 ] 

ASF GitHub Bot commented on NIFI-4262:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2056


> MergeContent - option to add merged uuid in original flow files
> ---
>
> Key: NIFI-4262
> URL: https://issues.apache.org/jira/browse/NIFI-4262
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Major
>
> With the apparition of Wait/Notify processors it is now possible tackle new 
> challenges when it comes to synchronize the execution of different parts of 
> the workflow.
> The objective here is the following:
> Flow files are sent to a MergeContent processor. Merged flow files are then 
> sent to a processor A while original flow files are sent to a processor B. I 
> want to trigger processor B when and only when processor A has completed.
> To use the Wait/Notify approach, a common attribute must be available to be 
> used as a signal in the distributed cache. This JIRA is about adding a 
> processor property allowing a user to add the UUID of the merged flow file as 
> a new attribute of all the original flow files that are constituting the 
> merged flow file.
> The template attached to NIFI-4028 can be used for this use case. Note that 
> the fix for NIFI-4028 is needed to solve the use case described in this JIRA.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2056: NIFI-4262 - MergeContent - option to add merged uui...

2018-06-08 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2056


---


[jira] [Created] (NIFI-5282) GCPProcessor with HTTP Proxy with Authentication

2018-06-08 Thread Julian Gimbel (JIRA)
Julian Gimbel created NIFI-5282:
---

 Summary: GCPProcessor with HTTP Proxy with Authentication
 Key: NIFI-5282
 URL: https://issues.apache.org/jira/browse/NIFI-5282
 Project: Apache NiFi
  Issue Type: Improvement
Affects Versions: 1.6.0
Reporter: Julian Gimbel


The [AbstractGCPProcessor 
|https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java]
 already accepts http proxy settings but it but be even better if it accepts 
authenticated proxies with user and password aswell.

In the best case it would support the ProxyService introduced in 
[NIFI-4199|https://issues.apache.org/jira/projects/NIFI/issues/NIFI-4199] and 
all of its options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5155) Bulletins do not include IP/hostname information

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5155?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506052#comment-16506052
 ] 

ASF GitHub Bot commented on NIFI-5155:
--

Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2773
  
I'm not exactly sure why the Bulletin itself has a `nodeAddress` field. 
However, I can offer that the corresponding BulletinDTO contains a 
`nodeAddress` to help differentiate between which node in the cluster is 
reporting the message. This is populated when the responses are merged at the 
cluster coordinator. In standalone mode, there is only a single instance so 
this field does not need to be populated. This is also how component 
`validationErrors` work.


> Bulletins do not include IP/hostname information
> 
>
> Key: NIFI-5155
> URL: https://issues.apache.org/jira/browse/NIFI-5155
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> There is an API call on Bulletin for setNodeAddress(), which is meant to set 
> the node on which a bulletin/error has occurred. The 
> SiteToSiteBulletinReportingTask uses getNodeAddress() to add said field to 
> the outgoing records (if available). However, the framework is not calling 
> setNodeAddress() anywhere, which results in the field being missing from the 
> outgoing records.
> NiFi should add the appropriate node address information to the outgoing 
> bulletin information.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2773: NIFI-5155: Add host address info to bulletins

2018-06-08 Thread mcgilman
Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2773
  
I'm not exactly sure why the Bulletin itself has a `nodeAddress` field. 
However, I can offer that the corresponding BulletinDTO contains a 
`nodeAddress` to help differentiate between which node in the cluster is 
reporting the message. This is populated when the responses are merged at the 
cluster coordinator. In standalone mode, there is only a single instance so 
this field does not need to be populated. This is also how component 
`validationErrors` work.


---


[jira] [Updated] (NIFI-5264) Add parsing failure message in ValidateCSV

2018-06-08 Thread Sivaprasanna Sethuraman (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5264?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sivaprasanna Sethuraman updated NIFI-5264:
--
   Resolution: Fixed
Fix Version/s: 1.7.0
   Status: Resolved  (was: Patch Available)

> Add parsing failure message in ValidateCSV 
> ---
>
> Key: NIFI-5264
> URL: https://issues.apache.org/jira/browse/NIFI-5264
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.6.0
>Reporter: bsf
>Assignee: Pierre Villard
>Priority: Major
> Fix For: 1.7.0
>
>
> As a developer I would like to see an improvement on the ValidateCSV 
> component when using the line by line validation strategy. It will be nice to 
> have an option to append into the flowfile on the invalid relationship 1 or 2 
> new fields:
>  * field_name : the name of the field failed in the schema validation
>  * field_description : the description of the validation error
> This will help a lot the user to understand the validation issue on each line.
> If too complex, anything that provides information on the fail validation of 
> the line against the schema will be more than welcome :)
> Pentaho DI do something like this by enabling error handling.
> Thanks a lot!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5278) ExecuteSparkInteractive processor fails on code containing a quote

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5278?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506047#comment-16506047
 ] 

ASF GitHub Bot commented on NIFI-5278:
--

Github user peter-toth commented on the issue:

https://github.com/apache/nifi/pull/2768
  
@joewitt , thanks for the feedback. I've added Apache Commons Text to 
NOTICE of the nifi-livy-nar and nifi-assembly as you suggested. I checked that 
it does not bring in any new transitive dependency and also amended the 
existing test.


> ExecuteSparkInteractive processor fails on code containing a quote
> --
>
> Key: NIFI-5278
> URL: https://issues.apache.org/jira/browse/NIFI-5278
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
>Reporter: Peter Toth
>Priority: Major
>
> ExecuteSparkInteractive uses 
> org.apache.commons.lang.StringEscapeUtils.escapeJavaScript() which results 
> quotes to be escaped as \'. This breaks JSON payload of the Livy REST API.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2768: NIFI-5278: fixes JSON escaping of code parameter in Execut...

2018-06-08 Thread peter-toth
Github user peter-toth commented on the issue:

https://github.com/apache/nifi/pull/2768
  
@joewitt , thanks for the feedback. I've added Apache Commons Text to 
NOTICE of the nifi-livy-nar and nifi-assembly as you suggested. I checked that 
it does not bring in any new transitive dependency and also amended the 
existing test.


---


[jira] [Commented] (NIFI-5264) Add parsing failure message in ValidateCSV

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5264?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506044#comment-16506044
 ] 

ASF GitHub Bot commented on NIFI-5264:
--

Github user zenfenan commented on the issue:

https://github.com/apache/nifi/pull/2769
  
Built it and tested locally. Tested with few runs and got the expected 
error messages in the attributes. Thanks @pvillard31 


> Add parsing failure message in ValidateCSV 
> ---
>
> Key: NIFI-5264
> URL: https://issues.apache.org/jira/browse/NIFI-5264
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.6.0
>Reporter: bsf
>Assignee: Pierre Villard
>Priority: Major
>
> As a developer I would like to see an improvement on the ValidateCSV 
> component when using the line by line validation strategy. It will be nice to 
> have an option to append into the flowfile on the invalid relationship 1 or 2 
> new fields:
>  * field_name : the name of the field failed in the schema validation
>  * field_description : the description of the validation error
> This will help a lot the user to understand the validation issue on each line.
> If too complex, anything that provides information on the fail validation of 
> the line against the schema will be more than welcome :)
> Pentaho DI do something like this by enabling error handling.
> Thanks a lot!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2769: NIFI-5264 - Added attribute for validation error message i...

2018-06-08 Thread zenfenan
Github user zenfenan commented on the issue:

https://github.com/apache/nifi/pull/2769
  
Built it and tested locally. Tested with few runs and got the expected 
error messages in the attributes. Thanks @pvillard31 


---


[jira] [Commented] (NIFI-5264) Add parsing failure message in ValidateCSV

2018-06-08 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5264?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506042#comment-16506042
 ] 

ASF subversion and git services commented on NIFI-5264:
---

Commit 6e067734d5bed7dbabc4af2dae1f23bb980e3957 in nifi's branch 
refs/heads/master from [~pvillard]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=6e06773 ]

NIFI-5264 - Added attribute for validation error message in ValidateCSV

This closes #2769

Signed-off-by: zenfenan 


> Add parsing failure message in ValidateCSV 
> ---
>
> Key: NIFI-5264
> URL: https://issues.apache.org/jira/browse/NIFI-5264
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.6.0
>Reporter: bsf
>Assignee: Pierre Villard
>Priority: Major
>
> As a developer I would like to see an improvement on the ValidateCSV 
> component when using the line by line validation strategy. It will be nice to 
> have an option to append into the flowfile on the invalid relationship 1 or 2 
> new fields:
>  * field_name : the name of the field failed in the schema validation
>  * field_description : the description of the validation error
> This will help a lot the user to understand the validation issue on each line.
> If too complex, anything that provides information on the fail validation of 
> the line against the schema will be more than welcome :)
> Pentaho DI do something like this by enabling error handling.
> Thanks a lot!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5264) Add parsing failure message in ValidateCSV

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5264?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506043#comment-16506043
 ] 

ASF GitHub Bot commented on NIFI-5264:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2769


> Add parsing failure message in ValidateCSV 
> ---
>
> Key: NIFI-5264
> URL: https://issues.apache.org/jira/browse/NIFI-5264
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.6.0
>Reporter: bsf
>Assignee: Pierre Villard
>Priority: Major
>
> As a developer I would like to see an improvement on the ValidateCSV 
> component when using the line by line validation strategy. It will be nice to 
> have an option to append into the flowfile on the invalid relationship 1 or 2 
> new fields:
>  * field_name : the name of the field failed in the schema validation
>  * field_description : the description of the validation error
> This will help a lot the user to understand the validation issue on each line.
> If too complex, anything that provides information on the fail validation of 
> the line against the schema will be more than welcome :)
> Pentaho DI do something like this by enabling error handling.
> Thanks a lot!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2769: NIFI-5264 - Added attribute for validation error me...

2018-06-08 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2769


---


[jira] [Commented] (NIFI-5237) Wrong redirect from login behind a context path, when using OpenID authentication

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5237?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506038#comment-16506038
 ] 

ASF GitHub Bot commented on NIFI-5237:
--

Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2763
  
@maciejbozemoj Yup. I've verified the existing code did not work and the 
proposed changes do work with my setup. That's why I was surprised with the 
behavior you reported originally. Thanks again for assisting with the review!


> Wrong redirect from login behind a context path, when using OpenID 
> authentication
> -
>
> Key: NIFI-5237
> URL: https://issues.apache.org/jira/browse/NIFI-5237
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
> Environment: NiFi behind a reverse proxy (HAProxy)
>Reporter: Damian Czaja
>Assignee: Matt Gilman
>Priority: Major
>
> When I deploy NiFi behind a custom context path eg 
> ([https://my-nifi/my/context/path/)|https://nifi/my/context/path/)] and using 
> OpenID authentication, after the login I'm redirected to 
> [https://my-nifi/nifi/|https://my-nifi/nifi] instead of 
> [https://my-nifi/my/context/path/nifi/] .
> My presumption is, that the relative redirect in 
> httpServletResponse.sendRedirect it's respecting the contextPath provided in 
> the X-ProxyContextPath header:
> [https://github.com/apache/nifi/blob/rel/nifi-1.6.0/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/api/AccessResource.java#L269]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2763: NIFI-5237: Considering proxy headers following OIDC login

2018-06-08 Thread mcgilman
Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2763
  
@maciejbozemoj Yup. I've verified the existing code did not work and the 
proposed changes do work with my setup. That's why I was surprised with the 
behavior you reported originally. Thanks again for assisting with the review!


---


[jira] [Commented] (NIFI-3217) Resizing browser closes property entry textbox

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-3217?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505987#comment-16505987
 ] 

ASF GitHub Bot commented on NIFI-3217:
--

Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2766
  
thanks @scottyaslan for the review!


> Resizing browser closes property entry textbox
> --
>
> Key: NIFI-3217
> URL: https://issues.apache.org/jira/browse/NIFI-3217
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core UI
>Affects Versions: 1.1.0
>Reporter: Joseph Percivall
>Assignee: Matt Gilman
>Priority: Minor
> Fix For: 1.7.0
>
>
> Steps to reproduce
> 1: Have processor on graph
> 2: Open processor config window
> 3: Click to edit the value of a property (opens the entry textbox)
> 4: With the textbox open, resize the Browser Window
> 5: See the textbox automatically close
> Preferably, this should instead stay open.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-3217) Resizing browser closes property entry textbox

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-3217?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505988#comment-16505988
 ] 

ASF GitHub Bot commented on NIFI-3217:
--

Github user mcgilman closed the pull request at:

https://github.com/apache/nifi/pull/2766


> Resizing browser closes property entry textbox
> --
>
> Key: NIFI-3217
> URL: https://issues.apache.org/jira/browse/NIFI-3217
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core UI
>Affects Versions: 1.1.0
>Reporter: Joseph Percivall
>Assignee: Matt Gilman
>Priority: Minor
> Fix For: 1.7.0
>
>
> Steps to reproduce
> 1: Have processor on graph
> 2: Open processor config window
> 3: Click to edit the value of a property (opens the entry textbox)
> 4: With the textbox open, resize the Browser Window
> 5: See the textbox automatically close
> Preferably, this should instead stay open.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2766: NIFI-3217: Preventing editor close on window resize

2018-06-08 Thread mcgilman
Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2766
  
thanks @scottyaslan for the review!


---


[GitHub] nifi pull request #2766: NIFI-3217: Preventing editor close on window resize

2018-06-08 Thread mcgilman
Github user mcgilman closed the pull request at:

https://github.com/apache/nifi/pull/2766


---


[jira] [Commented] (NIFI-5264) Add parsing failure message in ValidateCSV

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5264?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505976#comment-16505976
 ] 

ASF GitHub Bot commented on NIFI-5264:
--

Github user zenfenan commented on the issue:

https://github.com/apache/nifi/pull/2769
  
Reviewing..


> Add parsing failure message in ValidateCSV 
> ---
>
> Key: NIFI-5264
> URL: https://issues.apache.org/jira/browse/NIFI-5264
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.6.0
>Reporter: bsf
>Assignee: Pierre Villard
>Priority: Major
>
> As a developer I would like to see an improvement on the ValidateCSV 
> component when using the line by line validation strategy. It will be nice to 
> have an option to append into the flowfile on the invalid relationship 1 or 2 
> new fields:
>  * field_name : the name of the field failed in the schema validation
>  * field_description : the description of the validation error
> This will help a lot the user to understand the validation issue on each line.
> If too complex, anything that provides information on the fail validation of 
> the line against the schema will be more than welcome :)
> Pentaho DI do something like this by enabling error handling.
> Thanks a lot!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2769: NIFI-5264 - Added attribute for validation error message i...

2018-06-08 Thread zenfenan
Github user zenfenan commented on the issue:

https://github.com/apache/nifi/pull/2769
  
Reviewing..


---


[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505973#comment-16505973
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194042057
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-web-utils/pom.xml ---
@@ -0,0 +1,39 @@
+
+
+http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
+
+nifi-standard-bundle
+org.apache.nifi
+1.7.0-SNAPSHOT
+
+4.0.0
+nifi-standard-web-utils
--- End diff --

Yeah. I can see this functionality being reused in plenty of places and 
don't think we should tie the transitive dependencies of the standard package's 
test scope into other packages as that could complicate things down the road. 
I'll change the module name to `nifi-standard-web-test-utils`. How about that?


> Add a REST lookup service
> -
>
> Key: NIFI-5214
> URL: https://issues.apache.org/jira/browse/NIFI-5214
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> * Should have reader API support
>  * Should be able to drill down through complex XML and JSON responses to a 
> nested record.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194042057
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-web-utils/pom.xml ---
@@ -0,0 +1,39 @@
+
+
+http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
+
+nifi-standard-bundle
+org.apache.nifi
+1.7.0-SNAPSHOT
+
+4.0.0
+nifi-standard-web-utils
--- End diff --

Yeah. I can see this functionality being reused in plenty of places and 
don't think we should tie the transitive dependencies of the standard package's 
test scope into other packages as that could complicate things down the road. 
I'll change the module name to `nifi-standard-web-test-utils`. How about that?


---


[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505972#comment-16505972
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194041176
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/resources/docs/org.apache.nifi.lookup.RestLookupService/additionalDetails.html
 ---
@@ -0,0 +1,54 @@
+
+
+
+
+
+RestLookupService
+
+
+
+General
+This lookup service has the following required keys:
+
+mime.type
+request.method; valid values:
+
+delete
+get
+post
+put
+
+
+
+In addition to the required keys, a key "body" can be added which 
contains a string representing JSON, XML, etc. to be sent with any
+of those methods except for "get."
+The record reader is used to consume the response of the REST 
service call and turn it into one or more records. The record path property
+is provided to allow for a lookup path to either a nested record or a 
single point deep in the REST response. Note: a valid schema must be
+built that encapsulates the REST response accurately in order for this 
service to work.
+Headers
+Headers are supported using dynamic properties. Just add a dynamic 
property and the name will be the header name and the value will be the value 
for the header. Expression language
+powered by input from the variable registry is supported.
+Dynamic URLs
+The URL property supports expression language in a non-standard 
way: through the lookup key/value pairs configured on the processor. The 
configuration specified by the user will be passed
--- End diff --

Done.


> Add a REST lookup service
> -
>
> Key: NIFI-5214
> URL: https://issues.apache.org/jira/browse/NIFI-5214
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mike Thomsen
>Assignee: Mike Thomsen
>Priority: Major
>
> * Should have reader API support
>  * Should be able to drill down through complex XML and JSON responses to a 
> nested record.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194041176
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/resources/docs/org.apache.nifi.lookup.RestLookupService/additionalDetails.html
 ---
@@ -0,0 +1,54 @@
+
+
+
+
+
+RestLookupService
+
+
+
+General
+This lookup service has the following required keys:
+
+mime.type
+request.method; valid values:
+
+delete
+get
+post
+put
+
+
+
+In addition to the required keys, a key "body" can be added which 
contains a string representing JSON, XML, etc. to be sent with any
+of those methods except for "get."
+The record reader is used to consume the response of the REST 
service call and turn it into one or more records. The record path property
+is provided to allow for a lookup path to either a nested record or a 
single point deep in the REST response. Note: a valid schema must be
+built that encapsulates the REST response accurately in order for this 
service to work.
+Headers
+Headers are supported using dynamic properties. Just add a dynamic 
property and the name will be the header name and the value will be the value 
for the header. Expression language
+powered by input from the variable registry is supported.
+Dynamic URLs
+The URL property supports expression language in a non-standard 
way: through the lookup key/value pairs configured on the processor. The 
configuration specified by the user will be passed
--- End diff --

Done.


---


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194041102
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+.description("The URL for the REST endpoint. Expression language 
is evaluated against the lookup key/value pairs, " +
+"not flowfile attributes or variable registry.")
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505971#comment-16505971
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194041102
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505963#comment-16505963
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194039553
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194039553
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+.description("The URL for the REST endpoint. Expression language 
is evaluated against the lookup key/value pairs, " +
+"not flowfile attributes or variable registry.")
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505961#comment-16505961
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194039429
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
--- End diff --

Done.


> Add a REST lookup service
> -
>
> Key: NIFI-5214
> URL: 

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194039429
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
--- End diff --

Done.


---


[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194039043
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+.description("The URL for the REST endpoint. Expression language 
is evaluated against the lookup key/value pairs, " +
+"not flowfile attributes or variable registry.")
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505959#comment-16505959
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194039043
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505953#comment-16505953
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194038855
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194038826
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+.description("The URL for the REST endpoint. Expression language 
is evaluated against the lookup key/value pairs, " +
+"not flowfile attributes or variable registry.")
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194038855
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+.description("The URL for the REST endpoint. Expression language 
is evaluated against the lookup key/value pairs, " +
+"not flowfile attributes or variable registry.")
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505952#comment-16505952
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194038840
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+

[jira] [Commented] (NIFI-5214) Add a REST lookup service

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505951#comment-16505951
 ] 

ASF GitHub Bot commented on NIFI-5214:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194038826
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+

[GitHub] nifi pull request #2723: NIFI-5214 Added REST LookupService

2018-06-08 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2723#discussion_r194038840
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-lookup-services-bundle/nifi-lookup-services/src/main/java/org/apache/nifi/lookup/RestLookupService.java
 ---
@@ -0,0 +1,435 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.lookup;
+
+import com.burgstaller.okhttp.AuthenticationCacheInterceptor;
+import com.burgstaller.okhttp.CachingAuthenticatorDecorator;
+import com.burgstaller.okhttp.digest.CachingAuthenticator;
+import com.burgstaller.okhttp.digest.DigestAuthenticator;
+import okhttp3.Credentials;
+import okhttp3.MediaType;
+import okhttp3.OkHttpClient;
+import okhttp3.Request;
+import okhttp3.RequestBody;
+import okhttp3.Response;
+import org.apache.nifi.annotation.behavior.DynamicProperties;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.attribute.expression.language.PreparedQuery;
+import org.apache.nifi.attribute.expression.language.Query;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.proxy.ProxyConfiguration;
+import org.apache.nifi.proxy.ProxyConfigurationService;
+import org.apache.nifi.proxy.ProxySpec;
+import org.apache.nifi.record.path.FieldValue;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import javax.net.ssl.SSLContext;
+import java.io.IOException;
+import java.io.InputStream;
+import java.net.Proxy;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.regex.Pattern;
+import java.util.stream.Collectors;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+@Tags({ "rest", "lookup", "json", "xml", "http" })
+@CapabilityDescription("Use a REST service to enrich records.")
+@DynamicProperties({
+@DynamicProperty(name = "*", value = "*", description = "All dynamic 
properties are added as HTTP headers with the name " +
+"as the header name and the value as the header value.")
+})
+public class RestLookupService extends AbstractControllerService 
implements LookupService {
+static final PropertyDescriptor URL = new PropertyDescriptor.Builder()
+.name("rest-lookup-url")
+.displayName("URL")
+.description("The URL for the REST endpoint. Expression language 
is evaluated against the lookup key/value pairs, " +
+"not flowfile attributes or variable registry.")
+

[jira] [Commented] (NIFI-5221) Add Object Tagging support for AWS S3 Processors

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505939#comment-16505939
 ] 

ASF GitHub Bot commented on NIFI-5221:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194034751
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3Object.java
 ---
@@ -205,11 +210,21 @@
 .defaultValue(NO_SERVER_SIDE_ENCRYPTION)
 .build();
 
+public static final PropertyDescriptor OBJECT_TAGS = new 
PropertyDescriptor.Builder()
--- End diff --

Yep, wouldn't it be more flexible?
With the current approach, it assumes that you always have the same set of 
tags (even if the values can change based on ffs attributes, the keys would 
always be the same). With the regular expression, you could easily manage the 
case where you have one ff with attributes tagS3_country=FR, 
tagS3_security=topsecret and one flow file with only tagS3_country=US. You 
would set the tag regular expression to tagS3.* and you would have the tags 
created using the attributes matching the regex. Does it make sense or am I 
missing something?


> Add Object Tagging support for AWS S3 Processors
> 
>
> Key: NIFI-5221
> URL: https://issues.apache.org/jira/browse/NIFI-5221
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> AWS has introduced new set of functionalities that enable the S3 bucket and 
> objects to be tagged. This can be useful for data classification purposes and 
> with new regulatory process related to data are being introduced such as 
> GDPR, object tagging can be quite useful and helpful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2751: NIFI-5221: Added 'Object Tagging' functionalities t...

2018-06-08 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194034751
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3Object.java
 ---
@@ -205,11 +210,21 @@
 .defaultValue(NO_SERVER_SIDE_ENCRYPTION)
 .build();
 
+public static final PropertyDescriptor OBJECT_TAGS = new 
PropertyDescriptor.Builder()
--- End diff --

Yep, wouldn't it be more flexible?
With the current approach, it assumes that you always have the same set of 
tags (even if the values can change based on ffs attributes, the keys would 
always be the same). With the regular expression, you could easily manage the 
case where you have one ff with attributes tagS3_country=FR, 
tagS3_security=topsecret and one flow file with only tagS3_country=US. You 
would set the tag regular expression to tagS3.* and you would have the tags 
created using the attributes matching the regex. Does it make sense or am I 
missing something?


---


[jira] [Commented] (NIFI-5221) Add Object Tagging support for AWS S3 Processors

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505937#comment-16505937
 ] 

ASF GitHub Bot commented on NIFI-5221:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194033825
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/ListS3.java
 ---
@@ -307,6 +328,20 @@ private boolean commit(final ProcessContext context, 
final ProcessSession sessio
 return willCommit;
 }
 
+private Map writeObjectTags(AmazonS3 client, 
S3VersionSummary versionSummary) {
+final GetObjectTaggingResult taggingResult = 
client.getObjectTagging(new 
GetObjectTaggingRequest(versionSummary.getBucketName(), 
versionSummary.getKey()));
--- End diff --

I just imagine the case where a user is listing a lot of objects in S3 and 
this could cause a large amount of additional calls. I agree that the overhead 
should be fairly limited. Just wondering if we want this new behavior to be 
enabled by default. I honestly don't have any strong opinion on that, just 
wanted to mention it. Do you have an opinion on that @jvwing ?


> Add Object Tagging support for AWS S3 Processors
> 
>
> Key: NIFI-5221
> URL: https://issues.apache.org/jira/browse/NIFI-5221
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> AWS has introduced new set of functionalities that enable the S3 bucket and 
> objects to be tagged. This can be useful for data classification purposes and 
> with new regulatory process related to data are being introduced such as 
> GDPR, object tagging can be quite useful and helpful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2751: NIFI-5221: Added 'Object Tagging' functionalities t...

2018-06-08 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194033825
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/ListS3.java
 ---
@@ -307,6 +328,20 @@ private boolean commit(final ProcessContext context, 
final ProcessSession sessio
 return willCommit;
 }
 
+private Map writeObjectTags(AmazonS3 client, 
S3VersionSummary versionSummary) {
+final GetObjectTaggingResult taggingResult = 
client.getObjectTagging(new 
GetObjectTaggingRequest(versionSummary.getBucketName(), 
versionSummary.getKey()));
--- End diff --

I just imagine the case where a user is listing a lot of objects in S3 and 
this could cause a large amount of additional calls. I agree that the overhead 
should be fairly limited. Just wondering if we want this new behavior to be 
enabled by default. I honestly don't have any strong opinion on that, just 
wanted to mention it. Do you have an opinion on that @jvwing ?


---


[jira] [Commented] (NIFI-5221) Add Object Tagging support for AWS S3 Processors

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505883#comment-16505883
 ] 

ASF GitHub Bot commented on NIFI-5221:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194009838
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3Object.java
 ---
@@ -205,11 +210,21 @@
 .defaultValue(NO_SERVER_SIDE_ENCRYPTION)
 .build();
 
+public static final PropertyDescriptor OBJECT_TAGS = new 
PropertyDescriptor.Builder()
--- End diff --

I don't fully understand this one. Are you suggesting to replace the JSON 
approach with a different one?


> Add Object Tagging support for AWS S3 Processors
> 
>
> Key: NIFI-5221
> URL: https://issues.apache.org/jira/browse/NIFI-5221
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> AWS has introduced new set of functionalities that enable the S3 bucket and 
> objects to be tagged. This can be useful for data classification purposes and 
> with new regulatory process related to data are being introduced such as 
> GDPR, object tagging can be quite useful and helpful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2751: NIFI-5221: Added 'Object Tagging' functionalities t...

2018-06-08 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194009838
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3Object.java
 ---
@@ -205,11 +210,21 @@
 .defaultValue(NO_SERVER_SIDE_ENCRYPTION)
 .build();
 
+public static final PropertyDescriptor OBJECT_TAGS = new 
PropertyDescriptor.Builder()
--- End diff --

I don't fully understand this one. Are you suggesting to replace the JSON 
approach with a different one?


---


[jira] [Commented] (NIFI-5221) Add Object Tagging support for AWS S3 Processors

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505882#comment-16505882
 ] 

ASF GitHub Bot commented on NIFI-5221:
--

Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194009531
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/ListS3.java
 ---
@@ -307,6 +328,20 @@ private boolean commit(final ProcessContext context, 
final ProcessSession sessio
 return willCommit;
 }
 
+private Map writeObjectTags(AmazonS3 client, 
S3VersionSummary versionSummary) {
+final GetObjectTaggingResult taggingResult = 
client.getObjectTagging(new 
GetObjectTaggingRequest(versionSummary.getBucketName(), 
versionSummary.getKey()));
--- End diff --

Yep. It'll make additional API call but I don't think it will have any 
negative impact on the existing flows? Do you sense any cases where this might 
break? This simply calls S3 service to see if this key has any tags associated, 
if it has, it will add to the flowfile attributes.


> Add Object Tagging support for AWS S3 Processors
> 
>
> Key: NIFI-5221
> URL: https://issues.apache.org/jira/browse/NIFI-5221
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Sivaprasanna Sethuraman
>Assignee: Sivaprasanna Sethuraman
>Priority: Major
>
> AWS has introduced new set of functionalities that enable the S3 bucket and 
> objects to be tagged. This can be useful for data classification purposes and 
> with new regulatory process related to data are being introduced such as 
> GDPR, object tagging can be quite useful and helpful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2751: NIFI-5221: Added 'Object Tagging' functionalities t...

2018-06-08 Thread zenfenan
Github user zenfenan commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2751#discussion_r194009531
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/ListS3.java
 ---
@@ -307,6 +328,20 @@ private boolean commit(final ProcessContext context, 
final ProcessSession sessio
 return willCommit;
 }
 
+private Map writeObjectTags(AmazonS3 client, 
S3VersionSummary versionSummary) {
+final GetObjectTaggingResult taggingResult = 
client.getObjectTagging(new 
GetObjectTaggingRequest(versionSummary.getBucketName(), 
versionSummary.getKey()));
--- End diff --

Yep. It'll make additional API call but I don't think it will have any 
negative impact on the existing flows? Do you sense any cases where this might 
break? This simply calls S3 service to see if this key has any tags associated, 
if it has, it will add to the flowfile attributes.


---


[GitHub] nifi issue #2763: NIFI-5237: Considering proxy headers following OIDC login

2018-06-08 Thread maciejbozemoj
Github user maciejbozemoj commented on the issue:

https://github.com/apache/nifi/pull/2763
  
@mcgilman, I made docker-compose for set up and today I couldn't reproduce 
it, maybe it was my environment failure. I would like to know if you managed to 
test it locally with and without proxy.


---


[jira] [Commented] (NIFI-5237) Wrong redirect from login behind a context path, when using OpenID authentication

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5237?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505852#comment-16505852
 ] 

ASF GitHub Bot commented on NIFI-5237:
--

Github user maciejbozemoj commented on the issue:

https://github.com/apache/nifi/pull/2763
  
@mcgilman, I made docker-compose for set up and today I couldn't reproduce 
it, maybe it was my environment failure. I would like to know if you managed to 
test it locally with and without proxy.


> Wrong redirect from login behind a context path, when using OpenID 
> authentication
> -
>
> Key: NIFI-5237
> URL: https://issues.apache.org/jira/browse/NIFI-5237
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.6.0
> Environment: NiFi behind a reverse proxy (HAProxy)
>Reporter: Damian Czaja
>Assignee: Matt Gilman
>Priority: Major
>
> When I deploy NiFi behind a custom context path eg 
> ([https://my-nifi/my/context/path/)|https://nifi/my/context/path/)] and using 
> OpenID authentication, after the login I'm redirected to 
> [https://my-nifi/nifi/|https://my-nifi/nifi] instead of 
> [https://my-nifi/my/context/path/nifi/] .
> My presumption is, that the relative redirect in 
> httpServletResponse.sendRedirect it's respecting the contextPath provided in 
> the X-ProxyContextPath header:
> [https://github.com/apache/nifi/blob/rel/nifi-1.6.0/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/api/AccessResource.java#L269]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (NIFI-5281) JSON RecordSetWriter throws NullPointerException if value is not valid according to the schema's CHOICE type

2018-06-08 Thread Pierre Villard (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5281?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pierre Villard updated NIFI-5281:
-
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> JSON RecordSetWriter throws NullPointerException if value is not valid 
> according to the schema's CHOICE type
> 
>
> Key: NIFI-5281
> URL: https://issues.apache.org/jira/browse/NIFI-5281
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.7.0
>
>
> If the JSON Record Set Writer is used and a field is defined as a CHOICE 
> field in the schema, but the Record has a value that is not in the given 
> CHOICE, then the RecordSetWriter throws NullPointerException.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5281) JSON RecordSetWriter throws NullPointerException if value is not valid according to the schema's CHOICE type

2018-06-08 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5281?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505819#comment-16505819
 ] 

ASF GitHub Bot commented on NIFI-5281:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2772


> JSON RecordSetWriter throws NullPointerException if value is not valid 
> according to the schema's CHOICE type
> 
>
> Key: NIFI-5281
> URL: https://issues.apache.org/jira/browse/NIFI-5281
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.7.0
>
>
> If the JSON Record Set Writer is used and a field is defined as a CHOICE 
> field in the schema, but the Record has a value that is not in the given 
> CHOICE, then the RecordSetWriter throws NullPointerException.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5281) JSON RecordSetWriter throws NullPointerException if value is not valid according to the schema's CHOICE type

2018-06-08 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5281?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16505818#comment-16505818
 ] 

ASF subversion and git services commented on NIFI-5281:
---

Commit 49228aa5dcbae5c4216310117daba586a63e6778 in nifi's branch 
refs/heads/master from [~markap14]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=49228aa ]

NIFI-5281: If value is not valid according to the schema's CHOICE  field, JSON 
Writer should write null value instead of throwing NullPointerException

Signed-off-by: Pierre Villard 

This closes #2772.


> JSON RecordSetWriter throws NullPointerException if value is not valid 
> according to the schema's CHOICE type
> 
>
> Key: NIFI-5281
> URL: https://issues.apache.org/jira/browse/NIFI-5281
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.7.0
>
>
> If the JSON Record Set Writer is used and a field is defined as a CHOICE 
> field in the schema, but the Record has a value that is not in the given 
> CHOICE, then the RecordSetWriter throws NullPointerException.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2772: NIFI-5281: If value is not valid according to the s...

2018-06-08 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2772


---


  1   2   >