[GitHub] [nifi] sjyang18 opened a new pull request #3737: NIFI-6583 metrics reporting to azure log ws

2019-09-13 Thread GitBox
sjyang18 opened a new pull request #3737: NIFI-6583 metrics reporting to azure 
log ws
URL: https://github.com/apache/nifi/pull/3737
 
 
   Thank you for submitting a contribution to Apache NiFi.
   
   
    Description of PR
   
   NIFI-6583: Added a service that takes the same nifi metrics data used for 
graphite to azure log ws. In addition, this common service will provides the 
connection info for new nifi standard metrics tasks to reuse the same 
connection meta data. 
   
   ### For all changes:
   - [X] Is there a JIRA ticket associated with this PR? Is it referenced 
in the commit message?
   
   - [X ] Does your PR title start with **NIFI-** where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [X ] Has your PR been rebased against the latest commit within the target 
branch (typically `master`)?
   
   - [X ] Is your initial contribution a single, squashed commit? _Additional 
commits in response to PR reviewer feedback should be made on this branch and 
pushed to allow change tracking. Do not `squash` or use `--force` when pushing 
to allow for clean monitoring of changes._
   
   ### For code changes:
   - [ ] Have you ensured that the full suite of tests is executed via `mvn 
-Pcontrib-check clean install` at the root `nifi` folder?
   - [ ] Have you written or updated unit tests to verify your changes?
   - [ ] Have you verified that the full build is successful on both JDK 8 and 
JDK 11?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
   - [ ] If applicable, have you updated the `LICENSE` file, including the main 
`LICENSE` file under `nifi-assembly`?
   - [X] If applicable, have you updated the `NOTICE` file, including the main 
`NOTICE` file found under `nifi-assembly`?
   - [ ] If adding new Properties, have you added `.displayName` in addition to 
.name (programmatic access) for each of the new properties?
   
   ### For documentation related changes:
   - [X] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] sjyang18 closed pull request #3674: Nifi 6583

2019-09-13 Thread GitBox
sjyang18 closed pull request #3674: Nifi 6583
URL: https://github.com/apache/nifi/pull/3674
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] SamHjelmfelt commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
SamHjelmfelt commented on a change in pull request #3732: NIFI-6662: Adding 
Kudu Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324381204
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/test/java/org/apache/nifi/controller/kudu/TestKuduLookupService.java
 ##
 @@ -0,0 +1,222 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.CreateTableOptions;
+import org.apache.kudu.client.Insert;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduSession;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.PartialRow;
+import org.apache.kudu.test.KuduTestHarness;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+
+public class TestKuduLookupService {
+
+// The KuduTestHarness automatically starts and stops a real Kudu cluster
+// when each test is run. Kudu persists its on-disk state in a temporary
+// directory under a location defined by the environment variable 
TEST_TMPDIR
+// if set, or under /tmp otherwise. That cluster data is deleted on
+// successful exit of the test. The cluster output is logged through slf4j.
+@Rule
+public KuduTestHarness harness = new KuduTestHarness();
+private TestRunner testRunner;
+private long nowMillis = System.currentTimeMillis();
+private KuduLookupService kuduLookupService;
+
+public static class SampleProcessor extends AbstractProcessor {
+@Override
+public void onTrigger(ProcessContext context, ProcessSession session) 
throws ProcessException {
+}
+}
+
+@Before
+public void init() throws Exception {
+testRunner = TestRunners.newTestRunner(SampleProcessor.class);
+testRunner.setValidateExpressionUsage(false);
+final String tableName = "table1";
+
+KuduClient client =  harness.getClient();
+List columns = new ArrayList<>();
+columns.add(new ColumnSchema.ColumnSchemaBuilder("string", 
Type.STRING).key(true).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("binary", 
Type.BINARY).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("bool", 
Type.BOOL).build());
+//columns.add(new ColumnSchema.ColumnSchemaBuilder("decimal", 
Type.DECIMAL).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("double", 
Type.DOUBLE).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("float", 
Type.FLOAT).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("int8", 
Type.INT8).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("int16", 
Type.INT16).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("int32", 
Type.INT32).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("int64", 
Type.INT64).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("unixtime_micros", 
Type.UNIXTIME_MICROS).build());
+Schema schema = new Schema(columns);
+
+CreateTableOptions opts = new 
CreateTableOptions().setRangePartitionColumns(Collections.singletonList("string"));
+client.createTable(tableName, schema, opts);
+
+KuduTable table = client.openTable(tableName);
+KuduSession session = client.newSession();
+
+Insert 

[GitHub] [nifi] sjyang18 commented on issue #3674: Nifi 6583

2019-09-13 Thread GitBox
sjyang18 commented on issue #3674: Nifi 6583
URL: https://github.com/apache/nifi/pull/3674#issuecomment-531398821
 
 
   Any update on my pull request? The build fail with FR generated from web 
page. Please let me know what I am missing.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] natural commented on a change in pull request #3594: NIFI-3833 Support for Encrypted Flow File Repositories

2019-09-13 Thread GitBox
natural commented on a change in pull request #3594: NIFI-3833 Support for 
Encrypted Flow File Repositories
URL: https://github.com/apache/nifi/pull/3594#discussion_r324368404
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/administration-guide.adoc
 ##
 @@ -2477,6 +2477,13 @@ implementation.
 |`nifi.flowfile.repository.always.sync`|If set to `true`, any change to the 
repository will be synchronized to the disk, meaning that NiFi will ask the 
operating system not to cache the information. This is very expensive and can 
significantly reduce NiFi performance. However, if it is `false`, there could 
be the potential for data loss if either there is a sudden power loss or the 
operating system crashes. The default value is `false`.
 |
 
+ Encryption
+
+The FlowFile repository can be configured to encrypt all files as they are 
written to disk.  To enable this encryption,
+set the `nifi.flowfile.repository.always.key.1` property to a 16 or 32 bit 
value like this:
 
 Review comment:
   I've added explicit key lookup to the nifi props class, and support for 
retrieving key material where needed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324276718
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324277897
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324366570
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324281890
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324365098
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/test/java/org/apache/nifi/controller/kudu/TestKuduLookupService.java
 ##
 @@ -0,0 +1,222 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.CreateTableOptions;
+import org.apache.kudu.client.Insert;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduSession;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.PartialRow;
+import org.apache.kudu.test.KuduTestHarness;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+
+public class TestKuduLookupService {
+
+// The KuduTestHarness automatically starts and stops a real Kudu cluster
+// when each test is run. Kudu persists its on-disk state in a temporary
+// directory under a location defined by the environment variable 
TEST_TMPDIR
+// if set, or under /tmp otherwise. That cluster data is deleted on
+// successful exit of the test. The cluster output is logged through slf4j.
+@Rule
+public KuduTestHarness harness = new KuduTestHarness();
+private TestRunner testRunner;
+private long nowMillis = System.currentTimeMillis();
+private KuduLookupService kuduLookupService;
+
+public static class SampleProcessor extends AbstractProcessor {
+@Override
+public void onTrigger(ProcessContext context, ProcessSession session) 
throws ProcessException {
+}
+}
+
+@Before
+public void init() throws Exception {
+testRunner = TestRunners.newTestRunner(SampleProcessor.class);
+testRunner.setValidateExpressionUsage(false);
+final String tableName = "table1";
+
+KuduClient client =  harness.getClient();
+List columns = new ArrayList<>();
+columns.add(new ColumnSchema.ColumnSchemaBuilder("string", 
Type.STRING).key(true).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("binary", 
Type.BINARY).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("bool", 
Type.BOOL).build());
+//columns.add(new ColumnSchema.ColumnSchemaBuilder("decimal", 
Type.DECIMAL).build());
 
 Review comment:
   Why is this commented out? It should be fixed/tested. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324282157
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324364288
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324275659
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324363969
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324365782
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/test/java/org/apache/nifi/controller/kudu/TestKuduLookupService.java
 ##
 @@ -0,0 +1,222 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.CreateTableOptions;
+import org.apache.kudu.client.Insert;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduSession;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.PartialRow;
+import org.apache.kudu.test.KuduTestHarness;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertFalse;
+
+public class TestKuduLookupService {
+
+// The KuduTestHarness automatically starts and stops a real Kudu cluster
+// when each test is run. Kudu persists its on-disk state in a temporary
+// directory under a location defined by the environment variable 
TEST_TMPDIR
+// if set, or under /tmp otherwise. That cluster data is deleted on
+// successful exit of the test. The cluster output is logged through slf4j.
+@Rule
+public KuduTestHarness harness = new KuduTestHarness();
+private TestRunner testRunner;
+private long nowMillis = System.currentTimeMillis();
+private KuduLookupService kuduLookupService;
+
+public static class SampleProcessor extends AbstractProcessor {
+@Override
+public void onTrigger(ProcessContext context, ProcessSession session) 
throws ProcessException {
+}
+}
+
+@Before
+public void init() throws Exception {
+testRunner = TestRunners.newTestRunner(SampleProcessor.class);
+testRunner.setValidateExpressionUsage(false);
+final String tableName = "table1";
+
+KuduClient client =  harness.getClient();
+List columns = new ArrayList<>();
+columns.add(new ColumnSchema.ColumnSchemaBuilder("string", 
Type.STRING).key(true).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("binary", 
Type.BINARY).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("bool", 
Type.BOOL).build());
+//columns.add(new ColumnSchema.ColumnSchemaBuilder("decimal", 
Type.DECIMAL).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("double", 
Type.DOUBLE).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("float", 
Type.FLOAT).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("int8", 
Type.INT8).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("int16", 
Type.INT16).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("int32", 
Type.INT32).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("int64", 
Type.INT64).build());
+columns.add(new ColumnSchema.ColumnSchemaBuilder("unixtime_micros", 
Type.UNIXTIME_MICROS).build());
+Schema schema = new Schema(columns);
+
+CreateTableOptions opts = new 
CreateTableOptions().setRangePartitionColumns(Collections.singletonList("string"));
+client.createTable(tableName, schema, opts);
+
+KuduTable table = client.openTable(tableName);
+KuduSession session = client.newSession();
+
+Insert 

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324363180
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324282584
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[GitHub] [nifi] granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu Lookup Service

2019-09-13 Thread GitBox
granthenke commented on a change in pull request #3732: NIFI-6662: Adding Kudu 
Lookup Service
URL: https://github.com/apache/nifi/pull/3732#discussion_r324364601
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-kudu-bundle/nifi-kudu-controller-service/src/main/java/org/apache/nifi/controller/kudu/KuduLookupService.java
 ##
 @@ -0,0 +1,392 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.controller.kudu;
+
+import org.apache.kudu.ColumnSchema;
+import org.apache.kudu.Schema;
+import org.apache.kudu.Type;
+import org.apache.kudu.client.AsyncKuduClient;
+import org.apache.kudu.client.KuduClient;
+import org.apache.kudu.client.KuduException;
+import org.apache.kudu.client.KuduPredicate;
+import org.apache.kudu.client.KuduScanner;
+import org.apache.kudu.client.KuduTable;
+import org.apache.kudu.client.RowResult;
+import org.apache.kudu.client.RowResultIterator;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.controller.ControllerServiceInitializationContext;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.kerberos.KerberosCredentialsService;
+import org.apache.nifi.lookup.LookupFailureException;
+import org.apache.nifi.lookup.RecordLookupService;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.security.krb.KerberosAction;
+import org.apache.nifi.security.krb.KerberosKeytabUser;
+import org.apache.nifi.security.krb.KerberosUser;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import javax.security.auth.login.LoginException;
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+
+@CapabilityDescription("Lookup a record from Kudu Server associated with the 
specified key. Binary columns are base64 encoded")
+@Tags({"lookup", "enrich", "key", "value", "kudu"})
+public class KuduLookupService extends AbstractControllerService implements 
RecordLookupService {
+
+static final PropertyDescriptor KUDU_MASTERS = new 
PropertyDescriptor.Builder()
+.name("Kudu Masters")
+.description("Comma separated addresses of the Kudu masters to 
connect to.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.build();
+
+static final PropertyDescriptor KERBEROS_CREDENTIALS_SERVICE = new 
PropertyDescriptor.Builder()
+.name("kerberos-credentials-service")
+.displayName("Kerberos Credentials Service")
+.description("Specifies the Kerberos Credentials to use for 
authentication")
+.required(false)
+.identifiesControllerService(KerberosCredentialsService.class)
+.build();
+
+static final PropertyDescriptor KUDU_OPERATION_TIMEOUT_MS = new 
PropertyDescriptor.Builder()
+.name("kudu-operations-timeout-ms")
+.displayName("Kudu Operation Timeout")
+.description("Default timeout used for user operations (using 
sessions and scanners)")
+.required(false)
+

[jira] [Updated] (NIFI-6381) Make Parameters and Parameter Contexts searchable in UI

2019-09-13 Thread Matt Gilman (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6381?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman updated NIFI-6381:
--
Fix Version/s: 1.10.0
   Resolution: Fixed
   Status: Resolved  (was: Patch Available)

> Make Parameters and Parameter Contexts searchable in UI
> ---
>
> Key: NIFI-6381
> URL: https://issues.apache.org/jira/browse/NIFI-6381
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core Framework
>Reporter: Mark Payne
>Assignee: Robert Fellows
>Priority: Minor
> Fix For: 1.10.0
>
>  Time Spent: 50m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (NIFI-6381) Make Parameters and Parameter Contexts searchable in UI

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-6381?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929528#comment-16929528
 ] 

ASF subversion and git services commented on NIFI-6381:
---

Commit 5ca3655dbff449ad9536e143c27bf2218c43695a in nifi's branch 
refs/heads/master from Rob Fellows
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=5ca3655 ]

NIFI-6381 - Make Parameters and Parameter Contexts searchable in UI

NIFI-6381 - remove wildcard imports

This closes #3728


> Make Parameters and Parameter Contexts searchable in UI
> ---
>
> Key: NIFI-6381
> URL: https://issues.apache.org/jira/browse/NIFI-6381
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core Framework
>Reporter: Mark Payne
>Assignee: Robert Fellows
>Priority: Minor
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (NIFI-6381) Make Parameters and Parameter Contexts searchable in UI

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-6381?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929529#comment-16929529
 ] 

ASF subversion and git services commented on NIFI-6381:
---

Commit 5ca3655dbff449ad9536e143c27bf2218c43695a in nifi's branch 
refs/heads/master from Rob Fellows
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=5ca3655 ]

NIFI-6381 - Make Parameters and Parameter Contexts searchable in UI

NIFI-6381 - remove wildcard imports

This closes #3728


> Make Parameters and Parameter Contexts searchable in UI
> ---
>
> Key: NIFI-6381
> URL: https://issues.apache.org/jira/browse/NIFI-6381
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core Framework
>Reporter: Mark Payne
>Assignee: Robert Fellows
>Priority: Minor
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] mcgilman commented on issue #3728: NIFI-6381 - Make Parameters and Parameter Contexts searchable in UI

2019-09-13 Thread GitBox
mcgilman commented on issue #3728: NIFI-6381 - Make Parameters and Parameter 
Contexts searchable in UI
URL: https://github.com/apache/nifi/pull/3728#issuecomment-531388188
 
 
   Thanks @rfellows! This has been merged to master.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] asfgit closed pull request #3728: NIFI-6381 - Make Parameters and Parameter Contexts searchable in UI

2019-09-13 Thread GitBox
asfgit closed pull request #3728: NIFI-6381 - Make Parameters and Parameter 
Contexts searchable in UI
URL: https://github.com/apache/nifi/pull/3728
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] mcgilman commented on a change in pull request #3728: NIFI-6381 - Make Parameters and Parameter Contexts searchable in UI

2019-09-13 Thread GitBox
mcgilman commented on a change in pull request #3728: NIFI-6381 - Make 
Parameters and Parameter Contexts searchable in UI
URL: https://github.com/apache/nifi/pull/3728#discussion_r324354865
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/test/java/org/apache/nifi/web/controller/ControllerSearchServiceTest.java
 ##
 @@ -19,39 +19,56 @@
 import org.apache.nifi.authorization.Authorizer;
 import org.apache.nifi.authorization.RequestAction;
 import org.apache.nifi.authorization.user.NiFiUser;
+import org.apache.nifi.controller.FlowController;
 import org.apache.nifi.controller.ProcessorNode;
 import org.apache.nifi.controller.StandardProcessorNode;
+import org.apache.nifi.controller.flow.FlowManager;
 import org.apache.nifi.groups.ProcessGroup;
+import org.apache.nifi.parameter.Parameter;
+import org.apache.nifi.parameter.ParameterContext;
+import org.apache.nifi.parameter.ParameterContextManager;
+import org.apache.nifi.parameter.ParameterDescriptor;
 import org.apache.nifi.processor.Processor;
 import org.apache.nifi.registry.VariableRegistry;
 import org.apache.nifi.registry.flow.StandardVersionControlInformation;
 import org.apache.nifi.registry.flow.VersionControlInformation;
 import org.apache.nifi.registry.variable.MutableVariableRegistry;
 import org.apache.nifi.web.api.dto.search.SearchResultsDTO;
+import org.apache.nifi.web.dao.ParameterContextDAO;
 import org.junit.Before;
 import org.junit.Test;
 import org.mockito.AdditionalMatchers;
 import org.mockito.Mockito;
 
-import java.util.HashSet;
-import java.util.Optional;
+import java.util.*;
 
 Review comment:
   Please do not wildcard imports.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (NIFI-6671) UI:Parameters listed in "Reference parameter..." drop-down not listed alphabetically

2019-09-13 Thread Robert Fellows (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6671?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Fellows updated NIFI-6671:
-
Status: Patch Available  (was: In Progress)

> UI:Parameters listed in "Reference parameter..." drop-down not listed 
> alphabetically
> 
>
> Key: NIFI-6671
> URL: https://issues.apache.org/jira/browse/NIFI-6671
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Andrew Lim
>Assignee: Robert Fellows
>Priority: Major
> Attachments: alphabetical_order.png, wrong-order.png
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> I created a Parameter Context and then added three parameters in this order:
> level1
> level2
> level3
> But these parameters are listed in the following order when I try to select 
> them for a property value:
> level1
> level3
> level2
>  
> See attached screenshots. If not easily reproducible, I can attach a video of 
> my exact steps.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] mcgilman commented on a change in pull request #3728: NIFI-6381 - Make Parameters and Parameter Contexts searchable in UI

2019-09-13 Thread GitBox
mcgilman commented on a change in pull request #3728: NIFI-6381 - Make 
Parameters and Parameter Contexts searchable in UI
URL: https://github.com/apache/nifi/pull/3728#discussion_r324346503
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/controller/ControllerSearchService.java
 ##
 @@ -52,10 +55,7 @@
 import org.apache.nifi.web.api.dto.search.SearchResultGroupDTO;
 import org.apache.nifi.web.api.dto.search.SearchResultsDTO;
 
-import java.util.ArrayList;
-import java.util.Collection;
-import java.util.List;
-import java.util.Map;
+import java.util.*;
 
 Review comment:
   Please do not wildcard imports. To be honest, I'm a little surprised this 
passes checkstyle.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] rfellows opened a new pull request #3736: NIFI-6671 - sort parameters by name in the 'Reference Parameter' drop…

2019-09-13 Thread GitBox
rfellows opened a new pull request #3736: NIFI-6671 - sort parameters by name 
in the 'Reference Parameter' drop…
URL: https://github.com/apache/nifi/pull/3736
 
 
   …down.
   
   Thank you for submitting a contribution to Apache NiFi.
   
   Please provide a short description of the PR here:
   
    Description of PR
   
   Sorts the list of parameters used in the `Reference Parameter` dropdown
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [X] Is there a JIRA ticket associated with this PR? Is it referenced 
in the commit message?
   
   - [X] Does your PR title start with **NIFI-** where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [X] Has your PR been rebased against the latest commit within the target 
branch (typically `master`)?
   
   - [X] Is your initial contribution a single, squashed commit? _Additional 
commits in response to PR reviewer feedback should be made on this branch and 
pushed to allow change tracking. Do not `squash` or use `--force` when pushing 
to allow for clean monitoring of changes._
   
   ### For code changes:
   - [ ] Have you ensured that the full suite of tests is executed via `mvn 
-Pcontrib-check clean install` at the root `nifi` folder?
   - [ ] Have you written or updated unit tests to verify your changes?
   - [ ] Have you verified that the full build is successful on both JDK 8 and 
JDK 11?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
   - [ ] If applicable, have you updated the `LICENSE` file, including the main 
`LICENSE` file under `nifi-assembly`?
   - [ ] If applicable, have you updated the `NOTICE` file, including the main 
`NOTICE` file found under `nifi-assembly`?
   - [ ] If adding new Properties, have you added `.displayName` in addition to 
.name (programmatic access) for each of the new properties?
   
   ### For documentation related changes:
   - [ ] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Assigned] (NIFI-6671) UI:Parameters listed in "Reference parameter..." drop-down not listed alphabetically

2019-09-13 Thread Robert Fellows (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6671?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Fellows reassigned NIFI-6671:


Assignee: Robert Fellows

> UI:Parameters listed in "Reference parameter..." drop-down not listed 
> alphabetically
> 
>
> Key: NIFI-6671
> URL: https://issues.apache.org/jira/browse/NIFI-6671
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Andrew Lim
>Assignee: Robert Fellows
>Priority: Major
> Attachments: alphabetical_order.png, wrong-order.png
>
>
> I created a Parameter Context and then added three parameters in this order:
> level1
> level2
> level3
> But these parameters are listed in the following order when I try to select 
> them for a property value:
> level1
> level3
> level2
>  
> See attached screenshots. If not easily reproducible, I can attach a video of 
> my exact steps.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (NIFI-6671) UI:Parameters listed in "Reference parameter..." drop-down not listed alphabetically

2019-09-13 Thread Andrew Lim (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6671?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Lim updated NIFI-6671:
-
Attachment: alphabetical_order.png

> UI:Parameters listed in "Reference parameter..." drop-down not listed 
> alphabetically
> 
>
> Key: NIFI-6671
> URL: https://issues.apache.org/jira/browse/NIFI-6671
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Andrew Lim
>Priority: Major
> Attachments: alphabetical_order.png, wrong-order.png
>
>
> I created a Parameter Context and then added three parameters in this order:
> level1
> level2
> level3
> But these parameters are listed in the following order when I try to select 
> them for a property value:
> level1
> level3
> level2
>  
> See attached screenshots. If not easily reproducible, I can attach a video of 
> my exact steps.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (NIFI-6671) UI:Parameters listed in "Reference parameter..." drop-down not listed alphabetically

2019-09-13 Thread Andrew Lim (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6671?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Lim updated NIFI-6671:
-
Attachment: wrong-order.png

> UI:Parameters listed in "Reference parameter..." drop-down not listed 
> alphabetically
> 
>
> Key: NIFI-6671
> URL: https://issues.apache.org/jira/browse/NIFI-6671
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Andrew Lim
>Priority: Major
> Attachments: alphabetical_order.png, wrong-order.png
>
>
> I created a Parameter Context and then added three parameters in this order:
> level1
> level2
> level3
> But these parameters are listed in the following order when I try to select 
> them for a property value:
> level1
> level3
> level2
>  
> See attached screenshots. If not easily reproducible, I can attach a video of 
> my exact steps.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Created] (NIFI-6671) UI:Parameters listed in "Reference parameter..." drop-down not listed alphabetically

2019-09-13 Thread Andrew Lim (Jira)
Andrew Lim created NIFI-6671:


 Summary: UI:Parameters listed in "Reference parameter..." 
drop-down not listed alphabetically
 Key: NIFI-6671
 URL: https://issues.apache.org/jira/browse/NIFI-6671
 Project: Apache NiFi
  Issue Type: Sub-task
  Components: Core UI
Reporter: Andrew Lim


I created a Parameter Context and then added three parameters in this order:

level1

level2

level3

But these parameters are listed in the following order when I try to select 
them for a property value:

level1

level3

level2

 

See attached screenshots. If not easily reproducible, I can attach a video of 
my exact steps.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324346114
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
+
+- *Description* - A description that explains what the Parameter is, how it is 
to be used, etc. This field is optional.
+
+Once these settings are configured, select "Apply". Add additional Parameters 
or edit any existing Parameters.
+
+image:update-parameter-context.png[Update Parameter Context]
+
+To complete the process, select "Apply" from the Parameter Context window. The 
following operations are performed to validate all components that reference 
the added or modified parameters: Stopping/Restarting affected Processors, 
Disabling/Re-enabling affected Controller Services, Updating Parameter Context.
+
+image:parameters-validate-affected-components.png[Validate Affected Components]
+
+The Referencing Components section lists any components referencing the 
parameters in the parameter context organized by process group.
+
+[[assigning_parameter_context_to_PG]]
+ Assigning a Parameter Context to a Process Group
+For a component to reference a Parameter, its Process Group must first be 
assigned to a Parameter Context. Once assigned, processors and controller 
services within that Process Group may only reference Parameters 

[jira] [Updated] (NIFI-6670) Create a RecordReader that reads lines of text into single-field records

2019-09-13 Thread Matt Burgess (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-6670:
---
Status: Patch Available  (was: In Progress)

> Create a RecordReader that reads lines of text into single-field records
> 
>
> Key: NIFI-6670
> URL: https://issues.apache.org/jira/browse/NIFI-6670
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> It would be nice to have a reader that can take any textual input and treat 
> each "line" as a single-field record. This is like CSVReader but there 
> wouldn't be a field delimiter; rather, a property to specify the name of the 
> field, and each line becomes a value for that field in the record.
> Additional capabilities could be added as well, such as skipping header 
> lines, grouping lines together as a single field value, ignoring empty lines, 
> etc.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] mattyb149 opened a new pull request #3735: NIFI-6670: Add TextLineReader to read lines of text as single-field records

2019-09-13 Thread GitBox
mattyb149 opened a new pull request #3735: NIFI-6670: Add TextLineReader to 
read lines of text as single-field records
URL: https://github.com/apache/nifi/pull/3735
 
 
   Thank you for submitting a contribution to Apache NiFi.
   
   Please provide a short description of the PR here:
   
    Description of PR
   
   _Enables X functionality; fixes bug NIFI-._
   
   In order to streamline the review of the contribution we ask you
   to ensure the following steps have been taken:
   
   ### For all changes:
   - [x] Is there a JIRA ticket associated with this PR? Is it referenced 
in the commit message?
   
   - [x] Does your PR title start with **NIFI-** where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [x] Has your PR been rebased against the latest commit within the target 
branch (typically `master`)?
   
   - [x] Is your initial contribution a single, squashed commit? _Additional 
commits in response to PR reviewer feedback should be made on this branch and 
pushed to allow change tracking. Do not `squash` or use `--force` when pushing 
to allow for clean monitoring of changes._
   
   ### For code changes:
   - [x] Have you ensured that the full suite of tests is executed via `mvn 
-Pcontrib-check clean install` at the root `nifi` folder?
   - [x] Have you written or updated unit tests to verify your changes?
   - [] Have you verified that the full build is successful on both JDK 8 and 
JDK 11?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
   - [ ] If applicable, have you updated the `LICENSE` file, including the main 
`LICENSE` file under `nifi-assembly`?
   - [ ] If applicable, have you updated the `NOTICE` file, including the main 
`NOTICE` file found under `nifi-assembly`?
   - [x] If adding new Properties, have you added `.displayName` in addition to 
.name (programmatic access) for each of the new properties?
   
   ### For documentation related changes:
   - [x] Have you ensured that format looks appropriate for the output in which 
it is rendered?
   
   ### Note:
   Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (NIFI-6670) Create a RecordReader that reads lines of text into single-field records

2019-09-13 Thread Matt Burgess (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-6670?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929438#comment-16929438
 ] 

Matt Burgess commented on NIFI-6670:


This might also be possible with GrokReader, but you have to supply the nominal 
schema rather than just specifying the field name. TextLineReader would be much 
more straightforward. One of the use cases is to take a file full of SQL 
statements and execute them as a transaction using PutDatabaseRecord. That 
avoids the need for splitting and using PutSQL, and results in a more efficient 
flow.

> Create a RecordReader that reads lines of text into single-field records
> 
>
> Key: NIFI-6670
> URL: https://issues.apache.org/jira/browse/NIFI-6670
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Matt Burgess
>Priority: Major
>
> It would be nice to have a reader that can take any textual input and treat 
> each "line" as a single-field record. This is like CSVReader but there 
> wouldn't be a field delimiter; rather, a property to specify the name of the 
> field, and each line becomes a value for that field in the record.
> Additional capabilities could be added as well, such as skipping header 
> lines, grouping lines together as a single field value, ignoring empty lines, 
> etc.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Assigned] (NIFI-6670) Create a RecordReader that reads lines of text into single-field records

2019-09-13 Thread Matt Burgess (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess reassigned NIFI-6670:
--

Assignee: Matt Burgess

> Create a RecordReader that reads lines of text into single-field records
> 
>
> Key: NIFI-6670
> URL: https://issues.apache.org/jira/browse/NIFI-6670
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Matt Burgess
>Assignee: Matt Burgess
>Priority: Major
>
> It would be nice to have a reader that can take any textual input and treat 
> each "line" as a single-field record. This is like CSVReader but there 
> wouldn't be a field delimiter; rather, a property to specify the name of the 
> field, and each line becomes a value for that field in the record.
> Additional capabilities could be added as well, such as skipping header 
> lines, grouping lines together as a single field value, ignoring empty lines, 
> etc.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324303061
 
 

 ##
 File path: 
nifi-commons/nifi-record/src/main/java/org/apache/nifi/serialization/record/util/DataTypeUtils.java
 ##
 @@ -225,17 +232,109 @@ public static boolean isCompatibleDataType(final Object 
value, final DataType da
 }
 
 public static DataType chooseDataType(final Object value, final 
ChoiceDataType choiceType) {
-for (final DataType subType : choiceType.getPossibleSubTypes()) {
-if (isCompatibleDataType(value, subType)) {
-if (subType.getFieldType() == RecordFieldType.CHOICE) {
-return chooseDataType(value, (ChoiceDataType) subType);
-}
+Queue possibleSubTypes = new 
LinkedList<>(choiceType.getPossibleSubTypes());
+Set possibleSimpleSubTypes = new HashSet<>();
 
-return subType;
+while (possibleSubTypes.peek() != null) {
+DataType subType = possibleSubTypes.poll();
+if (subType instanceof ChoiceDataType) {
+possibleSubTypes.addAll(((ChoiceDataType) 
subType).getPossibleSubTypes());
+} else {
+possibleSimpleSubTypes.add(subType);
 }
 }
 
-return null;
+List compatibleSimpleSubTypes = 
possibleSimpleSubTypes.stream()
 
 Review comment:
   This method is invoked quite a lot, by many different writers, which means 
that performance is quite a large concern here. We need to avoid the use of 
Streams, as creation of streams is quite expensive. To demonstrate, I created a 
simple unit test that creates a schema containing 3 fields. Each is a CHOICE 
between int, float, string. Then I used JSON Writer to write a Record 1M times. 
It took about 1500 milliseconds on my laptop (on average, after letting the JVM 
warm up). Then I wrote another test that did the same thing but for the fields 
made one an INT, one a FLOAT, and one a String. It took only 400 milliseconds 
(on average, after letting the JVM warm up). A quick profiling of the 
application does indeed show that the majority of the time spent was in calls 
to `stream()`, `ReferencePipeline.collect()`, `ReferencePipeline.findFirst()` 
and `HashSet.add()`. We may not be able to eliminate the calls to 
`HashSet.add()` but we can eliminate the use of Streams (ReferencePipeline).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324308283
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/AbstractTestConversion.java
 ##
 @@ -0,0 +1,396 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.avro.AvroReader;
+import org.apache.nifi.avro.AvroReaderWithEmbeddedSchema;
+import org.apache.nifi.avro.AvroRecordSetWriter;
+import org.apache.nifi.csv.CSVReader;
+import org.apache.nifi.csv.CSVRecordSetWriter;
+import org.apache.nifi.json.JsonRecordSetWriter;
+import org.apache.nifi.json.JsonTreeReader;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.apache.nifi.xml.XMLReader;
+import org.apache.nifi.xml.XMLRecordSetWriter;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.io.ByteArrayInputStream;
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.Map;
+import java.util.Optional;
+import java.util.UUID;
+import java.util.function.Consumer;
+
+import static org.junit.Assert.assertEquals;
+
+public abstract class AbstractTestConversion {
+protected RecordReaderFactory reader;
+protected Consumer inputHandler;
+protected Consumer readerConfigurer;
+
+protected RecordSetWriterFactory writer;
+protected Consumer resultHandler;
+protected Consumer writerConfigurer;
+
+@Before
+public void setUp() throws Exception {
+reader = null;
+inputHandler = null;
+readerConfigurer = null;
+
+writer = null;
+resultHandler = null;
+writerConfigurer = null;
+}
+
+@Test
+public void testCsvToJson() throws Exception {
+fromCsv(csvPostfix());
+toJson(jsonPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvro() throws Exception {
+fromCsv(csvPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvroToCsv() throws Exception {
+fromCsv(csvPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toCsv(csvPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testCsvToXml() throws Exception {
+fromCsv(csvPostfix());
+toXml(xmlPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToCsv() throws Exception {
+fromJson(jsonPostfix());
+toCsv(csvPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvro() throws Exception {
+fromJson(jsonPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvroToJson() throws Exception {
+fromJson(jsonPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toJson(jsonPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testAvroToCsv() throws Exception {
+fromAvro(avroPostfix());
+

[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324307616
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/AbstractTestConversion.java
 ##
 @@ -0,0 +1,396 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.avro.AvroReader;
+import org.apache.nifi.avro.AvroReaderWithEmbeddedSchema;
+import org.apache.nifi.avro.AvroRecordSetWriter;
+import org.apache.nifi.csv.CSVReader;
+import org.apache.nifi.csv.CSVRecordSetWriter;
+import org.apache.nifi.json.JsonRecordSetWriter;
+import org.apache.nifi.json.JsonTreeReader;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.apache.nifi.xml.XMLReader;
+import org.apache.nifi.xml.XMLRecordSetWriter;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.io.ByteArrayInputStream;
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.Map;
+import java.util.Optional;
+import java.util.UUID;
+import java.util.function.Consumer;
+
+import static org.junit.Assert.assertEquals;
+
+public abstract class AbstractTestConversion {
+protected RecordReaderFactory reader;
+protected Consumer inputHandler;
+protected Consumer readerConfigurer;
+
+protected RecordSetWriterFactory writer;
+protected Consumer resultHandler;
+protected Consumer writerConfigurer;
+
+@Before
+public void setUp() throws Exception {
+reader = null;
+inputHandler = null;
+readerConfigurer = null;
+
+writer = null;
+resultHandler = null;
+writerConfigurer = null;
+}
+
+@Test
+public void testCsvToJson() throws Exception {
+fromCsv(csvPostfix());
+toJson(jsonPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvro() throws Exception {
+fromCsv(csvPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvroToCsv() throws Exception {
+fromCsv(csvPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toCsv(csvPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testCsvToXml() throws Exception {
+fromCsv(csvPostfix());
+toXml(xmlPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToCsv() throws Exception {
+fromJson(jsonPostfix());
+toCsv(csvPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvro() throws Exception {
+fromJson(jsonPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvroToJson() throws Exception {
+fromJson(jsonPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toJson(jsonPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testAvroToCsv() throws Exception {
+fromAvro(avroPostfix());
+

[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324306962
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-extension-utils/nifi-record-utils/nifi-avro-record-utils/src/main/java/org/apache/nifi/avro/AvroTypeUtil.java
 ##
 @@ -877,6 +878,18 @@ private static Object convertToAvroObject(final Object 
rawValue, final Schema fi
  */
 private static Object convertUnionFieldValue(final Object originalValue, 
final Schema fieldSchema, final Function conversion, final 
String fieldName) {
 boolean foundNonNull = false;
+
+Optional mostSuitableType = DataTypeUtils.findMostSuitableType(
+originalValue,
+fieldSchema.getTypes().stream().filter(schema -> 
schema.getType() != Type.NULL).collect(Collectors.toList()),
+subSchema -> AvroTypeUtil.determineDataType(subSchema)
+);
+if (mostSuitableType.isPresent()) {
+Object convertedVAlue = conversion.apply(mostSuitableType.get());
 
 Review comment:
   This would be made simpler by simply returning `conversion.apply(...)`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324310345
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/schema/inference/FieldTypeInference.java
 ##
 @@ -62,36 +62,44 @@ public void addPossibleDataType(final DataType dataType) {
 final RecordSchema newSchema = ((RecordDataType) 
dataType).getChildSchema();
 
 final RecordSchema mergedSchema = 
DataTypeUtils.merge(singleDataTypeSchema, newSchema);
+possibleDataTypes.remove(singleDataType);
 singleDataType = 
RecordFieldType.RECORD.getRecordDataType(mergedSchema);
+possibleDataTypes.add(singleDataType);
 return;
 }
 
-if (singleFieldType.isWiderThan(additionalFieldType)) {
-// Assigned type is already wide enough to encompass the given type
-return;
+if (possibleDataTypes.isEmpty()) {
+possibleDataTypes.add(singleDataType);
 }
 
-if (additionalFieldType.isWiderThan(singleFieldType)) {
-// The given type is wide enough to encompass the assigned type. 
So changed the assigned type to the given type.
-singleDataType = dataType;
-return;
-}
+boolean hasWiderNonString = possibleDataTypes.stream()
 
 Review comment:
   This method is also evaluated against potentially every field in any record 
whose schema is being inferred, so performance is important. As such, we should 
avoid use of `Stream`s and instead prefer imperative style programming.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324311037
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/schema/inference/FieldTypeInference.java
 ##
 @@ -62,36 +62,44 @@ public void addPossibleDataType(final DataType dataType) {
 final RecordSchema newSchema = ((RecordDataType) 
dataType).getChildSchema();
 
 final RecordSchema mergedSchema = 
DataTypeUtils.merge(singleDataTypeSchema, newSchema);
+possibleDataTypes.remove(singleDataType);
 singleDataType = 
RecordFieldType.RECORD.getRecordDataType(mergedSchema);
+possibleDataTypes.add(singleDataType);
 return;
 }
 
-if (singleFieldType.isWiderThan(additionalFieldType)) {
-// Assigned type is already wide enough to encompass the given type
-return;
+if (possibleDataTypes.isEmpty()) {
+possibleDataTypes.add(singleDataType);
 }
 
-if (additionalFieldType.isWiderThan(singleFieldType)) {
-// The given type is wide enough to encompass the assigned type. 
So changed the assigned type to the given type.
-singleDataType = dataType;
-return;
-}
+boolean hasWiderNonString = possibleDataTypes.stream()
+.map(DataType::getFieldType)
+.filter(possibleDataType -> 
!possibleDataType.equals(RecordFieldType.STRING))
+.filter(possibleDataType -> 
possibleDataType.isWiderThan(additionalFieldType))
+.findAny()
+.isPresent();
+
+if (!hasWiderNonString) {
+java.util.Iterator possibleDataTypeIterator = 
possibleDataTypes.iterator();
 
 Review comment:
   Should import `java.util.Iterator` rather than declaring the full package 
inline.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324307989
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/AbstractTestConversion.java
 ##
 @@ -0,0 +1,396 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.avro.AvroReader;
+import org.apache.nifi.avro.AvroReaderWithEmbeddedSchema;
+import org.apache.nifi.avro.AvroRecordSetWriter;
+import org.apache.nifi.csv.CSVReader;
+import org.apache.nifi.csv.CSVRecordSetWriter;
+import org.apache.nifi.json.JsonRecordSetWriter;
+import org.apache.nifi.json.JsonTreeReader;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.apache.nifi.xml.XMLReader;
+import org.apache.nifi.xml.XMLRecordSetWriter;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.io.ByteArrayInputStream;
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.Map;
+import java.util.Optional;
+import java.util.UUID;
+import java.util.function.Consumer;
+
+import static org.junit.Assert.assertEquals;
+
+public abstract class AbstractTestConversion {
+protected RecordReaderFactory reader;
+protected Consumer inputHandler;
+protected Consumer readerConfigurer;
+
+protected RecordSetWriterFactory writer;
+protected Consumer resultHandler;
+protected Consumer writerConfigurer;
+
+@Before
+public void setUp() throws Exception {
+reader = null;
+inputHandler = null;
+readerConfigurer = null;
+
+writer = null;
+resultHandler = null;
+writerConfigurer = null;
+}
+
+@Test
+public void testCsvToJson() throws Exception {
+fromCsv(csvPostfix());
+toJson(jsonPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvro() throws Exception {
+fromCsv(csvPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvroToCsv() throws Exception {
+fromCsv(csvPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toCsv(csvPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testCsvToXml() throws Exception {
+fromCsv(csvPostfix());
+toXml(xmlPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToCsv() throws Exception {
+fromJson(jsonPostfix());
+toCsv(csvPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvro() throws Exception {
+fromJson(jsonPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvroToJson() throws Exception {
+fromJson(jsonPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toJson(jsonPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testAvroToCsv() throws Exception {
+fromAvro(avroPostfix());
+

[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324308808
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestConversionWithExplicitSchema.java
 ##
 @@ -0,0 +1,88 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.avro.Schema;
+import org.apache.nifi.avro.AvroReaderWithExplicitSchema;
+import org.apache.nifi.avro.AvroTypeUtil;
+import org.apache.nifi.csv.CSVUtils;
+import org.apache.nifi.schema.access.SchemaAccessUtils;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.util.TestRunner;
+import org.junit.Before;
+
+import java.io.ByteArrayInputStream;
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.Map;
+
+public class TestConversionWithExplicitSchema extends AbstractTestConversion {
 
 Review comment:
   Any of these unit tests that are testing conversion from A to B is fine, but 
they should probably be integration tests, not unit tests. So should be named 
ConversionWithExplicitSchemaIT


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324307403
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/AbstractTestConversion.java
 ##
 @@ -0,0 +1,396 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.avro.AvroReader;
+import org.apache.nifi.avro.AvroReaderWithEmbeddedSchema;
+import org.apache.nifi.avro.AvroRecordSetWriter;
+import org.apache.nifi.csv.CSVReader;
+import org.apache.nifi.csv.CSVRecordSetWriter;
+import org.apache.nifi.json.JsonRecordSetWriter;
+import org.apache.nifi.json.JsonTreeReader;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.apache.nifi.xml.XMLReader;
+import org.apache.nifi.xml.XMLRecordSetWriter;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.io.ByteArrayInputStream;
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.Map;
+import java.util.Optional;
+import java.util.UUID;
+import java.util.function.Consumer;
+
+import static org.junit.Assert.assertEquals;
+
+public abstract class AbstractTestConversion {
+protected RecordReaderFactory reader;
+protected Consumer inputHandler;
+protected Consumer readerConfigurer;
+
+protected RecordSetWriterFactory writer;
+protected Consumer resultHandler;
+protected Consumer writerConfigurer;
+
+@Before
+public void setUp() throws Exception {
+reader = null;
+inputHandler = null;
+readerConfigurer = null;
+
+writer = null;
+resultHandler = null;
+writerConfigurer = null;
+}
+
+@Test
+public void testCsvToJson() throws Exception {
+fromCsv(csvPostfix());
+toJson(jsonPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvro() throws Exception {
+fromCsv(csvPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvroToCsv() throws Exception {
+fromCsv(csvPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toCsv(csvPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testCsvToXml() throws Exception {
+fromCsv(csvPostfix());
+toXml(xmlPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToCsv() throws Exception {
+fromJson(jsonPostfix());
+toCsv(csvPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvro() throws Exception {
+fromJson(jsonPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvroToJson() throws Exception {
+fromJson(jsonPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toJson(jsonPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testAvroToCsv() throws Exception {
+fromAvro(avroPostfix());
+

[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324306453
 
 

 ##
 File path: 
nifi-commons/nifi-record/src/main/java/org/apache/nifi/serialization/record/util/DataTypeUtils.java
 ##
 @@ -440,12 +539,12 @@ public static DataType inferDataType(final Object value, 
final DataType defaultT
 //final DataType elementDataType = inferDataType(valueFromMap, 
RecordFieldType.STRING.getDataType());
 //return RecordFieldType.MAP.getMapDataType(elementDataType);
 }
-if (value instanceof Object[]) {
-final Object[] array = (Object[]) value;
-
+if (value.getClass().isArray()) {
 
 Review comment:
   Neat! I don't think I realized that `Class.isArray()` and 
`Array.getLength()` / `Array.get()` were a thing.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324307536
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/AbstractTestConversion.java
 ##
 @@ -0,0 +1,396 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.avro.AvroReader;
+import org.apache.nifi.avro.AvroReaderWithEmbeddedSchema;
+import org.apache.nifi.avro.AvroRecordSetWriter;
+import org.apache.nifi.csv.CSVReader;
+import org.apache.nifi.csv.CSVRecordSetWriter;
+import org.apache.nifi.json.JsonRecordSetWriter;
+import org.apache.nifi.json.JsonTreeReader;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.apache.nifi.xml.XMLReader;
+import org.apache.nifi.xml.XMLRecordSetWriter;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.io.ByteArrayInputStream;
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.Map;
+import java.util.Optional;
+import java.util.UUID;
+import java.util.function.Consumer;
+
+import static org.junit.Assert.assertEquals;
+
+public abstract class AbstractTestConversion {
+protected RecordReaderFactory reader;
+protected Consumer inputHandler;
+protected Consumer readerConfigurer;
+
+protected RecordSetWriterFactory writer;
+protected Consumer resultHandler;
+protected Consumer writerConfigurer;
+
+@Before
+public void setUp() throws Exception {
+reader = null;
+inputHandler = null;
+readerConfigurer = null;
+
+writer = null;
+resultHandler = null;
+writerConfigurer = null;
+}
+
+@Test
+public void testCsvToJson() throws Exception {
+fromCsv(csvPostfix());
+toJson(jsonPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvro() throws Exception {
+fromCsv(csvPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testCsvToAvroToCsv() throws Exception {
+fromCsv(csvPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toCsv(csvPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testCsvToXml() throws Exception {
+fromCsv(csvPostfix());
+toXml(xmlPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToCsv() throws Exception {
+fromJson(jsonPostfix());
+toCsv(csvPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvro() throws Exception {
+fromJson(jsonPostfix());
+toAvro(avroPostfix());
+
+testConversion(reader, readerConfigurer, writer, writerConfigurer, 
inputHandler, resultHandler);
+}
+
+@Test
+public void testJsonToAvroToJson() throws Exception {
+fromJson(jsonPostfix());
+
+AvroRecordSetWriter writer2 = new AvroRecordSetWriter();
+AvroReader reader2 = new AvroReader();
+
+toJson(jsonPostfix());
+
+testChain(writer2, reader2);
+}
+
+@Test
+public void testAvroToCsv() throws Exception {
+fromAvro(avroPostfix());
+

[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324304179
 
 

 ##
 File path: 
nifi-commons/nifi-record/src/main/java/org/apache/nifi/serialization/record/util/DataTypeUtils.java
 ##
 @@ -225,17 +232,109 @@ public static boolean isCompatibleDataType(final Object 
value, final DataType da
 }
 
 public static DataType chooseDataType(final Object value, final 
ChoiceDataType choiceType) {
-for (final DataType subType : choiceType.getPossibleSubTypes()) {
-if (isCompatibleDataType(value, subType)) {
-if (subType.getFieldType() == RecordFieldType.CHOICE) {
-return chooseDataType(value, (ChoiceDataType) subType);
-}
+Queue possibleSubTypes = new 
LinkedList<>(choiceType.getPossibleSubTypes());
+Set possibleSimpleSubTypes = new HashSet<>();
 
-return subType;
+while (possibleSubTypes.peek() != null) {
+DataType subType = possibleSubTypes.poll();
+if (subType instanceof ChoiceDataType) {
+possibleSubTypes.addAll(((ChoiceDataType) 
subType).getPossibleSubTypes());
+} else {
+possibleSimpleSubTypes.add(subType);
 }
 }
 
-return null;
+List compatibleSimpleSubTypes = 
possibleSimpleSubTypes.stream()
+.filter(subType -> isCompatibleDataType(value, subType))
+.collect(Collectors.toList());
+
+int nrOfCompatibleSimpleSubTypes = compatibleSimpleSubTypes.size();
+
+DataType chosenSimpleType;
+if (nrOfCompatibleSimpleSubTypes == 0) {
+chosenSimpleType = null;
+} else if (nrOfCompatibleSimpleSubTypes == 1) {
+chosenSimpleType = compatibleSimpleSubTypes.get(0);
+} else {
+chosenSimpleType = findMostSuitableType(value, 
compatibleSimpleSubTypes, Function.identity())
+.orElse(compatibleSimpleSubTypes.get(0));
+}
+
+return chosenSimpleType;
+}
+
+public static  Optional findMostSuitableType(Object value, List 
types, Function dataTypeMapper) {
+final Optional mostSuitableType;
+
+Optional inferredDataTypeOptional = 
Optional.ofNullable(inferDataType(value, null))
+.filter(dataType -> 
!dataType.getFieldType().equals(RecordFieldType.STRING));
+
+if (value instanceof String) {
+mostSuitableType = findMostSuitableTypeByStringValue((String) 
value, types, dataTypeMapper);
+} else if (inferredDataTypeOptional.isPresent()) {
+DataType inferredDataType = inferredDataTypeOptional.get();
+
+Optional inferredTypeOptional = types.stream()
+.filter(type -> 
dataTypeMapper.apply(type).equals(inferredDataType))
+.findFirst();
+
+if (inferredTypeOptional.isPresent()) {
+mostSuitableType = inferredTypeOptional;
+} else {
+Optional widerAvailableTypeOptional = types.stream()
+.map(type -> getWiderType(dataTypeMapper.apply(type), 
inferredDataType).isPresent() ? type : null)
+.filter(Objects::nonNull)
+.findFirst();
+
+if (widerAvailableTypeOptional.isPresent()) {
+mostSuitableType = widerAvailableTypeOptional;
+} else {
+mostSuitableType = Optional.empty();
+}
+}
+} else {
+mostSuitableType = Optional.empty();
+}
+
+return mostSuitableType;
+}
+
+public static  Optional findMostSuitableTypeByStringValue(String 
valueAsString, List types, Function dataTypeMapper) {
+Optional mostSuitableType = types.stream()
+// Sorting based on the RecordFieldType enum ordering looks 
appropriate here as we want simpler types
+//  first and the enum's ordering seems to reflect that
+.sorted((type1, type2) -> {
+int comparison;
+
+RecordFieldType dataType1 = 
dataTypeMapper.apply(type1).getFieldType();
+RecordFieldType dataType2 = 
dataTypeMapper.apply(type2).getFieldType();
+
+// Moving TIMESTAMP at the front (at least it 
should precede DATE)
+if (dataType1 == RecordFieldType.TIMESTAMP) {
 
 Review comment:
   This seems a bit convoluted. Does it not make sense to instead just move the 
TIMESTAMP before DATE in the enum, if this is what we want?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For 

[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324281234
 
 

 ##
 File path: 
nifi-commons/nifi-record/src/main/java/org/apache/nifi/serialization/record/util/DataTypeUtils.java
 ##
 @@ -225,17 +232,109 @@ public static boolean isCompatibleDataType(final Object 
value, final DataType da
 }
 
 public static DataType chooseDataType(final Object value, final 
ChoiceDataType choiceType) {
-for (final DataType subType : choiceType.getPossibleSubTypes()) {
-if (isCompatibleDataType(value, subType)) {
-if (subType.getFieldType() == RecordFieldType.CHOICE) {
-return chooseDataType(value, (ChoiceDataType) subType);
-}
+Queue possibleSubTypes = new 
LinkedList<>(choiceType.getPossibleSubTypes());
 
 Review comment:
   Is there a reason to use a Queue/LinkedList here? I think this should just 
be an ArrayList and iterate over the values, rather than using `peek` & `poll`


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324305054
 
 

 ##
 File path: 
nifi-commons/nifi-record/src/main/java/org/apache/nifi/serialization/record/util/DataTypeUtils.java
 ##
 @@ -225,17 +232,109 @@ public static boolean isCompatibleDataType(final Object 
value, final DataType da
 }
 
 public static DataType chooseDataType(final Object value, final 
ChoiceDataType choiceType) {
-for (final DataType subType : choiceType.getPossibleSubTypes()) {
-if (isCompatibleDataType(value, subType)) {
-if (subType.getFieldType() == RecordFieldType.CHOICE) {
-return chooseDataType(value, (ChoiceDataType) subType);
-}
+Queue possibleSubTypes = new 
LinkedList<>(choiceType.getPossibleSubTypes());
+Set possibleSimpleSubTypes = new HashSet<>();
 
-return subType;
+while (possibleSubTypes.peek() != null) {
+DataType subType = possibleSubTypes.poll();
+if (subType instanceof ChoiceDataType) {
+possibleSubTypes.addAll(((ChoiceDataType) 
subType).getPossibleSubTypes());
+} else {
+possibleSimpleSubTypes.add(subType);
 }
 }
 
-return null;
+List compatibleSimpleSubTypes = 
possibleSimpleSubTypes.stream()
+.filter(subType -> isCompatibleDataType(value, subType))
+.collect(Collectors.toList());
+
+int nrOfCompatibleSimpleSubTypes = compatibleSimpleSubTypes.size();
+
+DataType chosenSimpleType;
+if (nrOfCompatibleSimpleSubTypes == 0) {
+chosenSimpleType = null;
+} else if (nrOfCompatibleSimpleSubTypes == 1) {
+chosenSimpleType = compatibleSimpleSubTypes.get(0);
+} else {
+chosenSimpleType = findMostSuitableType(value, 
compatibleSimpleSubTypes, Function.identity())
+.orElse(compatibleSimpleSubTypes.get(0));
+}
+
+return chosenSimpleType;
+}
+
+public static  Optional findMostSuitableType(Object value, List 
types, Function dataTypeMapper) {
+final Optional mostSuitableType;
+
+Optional inferredDataTypeOptional = 
Optional.ofNullable(inferDataType(value, null))
+.filter(dataType -> 
!dataType.getFieldType().equals(RecordFieldType.STRING));
+
+if (value instanceof String) {
+mostSuitableType = findMostSuitableTypeByStringValue((String) 
value, types, dataTypeMapper);
+} else if (inferredDataTypeOptional.isPresent()) {
+DataType inferredDataType = inferredDataTypeOptional.get();
+
+Optional inferredTypeOptional = types.stream()
+.filter(type -> 
dataTypeMapper.apply(type).equals(inferredDataType))
+.findFirst();
+
+if (inferredTypeOptional.isPresent()) {
+mostSuitableType = inferredTypeOptional;
+} else {
+Optional widerAvailableTypeOptional = types.stream()
+.map(type -> getWiderType(dataTypeMapper.apply(type), 
inferredDataType).isPresent() ? type : null)
+.filter(Objects::nonNull)
+.findFirst();
+
+if (widerAvailableTypeOptional.isPresent()) {
+mostSuitableType = widerAvailableTypeOptional;
+} else {
+mostSuitableType = Optional.empty();
+}
+}
+} else {
+mostSuitableType = Optional.empty();
+}
+
+return mostSuitableType;
+}
+
+public static  Optional findMostSuitableTypeByStringValue(String 
valueAsString, List types, Function dataTypeMapper) {
+Optional mostSuitableType = types.stream()
+// Sorting based on the RecordFieldType enum ordering looks 
appropriate here as we want simpler types
+//  first and the enum's ordering seems to reflect that
+.sorted((type1, type2) -> {
+int comparison;
+
+RecordFieldType dataType1 = 
dataTypeMapper.apply(type1).getFieldType();
+RecordFieldType dataType2 = 
dataTypeMapper.apply(type2).getFieldType();
+
+// Moving TIMESTAMP at the front (at least it 
should precede DATE)
+if (dataType1 == RecordFieldType.TIMESTAMP) {
+comparison = -1;
+} else if (dataType2 == RecordFieldType.TIMESTAMP) 
{
+comparison = 1;
+} else {
+comparison = dataType1.compareTo(dataType2);
+}
+
+return comparison;
+ 

[GitHub] [nifi] markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
markap14 commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324304574
 
 

 ##
 File path: 
nifi-commons/nifi-record/src/main/java/org/apache/nifi/serialization/record/util/DataTypeUtils.java
 ##
 @@ -225,17 +232,109 @@ public static boolean isCompatibleDataType(final Object 
value, final DataType da
 }
 
 public static DataType chooseDataType(final Object value, final 
ChoiceDataType choiceType) {
-for (final DataType subType : choiceType.getPossibleSubTypes()) {
-if (isCompatibleDataType(value, subType)) {
-if (subType.getFieldType() == RecordFieldType.CHOICE) {
-return chooseDataType(value, (ChoiceDataType) subType);
-}
+Queue possibleSubTypes = new 
LinkedList<>(choiceType.getPossibleSubTypes());
+Set possibleSimpleSubTypes = new HashSet<>();
 
-return subType;
+while (possibleSubTypes.peek() != null) {
+DataType subType = possibleSubTypes.poll();
+if (subType instanceof ChoiceDataType) {
+possibleSubTypes.addAll(((ChoiceDataType) 
subType).getPossibleSubTypes());
+} else {
+possibleSimpleSubTypes.add(subType);
 }
 }
 
-return null;
+List compatibleSimpleSubTypes = 
possibleSimpleSubTypes.stream()
+.filter(subType -> isCompatibleDataType(value, subType))
+.collect(Collectors.toList());
+
+int nrOfCompatibleSimpleSubTypes = compatibleSimpleSubTypes.size();
+
+DataType chosenSimpleType;
+if (nrOfCompatibleSimpleSubTypes == 0) {
+chosenSimpleType = null;
+} else if (nrOfCompatibleSimpleSubTypes == 1) {
+chosenSimpleType = compatibleSimpleSubTypes.get(0);
+} else {
+chosenSimpleType = findMostSuitableType(value, 
compatibleSimpleSubTypes, Function.identity())
+.orElse(compatibleSimpleSubTypes.get(0));
+}
+
+return chosenSimpleType;
+}
+
+public static  Optional findMostSuitableType(Object value, List 
types, Function dataTypeMapper) {
+final Optional mostSuitableType;
+
+Optional inferredDataTypeOptional = 
Optional.ofNullable(inferDataType(value, null))
+.filter(dataType -> 
!dataType.getFieldType().equals(RecordFieldType.STRING));
+
+if (value instanceof String) {
+mostSuitableType = findMostSuitableTypeByStringValue((String) 
value, types, dataTypeMapper);
+} else if (inferredDataTypeOptional.isPresent()) {
+DataType inferredDataType = inferredDataTypeOptional.get();
+
+Optional inferredTypeOptional = types.stream()
+.filter(type -> 
dataTypeMapper.apply(type).equals(inferredDataType))
+.findFirst();
+
+if (inferredTypeOptional.isPresent()) {
+mostSuitableType = inferredTypeOptional;
+} else {
+Optional widerAvailableTypeOptional = types.stream()
+.map(type -> getWiderType(dataTypeMapper.apply(type), 
inferredDataType).isPresent() ? type : null)
+.filter(Objects::nonNull)
+.findFirst();
+
+if (widerAvailableTypeOptional.isPresent()) {
+mostSuitableType = widerAvailableTypeOptional;
+} else {
+mostSuitableType = Optional.empty();
+}
+}
+} else {
+mostSuitableType = Optional.empty();
+}
+
+return mostSuitableType;
+}
+
+public static  Optional findMostSuitableTypeByStringValue(String 
valueAsString, List types, Function dataTypeMapper) {
+Optional mostSuitableType = types.stream()
+// Sorting based on the RecordFieldType enum ordering looks 
appropriate here as we want simpler types
+//  first and the enum's ordering seems to reflect that
+.sorted((type1, type2) -> {
+int comparison;
+
+RecordFieldType dataType1 = 
dataTypeMapper.apply(type1).getFieldType();
+RecordFieldType dataType2 = 
dataTypeMapper.apply(type2).getFieldType();
+
+// Moving TIMESTAMP at the front (at least it 
should precede DATE)
+if (dataType1 == RecordFieldType.TIMESTAMP) {
+comparison = -1;
+} else if (dataType2 == RecordFieldType.TIMESTAMP) 
{
+comparison = 1;
+} else {
+comparison = dataType1.compareTo(dataType2);
+}
+
+return comparison;
+ 

[jira] [Created] (NIFI-6670) Create a RecordReader that reads lines of text into single-field records

2019-09-13 Thread Matt Burgess (Jira)
Matt Burgess created NIFI-6670:
--

 Summary: Create a RecordReader that reads lines of text into 
single-field records
 Key: NIFI-6670
 URL: https://issues.apache.org/jira/browse/NIFI-6670
 Project: Apache NiFi
  Issue Type: New Feature
  Components: Extensions
Reporter: Matt Burgess


It would be nice to have a reader that can take any textual input and treat 
each "line" as a single-field record. This is like CSVReader but there wouldn't 
be a field delimiter; rather, a property to specify the name of the field, and 
each line becomes a value for that field in the record.

Additional capabilities could be added as well, such as skipping header lines, 
grouping lines together as a single field value, ignoring empty lines, etc.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324297610
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
+
+- *Description* - A description that explains what the Parameter is, how it is 
to be used, etc. This field is optional.
+
+Once these settings are configured, select "Apply". Add additional Parameters 
or edit any existing Parameters.
+
+image:update-parameter-context.png[Update Parameter Context]
+
+To complete the process, select "Apply" from the Parameter Context window. The 
following operations are performed to validate all components that reference 
the added or modified parameters: Stopping/Restarting affected Processors, 
Disabling/Re-enabling affected Controller Services, Updating Parameter Context.
+
+image:parameters-validate-affected-components.png[Validate Affected Components]
+
+The Referencing Components section lists any components referencing the 
parameters in the parameter context organized by process group.
+
+[[assigning_parameter_context_to_PG]]
+ Assigning a Parameter Context to a Process Group
+For a component to reference a Parameter, its Process Group must first be 
assigned to a Parameter Context. Once assigned, processors and controller 
services within that Process Group may only reference Parameters 

[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324285008
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
+
+- *Description* - A description that explains what the Parameter is, how it is 
to be used, etc. This field is optional.
+
+Once these settings are configured, select "Apply". Add additional Parameters 
or edit any existing Parameters.
+
+image:update-parameter-context.png[Update Parameter Context]
+
+To complete the process, select "Apply" from the Parameter Context window. The 
following operations are performed to validate all components that reference 
the added or modified parameters: Stopping/Restarting affected Processors, 
Disabling/Re-enabling affected Controller Services, Updating Parameter Context.
+
+image:parameters-validate-affected-components.png[Validate Affected Components]
+
+The Referencing Components section lists any components referencing the 
parameters in the parameter context organized by process group.
+
+[[assigning_parameter_context_to_PG]]
+ Assigning a Parameter Context to a Process Group
+For a component to reference a Parameter, its Process Group must first be 
assigned to a Parameter Context. Once assigned, processors and controller 
services within that Process Group may only reference Parameters 

[jira] [Created] (NIFI-6669) Allow arbitrary IV values in EncryptContent processor when running in decrypt mode

2019-09-13 Thread Andy LoPresto (Jira)
Andy LoPresto created NIFI-6669:
---

 Summary: Allow arbitrary IV values in EncryptContent processor 
when running in decrypt mode
 Key: NIFI-6669
 URL: https://issues.apache.org/jira/browse/NIFI-6669
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Extensions
Affects Versions: 1.9.2
Reporter: Andy LoPresto
Assignee: Andy LoPresto


As discussed in the Apache NiFi Slack instance recently:

 

{quote}
[RuthMizzi|https://app.slack.com/team/UN6KDKGMB]  [3 days 
ago|https://apachenifi.slack.com/archives/C0L9VCD47/p1568103042185700]
Hello - wondering whether anyone can help me with question I have using the 
EncryptContent nifi component for decryption of messages received from maxwell 
daemon .. has anyone ever done this before?
 
[Andy LoPresto|https://app.slack.com/team/U0LA8HR55]  [3 days 
ago|https://apachenifi.slack.com/archives/C0L9VCD47/p1568138070190500?thread_ts=1568103042.185700=C0L9VCD47]
This won’t work out of the box. Maxwell’s Daemon uses {{AES/CBC/NoPadding}} 
cipher mode of operation ([https://maxwells-daemon.io/encryption/]), which NiFi 
does support, but it generates a unique IV and sends both encoded in Base64. 
The underlying code in NiFi supports unique IVs but it is not currently exposed 
to the {{EncryptContent}} processor.  I would suggest the following approach:
1. Submit a Jira requesting this feature and assign it to me. By adding an 
optional IV property, we can allow unique input for every decryption operation, 
and this value can be extracted from the JSON to a flowfile attribute in a 
preceding {{EvaluateJsonPath}} processor. We will need to eagerly detect Base64 
encoding vs. Hex encoding for this input.
2. Write an {{ExecuteScript}} processor which consumes the two JSON values and 
calls simple decrypt logic in Groovy. I can help generate this if you need it.
 
[RuthMizzi|https://app.slack.com/team/UN6KDKGMB][9 hours 
ago|https://apachenifi.slack.com/archives/C0L9VCD47/p1568361404228800?thread_ts=1568103042.185700=C0L9VCD47]
thanks so much for your detailed response! In the meantime we convinced our 
data sources to send the data unencrypted but over a secured ssl kafka 
connection and therefore got around the issue -- but i really appreciate the 
knowledge shared. Thanks again
{quote}
 
The IV property descriptor should not be required, should support Expression 
Language, and should detect Base64 and Hex encoded values and validate them 
against the correct block length for the selected mode of operation via custom 
validation. 
 
We may also want a dropdown of "IV strategies" such as "IV prepended", "IV 
prepended with delimiter", etc. to handle incoming flowfile content which 
already has a per-ciphertext IV prepended. 
 



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (NIFI-6640) Schema Inference of UNION/CHOICE types not handled correctly

2019-09-13 Thread Mark Payne (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6640?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-6640:
-
Component/s: (was: Core Framework)
 Extensions

> Schema Inference of UNION/CHOICE types not handled correctly
> 
>
> Key: NIFI-6640
> URL: https://issues.apache.org/jira/browse/NIFI-6640
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: Tamas Palfy
>Assignee: Tamas Palfy
>Priority: Major
>  Labels: Record, inference, schema
> Attachments: NIFI-6640.template.xml
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> When reading the following CSV:
> {code}
> Id|Value
> 1|3
> 2|3.75
> 3|3.85
> 4|8
> 5|2.0
> 6|4.0
> 7|some_string
> {code}
> And try to channel through a {{ConvertRecord}} processor, the following 
> exception is thrown:
> {code}
> 2019-09-06 18:25:48,936 ERROR [Timer-Driven Process Thread-2] 
> o.a.n.processors.standard.ConvertRecord 
> ConvertRecord[id=07635c71-016d-1000-3847-ff916164b32a] Failed to process 
> StandardFlowFileRecord[uuid=4b4ab01a-b349-4f83-9b25-6a58d0b29
> 7c1,claim=StandardContentClaim 
> [resourceClaim=StandardResourceClaim[id=1567786888281-1, container=default, 
> section=1], offset=326669, 
> length=56],offset=0,name=4b4ab01a-b349-4f83-9b25-6a58d0b297c1,size=56]; will 
> route to failure: org.apa
> che.nifi.processor.exception.ProcessException: Could not parse incoming data
> org.apache.nifi.processor.exception.ProcessException: Could not parse 
> incoming data
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:170)
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2925)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1162)
> at 
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:205)
> at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.nifi.serialization.MalformedRecordException: Error 
> while getting next record. Root cause: 
> org.apache.nifi.serialization.record.util.IllegalTypeConversionException: 
> Cannot convert value [some_string] of type class j
> ava.lang.String for field Value to any of the following available Sub-Types 
> for a Choice: [FLOAT, INT]
> at 
> org.apache.nifi.csv.CSVRecordReader.nextRecord(CSVRecordReader.java:119)
> at 
> org.apache.nifi.serialization.RecordReader.nextRecord(RecordReader.java:50)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:156)
> ... 13 common frames omitted
> Caused by: 
> org.apache.nifi.serialization.record.util.IllegalTypeConversionException: 
> Cannot convert value [some_string] of type class java.lang.String for field 
> Value to any of the following available Sub-Types for a Choice: [FLOAT, INT
> ]
> at 
> org.apache.nifi.serialization.record.util.DataTypeUtils.convertType(DataTypeUtils.java:166)
> at 
> org.apache.nifi.serialization.record.util.DataTypeUtils.convertType(DataTypeUtils.java:116)
> at 
> org.apache.nifi.csv.AbstractCSVRecordReader.convert(AbstractCSVRecordReader.java:86)
> at 
> org.apache.nifi.csv.CSVRecordReader.nextRecord(CSVRecordReader.java:105)
> ... 15 common frames omitted
> {code}
> The problem is that {{FieldTypeInference}} has both a list of 
> {{possibleDataTypes}} and a {{singleDataType}} and as long as an added 
> dataType is not in a "wider" relationship with the previous types it is added 
> to the {{possibleDataTypes}}. But once a "wider" type is added, it actually 
> gets set as the {{singleDataType}} and the {{possibleDataTypes}} remains 
> intact.
> However when we 

[jira] [Updated] (NIFI-6640) Schema Inference of UNION/CHOICE types not handled correctly

2019-09-13 Thread Mark Payne (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6640?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-6640:
-
Labels: Record inference schema  (was: )

> Schema Inference of UNION/CHOICE types not handled correctly
> 
>
> Key: NIFI-6640
> URL: https://issues.apache.org/jira/browse/NIFI-6640
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Reporter: Tamas Palfy
>Assignee: Tamas Palfy
>Priority: Major
>  Labels: Record, inference, schema
> Attachments: NIFI-6640.template.xml
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> When reading the following CSV:
> {code}
> Id|Value
> 1|3
> 2|3.75
> 3|3.85
> 4|8
> 5|2.0
> 6|4.0
> 7|some_string
> {code}
> And try to channel through a {{ConvertRecord}} processor, the following 
> exception is thrown:
> {code}
> 2019-09-06 18:25:48,936 ERROR [Timer-Driven Process Thread-2] 
> o.a.n.processors.standard.ConvertRecord 
> ConvertRecord[id=07635c71-016d-1000-3847-ff916164b32a] Failed to process 
> StandardFlowFileRecord[uuid=4b4ab01a-b349-4f83-9b25-6a58d0b29
> 7c1,claim=StandardContentClaim 
> [resourceClaim=StandardResourceClaim[id=1567786888281-1, container=default, 
> section=1], offset=326669, 
> length=56],offset=0,name=4b4ab01a-b349-4f83-9b25-6a58d0b297c1,size=56]; will 
> route to failure: org.apa
> che.nifi.processor.exception.ProcessException: Could not parse incoming data
> org.apache.nifi.processor.exception.ProcessException: Could not parse 
> incoming data
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:170)
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2925)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1162)
> at 
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:205)
> at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.nifi.serialization.MalformedRecordException: Error 
> while getting next record. Root cause: 
> org.apache.nifi.serialization.record.util.IllegalTypeConversionException: 
> Cannot convert value [some_string] of type class j
> ava.lang.String for field Value to any of the following available Sub-Types 
> for a Choice: [FLOAT, INT]
> at 
> org.apache.nifi.csv.CSVRecordReader.nextRecord(CSVRecordReader.java:119)
> at 
> org.apache.nifi.serialization.RecordReader.nextRecord(RecordReader.java:50)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:156)
> ... 13 common frames omitted
> Caused by: 
> org.apache.nifi.serialization.record.util.IllegalTypeConversionException: 
> Cannot convert value [some_string] of type class java.lang.String for field 
> Value to any of the following available Sub-Types for a Choice: [FLOAT, INT
> ]
> at 
> org.apache.nifi.serialization.record.util.DataTypeUtils.convertType(DataTypeUtils.java:166)
> at 
> org.apache.nifi.serialization.record.util.DataTypeUtils.convertType(DataTypeUtils.java:116)
> at 
> org.apache.nifi.csv.AbstractCSVRecordReader.convert(AbstractCSVRecordReader.java:86)
> at 
> org.apache.nifi.csv.CSVRecordReader.nextRecord(CSVRecordReader.java:105)
> ... 15 common frames omitted
> {code}
> The problem is that {{FieldTypeInference}} has both a list of 
> {{possibleDataTypes}} and a {{singleDataType}} and as long as an added 
> dataType is not in a "wider" relationship with the previous types it is added 
> to the {{possibleDataTypes}}. But once a "wider" type is added, it actually 
> gets set as the {{singleDataType}} and the {{possibleDataTypes}} remains 
> intact.
> However when we try to determine the 

[jira] [Updated] (NIFI-6640) Schema Inference of UNION/CHOICE types not handled correctly

2019-09-13 Thread Mark Payne (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6640?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-6640:
-
Summary: Schema Inference of UNION/CHOICE types not handled correctly  
(was: UNION/CHOICE types not handled correctly)

> Schema Inference of UNION/CHOICE types not handled correctly
> 
>
> Key: NIFI-6640
> URL: https://issues.apache.org/jira/browse/NIFI-6640
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Reporter: Tamas Palfy
>Assignee: Tamas Palfy
>Priority: Major
> Attachments: NIFI-6640.template.xml
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> When reading the following CSV:
> {code}
> Id|Value
> 1|3
> 2|3.75
> 3|3.85
> 4|8
> 5|2.0
> 6|4.0
> 7|some_string
> {code}
> And try to channel through a {{ConvertRecord}} processor, the following 
> exception is thrown:
> {code}
> 2019-09-06 18:25:48,936 ERROR [Timer-Driven Process Thread-2] 
> o.a.n.processors.standard.ConvertRecord 
> ConvertRecord[id=07635c71-016d-1000-3847-ff916164b32a] Failed to process 
> StandardFlowFileRecord[uuid=4b4ab01a-b349-4f83-9b25-6a58d0b29
> 7c1,claim=StandardContentClaim 
> [resourceClaim=StandardResourceClaim[id=1567786888281-1, container=default, 
> section=1], offset=326669, 
> length=56],offset=0,name=4b4ab01a-b349-4f83-9b25-6a58d0b297c1,size=56]; will 
> route to failure: org.apa
> che.nifi.processor.exception.ProcessException: Could not parse incoming data
> org.apache.nifi.processor.exception.ProcessException: Could not parse 
> incoming data
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:170)
> at 
> org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2925)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor.onTrigger(AbstractRecordProcessor.java:122)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1162)
> at 
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:205)
> at 
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> at 
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> at 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.nifi.serialization.MalformedRecordException: Error 
> while getting next record. Root cause: 
> org.apache.nifi.serialization.record.util.IllegalTypeConversionException: 
> Cannot convert value [some_string] of type class j
> ava.lang.String for field Value to any of the following available Sub-Types 
> for a Choice: [FLOAT, INT]
> at 
> org.apache.nifi.csv.CSVRecordReader.nextRecord(CSVRecordReader.java:119)
> at 
> org.apache.nifi.serialization.RecordReader.nextRecord(RecordReader.java:50)
> at 
> org.apache.nifi.processors.standard.AbstractRecordProcessor$1.process(AbstractRecordProcessor.java:156)
> ... 13 common frames omitted
> Caused by: 
> org.apache.nifi.serialization.record.util.IllegalTypeConversionException: 
> Cannot convert value [some_string] of type class java.lang.String for field 
> Value to any of the following available Sub-Types for a Choice: [FLOAT, INT
> ]
> at 
> org.apache.nifi.serialization.record.util.DataTypeUtils.convertType(DataTypeUtils.java:166)
> at 
> org.apache.nifi.serialization.record.util.DataTypeUtils.convertType(DataTypeUtils.java:116)
> at 
> org.apache.nifi.csv.AbstractCSVRecordReader.convert(AbstractCSVRecordReader.java:86)
> at 
> org.apache.nifi.csv.CSVRecordReader.nextRecord(CSVRecordReader.java:105)
> ... 15 common frames omitted
> {code}
> The problem is that {{FieldTypeInference}} has both a list of 
> {{possibleDataTypes}} and a {{singleDataType}} and as long as an added 
> dataType is not in a "wider" relationship with the previous types it is added 
> to the {{possibleDataTypes}}. But once a "wider" type is added, it actually 
> gets set as the {{singleDataType}} and the {{possibleDataTypes}} remains 
> intact.
> However 

[GitHub] [nifi] tpalfy commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE types not handled correctly

2019-09-13 Thread GitBox
tpalfy commented on a change in pull request #3724: NIFI-6640 - UNION/CHOICE 
types not handled correctly
URL: https://github.com/apache/nifi/pull/3724#discussion_r324275668
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-extension-utils/nifi-record-utils/nifi-avro-record-utils/src/test/java/org/apache/nifi/avro/TestAvroTypeUtil.java
 ##
 @@ -587,4 +588,49 @@ public void testListAndMapConversion() {
 assertNotNull(((Record)inner).get("Message"));
 }
 }
+
+@Test
+public void 
testConvertToAvroObjectWhenFloatVSUnion_INT_FLOAT_ThenReturnFloat() {
 
 Review comment:
   These two tests can be done more elegantly. Also need to add a version where 
INT is expected (and maybe test some other types).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Resolved] (NIFI-6480) PutORC/PutParquet can't overwrite file even if set 'Overwrite Files' to true

2019-09-13 Thread Bryan Bende (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende resolved NIFI-6480.
---
Fix Version/s: 1.10.0
   Resolution: Fixed

> PutORC/PutParquet can't overwrite file even if set 'Overwrite Files' to true
> 
>
> Key: NIFI-6480
> URL: https://issues.apache.org/jira/browse/NIFI-6480
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: archon gum
>Assignee: archon gum
>Priority: Major
> Fix For: 1.10.0
>
> Attachments: Snipaste_2019-07-16_11-07-05.jpg
>
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> !Snipaste_2019-07-16_11-07-05.jpg!
>  
> Solution:
> Reference to 
> [PutHDFS|https://github.com/apache/nifi/blob/72244d09ff193131119c10492b9327af35a64f02/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java],
>  delete dest file before rename temp file to dest file.
> Tested on my local machine.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (NIFI-6480) PutORC/PutParquet can't overwrite file even if set 'Overwrite Files' to true

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-6480?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929321#comment-16929321
 ] 

ASF subversion and git services commented on NIFI-6480:
---

Commit 8a8852e73dbcc9653ac49662df44576ec338ebb1 in nifi's branch 
refs/heads/master from archon
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=8a8852e ]

NIFI-6480: PutORC/PutParquet can't overwrite file even if set 'Overwrite Files' 
to true

This closes #3599.

Signed-off-by: Bryan Bende 


> PutORC/PutParquet can't overwrite file even if set 'Overwrite Files' to true
> 
>
> Key: NIFI-6480
> URL: https://issues.apache.org/jira/browse/NIFI-6480
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Reporter: archon gum
>Assignee: archon gum
>Priority: Major
> Attachments: Snipaste_2019-07-16_11-07-05.jpg
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> !Snipaste_2019-07-16_11-07-05.jpg!
>  
> Solution:
> Reference to 
> [PutHDFS|https://github.com/apache/nifi/blob/72244d09ff193131119c10492b9327af35a64f02/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java],
>  delete dest file before rename temp file to dest file.
> Tested on my local machine.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] asfgit closed pull request #3599: NIFI-6480: PutORC/PutParquet can't overwrite file even if set 'Overwr…

2019-09-13 Thread GitBox
asfgit closed pull request #3599: NIFI-6480: PutORC/PutParquet can't overwrite 
file even if set 'Overwr…
URL: https://github.com/apache/nifi/pull/3599
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] bbende commented on issue #3599: NIFI-6480: PutORC/PutParquet can't overwrite file even if set 'Overwr…

2019-09-13 Thread GitBox
bbende commented on issue #3599: NIFI-6480: PutORC/PutParquet can't overwrite 
file even if set 'Overwr…
URL: https://github.com/apache/nifi/pull/3599#issuecomment-531297553
 
 
   @archongum sorry for taking so long to get to this... changes look good, 
going to merge, thanks!


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (NIFI-6668) List Provenance and Lineage Queries

2019-09-13 Thread Bryan Rosander (Jira)
Bryan Rosander created NIFI-6668:


 Summary: List Provenance and Lineage Queries
 Key: NIFI-6668
 URL: https://issues.apache.org/jira/browse/NIFI-6668
 Project: Apache NiFi
  Issue Type: Improvement
Reporter: Bryan Rosander


It would be useful to be able to list Provenance and Lineage queries in order 
to clean up queries leftover from testing or a misbehaved or crashed tool.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] bbende commented on issue #3682: NIFI-6604: Removing large NARs from generated assemblies to reduce size.

2019-09-13 Thread GitBox
bbende commented on issue #3682: NIFI-6604: Removing large NARs from generated 
assemblies to reduce size.
URL: https://github.com/apache/nifi/pull/3682#issuecomment-531287636
 
 
   This looks good. Do we want a profile to control inclusion of any of these? 
   
   For something like Kafka 0.8 it seems like we could just drop it, but maybe 
for something like Druid we would want a -Pinclude-druid ?
   
   cc @joewitt 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] bbende commented on a change in pull request #3723: NIFI-6656 Added a default visibility expression configuration item to…

2019-09-13 Thread GitBox
bbende commented on a change in pull request #3723: NIFI-6656 Added a default 
visibility expression configuration item to…
URL: https://github.com/apache/nifi/pull/3723#discussion_r324249667
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-services/nifi-hbase-client-service-api/pom.xml
 ##
 @@ -31,5 +31,16 @@
 nifi-api
 provided
 
+
 
 Review comment:
   I think these dependencies can be removed since the property descriptor now 
moved out of the API ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (NIFI-6024) Registry Buckets Inconsistently Sorted

2019-09-13 Thread Bryan Bende (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6024?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende updated NIFI-6024:
--
  Assignee: Mark Payne
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> Registry Buckets Inconsistently Sorted
> --
>
> Key: NIFI-6024
> URL: https://issues.apache.org/jira/browse/NIFI-6024
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.8.0
>Reporter: Alan Jackoway
>Assignee: Mark Payne
>Priority: Major
>  Labels: SDLC
> Fix For: 1.10.0
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> When importing a process group from a registry with multiple buckets, they 
> should always be in alphanumeric order. Currently they are inconsistent in 
> what order comes back.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (NIFI-6024) Registry Buckets Inconsistently Sorted

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-6024?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929289#comment-16929289
 ] 

ASF subversion and git services commented on NIFI-6024:
---

Commit 7623e6f5a1bd135ab58bbb9fb4b25686931d8fbf in nifi's branch 
refs/heads/master from Mark Payne
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=7623e6f ]

NIFI-6024: When fetching names of buckets and flows for registry, sort them 
alphanumerically

This closes #3709.

Signed-off-by: Bryan Bende 


> Registry Buckets Inconsistently Sorted
> --
>
> Key: NIFI-6024
> URL: https://issues.apache.org/jira/browse/NIFI-6024
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.8.0
>Reporter: Alan Jackoway
>Priority: Major
>  Labels: SDLC
> Fix For: 1.10.0
>
>  Time Spent: 20m
>  Remaining Estimate: 0h
>
> When importing a process group from a registry with multiple buckets, they 
> should always be in alphanumeric order. Currently they are inconsistent in 
> what order comes back.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] asfgit closed pull request #3709: NIFI-6024: When fetching names of buckets and flows for registry, sor…

2019-09-13 Thread GitBox
asfgit closed pull request #3709: NIFI-6024: When fetching names of buckets and 
flows for registry, sor…
URL: https://github.com/apache/nifi/pull/3709
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] bbende commented on issue #3709: NIFI-6024: When fetching names of buckets and flows for registry, sor…

2019-09-13 Thread GitBox
bbende commented on issue #3709: NIFI-6024: When fetching names of buckets and 
flows for registry, sor…
URL: https://github.com/apache/nifi/pull/3709#issuecomment-531280213
 
 
   Looks good, merging


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324242476
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
+
+- *Description* - A description that explains what the Parameter is, how it is 
to be used, etc. This field is optional.
+
+Once these settings are configured, select "Apply". Add additional Parameters 
or edit any existing Parameters.
+
+image:update-parameter-context.png[Update Parameter Context]
+
+To complete the process, select "Apply" from the Parameter Context window. The 
following operations are performed to validate all components that reference 
the added or modified parameters: Stopping/Restarting affected Processors, 
Disabling/Re-enabling affected Controller Services, Updating Parameter Context.
+
+image:parameters-validate-affected-components.png[Validate Affected Components]
+
+The Referencing Components section lists any components referencing the 
parameters in the parameter context organized by process group.
+
+[[assigning_parameter_context_to_PG]]
+ Assigning a Parameter Context to a Process Group
+For a component to reference a Parameter, its Process Group must first be 
assigned to a Parameter Context. Once assigned, processors and controller 
services within that Process Group may only reference Parameters 

[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324241813
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
+
+- *Description* - A description that explains what the Parameter is, how it is 
to be used, etc. This field is optional.
+
+Once these settings are configured, select "Apply". Add additional Parameters 
or edit any existing Parameters.
+
+image:update-parameter-context.png[Update Parameter Context]
+
+To complete the process, select "Apply" from the Parameter Context window. The 
following operations are performed to validate all components that reference 
the added or modified parameters: Stopping/Restarting affected Processors, 
Disabling/Re-enabling affected Controller Services, Updating Parameter Context.
+
+image:parameters-validate-affected-components.png[Validate Affected Components]
+
+The Referencing Components section lists any components referencing the 
parameters in the parameter context organized by process group.
+
+[[assigning_parameter_context_to_PG]]
+ Assigning a Parameter Context to a Process Group
+For a component to reference a Parameter, its Process Group must first be 
assigned to a Parameter Context. Once assigned, processors and controller 
services within that Process Group may only reference Parameters 

[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324240880
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
+
+- *Description* - A description that explains what the Parameter is, how it is 
to be used, etc. This field is optional.
+
+Once these settings are configured, select "Apply". Add additional Parameters 
or edit any existing Parameters.
+
+image:update-parameter-context.png[Update Parameter Context]
+
+To complete the process, select "Apply" from the Parameter Context window. The 
following operations are performed to validate all components that reference 
the added or modified parameters: Stopping/Restarting affected Processors, 
Disabling/Re-enabling affected Controller Services, Updating Parameter Context.
+
+image:parameters-validate-affected-components.png[Validate Affected Components]
+
+The Referencing Components section lists any components referencing the 
parameters in the parameter context organized by process group.
+
+[[assigning_parameter_context_to_PG]]
+ Assigning a Parameter Context to a Process Group
+For a component to reference a Parameter, its Process Group must first be 
assigned to a Parameter Context. Once assigned, processors and controller 
services within that Process Group may only reference Parameters 

[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324238782
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
+
+- *Description* - A description that explains what the Parameter is, how it is 
to be used, etc. This field is optional.
+
+Once these settings are configured, select "Apply". Add additional Parameters 
or edit any existing Parameters.
+
+image:update-parameter-context.png[Update Parameter Context]
+
+To complete the process, select "Apply" from the Parameter Context window. The 
following operations are performed to validate all components that reference 
the added or modified parameters: Stopping/Restarting affected Processors, 
Disabling/Re-enabling affected Controller Services, Updating Parameter Context.
+
+image:parameters-validate-affected-components.png[Validate Affected Components]
+
+The Referencing Components section lists any components referencing the 
parameters in the parameter context organized by process group.
+
+[[assigning_parameter_context_to_PG]]
+ Assigning a Parameter Context to a Process Group
+For a component to reference a Parameter, its Process Group must first be 
assigned to a Parameter Context. Once assigned, processors and controller 
services within that Process Group may only reference Parameters 

[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324237736
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
+
+- *Description* - A description that explains what the Parameter is, how it is 
to be used, etc. This field is optional.
+
+Once these settings are configured, select "Apply". Add additional Parameters 
or edit any existing Parameters.
+
+image:update-parameter-context.png[Update Parameter Context]
+
+To complete the process, select "Apply" from the Parameter Context window. The 
following operations are performed to validate all components that reference 
the added or modified parameters: Stopping/Restarting affected Processors, 
Disabling/Re-enabling affected Controller Services, Updating Parameter Context.
+
+image:parameters-validate-affected-components.png[Validate Affected Components]
+
+The Referencing Components section lists any components referencing the 
parameters in the parameter context organized by process group.
+
+[[assigning_parameter_context_to_PG]]
+ Assigning a Parameter Context to a Process Group
+For a component to reference a Parameter, its Process Group must first be 
assigned to a Parameter Context. Once assigned, processors and controller 
services within that Process Group may only reference Parameters 

[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324236849
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
+
+- *Sensitive Value* -  Set to "Yes" if the Parameter's Value should be 
considered sensitive. If sensitive, the value of the Parameter will not be 
shown in the UI once applied. The default setting is "No".
 
 Review comment:
   Made this update.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] kevdoran commented on issue #3722: NIFI-6653 Change bootstrap port command handling

2019-09-13 Thread GitBox
kevdoran commented on issue #3722: NIFI-6653 Change bootstrap port command 
handling
URL: https://github.com/apache/nifi/pull/3722#issuecomment-531272580
 
 
   @markap14 any chance you can take a look at this small PR? I'm hoping to 
include it in the 1.10.0 release.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] thenatog edited a comment on issue #3715: NIFI-6578 - Upgraded zookeeper framework version. Some code changes r…

2019-09-13 Thread GitBox
thenatog edited a comment on issue #3715: NIFI-6578 - Upgraded zookeeper 
framework version. Some code changes r…
URL: https://github.com/apache/nifi/pull/3715#issuecomment-531271643
 
 
   To test this PR:
   
   With embedded zookeeper:
   - nifi.zookeeper.connect.string=localhost:2181
   - nifi.state.management.embedded.zookeeper.start=true
   - zookeeper.properties file:
   - Remove clientPort=2181
   - Add port to end of server string: server.1=localhost:2888:3888;2181
   
   With external zookeeper quorum:
   - Write the docker-compose.yml file from this page: 
https://hub.docker.com/_/zookeeper to a directory, run docker-compose up 
   - Configure NiFi to use the docker zookeeper nodes:
   - nifi.properties: 
nifi.zookeeper.connect.string=zoo1:2281,zoo2:2282,zoo3:2283
   - state-management.xml: zoo1:2281,zoo2:2282,zoo:2283
   
   In zookeeper.properties file define:
   - server.1=zoo1:2888:3888;2281

   - server.2=zoo2:2888:3888;2282
   - 
server.3=zoo32888:3888;2282
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] andrewmlim commented on a change in pull request #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on a change in pull request #3725: NIFI-6558 Added 
Parameters to User Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#discussion_r324234880
 
 

 ##
 File path: nifi-docs/src/main/asciidoc/user-guide.adoc
 ##
 @@ -711,15 +711,187 @@ whatever comments are appropriate for this component. 
Use of the Comments tab is
 image::comments-tab.png["Comments Tab"]
 
 
-=== Additional Help
+ Additional Help
 
 You can access additional documentation about each Processor's usage by 
right-clicking on the Processor and selecting 'Usage' from the context menu. 
Alternatively, select Help from the Global Menu in the top-right corner of the 
UI to display a Help page with all of the documentation, including usage 
documentation for all the Processors that are available. Click on the desired 
Processor to view usage documentation.
 
+[[Parameters]]
+=== Parameters
+The values of properties in the flow, including sensitive properties, can be 
parameterized using Parameters. Parameters are created and configured within 
the NiFi UI. Any property can be configured to reference a Parameter with the 
following conditions:
+
+ - A sensitive property can only reference a Sensitive Parameter
+ - A non-sensitive property can only reference a non-Sensitive Parameter
+ - Properties that reference Controller Services can not use Parameters
+
+NOTE: NiFi automatically picks up new or modified parameters.
+
+[[parameter-contexts]]
+ Parameter Contexts
+Parameters are created within Parameter Contexts. Parameter Contexts are 
globally defined/accessible to the NiFi instance. Access policies can be 
applied to Parameter Contexts to determine which users can create them. Once 
created, policies to read and write to a specific Parameter Context can also be 
applied (see <> for more information).
+
+= Creating a Parameter Context
+To create a Parameter Context, select Parameter Contexts from the Global Menu:
+
+image:parameter-contexts-selection.png["Global Menu - Parameter Contexts"]
+
+In the Parameter Contexts window, click the `+` button in the upper-right 
corner and the Add Parameter Context window opens. The window has two tabs: 
Settings and Parameters.
+
+image:parameter-contexts-settings.png["Parameter Contexts - Settings"]
+
+On the "Settings" tab, add a name for the Parameter Context and a description 
if desired.  Select "Apply" to save the Parameter Context or select the 
"Parameters" tab to add parameters to the context.
+
+ Adding a Parameter to a Parameter Context
+Parameters can be added during Parameter Context creation or added to existing 
Parameter Contexts.
+
+During Parameter Context creation, select the "Parameters" tab. Click the `+` 
button to open the Add Parameter window.
+
+image:add-parameter-during-parameter-context-creation.png[Add Parameter]
+
+To add parameters to an existing Parameter Context, open the Parameter Context 
window and click the Edit button (image:iconEdit.png["Edit"]) in the row of the 
desired Parameter Context.
+
+image:edit-parameter-context.png[Edit Parameter Context]
+
+On the "Parameters" tab, click the `+` button to open the Add Parameter window.
+
+The Add Parameter window has the following settings:
+
+- *Name* - A name that is used to denote the Parameter. Only alpha-numeric 
characters (a-z, A-Z, 0-9), hyphens ( - ), underscores ( _ ), periods ( . ), 
and spaces are allowed.
+
+- *Value* - The value that will be used when the Parameter is referenced.
+
+- *Set empty string* - Check to explicity set the value of the Parameter to an 
empty string. Unchecked by default.
 
 Review comment:
   Added a note for this. Also corrected "explicity" to "explicitly".


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] thenatog commented on issue #3715: NIFI-6578 - Upgraded zookeeper framework version. Some code changes r…

2019-09-13 Thread GitBox
thenatog commented on issue #3715: NIFI-6578 - Upgraded zookeeper framework 
version. Some code changes r…
URL: https://github.com/apache/nifi/pull/3715#issuecomment-531271643
 
 
   To test this PR:
   
   With embedded zookeeper:
   - nifi.zookeeper.connect.string=localhost:2181
   - nifi.state.management.embedded.zookeeper.start=true
   - zookeeper.properties file:
   - Remove clientPort=2181
   - Add port to end of server string: server.1=localhost:2888:3888;2181
   
   With external zookeeper quorum:
   - Write the docker-compose.yml file from this page: 
https://hub.docker.com/_/zookeeper to a directory, run docker-compose up 
   - Configure NiFi to use the docker zookeeper nodes:
   - nifi.properties: 
nifi.zookeeper.connect.string=zoo1:2281,zoo2:2282,zoo3:2283
   - state-management.xml: 
   zoo1:2281,zoo2:2282,zoo:2283
   
   In zookeeper.properties file define:
   - server.1=zoo1:2888:3888;2281

   - server.2=zoo2:2888:3888;2282
   - 
server.3=zoo32888:3888;2282
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (NIFI-6667) Nifi Docker doesn't have a way to set sensitive props key

2019-09-13 Thread James Sevener (Jira)
James Sevener created NIFI-6667:
---

 Summary: Nifi Docker doesn't have a way to set sensitive props key
 Key: NIFI-6667
 URL: https://issues.apache.org/jira/browse/NIFI-6667
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Docker
Affects Versions: 1.9.2
Reporter: James Sevener


secure.sh doesn't accept a variable to set sensitive props key.

Something like to following works.
{code:java}
: ${SENSITIVE_PROPS_KEY:?"Must specify an encryption passphrase to be used for 
sensitive properties."}
prop_replace 'nifi.sensitive.props.key' 
"${SENSITIVE_PROPS_KEY}"{code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Updated] (NIFI-6582) Add diff-flow-versions command to CLI

2019-09-13 Thread Mark Payne (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-6582:
-
Fix Version/s: 1.10.0
   Resolution: Fixed
   Status: Resolved  (was: Patch Available)

> Add diff-flow-versions command to CLI
> -
>
> Key: NIFI-6582
> URL: https://issues.apache.org/jira/browse/NIFI-6582
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.9.2
>Reporter: Bryan Bende
>Assignee: Bryan Bende
>Priority: Minor
>  Labels: CLI
> Fix For: 1.10.0
>
>  Time Spent: 50m
>  Remaining Estimate: 0h
>
> The NiFi Registry REST API has a command to diff two flow versions. We should 
> have a CLI command that can call this end-point and list the differences in a 
> table.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (NIFI-6582) Add diff-flow-versions command to CLI

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929268#comment-16929268
 ] 

ASF subversion and git services commented on NIFI-6582:
---

Commit f42d5e56fc1a11d0d8851ce3c642029df691b8bc in nifi's branch 
refs/heads/master from Bryan Bende
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=f42d5e5 ]

NIFI-6582 Removing bucketId argument to be consistent with other commands

This closes #3667.

Signed-off-by: Mark Payne 


> Add diff-flow-versions command to CLI
> -
>
> Key: NIFI-6582
> URL: https://issues.apache.org/jira/browse/NIFI-6582
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.9.2
>Reporter: Bryan Bende
>Assignee: Bryan Bende
>Priority: Minor
>  Labels: CLI
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> The NiFi Registry REST API has a command to diff two flow versions. We should 
> have a CLI command that can call this end-point and list the differences in a 
> table.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (NIFI-6582) Add diff-flow-versions command to CLI

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929267#comment-16929267
 ] 

ASF subversion and git services commented on NIFI-6582:
---

Commit 5fd6b873feac38d2e5f5a8e767c87f16c710ec8f in nifi's branch 
refs/heads/master from Bryan Bende
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=5fd6b87 ]

NIFI-6582 Add diff-flow-versions command to CLI


> Add diff-flow-versions command to CLI
> -
>
> Key: NIFI-6582
> URL: https://issues.apache.org/jira/browse/NIFI-6582
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.9.2
>Reporter: Bryan Bende
>Assignee: Bryan Bende
>Priority: Minor
>  Labels: CLI
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> The NiFi Registry REST API has a command to diff two flow versions. We should 
> have a CLI command that can call this end-point and list the differences in a 
> table.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] markap14 commented on issue #3667: NIFI-6582 Add diff-flow-versions command to CLI

2019-09-13 Thread GitBox
markap14 commented on issue #3667: NIFI-6582 Add diff-flow-versions command to 
CLI
URL: https://github.com/apache/nifi/pull/3667#issuecomment-531269936
 
 
   Thanks @bbende all looks good. +1 merged to master.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] asfgit closed pull request #3667: NIFI-6582 Add diff-flow-versions command to CLI

2019-09-13 Thread GitBox
asfgit closed pull request #3667: NIFI-6582 Add diff-flow-versions command to 
CLI
URL: https://github.com/apache/nifi/pull/3667
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (NIFI-6658) Provide capability for obtaining diagnostic information for admins

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-6658?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929257#comment-16929257
 ] 

ASF subversion and git services commented on NIFI-6658:
---

Commit eb6085a31d9b8e3f041ff59fc3dcbb2a110b3f95 in nifi's branch 
refs/heads/master from Mark Payne
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=eb6085a ]

NIFI-6658: Implement new bin/nifi.sh diagnostics command that is responsible 
for obtaining diagnostic information about many different parts of nifi, the 
operating system, etc.

This closes #3727.

Signed-off-by: Bryan Bende 


> Provide capability for obtaining diagnostic information for admins
> --
>
> Key: NIFI-6658
> URL: https://issues.apache.org/jira/browse/NIFI-6658
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.10.0
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> When users run into issues, there are several different questions that we end 
> up asking very often:
>  * What version of NiFi?
>  * What does thread dump look like?
>  * What version of Java?
>  * Operating system info
>  * Disk space usage
>  * Cluster information
>  * How many open file handles are used / allowed?
> And several others that are along those same lines.
> We already have the ability for an admin to run `bin/nifi.sh dump ` 
> to gather the thread dump. We should expand this capability to provide more 
> than just a thread dump and to provide the answers to these common questions, 
> so that a user can easily just obtain the diagnostic information easily with 
> a single command.
> It probably makes more sense to introduce a new command, `bin/nifi.sh 
> diagnostics ` rather than just adding this to the `dump` command 
> because there are times that we need to gather several thread dumps, and we 
> don't need to gather all of this information each time. Some may already have 
> scripts, etc. that are setup to parse the thread dumps, as well.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] andrewmlim commented on issue #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim commented on issue #3725: NIFI-6558 Added Parameters to User Guide 
and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#issuecomment-531266412
 
 
   > * Variable values cannot reference other variables or make use of 
Expression Language.
   
   
   
   > Thanks for updating the docs @andrewmlim! I pointed out a handful of notes 
that I think could be clarified slightly. It may also make sense to point out a 
few additional notes:
   > 
   > * Parameters cannot be referenced in Reporting Tasks currently, or in 
Controller-Level Controller Services.
   > * While most processor/controller service properties are not validated 
when referencing variables / using Expression Language, they will be validated 
(against the substituted value) when using Parameters. This is a nice advantage 
of Parameters over variables.
   > * Variable values cannot reference other variables or make use of 
Expression Language.
   > * It may be worth explaining how variables work when interacting with the 
Flow Registry. When exporting a flow to the flow registry, the name of the 
parameter context is sent for each process group that is stored. Additionally, 
the parameters (names, descriptions, values, whether or not sensitive) are 
stored with the flow (for sensitive params, the values are not stored). When a 
flow is imported / updated, if no parameter context is already set, NiFi will 
set the Parameter Context with the same name as the one stored in flow 
registry. If none exists with that name, it will create one. It will also add 
in values for any missing parameters, based on the values that were exported to 
the flow registry. Even though sensitive values are not stored in the flow 
registry, this is a huge improvement over variables, because of Parameters are 
used, it's very easy for the user, on import/upgrade, to just go in and define 
the values for the Parameters, including sensitive values. If using Variables, 
users had to search for sensitive properties to fill in the values, etc. And 
since all properties in the flow can now be parameterized, it is now easy, when 
designing the flow, to parameterize things that will change per-deployment, and 
then have just one place to fill in all of these values when importing the flow.
   
   Thanks for the comments @markap14 ! I'll edit my PR with these Notes. For 
the last bullet re: Flow Registry integration, I agree we need to cover this 
topic in more depth, however I was planning on doing that in a separate Jira/PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] andrewmlim edited a comment on issue #3725: NIFI-6558 Added Parameters to User Guide and Sys Admin Guide

2019-09-13 Thread GitBox
andrewmlim edited a comment on issue #3725: NIFI-6558 Added Parameters to User 
Guide and Sys Admin Guide
URL: https://github.com/apache/nifi/pull/3725#issuecomment-531266412
 
 
   > Thanks for updating the docs @andrewmlim! I pointed out a handful of notes 
that I think could be clarified slightly. It may also make sense to point out a 
few additional notes:
   > 
   > * Parameters cannot be referenced in Reporting Tasks currently, or in 
Controller-Level Controller Services.
   > * While most processor/controller service properties are not validated 
when referencing variables / using Expression Language, they will be validated 
(against the substituted value) when using Parameters. This is a nice advantage 
of Parameters over variables.
   > * Variable values cannot reference other variables or make use of 
Expression Language.
   > * It may be worth explaining how variables work when interacting with the 
Flow Registry. When exporting a flow to the flow registry, the name of the 
parameter context is sent for each process group that is stored. Additionally, 
the parameters (names, descriptions, values, whether or not sensitive) are 
stored with the flow (for sensitive params, the values are not stored). When a 
flow is imported / updated, if no parameter context is already set, NiFi will 
set the Parameter Context with the same name as the one stored in flow 
registry. If none exists with that name, it will create one. It will also add 
in values for any missing parameters, based on the values that were exported to 
the flow registry. Even though sensitive values are not stored in the flow 
registry, this is a huge improvement over variables, because of Parameters are 
used, it's very easy for the user, on import/upgrade, to just go in and define 
the values for the Parameters, including sensitive values. If using Variables, 
users had to search for sensitive properties to fill in the values, etc. And 
since all properties in the flow can now be parameterized, it is now easy, when 
designing the flow, to parameterize things that will change per-deployment, and 
then have just one place to fill in all of these values when importing the flow.
   
   Thanks for the comments @markap14 ! I'll edit my PR with these Notes. For 
the last bullet re: Flow Registry integration, I agree we need to cover this 
topic in more depth, however I was planning on doing that in a separate Jira/PR.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (NIFI-6658) Provide capability for obtaining diagnostic information for admins

2019-09-13 Thread Bryan Bende (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-6658?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende updated NIFI-6658:
--
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> Provide capability for obtaining diagnostic information for admins
> --
>
> Key: NIFI-6658
> URL: https://issues.apache.org/jira/browse/NIFI-6658
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Mark Payne
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.10.0
>
>  Time Spent: 40m
>  Remaining Estimate: 0h
>
> When users run into issues, there are several different questions that we end 
> up asking very often:
>  * What version of NiFi?
>  * What does thread dump look like?
>  * What version of Java?
>  * Operating system info
>  * Disk space usage
>  * Cluster information
>  * How many open file handles are used / allowed?
> And several others that are along those same lines.
> We already have the ability for an admin to run `bin/nifi.sh dump ` 
> to gather the thread dump. We should expand this capability to provide more 
> than just a thread dump and to provide the answers to these common questions, 
> so that a user can easily just obtain the diagnostic information easily with 
> a single command.
> It probably makes more sense to introduce a new command, `bin/nifi.sh 
> diagnostics ` rather than just adding this to the `dump` command 
> because there are times that we need to gather several thread dumps, and we 
> don't need to gather all of this information each time. Some may already have 
> scripts, etc. that are setup to parse the thread dumps, as well.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] asfgit closed pull request #3727: NIFI-6658: Implement new bin/nifi.sh diagnostics command that is resp…

2019-09-13 Thread GitBox
asfgit closed pull request #3727: NIFI-6658: Implement new bin/nifi.sh 
diagnostics command that is resp…
URL: https://github.com/apache/nifi/pull/3727
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] bbende commented on issue #3727: NIFI-6658: Implement new bin/nifi.sh diagnostics command that is resp…

2019-09-13 Thread GitBox
bbende commented on issue #3727: NIFI-6658: Implement new bin/nifi.sh 
diagnostics command that is resp…
URL: https://github.com/apache/nifi/pull/3727#issuecomment-531265841
 
 
   Everything looks good, this is going to be extremely helpful, going to merge 
shortly


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] nielsbasjes commented on issue #3734: NIFI-6666 Add Useragent Header to InvokeHTTP requests

2019-09-13 Thread GitBox
nielsbasjes commented on issue #3734: NIFI- Add Useragent Header to 
InvokeHTTP requests
URL: https://github.com/apache/nifi/pull/3734#issuecomment-531262838
 
 
   Can someone please add me as a contributor in Jira please?
   And assign https://issues.apache.org/jira/browse/NIFI- to me?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] nielsbasjes opened a new pull request #3734: NIFI-6666 Add Useragent Header to InvokeHTTP requests

2019-09-13 Thread GitBox
nielsbasjes opened a new pull request #3734: NIFI- Add Useragent Header to 
InvokeHTTP requests
URL: https://github.com/apache/nifi/pull/3734
 
 
   Thank you for submitting a contribution to Apache NiFi.
   
   Please provide a short description of the PR here:
   
    Description of PR
   
   This enables the option of sending out a Useragent header to all InvokeHTTP 
requests and also adds a sensible default value.
   
   Something like this was present before in the (now deprecated) GetHTTP (but 
without the default value).
   
   The default useragent value is based on a combination of the version of Nifi 
at hand, the java version and the operating system version.
   
   In my local environment the end result was this default useragent:
   
   `Apache Nifi/1.10.0-SNAPSHOT (git:rel/nifi-1.9.0-347-gec3ea46; 
Java/1.8.0_222; Linux 4.4.0-159-generic; amd64; https://nifi.apache.org/)`
   
   To make this possible I changed:
   1. The nifi-api now has a generated class that contains several build time 
variables (like project version, build time and git information).
   1. The VariableRegistry ENVIRONMENT_SYSTEM_REGISTRY includes these variables.
   1. The StandardProcessorTestRunner now loads the ENVIRONMENT_SYSTEM_REGISTRY 
variables by default to allow any testing on these variables.
   1. The InvokeHTTP now has a property with that holds the useragent value 
that uses the Expression language to make it dynamic. It is dynamic per flow 
file so people have the option of writing a flow that changes the useragent per 
flowfile going through.
   
   This improvement fixes NIFI-.
   
   ### For all changes:
   - [x] Is there a JIRA ticket associated with this PR? Is it referenced in 
the commit message?
   
   - [x] Does your PR title start with **NIFI-** where  is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
   
   - [x] Has your PR been rebased against the latest commit within the target 
branch (typically `master`)?
   
   - [x] Is your initial contribution a single, squashed commit? _Additional 
commits in response to PR reviewer feedback should be made on this branch and 
pushed to allow change tracking. Do not `squash` or use `--force` when pushing 
to allow for clean monitoring of changes._
   
   ### For code changes:
   - [x] Have you ensured that the full suite of tests is executed via `mvn 
-Pcontrib-check clean install` at the root `nifi` folder?
   - [x] Have you written or updated unit tests to verify your changes?
   - [ ] Have you verified that the full build is successful on both JDK 8 and 
JDK 11?
   **Waiting for Travis to check this**
   
   - [x] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
   - [x] If applicable, have you updated the `LICENSE` file, including the main 
`LICENSE` file under `nifi-assembly`?
   - [x] If applicable, have you updated the `NOTICE` file, including the main 
`NOTICE` file found under `nifi-assembly`?
   - [x] If adding new Properties, have you added `.displayName` in addition to 
.name (programmatic access) for each of the new properties?
   
   ### Note:
   Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (NIFI-5952) RAW Site-to-Site fails with java.nio.channels.IllegalBlockingModeException

2019-09-13 Thread Mark Payne (Jira)


 [ 
https://issues.apache.org/jira/browse/NIFI-5952?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Payne updated NIFI-5952:
-
Fix Version/s: 1.10.0
   Resolution: Fixed
   Status: Resolved  (was: Patch Available)

I have resolved this as fixed, after merging PR #3265. That PR does address the 
concern described here. If we want to provide an NIO-based implementation, we 
can do that in a new Jira.

> RAW Site-to-Site fails with java.nio.channels.IllegalBlockingModeException
> --
>
> Key: NIFI-5952
> URL: https://issues.apache.org/jira/browse/NIFI-5952
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core Framework
> Environment: jdk-11.0.1
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
>Priority: Blocker
>  Labels: Java11
> Fix For: 1.10.0
>
>  Time Spent: 1h 50m
>  Remaining Estimate: 0h
>
> During the review cycle of NIFI-5820, I found that while HTTP S2S works 
> without issue, RAW S2S is failing with following Exception:
> {code:java}
>  2018-12-19 16:19:26,811 ERROR [Site-to-Site Listener] org.apache.nifi.NiFi
>  java.nio.channels.IllegalBlockingModeException: null
>  at 
> java.base/sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:121)
>  at 
> org.apache.nifi.remote.SocketRemoteSiteListener$1.run(SocketRemoteSiteListener.java:125)
>  at java.base/java.lang.Thread.run(Thread.java:834)
> {code}
> Despite of the fact that the RAW has been worked with older Java versions, it 
> seems current nio usage at RAW S2S is not correct. And JDK 11 starts 
> complaining about it.
> Here are few things I've discovered with current NiFi and nio SocketChannel:
>  - NiFi accepts RAW S2S client connection with SocketRemoteSiteListener, 
> which uses ServerSocketChannel as non-blocking manner [1]
>  - But SocketRemoteSiteListener doesn't use Selector API to accept incoming 
> connection and transfer data with the channel. This is the cause of above 
> exception.
>  - SocketRemoteSiteListener spawns new thread when it accepts connection. 
> This is how connections are handled with a non-nio, standard Socket 
> programming. If we want to use non-blocking NIO, we need to use channels with 
> Selector
>  - But using non-blocking IO with current NiFi S2S protocol can only add few 
> or none benefit by doing so. [2]
> To make RAW S2S work with Java 11, we need either:
>  A. Stop using nio packages.
>  B. Implement correct nio usage, meaning use Selector IO and probably we need 
> another thread pool.
> I'm going to take the approach A above, because B would take much more 
> refactoring.
> [1] 
> [https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/SocketRemoteSiteListener.java#L120]
>  [2] 
> [https://stackoverflow.com/questions/12338204/in-java-nio-is-a-selector-useful-for-a-client-socketchannel]



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] markap14 commented on issue #3578: NIFI-5952 [WIP] Fixing RAW S2S illegal blocking mode on Java 11

2019-09-13 Thread GitBox
markap14 commented on issue #3578: NIFI-5952 [WIP] Fixing RAW S2S illegal 
blocking mode on Java 11
URL: https://github.com/apache/nifi/pull/3578#issuecomment-531261196
 
 
   @ijokarumawak I merged in PR #3265, which is an alternative to this PR. 
Perhaps it makes sense to close this PR for now, and if you're able to get the 
issues resolved for this NIO-based implementation we can create a new PR 
against a new Jira?


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [nifi] markap14 commented on issue #3265: NIFI-5952 Refactor RAW S2S from nio to socket

2019-09-13 Thread GitBox
markap14 commented on issue #3265:  NIFI-5952 Refactor RAW S2S from nio to 
socket
URL: https://github.com/apache/nifi/pull/3265#issuecomment-531260308
 
 
   Thanks for the update @ijokarumawak . I don't love that we are going from 
NIO to the "regular IO" based API, but I agree with your assessment that it's a 
fairly straight-forward change that will allow us to use site-to-site with Java 
11. I did test this against both Java 8 and Java 11 and things seemed to be 
working well. +1 Have merged to master.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (NIFI-5952) RAW Site-to-Site fails with java.nio.channels.IllegalBlockingModeException

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-5952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929247#comment-16929247
 ] 

ASF subversion and git services commented on NIFI-5952:
---

Commit 70c428f0970e1c79c14532169a727db85cb1bd66 in nifi's branch 
refs/heads/master from Mark Payne
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=70c428f ]

NIFI-5952: This closes #3265


> RAW Site-to-Site fails with java.nio.channels.IllegalBlockingModeException
> --
>
> Key: NIFI-5952
> URL: https://issues.apache.org/jira/browse/NIFI-5952
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core Framework
> Environment: jdk-11.0.1
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
>Priority: Blocker
>  Labels: Java11
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> During the review cycle of NIFI-5820, I found that while HTTP S2S works 
> without issue, RAW S2S is failing with following Exception:
> {code:java}
>  2018-12-19 16:19:26,811 ERROR [Site-to-Site Listener] org.apache.nifi.NiFi
>  java.nio.channels.IllegalBlockingModeException: null
>  at 
> java.base/sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:121)
>  at 
> org.apache.nifi.remote.SocketRemoteSiteListener$1.run(SocketRemoteSiteListener.java:125)
>  at java.base/java.lang.Thread.run(Thread.java:834)
> {code}
> Despite of the fact that the RAW has been worked with older Java versions, it 
> seems current nio usage at RAW S2S is not correct. And JDK 11 starts 
> complaining about it.
> Here are few things I've discovered with current NiFi and nio SocketChannel:
>  - NiFi accepts RAW S2S client connection with SocketRemoteSiteListener, 
> which uses ServerSocketChannel as non-blocking manner [1]
>  - But SocketRemoteSiteListener doesn't use Selector API to accept incoming 
> connection and transfer data with the channel. This is the cause of above 
> exception.
>  - SocketRemoteSiteListener spawns new thread when it accepts connection. 
> This is how connections are handled with a non-nio, standard Socket 
> programming. If we want to use non-blocking NIO, we need to use channels with 
> Selector
>  - But using non-blocking IO with current NiFi S2S protocol can only add few 
> or none benefit by doing so. [2]
> To make RAW S2S work with Java 11, we need either:
>  A. Stop using nio packages.
>  B. Implement correct nio usage, meaning use Selector IO and probably we need 
> another thread pool.
> I'm going to take the approach A above, because B would take much more 
> refactoring.
> [1] 
> [https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/SocketRemoteSiteListener.java#L120]
>  [2] 
> [https://stackoverflow.com/questions/12338204/in-java-nio-is-a-selector-useful-for-a-client-socketchannel]



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (NIFI-5952) RAW Site-to-Site fails with java.nio.channels.IllegalBlockingModeException

2019-09-13 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/NIFI-5952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16929246#comment-16929246
 ] 

ASF subversion and git services commented on NIFI-5952:
---

Commit e659e3b606cc3e41816081046f07d9a8d33c88f6 in nifi's branch 
refs/heads/master from Koji Kawamura
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=e659e3b ]

NIFI-5952 Refactor RAW S2S from nio to socket


> RAW Site-to-Site fails with java.nio.channels.IllegalBlockingModeException
> --
>
> Key: NIFI-5952
> URL: https://issues.apache.org/jira/browse/NIFI-5952
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core Framework
> Environment: jdk-11.0.1
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
>Priority: Blocker
>  Labels: Java11
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> During the review cycle of NIFI-5820, I found that while HTTP S2S works 
> without issue, RAW S2S is failing with following Exception:
> {code:java}
>  2018-12-19 16:19:26,811 ERROR [Site-to-Site Listener] org.apache.nifi.NiFi
>  java.nio.channels.IllegalBlockingModeException: null
>  at 
> java.base/sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:121)
>  at 
> org.apache.nifi.remote.SocketRemoteSiteListener$1.run(SocketRemoteSiteListener.java:125)
>  at java.base/java.lang.Thread.run(Thread.java:834)
> {code}
> Despite of the fact that the RAW has been worked with older Java versions, it 
> seems current nio usage at RAW S2S is not correct. And JDK 11 starts 
> complaining about it.
> Here are few things I've discovered with current NiFi and nio SocketChannel:
>  - NiFi accepts RAW S2S client connection with SocketRemoteSiteListener, 
> which uses ServerSocketChannel as non-blocking manner [1]
>  - But SocketRemoteSiteListener doesn't use Selector API to accept incoming 
> connection and transfer data with the channel. This is the cause of above 
> exception.
>  - SocketRemoteSiteListener spawns new thread when it accepts connection. 
> This is how connections are handled with a non-nio, standard Socket 
> programming. If we want to use non-blocking NIO, we need to use channels with 
> Selector
>  - But using non-blocking IO with current NiFi S2S protocol can only add few 
> or none benefit by doing so. [2]
> To make RAW S2S work with Java 11, we need either:
>  A. Stop using nio packages.
>  B. Implement correct nio usage, meaning use Selector IO and probably we need 
> another thread pool.
> I'm going to take the approach A above, because B would take much more 
> refactoring.
> [1] 
> [https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/SocketRemoteSiteListener.java#L120]
>  [2] 
> [https://stackoverflow.com/questions/12338204/in-java-nio-is-a-selector-useful-for-a-client-socketchannel]



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[GitHub] [nifi] asfgit closed pull request #3265: NIFI-5952 Refactor RAW S2S from nio to socket

2019-09-13 Thread GitBox
asfgit closed pull request #3265:  NIFI-5952 Refactor RAW S2S from nio to socket
URL: https://github.com/apache/nifi/pull/3265
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


  1   2   >