[jira] [Updated] (NIFI-2298) Add missing futures for ConsumeKafka

2016-07-17 Thread sumanth chinthagunta (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-2298?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sumanth chinthagunta updated NIFI-2298:
---
Description: 
The new ConsumeKafka processor  is missing some capabilities that were present 
in old getKafka processor. 
1. New ConsumeKafka is not writing critical Kafka attributes  i.e., kafka.key, 
kafka.offset, kafka.partition etc into flowFile attributes. 

Old getKafka processor: 
{quote}
Standard FlowFile Attributes
Key: 'entryDate'
   Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'lineageStartDate'
   Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'fileSize'
   Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
   Value: '19709945781167274'
Key: 'kafka.key'
   Value: '\{"database":"test","table":"sc_job","pk.systemid":1\}'
Key: 'kafka.offset'
   Value: '1184010261'
Key: 'kafka.partition'
   Value: '0'
Key: 'kafka.topic'
   Value: ‘data'
Key: 'path'
   Value: './'
Key: 'uuid'
   Value: '244059bb-9ad9-4d74-b1fb-312eee72124a'
 {quote}
 
New ConsumeKafka processor : 
 {quote}
Standard FlowFile Attributes
Key: 'entryDate'
   Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'lineageStartDate'
   Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'fileSize'
   Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
   Value: '19710046870478139'
Key: 'path'
   Value: './'
Key: 'uuid'
   Value: '349fbeb3-e342-4533-be4c-424793fa5c59’
{quote}

2. getKafka/petKafka are compatible with Kafka 0.8.x and 0.9.x . 
Please make new PublishKafka/ConsumeKafka processors based on Kafka 0.10 
version. 

3. Support subscribing to multiple topics i.e., topic:  topic1,topic2 

4. Support configurable Serializer/DeSerializer for String, JSON , Avro etc. 

  was:
The new ConsumeKafka processor  is missing some capabilities that were present 
in old getKafka processor. 
1. New ConsumeKafka is not writing critical Kafka attributes  i.e., kafka.key, 
kafka.offset, kafka.partition etc into flowFile attributes. 

Old getKafka processor: 
{quote}
Standard FlowFile Attributes
Key: 'entryDate'
   Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'lineageStartDate'
   Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'fileSize'
   Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
   Value: '19709945781167274'
Key: 'kafka.key'
   Value: '\{"database":"test","table":"sc_job","pk.systemid":1\}'
Key: 'kafka.offset'
   Value: '1184010261'
Key: 'kafka.partition'
   Value: '0'
Key: 'kafka.topic'
   Value: ‘data'
Key: 'path'
   Value: './'
Key: 'uuid'
   Value: '244059bb-9ad9-4d74-b1fb-312eee72124a'
 {quote}
 
New ConsumeKafka processor : 
 {quote}
Standard FlowFile Attributes
Key: 'entryDate'
   Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'lineageStartDate'
   Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'fileSize'
   Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
   Value: '19710046870478139'
Key: 'path'
   Value: './'
Key: 'uuid'
   Value: '349fbeb3-e342-4533-be4c-424793fa5c59’
{quote}

2. getKafka/petKafka are compatible with Kafka 0.8.x and 0.9.x . 
Please make new PublishKafka/ConsumeKafka processors based on Kafka 0.10 
version. 

3. Support subscribing to multiple topics i.e., topic:  topic1,topic2 


> Add missing futures for ConsumeKafka
> 
>
> Key: NIFI-2298
> URL: https://issues.apache.org/jira/browse/NIFI-2298
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 0.7.0
>Reporter: sumanth chinthagunta
>  Labels: kafka
> Fix For: 0.8.0
>
>
> The new ConsumeKafka processor  is missing some capabilities that were 
> present in old getKafka processor. 
> 1. New ConsumeKafka is not writing critical Kafka attributes  i.e., 
> kafka.key, kafka.offset, kafka.partition etc into flowFile attributes. 
> Old getKafka processor: 
> {quote}
> Standard FlowFile Attributes
> Key: 'entryDate'
>Value: 'Sun Jul 17 15:17:00 CDT 2016'
> Key: 'lineageStartDate'
>Value: 'Sun Jul 17 15:17:00 CDT 2016'
> Key: 'fileSize'
>Value: '183'
> FlowFile Attribute Map Content
> Key: 'filename'
>Value: '19709945781167274'
> Key: 'kafka.key'
>Value: '\{"database":"test","table":"sc_job","pk.systemid":1\}'
> Key: 'kafka.offset'
>Value: '1184010261'
> Key: 'kafka.partition'
>Value: '0'
> Key: 'kafka.topic'
>Value: ‘data'
> Key: 'path'
>Value: './'
> Key: 'uuid'
>

[jira] [Created] (NIFI-2299) Add standard services API dependency to Scripting Processors

2016-07-17 Thread sumanth chinthagunta (JIRA)
sumanth chinthagunta created NIFI-2299:
--

 Summary: Add standard services API dependency to Scripting 
Processors   
 Key: NIFI-2299
 URL: https://issues.apache.org/jira/browse/NIFI-2299
 Project: Apache NiFi
  Issue Type: Bug
  Components: Extensions
Affects Versions: 0.6.1, 0.7.0
Reporter: sumanth chinthagunta
 Fix For: 0.8.0


Scripting Processors cannot used Controller Services such as 
*DistributedMapCacheClientService* etc, as required dependencies are missing 
for *ExecuteScript* Nar. By adding following dependencies we can open new 
possibilities for Scripting Processors.  

Add this dependency to 
nifi/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-nar/pom.xml
{code}

org.apache.nifi
nifi-standard-services-api-nar
nar

{code}

Add this dependency to 
nifi/nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/pom.xml

{code}

org.apache.nifi
nifi-distributed-cache-client-service-api

{code}

Then we can create scripting processor like:
{code}
import org.apache.nifi.controller.ControllerService
import com.crossbusiness.nifi.processors.StringSerDe

final StringSerDe stringSerDe = new StringSerDe();

def lookup = context.controllerServiceLookup
def cacheServiceName = DistributedMapCacheClientServiceName.value

log.error  "cacheServiceName: ${cacheServiceName}"

def cacheServiceId = 
lookup.getControllerServiceIdentifiers(ControllerService).find {
cs -> lookup.getControllerServiceName(cs) == cacheServiceName
}

log.error  "cacheServiceId:  ${cacheServiceId}"

def cache = lookup.getControllerService(cacheServiceId)
log.error cache.get("aaa", stringSerDe, stringSerDe )
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (NIFI-2298) Add missing futures for ConsumeKafka

2016-07-17 Thread sumanth chinthagunta (JIRA)
sumanth chinthagunta created NIFI-2298:
--

 Summary: Add missing futures for ConsumeKafka
 Key: NIFI-2298
 URL: https://issues.apache.org/jira/browse/NIFI-2298
 Project: Apache NiFi
  Issue Type: Bug
  Components: Extensions
Affects Versions: 0.7.0
Reporter: sumanth chinthagunta
 Fix For: 0.8.0


The new ConsumeKafka processor  is missing some capabilities that were in old 
getKafka processor. 
1. New ConsumeKafka is not writing critical Kafka attributes  i.e., kafka.key, 
kafka.offset, kafka.partition etc into flowFile attributes. 

Old getKafka processor: 
{quote}
Standard FlowFile Attributes
Key: 'entryDate'
   Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'lineageStartDate'
   Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'fileSize'
   Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
   Value: '19709945781167274'
Key: 'kafka.key'
   Value: '\{"database":"test","table":"sc_job","pk.systemid":1\}'
Key: 'kafka.offset'
   Value: '1184010261'
Key: 'kafka.partition'
   Value: '0'
Key: 'kafka.topic'
   Value: ‘data'
Key: 'path'
   Value: './'
Key: 'uuid'
   Value: '244059bb-9ad9-4d74-b1fb-312eee72124a'
 {quote}
 
New ConsumeKafka processor : 
 {quote}
Standard FlowFile Attributes
Key: 'entryDate'
   Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'lineageStartDate'
   Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'fileSize'
   Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
   Value: '19710046870478139'
Key: 'path'
   Value: './'
Key: 'uuid'
   Value: '349fbeb3-e342-4533-be4c-424793fa5c59’
{quote}

2. getKafka/petKafka are compatible with Kafka 0.8.x and 0.9.x . Please make 
new PublishKafka/ConsumeKafka processors based on kafka 0.10 version. 

3. Support subscribing to multiple topics i.e., topic:  topic1,topic2 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (NIFI-2297) Describe differences for QueryDatabaseTable between 0.x and 1.0

2016-07-17 Thread Matt Burgess (JIRA)
Matt Burgess created NIFI-2297:
--

 Summary: Describe differences for QueryDatabaseTable between 0.x 
and 1.0
 Key: NIFI-2297
 URL: https://issues.apache.org/jira/browse/NIFI-2297
 Project: Apache NiFi
  Issue Type: Task
  Components: Documentation & Website
Reporter: Matt Burgess
Assignee: Andrew Lim
 Fix For: 1.0.0


QueryDatabaseTable was refactored to support a common interface 
(DatabaseAdapter) between some DB-related processors that need to know the 
"type" of database, either for SQL generation or whatever. Since 
QueryDatabaseTable exists in 0.7.x, documentation is needed to describe the 
changes that need to be made between "SQL Preprocessing Strategy" and "Database 
Type". In the former property the choice is "None" or "Oracle"; in the new 
system that property is invalid and the new one offers a choice of "Generic" or 
"Oracle".



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi issue #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-07-17 Thread mans2singh
Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/502
  
Hi @JPercivall - I've updated the code based on your comments.  Please let 
me know if you have any other comments.  Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-07-17 Thread mans2singh
Github user mans2singh commented on a diff in the pull request:

https://github.com/apache/nifi/pull/502#discussion_r71093727
  
--- Diff: 
nifi-nar-bundles/nifi-ignite-bundle/nifi-ignite-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -0,0 +1,26 @@
+nifi-ignite-nar
+Copyright 2016 The Apache Software Foundation
+
+This product includes software developed at
+The Apache Software Foundation (http://www.apache.org/).
+
+===
+Apache Software License v2
+===
+
+The following binary components are provided under the Apache Software 
License v2
+
+  (ASLv2) Apache Ignite
--- End diff --

Updated NOTICE.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2156) Add ListDatabaseTables processor

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381598#comment-15381598
 ] 

ASF GitHub Bot commented on NIFI-2156:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/642#discussion_r71093459
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListDatabaseTables.java
 ---
@@ -0,0 +1,304 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+/**
+ * A processor to retrieve a list of tables (and their metadata) from a 
database connection
+ */
+@TriggerSerially
+@InputRequirement(InputRequirement.Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "list", "jdbc", "table", "database"})
+@CapabilityDescription("Generates a set of flow files, each containing 
attributes corresponding to metadata about a table from a database connection.")
+@WritesAttributes({
+@WritesAttribute(attribute = "db.table.name", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.catalog", description = 
"Contains the name of the catalog to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.schema", description = 
"Contains the name of the schema to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.fullname", description = 
"Contains the fully-qualifed table name (possibly including catalog, schema, 
etc.)"),
+@WritesAttribute(attribute = "db.table.type",
+description = "Contains the type of the database table 
from the connection. Typical types are \"TABLE\", \"VIEW\", \"SYSTEM TABLE\", "
++ "\"GLOBAL TEMPORARY\", \"LOCAL TEMPORARY\", 
\"ALIAS\", \"SYNONYM\""),
+@WritesAttribute(attribute = "db.table.remarks", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.count", description = 
"Contains the number of rows in the table")
+})
+@Stateful(scopes = {Scope.LOCAL}, description = "After performing a 
listing of tables, the 

[GitHub] nifi pull request #642: NIFI-2156: Add ListDatabaseTables processor

2016-07-17 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/642#discussion_r71093459
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListDatabaseTables.java
 ---
@@ -0,0 +1,304 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+/**
+ * A processor to retrieve a list of tables (and their metadata) from a 
database connection
+ */
+@TriggerSerially
+@InputRequirement(InputRequirement.Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "list", "jdbc", "table", "database"})
+@CapabilityDescription("Generates a set of flow files, each containing 
attributes corresponding to metadata about a table from a database connection.")
+@WritesAttributes({
+@WritesAttribute(attribute = "db.table.name", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.catalog", description = 
"Contains the name of the catalog to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.schema", description = 
"Contains the name of the schema to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.fullname", description = 
"Contains the fully-qualifed table name (possibly including catalog, schema, 
etc.)"),
+@WritesAttribute(attribute = "db.table.type",
+description = "Contains the type of the database table 
from the connection. Typical types are \"TABLE\", \"VIEW\", \"SYSTEM TABLE\", "
++ "\"GLOBAL TEMPORARY\", \"LOCAL TEMPORARY\", 
\"ALIAS\", \"SYNONYM\""),
+@WritesAttribute(attribute = "db.table.remarks", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.count", description = 
"Contains the number of rows in the table")
+})
+@Stateful(scopes = {Scope.LOCAL}, description = "After performing a 
listing of tables, the timestamp of the query is stored. "
++ "This allows the Processor to not re-list tables the next time 
that the Processor is run. Changing any of the processor properties will "
++ "indicate that the processor should 

[jira] [Commented] (NIFI-2156) Add ListDatabaseTables processor

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381570#comment-15381570
 ] 

ASF GitHub Bot commented on NIFI-2156:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/642#discussion_r71091826
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListDatabaseTables.java
 ---
@@ -0,0 +1,304 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+/**
+ * A processor to retrieve a list of tables (and their metadata) from a 
database connection
+ */
+@TriggerSerially
+@InputRequirement(InputRequirement.Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "list", "jdbc", "table", "database"})
+@CapabilityDescription("Generates a set of flow files, each containing 
attributes corresponding to metadata about a table from a database connection.")
+@WritesAttributes({
+@WritesAttribute(attribute = "db.table.name", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.catalog", description = 
"Contains the name of the catalog to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.schema", description = 
"Contains the name of the schema to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.fullname", description = 
"Contains the fully-qualifed table name (possibly including catalog, schema, 
etc.)"),
+@WritesAttribute(attribute = "db.table.type",
+description = "Contains the type of the database table 
from the connection. Typical types are \"TABLE\", \"VIEW\", \"SYSTEM TABLE\", "
++ "\"GLOBAL TEMPORARY\", \"LOCAL TEMPORARY\", 
\"ALIAS\", \"SYNONYM\""),
+@WritesAttribute(attribute = "db.table.remarks", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.count", description = 
"Contains the number of rows in the table")
+})
+@Stateful(scopes = {Scope.LOCAL}, description = "After performing a 
listing of tables, the 

[GitHub] nifi pull request #642: NIFI-2156: Add ListDatabaseTables processor

2016-07-17 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/642#discussion_r71091826
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListDatabaseTables.java
 ---
@@ -0,0 +1,304 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+/**
+ * A processor to retrieve a list of tables (and their metadata) from a 
database connection
+ */
+@TriggerSerially
+@InputRequirement(InputRequirement.Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "list", "jdbc", "table", "database"})
+@CapabilityDescription("Generates a set of flow files, each containing 
attributes corresponding to metadata about a table from a database connection.")
+@WritesAttributes({
+@WritesAttribute(attribute = "db.table.name", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.catalog", description = 
"Contains the name of the catalog to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.schema", description = 
"Contains the name of the schema to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.fullname", description = 
"Contains the fully-qualifed table name (possibly including catalog, schema, 
etc.)"),
+@WritesAttribute(attribute = "db.table.type",
+description = "Contains the type of the database table 
from the connection. Typical types are \"TABLE\", \"VIEW\", \"SYSTEM TABLE\", "
++ "\"GLOBAL TEMPORARY\", \"LOCAL TEMPORARY\", 
\"ALIAS\", \"SYNONYM\""),
+@WritesAttribute(attribute = "db.table.remarks", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.count", description = 
"Contains the number of rows in the table")
+})
+@Stateful(scopes = {Scope.LOCAL}, description = "After performing a 
listing of tables, the timestamp of the query is stored. "
--- End diff --

Wasn't sure about that but makes sense to me :) will change to cluster.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub 

[jira] [Commented] (NIFI-2156) Add ListDatabaseTables processor

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381560#comment-15381560
 ] 

ASF GitHub Bot commented on NIFI-2156:
--

Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/642#discussion_r71091358
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListDatabaseTables.java
 ---
@@ -0,0 +1,304 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+/**
+ * A processor to retrieve a list of tables (and their metadata) from a 
database connection
+ */
+@TriggerSerially
+@InputRequirement(InputRequirement.Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "list", "jdbc", "table", "database"})
+@CapabilityDescription("Generates a set of flow files, each containing 
attributes corresponding to metadata about a table from a database connection.")
+@WritesAttributes({
+@WritesAttribute(attribute = "db.table.name", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.catalog", description = 
"Contains the name of the catalog to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.schema", description = 
"Contains the name of the schema to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.fullname", description = 
"Contains the fully-qualifed table name (possibly including catalog, schema, 
etc.)"),
+@WritesAttribute(attribute = "db.table.type",
+description = "Contains the type of the database table 
from the connection. Typical types are \"TABLE\", \"VIEW\", \"SYSTEM TABLE\", "
++ "\"GLOBAL TEMPORARY\", \"LOCAL TEMPORARY\", 
\"ALIAS\", \"SYNONYM\""),
+@WritesAttribute(attribute = "db.table.remarks", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.count", description = 
"Contains the number of rows in the table")
+})
+@Stateful(scopes = {Scope.LOCAL}, description = "After performing a 
listing of tables, the 

[GitHub] nifi pull request #642: NIFI-2156: Add ListDatabaseTables processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/642#discussion_r71091358
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListDatabaseTables.java
 ---
@@ -0,0 +1,304 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+/**
+ * A processor to retrieve a list of tables (and their metadata) from a 
database connection
+ */
+@TriggerSerially
+@InputRequirement(InputRequirement.Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "list", "jdbc", "table", "database"})
+@CapabilityDescription("Generates a set of flow files, each containing 
attributes corresponding to metadata about a table from a database connection.")
+@WritesAttributes({
+@WritesAttribute(attribute = "db.table.name", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.catalog", description = 
"Contains the name of the catalog to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.schema", description = 
"Contains the name of the schema to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.fullname", description = 
"Contains the fully-qualifed table name (possibly including catalog, schema, 
etc.)"),
+@WritesAttribute(attribute = "db.table.type",
+description = "Contains the type of the database table 
from the connection. Typical types are \"TABLE\", \"VIEW\", \"SYSTEM TABLE\", "
++ "\"GLOBAL TEMPORARY\", \"LOCAL TEMPORARY\", 
\"ALIAS\", \"SYNONYM\""),
+@WritesAttribute(attribute = "db.table.remarks", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.count", description = 
"Contains the number of rows in the table")
+})
+@Stateful(scopes = {Scope.LOCAL}, description = "After performing a 
listing of tables, the timestamp of the query is stored. "
++ "This allows the Processor to not re-list tables the next time 
that the Processor is run. Changing any of the processor properties will "
++ "indicate that the processor should 

[jira] [Commented] (NIFI-2156) Add ListDatabaseTables processor

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381559#comment-15381559
 ] 

ASF GitHub Bot commented on NIFI-2156:
--

Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/642#discussion_r71091290
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListDatabaseTables.java
 ---
@@ -0,0 +1,304 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+/**
+ * A processor to retrieve a list of tables (and their metadata) from a 
database connection
+ */
+@TriggerSerially
+@InputRequirement(InputRequirement.Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "list", "jdbc", "table", "database"})
+@CapabilityDescription("Generates a set of flow files, each containing 
attributes corresponding to metadata about a table from a database connection.")
+@WritesAttributes({
+@WritesAttribute(attribute = "db.table.name", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.catalog", description = 
"Contains the name of the catalog to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.schema", description = 
"Contains the name of the schema to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.fullname", description = 
"Contains the fully-qualifed table name (possibly including catalog, schema, 
etc.)"),
+@WritesAttribute(attribute = "db.table.type",
+description = "Contains the type of the database table 
from the connection. Typical types are \"TABLE\", \"VIEW\", \"SYSTEM TABLE\", "
++ "\"GLOBAL TEMPORARY\", \"LOCAL TEMPORARY\", 
\"ALIAS\", \"SYNONYM\""),
+@WritesAttribute(attribute = "db.table.remarks", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.count", description = 
"Contains the number of rows in the table")
+})
+@Stateful(scopes = {Scope.LOCAL}, description = "After performing a 
listing of tables, the 

[GitHub] nifi pull request #642: NIFI-2156: Add ListDatabaseTables processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/642#discussion_r71091290
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ListDatabaseTables.java
 ---
@@ -0,0 +1,304 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.util.StringUtils;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DatabaseMetaData;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+/**
+ * A processor to retrieve a list of tables (and their metadata) from a 
database connection
+ */
+@TriggerSerially
+@InputRequirement(InputRequirement.Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "list", "jdbc", "table", "database"})
+@CapabilityDescription("Generates a set of flow files, each containing 
attributes corresponding to metadata about a table from a database connection.")
+@WritesAttributes({
+@WritesAttribute(attribute = "db.table.name", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.catalog", description = 
"Contains the name of the catalog to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.schema", description = 
"Contains the name of the schema to which the table belongs (may be null)"),
+@WritesAttribute(attribute = "db.table.fullname", description = 
"Contains the fully-qualifed table name (possibly including catalog, schema, 
etc.)"),
+@WritesAttribute(attribute = "db.table.type",
+description = "Contains the type of the database table 
from the connection. Typical types are \"TABLE\", \"VIEW\", \"SYSTEM TABLE\", "
++ "\"GLOBAL TEMPORARY\", \"LOCAL TEMPORARY\", 
\"ALIAS\", \"SYNONYM\""),
+@WritesAttribute(attribute = "db.table.remarks", description = 
"Contains the name of a database table from the connection"),
+@WritesAttribute(attribute = "db.table.count", description = 
"Contains the number of rows in the table")
+})
+@Stateful(scopes = {Scope.LOCAL}, description = "After performing a 
listing of tables, the timestamp of the query is stored. "
--- End diff --

Shouldn't this be "cluster"? That way when primary node changes it will 
keep the same listing of tables


---
If your project is set up for it, you can reply to this email and 

[jira] [Commented] (NIFI-2157) Add GenerateTableFetch processor

2016-07-17 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381547#comment-15381547
 ] 

ASF subversion and git services commented on NIFI-2157:
---

Commit 01cae237454c0d7ffedbe7dd08dbe705bc22cb09 in nifi's branch 
refs/heads/master from [~mattyb149]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=01cae23 ]

NIFI-2157: Add GenerateTableFetch processor

This closes #645

Signed-off-by: jpercivall 


> Add GenerateTableFetch processor
> 
>
> Key: NIFI-2157
> URL: https://issues.apache.org/jira/browse/NIFI-2157
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Matt Burgess
>Assignee: Matt Burgess
> Fix For: 1.0.0
>
>
> This processor would presumably operate like QueryDatabaseTable, except it 
> will contain a "Partition Size" property, and rather than executing the SQL 
> statement(s) to fetch rows, it would generate flow files containing SQL 
> statements that will select rows from a table. If the partition size is 
> indicated, then the SELECT statements will refer to a range of rows, such 
> that each statement will grab only a portion of the table. If max-value 
> columns are specified, then only rows whose observed values for those columns 
> exceed the current maximum will be fetched (i.e. like QueryDatabaseTable). 
> These flow files (due to NIFI-1973) can be passed to ExecuteSQL processors 
> for the actual fetching of rows, and ExecuteSQL can be distributed across 
> cluster nodes and/or multiple tasks. These features enable distributed 
> incremental fetching of rows from database table(s).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi pull request #645: NIFI-2157: Add GenerateTableFetch processor

2016-07-17 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/645


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2157) Add GenerateTableFetch processor

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381548#comment-15381548
 ] 

ASF GitHub Bot commented on NIFI-2157:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/645


> Add GenerateTableFetch processor
> 
>
> Key: NIFI-2157
> URL: https://issues.apache.org/jira/browse/NIFI-2157
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Matt Burgess
>Assignee: Matt Burgess
> Fix For: 1.0.0
>
>
> This processor would presumably operate like QueryDatabaseTable, except it 
> will contain a "Partition Size" property, and rather than executing the SQL 
> statement(s) to fetch rows, it would generate flow files containing SQL 
> statements that will select rows from a table. If the partition size is 
> indicated, then the SELECT statements will refer to a range of rows, such 
> that each statement will grab only a portion of the table. If max-value 
> columns are specified, then only rows whose observed values for those columns 
> exceed the current maximum will be fetched (i.e. like QueryDatabaseTable). 
> These flow files (due to NIFI-1973) can be passed to ExecuteSQL processors 
> for the actual fetching of rows, and ExecuteSQL can be distributed across 
> cluster nodes and/or multiple tasks. These features enable distributed 
> incremental fetching of rows from database table(s).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2157) Add GenerateTableFetch processor

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381543#comment-15381543
 ] 

ASF GitHub Bot commented on NIFI-2157:
--

Github user JPercivall commented on the issue:

https://github.com/apache/nifi/pull/645
  
+1 

Went through code and any comments were addressed. ran a contrib check 
build and verified against a MySQL DB. Thanks @mattyb149, I will merge it in.


> Add GenerateTableFetch processor
> 
>
> Key: NIFI-2157
> URL: https://issues.apache.org/jira/browse/NIFI-2157
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Matt Burgess
>Assignee: Matt Burgess
> Fix For: 1.0.0
>
>
> This processor would presumably operate like QueryDatabaseTable, except it 
> will contain a "Partition Size" property, and rather than executing the SQL 
> statement(s) to fetch rows, it would generate flow files containing SQL 
> statements that will select rows from a table. If the partition size is 
> indicated, then the SELECT statements will refer to a range of rows, such 
> that each statement will grab only a portion of the table. If max-value 
> columns are specified, then only rows whose observed values for those columns 
> exceed the current maximum will be fetched (i.e. like QueryDatabaseTable). 
> These flow files (due to NIFI-1973) can be passed to ExecuteSQL processors 
> for the actual fetching of rows, and ExecuteSQL can be distributed across 
> cluster nodes and/or multiple tasks. These features enable distributed 
> incremental fetching of rows from database table(s).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi issue #645: NIFI-2157: Add GenerateTableFetch processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on the issue:

https://github.com/apache/nifi/pull/645
  
+1 

Went through code and any comments were addressed. ran a contrib check 
build and verified against a MySQL DB. Thanks @mattyb149, I will merge it in.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (NIFI-2296) The hover text for the Search icon in the Upload Template window is confusing

2016-07-17 Thread Andrew Lim (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-2296?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Lim updated NIFI-2296:
-
Attachment: NIFI-2296_xmlChosen.png
NIFI-2296_noFileChosen.png

> The hover text for the Search icon in the Upload Template window is confusing
> -
>
> Key: NIFI-2296
> URL: https://issues.apache.org/jira/browse/NIFI-2296
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Affects Versions: 1.0.0
>Reporter: Andrew Lim
>Priority: Minor
>  Labels: UI
> Attachments: NIFI-2296_noFileChosen.png, NIFI-2296_xmlChosen.png
>
>
> When the user hovers over the Search icon in the Upload Template window, the 
> text displayed is either "No file chosen" or the name of the template xml 
> chosen.  This text doesn't aide the user in determining what the icon/button 
> is for and also displays redundant information when a template has been 
> selected.  See attached screenshots.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (NIFI-2296) The hover text for the Search icon in the Upload Template window is confusing

2016-07-17 Thread Andrew Lim (JIRA)
Andrew Lim created NIFI-2296:


 Summary: The hover text for the Search icon in the Upload Template 
window is confusing
 Key: NIFI-2296
 URL: https://issues.apache.org/jira/browse/NIFI-2296
 Project: Apache NiFi
  Issue Type: Sub-task
  Components: Core UI
Affects Versions: 1.0.0
Reporter: Andrew Lim
Priority: Minor


When the user hovers over the Search icon in the Upload Template window, the 
text displayed is either "No file chosen" or the name of the template xml 
chosen.  This text doesn't aide the user in determining what the icon/button is 
for and also displays redundant information when a template has been selected.  
See attached screenshots.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (NIFI-2295) Replay Permissions bug in unsecure standalone instance

2016-07-17 Thread Joseph Percivall (JIRA)
Joseph Percivall created NIFI-2295:
--

 Summary: Replay Permissions bug in unsecure standalone instance
 Key: NIFI-2295
 URL: https://issues.apache.org/jira/browse/NIFI-2295
 Project: Apache NiFi
  Issue Type: Bug
Reporter: Joseph Percivall
 Fix For: 1.0.0


I have an unsecure default instance. When I go to replay a message I see an 
unexpected error that says "Unable to perform the desired action due to 
insufficient permissions. Contact the system administrator." 

That said, after I see the error when I go back to the canvas and refresh I see 
the message was properly marked for replay (exists in the queue and I can see 
the provenance of it getting replayed).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-1942) Create a processor to validate CSV against a user-supplied schema

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-1942?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381508#comment-15381508
 ] 

ASF GitHub Bot commented on NIFI-1942:
--

Github user JPercivall commented on the issue:

https://github.com/apache/nifi/pull/476
  
I'm trying to use a couple of the processors listed in the accompanying doc 
you link to[1] but there seem to be some processors that aren't available 
(notably "IsIncludedIn"). I was trying to make sure a column had a value in a 
set of strings ("male" or "female"). 

Is there a reason for not including all the processors available?

[1] http://super-csv.github.io/super-csv/cell_processors.html
[2] 
http://super-csv.github.io/super-csv/apidocs/org/supercsv/cellprocessor/constraint/IsIncludedIn.html


> Create a processor to validate CSV against a user-supplied schema
> -
>
> Key: NIFI-1942
> URL: https://issues.apache.org/jira/browse/NIFI-1942
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Minor
> Fix For: 1.0.0
>
> Attachments: ValidateCSV.xml
>
>
> In order to extend the set of "quality control" processors, it would be 
> interesting to have a processor validating CSV formatted flow files against a 
> user-specified schema.
> Flow file validated against schema would be routed to "valid" relationship 
> although flow file not validated against schema would be routed to "invalid" 
> relationship.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi issue #476: NIFI-1942 Processor to validate CSV against user-supplied s...

2016-07-17 Thread JPercivall
Github user JPercivall commented on the issue:

https://github.com/apache/nifi/pull/476
  
I'm trying to use a couple of the processors listed in the accompanying doc 
you link to[1] but there seem to be some processors that aren't available 
(notably "IsIncludedIn"). I was trying to make sure a column had a value in a 
set of strings ("male" or "female"). 

Is there a reason for not including all the processors available?

[1] http://super-csv.github.io/super-csv/cell_processors.html
[2] 
http://super-csv.github.io/super-csv/apidocs/org/supercsv/cellprocessor/constraint/IsIncludedIn.html


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-1942) Create a processor to validate CSV against a user-supplied schema

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-1942?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381503#comment-15381503
 ] 

ASF GitHub Bot commented on NIFI-1942:
--

Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/476#discussion_r71088596
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ValidateCsv.java
 ---
@@ -0,0 +1,433 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicReference;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.supercsv.cellprocessor.Optional;
+import org.supercsv.cellprocessor.ParseBigDecimal;
+import org.supercsv.cellprocessor.ParseBool;
+import org.supercsv.cellprocessor.ParseChar;
+import org.supercsv.cellprocessor.ParseDate;
+import org.supercsv.cellprocessor.ParseDouble;
+import org.supercsv.cellprocessor.ParseInt;
+import org.supercsv.cellprocessor.ParseLong;
+import org.supercsv.cellprocessor.constraint.DMinMax;
+import org.supercsv.cellprocessor.constraint.Equals;
+import org.supercsv.cellprocessor.constraint.ForbidSubStr;
+import org.supercsv.cellprocessor.constraint.LMinMax;
+import org.supercsv.cellprocessor.constraint.NotNull;
+import org.supercsv.cellprocessor.constraint.RequireHashCode;
+import org.supercsv.cellprocessor.constraint.RequireSubStr;
+import org.supercsv.cellprocessor.constraint.StrMinMax;
+import org.supercsv.cellprocessor.constraint.StrNotNullOrEmpty;
+import org.supercsv.cellprocessor.constraint.StrRegEx;
+import org.supercsv.cellprocessor.constraint.Strlen;
+import org.supercsv.cellprocessor.constraint.Unique;
+import org.supercsv.cellprocessor.constraint.UniqueHashCode;
+import org.supercsv.cellprocessor.ift.CellProcessor;
+import org.supercsv.exception.SuperCsvCellProcessorException;
+import org.supercsv.io.CsvListReader;
+import org.supercsv.prefs.CsvPreference;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"csv", "schema", "validation"})
+@CapabilityDescription("Validates the contents of FlowFiles against a 
user-specified CSV schema. " +
+"Take a look at the additional documentation of this processor for 
some schema examples.")
+public class ValidateCsv extends AbstractProcessor {
+
+private final static List allowedOperators = 

[GitHub] nifi pull request #476: NIFI-1942 Processor to validate CSV against user-sup...

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/476#discussion_r71088596
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ValidateCsv.java
 ---
@@ -0,0 +1,433 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicReference;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.supercsv.cellprocessor.Optional;
+import org.supercsv.cellprocessor.ParseBigDecimal;
+import org.supercsv.cellprocessor.ParseBool;
+import org.supercsv.cellprocessor.ParseChar;
+import org.supercsv.cellprocessor.ParseDate;
+import org.supercsv.cellprocessor.ParseDouble;
+import org.supercsv.cellprocessor.ParseInt;
+import org.supercsv.cellprocessor.ParseLong;
+import org.supercsv.cellprocessor.constraint.DMinMax;
+import org.supercsv.cellprocessor.constraint.Equals;
+import org.supercsv.cellprocessor.constraint.ForbidSubStr;
+import org.supercsv.cellprocessor.constraint.LMinMax;
+import org.supercsv.cellprocessor.constraint.NotNull;
+import org.supercsv.cellprocessor.constraint.RequireHashCode;
+import org.supercsv.cellprocessor.constraint.RequireSubStr;
+import org.supercsv.cellprocessor.constraint.StrMinMax;
+import org.supercsv.cellprocessor.constraint.StrNotNullOrEmpty;
+import org.supercsv.cellprocessor.constraint.StrRegEx;
+import org.supercsv.cellprocessor.constraint.Strlen;
+import org.supercsv.cellprocessor.constraint.Unique;
+import org.supercsv.cellprocessor.constraint.UniqueHashCode;
+import org.supercsv.cellprocessor.ift.CellProcessor;
+import org.supercsv.exception.SuperCsvCellProcessorException;
+import org.supercsv.io.CsvListReader;
+import org.supercsv.prefs.CsvPreference;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"csv", "schema", "validation"})
+@CapabilityDescription("Validates the contents of FlowFiles against a 
user-specified CSV schema. " +
+"Take a look at the additional documentation of this processor for 
some schema examples.")
+public class ValidateCsv extends AbstractProcessor {
+
+private final static List allowedOperators = 
Arrays.asList("ParseBigDecimal", "ParseBool", "ParseChar", "ParseDate",
+"ParseDouble", "ParseInt", "ParseLong", "Optional", "DMinMax", 
"Equals", "ForbidSubStr", "LMinMax", "NotNull", "Null",
+

[GitHub] nifi pull request #476: NIFI-1942 Processor to validate CSV against user-sup...

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/476#discussion_r71088438
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/ValidateCsv.java
 ---
@@ -0,0 +1,433 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicReference;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.supercsv.cellprocessor.Optional;
+import org.supercsv.cellprocessor.ParseBigDecimal;
+import org.supercsv.cellprocessor.ParseBool;
+import org.supercsv.cellprocessor.ParseChar;
+import org.supercsv.cellprocessor.ParseDate;
+import org.supercsv.cellprocessor.ParseDouble;
+import org.supercsv.cellprocessor.ParseInt;
+import org.supercsv.cellprocessor.ParseLong;
+import org.supercsv.cellprocessor.constraint.DMinMax;
+import org.supercsv.cellprocessor.constraint.Equals;
+import org.supercsv.cellprocessor.constraint.ForbidSubStr;
+import org.supercsv.cellprocessor.constraint.LMinMax;
+import org.supercsv.cellprocessor.constraint.NotNull;
+import org.supercsv.cellprocessor.constraint.RequireHashCode;
+import org.supercsv.cellprocessor.constraint.RequireSubStr;
+import org.supercsv.cellprocessor.constraint.StrMinMax;
+import org.supercsv.cellprocessor.constraint.StrNotNullOrEmpty;
+import org.supercsv.cellprocessor.constraint.StrRegEx;
+import org.supercsv.cellprocessor.constraint.Strlen;
+import org.supercsv.cellprocessor.constraint.Unique;
+import org.supercsv.cellprocessor.constraint.UniqueHashCode;
+import org.supercsv.cellprocessor.ift.CellProcessor;
+import org.supercsv.exception.SuperCsvCellProcessorException;
+import org.supercsv.io.CsvListReader;
+import org.supercsv.prefs.CsvPreference;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"csv", "schema", "validation"})
+@CapabilityDescription("Validates the contents of FlowFiles against a 
user-specified CSV schema. " +
+"Take a look at the additional documentation of this processor for 
some schema examples.")
+public class ValidateCsv extends AbstractProcessor {
+
+private final static List allowedOperators = 
Arrays.asList("ParseBigDecimal", "ParseBool", "ParseChar", "ParseDate",
+"ParseDouble", "ParseInt", "ParseLong", "Optional", "DMinMax", 
"Equals", "ForbidSubStr", "LMinMax", "NotNull", "Null",
+

[jira] [Commented] (NIFI-1942) Create a processor to validate CSV against a user-supplied schema

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-1942?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381482#comment-15381482
 ] 

ASF GitHub Bot commented on NIFI-1942:
--

Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/476#discussion_r71088067
  
--- Diff: nifi-assembly/NOTICE ---
@@ -193,6 +193,8 @@ The following binary components are provided under the 
Apache Software License v
 
   (ASLv2) opencsv (net.sf.opencsv:opencsv:2.3)
 
+  (ASLv2) Super CSV (net.sf.supercsv:super-csv:2.4.0)
--- End diff --

This is a Apache 2.0 licensed import with not NOTICE, it can be used with 
adding anything to the NOTICE or LICENSE file. This applies to this assembly 
NOTICE and the nar NOTICE.


> Create a processor to validate CSV against a user-supplied schema
> -
>
> Key: NIFI-1942
> URL: https://issues.apache.org/jira/browse/NIFI-1942
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Minor
> Fix For: 1.0.0
>
> Attachments: ValidateCSV.xml
>
>
> In order to extend the set of "quality control" processors, it would be 
> interesting to have a processor validating CSV formatted flow files against a 
> user-specified schema.
> Flow file validated against schema would be routed to "valid" relationship 
> although flow file not validated against schema would be routed to "invalid" 
> relationship.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi pull request #476: NIFI-1942 Processor to validate CSV against user-sup...

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/476#discussion_r71088067
  
--- Diff: nifi-assembly/NOTICE ---
@@ -193,6 +193,8 @@ The following binary components are provided under the 
Apache Software License v
 
   (ASLv2) opencsv (net.sf.opencsv:opencsv:2.3)
 
+  (ASLv2) Super CSV (net.sf.supercsv:super-csv:2.4.0)
--- End diff --

This is a Apache 2.0 licensed import with not NOTICE, it can be used with 
adding anything to the NOTICE or LICENSE file. This applies to this assembly 
NOTICE and the nar NOTICE.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #645: NIFI-2157: Add GenerateTableFetch processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/645#discussion_r71087720
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GenerateTableFetch.java
 ---
@@ -0,0 +1,253 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.DatabaseAdapter;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.text.ParseException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+
+@TriggerSerially
+@InputRequirement(Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "select", "jdbc", "query", "database", "fetch", "generate"})
+@SeeAlso(QueryDatabaseTable.class)
--- End diff --

That's fair, as long as you have a reason for the choice, I'm fine with 
either.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2157) Add GenerateTableFetch processor

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381471#comment-15381471
 ] 

ASF GitHub Bot commented on NIFI-2157:
--

Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/645#discussion_r71087671
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GenerateTableFetch.java
 ---
@@ -0,0 +1,253 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.DatabaseAdapter;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.text.ParseException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+
+@TriggerSerially
+@InputRequirement(Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "select", "jdbc", "query", "database", "fetch", "generate"})
+@SeeAlso(QueryDatabaseTable.class)
--- End diff --

Probably. I wasn't sure whether to be liberal or conservative with the 
SeeAlso annotation


> Add GenerateTableFetch processor
> 
>
> Key: NIFI-2157
> URL: https://issues.apache.org/jira/browse/NIFI-2157
> Project: Apache NiFi
>  Issue Type: Sub-task
>Reporter: Matt Burgess
>Assignee: Matt Burgess
> Fix For: 1.0.0
>
>
> This processor would presumably operate like QueryDatabaseTable, except it 
> will contain a "Partition Size" property, and rather than executing the SQL 
> statement(s) to fetch rows, it would generate flow files containing SQL 
> statements that will select rows from a table. If the partition size is 
> indicated, then the SELECT statements will refer to a range of rows, such 
> that each statement will grab only a portion of the table. If max-value 
> columns are specified, then only rows whose observed values for those columns 
> exceed the current maximum will be fetched (i.e. like QueryDatabaseTable). 
> These flow files (due to NIFI-1973) can be passed to ExecuteSQL processors 
> for the actual fetching of rows, and ExecuteSQL can be distributed across 
> cluster nodes and/or multiple tasks. These features enable distributed 
> incremental fetching of rows from database table(s).



--
This message was sent by 

[GitHub] nifi pull request #645: NIFI-2157: Add GenerateTableFetch processor

2016-07-17 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/645#discussion_r71087671
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GenerateTableFetch.java
 ---
@@ -0,0 +1,253 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.DatabaseAdapter;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.text.ParseException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+
+@TriggerSerially
+@InputRequirement(Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "select", "jdbc", "query", "database", "fetch", "generate"})
+@SeeAlso(QueryDatabaseTable.class)
--- End diff --

Probably. I wasn't sure whether to be liberal or conservative with the 
SeeAlso annotation


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (NIFI-1942) Create a processor to validate CSV against a user-supplied schema

2016-07-17 Thread Pierre Villard (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-1942?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pierre Villard updated NIFI-1942:
-
Attachment: ValidateCSV.xml

Template to validate the processor.

> Create a processor to validate CSV against a user-supplied schema
> -
>
> Key: NIFI-1942
> URL: https://issues.apache.org/jira/browse/NIFI-1942
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Reporter: Pierre Villard
>Assignee: Pierre Villard
>Priority: Minor
> Fix For: 1.0.0
>
> Attachments: ValidateCSV.xml
>
>
> In order to extend the set of "quality control" processors, it would be 
> interesting to have a processor validating CSV formatted flow files against a 
> user-specified schema.
> Flow file validated against schema would be routed to "valid" relationship 
> although flow file not validated against schema would be routed to "invalid" 
> relationship.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2157) Add GenerateTableFetch processor

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381466#comment-15381466
 ] 

ASF GitHub Bot commented on NIFI-2157:
--

Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/645#discussion_r71087619
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GenerateTableFetch.java
 ---
@@ -0,0 +1,253 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.DatabaseAdapter;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.text.ParseException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+
+@TriggerSerially
+@InputRequirement(Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "select", "jdbc", "query", "database", "fetch", "generate"})
+@SeeAlso(QueryDatabaseTable.class)
+@CapabilityDescription("Generates SQL select queries that fetch \"pages\" 
of rows from a table. The partition size property, along with the table's row 
count, "
++ "determine the size and number of pages and generated FlowFiles. 
In addition, incremental fetching can be achieved by setting Maximum-Value 
Columns, "
++ "which causes the processor to track the columns' maximum 
values, thus only fetching rows whose columns' values exceed the observed 
maximums. This "
++ "processor is intended to be run on the Primary Node only.")
+@Stateful(scopes = Scope.CLUSTER, description = "After performing a query 
on the specified table, the maximum values for "
++ "the specified column(s) will be retained for use in future 
executions of the query. This allows the Processor "
++ "to fetch only those records that have max values greater than 
the retained values. This can be used for "
++ "incremental fetching, fetching of newly added rows, etc. To 
clear the maximum values, clear the state of the processor "
++ "per the State Management documentation")
+public class GenerateTableFetch extends AbstractDatabaseFetchProcessor {
+
+public static final PropertyDescriptor PARTITION_SIZE = new 
PropertyDescriptor.Builder()
+.name("gen-table-fetch-partition-size")
+

[GitHub] nifi pull request #645: NIFI-2157: Add GenerateTableFetch processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/645#discussion_r71087628
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GenerateTableFetch.java
 ---
@@ -0,0 +1,253 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.behavior.TriggerSerially;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.components.state.StateMap;
+import org.apache.nifi.dbcp.DBCPService;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessSessionFactory;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.standard.db.DatabaseAdapter;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.ResultSet;
+import java.sql.ResultSetMetaData;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.text.ParseException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+
+@TriggerSerially
+@InputRequirement(Requirement.INPUT_FORBIDDEN)
+@Tags({"sql", "select", "jdbc", "query", "database", "fetch", "generate"})
+@SeeAlso(QueryDatabaseTable.class)
--- End diff --

Should the other SQL processors (especially the ones that could accept this 
as input) be added?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #476: NIFI-1942 Processor to validate CSV against user-supplied s...

2016-07-17 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/476
  
Hey @JPercivall, thanks for the review! I believe I addressed all of your 
comments. I also added the new dependency in NOTICE files (let me know if it is 
OK). To validate the processor I used a simple flow to generate CSV data with a 
ReplaceText processor: I'm gonna add the template to the JIRA right now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #664: NIFI-2280 marked methods as deprecated which will be...

2016-07-17 Thread joewitt
Github user joewitt closed the pull request at:

https://github.com/apache/nifi/pull/664


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2280) Annotate methods as deprecated on 0.x which aren't needed on 1.x

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381451#comment-15381451
 ] 

ASF GitHub Bot commented on NIFI-2280:
--

Github user joewitt closed the pull request at:

https://github.com/apache/nifi/pull/664


> Annotate methods as deprecated on 0.x which aren't needed on 1.x
> 
>
> Key: NIFI-2280
> URL: https://issues.apache.org/jira/browse/NIFI-2280
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Joseph Witt
>Assignee: Joseph Witt
> Fix For: 0.8.0
>
>
> During the deprecated method removal/cleanup on 1.0 a few methods became no 
> longer necessary that weren't deprecated on 0.x.
> So this ticket is to mark them as deprecated on 0.x.
> They are:
> - Encrypt Process interface method for getCipher can be Deprecated. Was 
> deprecated in a couple impls.  Discussion on NIFI-1157
> - AbstractControllerService 'onConfigurationChange' is no longer needed.  Was 
> an empty impl and invoked by annotation.  Deprecate.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (NIFI-2294) Scroll bar exists in the Settings tab of the Configure Connection window

2016-07-17 Thread Andrew Lim (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-2294?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Lim updated NIFI-2294:
-
Attachment: NIFI-2294_scrollBar.png

> Scroll bar exists in the Settings tab of the Configure Connection window
> 
>
> Key: NIFI-2294
> URL: https://issues.apache.org/jira/browse/NIFI-2294
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Affects Versions: 1.0.0
> Environment: Mac OS X, Chrome
>Reporter: Andrew Lim
>Priority: Minor
>  Labels: UI
> Attachments: NIFI-2294_scrollBar.png
>
>
> The existence of a scroll bar in this window is inconsistent with the rest of 
> the application and also doesn't seem necessary.
> Screenshot attached.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi pull request #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/502#discussion_r71086795
  
--- Diff: 
nifi-nar-bundles/nifi-ignite-bundle/nifi-ignite-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -0,0 +1,26 @@
+nifi-ignite-nar
--- End diff --

Any NOTICE information in this Notice file that is not in the nifi-assembly 
NOTICE,  should be added to the assembly NOTICE as well


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/502#discussion_r71086770
  
--- Diff: 
nifi-nar-bundles/nifi-ignite-bundle/nifi-ignite-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -0,0 +1,26 @@
+nifi-ignite-nar
+Copyright 2016 The Apache Software Foundation
+
+This product includes software developed at
+The Apache Software Foundation (http://www.apache.org/).
+
+===
+Apache Software License v2
+===
+
+The following binary components are provided under the Apache Software 
License v2
+
+  (ASLv2) Apache Ignite
--- End diff --

This notice should be this: 
https://github.com/apache/ignite/blob/1.6.0/NOTICE


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/502#discussion_r71086749
  
--- Diff: pom.xml ---
@@ -227,6 +227,29 @@ language governing permissions and limitations under 
the License. -->
 1.5.3-M1
 
 
+org.apache.ignite
+ignite-core
+1.6.0
+
+
+org.apache.ignite
+ignite-spring
+1.6.0
+
+
+org.apached.ignite
+ignite-core
+1.6.0
+test-jar
+test
--- End diff --

I'd suggest against providing the scope here and instead putting it in the 
pom where it's used. Adding the scope test here will propagate to any package 
that brings in this dependency (even transitively).

Comment also applies to ignite-log4j2


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-07-17 Thread JPercivall
Github user JPercivall commented on a diff in the pull request:

https://github.com/apache/nifi/pull/502#discussion_r71086694
  
--- Diff: 
nifi-nar-bundles/nifi-ignite-bundle/nifi-ignite-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -0,0 +1,26 @@
+nifi-ignite-nar
+Copyright 2016 The Apache Software Foundation
+
+This product includes software developed at
+The Apache Software Foundation (http://www.apache.org/).
+
+===
+Apache Software License v2
+===
+
+The following binary components are provided under the Apache Software 
License v2
+
+  (ASLv2) Apache Ignite
+The following NOTICE information applies:
+  Apache Ignite
+  Copyright 2015 The Apache Software Foundation
+
+  (ASLv2) Apache Commons IO
+The following NOTICE information applies:
+  Apache Commons IO
+  Copyright 2002-2012 The Apache Software Foundation
+
+  (ASLv2) Guava
--- End diff --

Since this is only a test scope dependency and not bundled into the nar, 
this does not need to be a part of the notice


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Resolved] (NIFI-2280) Annotate methods as deprecated on 0.x which aren't needed on 1.x

2016-07-17 Thread Joseph Percivall (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-2280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joseph Percivall resolved NIFI-2280.

Resolution: Fixed

> Annotate methods as deprecated on 0.x which aren't needed on 1.x
> 
>
> Key: NIFI-2280
> URL: https://issues.apache.org/jira/browse/NIFI-2280
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Joseph Witt
>Assignee: Joseph Witt
> Fix For: 0.8.0
>
>
> During the deprecated method removal/cleanup on 1.0 a few methods became no 
> longer necessary that weren't deprecated on 0.x.
> So this ticket is to mark them as deprecated on 0.x.
> They are:
> - Encrypt Process interface method for getCipher can be Deprecated. Was 
> deprecated in a couple impls.  Discussion on NIFI-1157
> - AbstractControllerService 'onConfigurationChange' is no longer needed.  Was 
> an empty impl and invoked by annotation.  Deprecate.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2280) Annotate methods as deprecated on 0.x which aren't needed on 1.x

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381442#comment-15381442
 ] 

ASF GitHub Bot commented on NIFI-2280:
--

Github user JPercivall commented on the issue:

https://github.com/apache/nifi/pull/664
  
@joewitt just pushed out the change but since it's only on the 0.x branch 
you'll need to close this PR manually.

https://git1-us-west.apache.org/repos/asf?p=nifi.git;a=commit;h=537b3d0d


> Annotate methods as deprecated on 0.x which aren't needed on 1.x
> 
>
> Key: NIFI-2280
> URL: https://issues.apache.org/jira/browse/NIFI-2280
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Joseph Witt
>Assignee: Joseph Witt
> Fix For: 0.8.0
>
>
> During the deprecated method removal/cleanup on 1.0 a few methods became no 
> longer necessary that weren't deprecated on 0.x.
> So this ticket is to mark them as deprecated on 0.x.
> They are:
> - Encrypt Process interface method for getCipher can be Deprecated. Was 
> deprecated in a couple impls.  Discussion on NIFI-1157
> - AbstractControllerService 'onConfigurationChange' is no longer needed.  Was 
> an empty impl and invoked by annotation.  Deprecate.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi issue #664: NIFI-2280 marked methods as deprecated which will be remove...

2016-07-17 Thread JPercivall
Github user JPercivall commented on the issue:

https://github.com/apache/nifi/pull/664
  
@joewitt just pushed out the change but since it's only on the 0.x branch 
you'll need to close this PR manually.

https://git1-us-west.apache.org/repos/asf?p=nifi.git;a=commit;h=537b3d0d


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-1157) Remove deprecated classes and methods

2016-07-17 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-1157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381441#comment-15381441
 ] 

ASF subversion and git services commented on NIFI-1157:
---

Commit 537b3d0d73f95e07e3c69cedc374cb3cd19cd21c in nifi's branch refs/heads/0.x 
from [~joewitt]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=537b3d0 ]

NIFI-2280 marked methods as deprecated as appropriate to the cleanup discovered 
found in NIFI-1157

This closes #664

Signed-off-by: jpercivall 


> Remove deprecated classes and methods
> -
>
> Key: NIFI-1157
> URL: https://issues.apache.org/jira/browse/NIFI-1157
> Project: Apache NiFi
>  Issue Type: Task
>Reporter: Tony Kurc
>Assignee: Joseph Witt
>Priority: Minor
>  Labels: migration
> Fix For: 1.0.0
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2280) Annotate methods as deprecated on 0.x which aren't needed on 1.x

2016-07-17 Thread Joseph Witt (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381439#comment-15381439
 ] 

Joseph Witt commented on NIFI-2280:
---

ah nice catch and thanks for taking care of it!

> Annotate methods as deprecated on 0.x which aren't needed on 1.x
> 
>
> Key: NIFI-2280
> URL: https://issues.apache.org/jira/browse/NIFI-2280
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Joseph Witt
>Assignee: Joseph Witt
> Fix For: 0.8.0
>
>
> During the deprecated method removal/cleanup on 1.0 a few methods became no 
> longer necessary that weren't deprecated on 0.x.
> So this ticket is to mark them as deprecated on 0.x.
> They are:
> - Encrypt Process interface method for getCipher can be Deprecated. Was 
> deprecated in a couple impls.  Discussion on NIFI-1157
> - AbstractControllerService 'onConfigurationChange' is no longer needed.  Was 
> an empty impl and invoked by annotation.  Deprecate.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2280) Annotate methods as deprecated on 0.x which aren't needed on 1.x

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381430#comment-15381430
 ] 

ASF GitHub Bot commented on NIFI-2280:
--

Github user JPercivall commented on the issue:

https://github.com/apache/nifi/pull/664
  
Looks good overall, a couple of the deprecated markings are missing a 
corresponding "@deprecated use other getCipher requiring a salt" javadoc. I'll 
just add them for you because everything else looks good.


> Annotate methods as deprecated on 0.x which aren't needed on 1.x
> 
>
> Key: NIFI-2280
> URL: https://issues.apache.org/jira/browse/NIFI-2280
> Project: Apache NiFi
>  Issue Type: Bug
>Reporter: Joseph Witt
>Assignee: Joseph Witt
> Fix For: 0.8.0
>
>
> During the deprecated method removal/cleanup on 1.0 a few methods became no 
> longer necessary that weren't deprecated on 0.x.
> So this ticket is to mark them as deprecated on 0.x.
> They are:
> - Encrypt Process interface method for getCipher can be Deprecated. Was 
> deprecated in a couple impls.  Discussion on NIFI-1157
> - AbstractControllerService 'onConfigurationChange' is no longer needed.  Was 
> an empty impl and invoked by annotation.  Deprecate.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi issue #664: NIFI-2280 marked methods as deprecated which will be remove...

2016-07-17 Thread JPercivall
Github user JPercivall commented on the issue:

https://github.com/apache/nifi/pull/664
  
Looks good overall, a couple of the deprecated markings are missing a 
corresponding "@deprecated use other getCipher requiring a salt" javadoc. I'll 
just add them for you because everything else looks good.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2110) Queue Position doesn't show up in List Queue Details dialog

2016-07-17 Thread Matt Gilman (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381401#comment-15381401
 ] 

Matt Gilman commented on NIFI-2110:
---

[~markap14] Queue listing isn't showing up because this dialog queries for the 
FlowFileRecord based off the FlowFileSummary in the table. A FlowFileRecord 
does not contain it's position. A couple options:

1) Remove the queue position of the dialog.
2) Get the position for the FlowFileRecord so it can be returned to the client 
for the dialog.

I could possible grab the value from the table but it may not be accurate at 
the time the dialog is opened. Because of this I would probably vote for option 
1. Thoughts?

> Queue Position doesn't show up in List Queue Details dialog
> ---
>
> Key: NIFI-2110
> URL: https://issues.apache.org/jira/browse/NIFI-2110
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Matt Burgess
> Attachments: no-queue-position.png
>
>
> After bringing up the List Queue dialog, if you select an entry and click the 
> View Details button, on the Details tab there is a Queue Position field but 
> it always reads "no value set"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Comment Edited] (NIFI-2110) Queue Position doesn't show up in List Queue Details dialog

2016-07-17 Thread Matt Gilman (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381401#comment-15381401
 ] 

Matt Gilman edited comment on NIFI-2110 at 7/17/16 3:54 PM:


[~markap14] Queue position isn't showing up because this dialog queries for the 
FlowFileRecord based off the FlowFileSummary in the table. A FlowFileRecord 
does not contain it's position. A couple options:

1) Remove the queue position of the dialog.
2) Get the position for the FlowFileRecord so it can be returned to the client 
for the dialog.

I could possible grab the value from the table but it may not be accurate at 
the time the dialog is opened. Because of this I would probably vote for option 
1. Thoughts?


was (Author: mcgilman):
[~markap14] Queue listing isn't showing up because this dialog queries for the 
FlowFileRecord based off the FlowFileSummary in the table. A FlowFileRecord 
does not contain it's position. A couple options:

1) Remove the queue position of the dialog.
2) Get the position for the FlowFileRecord so it can be returned to the client 
for the dialog.

I could possible grab the value from the table but it may not be accurate at 
the time the dialog is opened. Because of this I would probably vote for option 
1. Thoughts?

> Queue Position doesn't show up in List Queue Details dialog
> ---
>
> Key: NIFI-2110
> URL: https://issues.apache.org/jira/browse/NIFI-2110
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Matt Burgess
> Attachments: no-queue-position.png
>
>
> After bringing up the List Queue dialog, if you select an entry and click the 
> View Details button, on the Details tab there is a Queue Position field but 
> it always reads "no value set"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Assigned] (NIFI-2096) Can't upload template using Firefox

2016-07-17 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-2096?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman reassigned NIFI-2096:
-

Assignee: Matt Gilman

> Can't upload template using Firefox
> ---
>
> Key: NIFI-2096
> URL: https://issues.apache.org/jira/browse/NIFI-2096
> Project: Apache NiFi
>  Issue Type: Sub-task
>  Components: Core UI
>Reporter: Rob Moran
>Assignee: Matt Gilman
>Priority: Blocker
>
> Tried in FF 40.0.2 and 47
> From the Templates shell, clicking the + icon does not trigger the system 
> prompt to choose a file.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2282) Purge history not working

2016-07-17 Thread Matt Gilman (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2282?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381386#comment-15381386
 ] 

Matt Gilman commented on NIFI-2282:
---

[~rmoran] Can you please try this again? I was unable to replicate. Make sure 
the end date of your purge request is correct. The default value is about 1 
month ago. So if you don't change the end date, it will remove all history 
older than 1 month ago. Let me know if your still seeing issues.

> Purge history not working
> -
>
> Key: NIFI-2282
> URL: https://issues.apache.org/jira/browse/NIFI-2282
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.0.0
>Reporter: Rob Moran
>
> Tried to purge config history from Flow Configuration History shell. The 
> purge operation is captured and displayed in the table, but previous 
> operations are not removed.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi issue #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-07-17 Thread mans2singh
Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/502
  
@pvillard31 - The build is passing locally for me, so I am not sure what 
could the issue with the travis build.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (NIFI-2293) Add controls for accessing each Node's History

2016-07-17 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-2293?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman updated NIFI-2293:
--
Issue Type: Improvement  (was: Bug)

> Add controls for accessing each Node's History
> --
>
> Key: NIFI-2293
> URL: https://issues.apache.org/jira/browse/NIFI-2293
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Matt Gilman
> Fix For: 1.1.0
>
>
> Each node in a cluster tracks and maintains it's own history. A given node's 
> events should be accessible from any node in the cluster. As part of 
> NIFI-2263 the user must go to each node directly in the browser to access 
> those events.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (NIFI-2263) Update History to work in a Cluster

2016-07-17 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-2263?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman updated NIFI-2263:
--
Description: Each node in a cluster records it's own independent history of 
configuration actions. Based on how these actions are audited it's not possible 
to guarantee the same identifiers for each action on each node. As a result, we 
will simply return the history for the node that is currently loaded in the 
browser.  (was: Each node in a cluster records it's own independent history of 
configuration actions. Based on how these actions are audited it's not possible 
to guaruntee the same )

> Update History to work in a Cluster
> ---
>
> Key: NIFI-2263
> URL: https://issues.apache.org/jira/browse/NIFI-2263
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework, Core UI
>Reporter: Matt Gilman
>Assignee: Matt Gilman
>Priority: Blocker
> Fix For: 1.0.0
>
>
> Each node in a cluster records it's own independent history of configuration 
> actions. Based on how these actions are audited it's not possible to 
> guarantee the same identifiers for each action on each node. As a result, we 
> will simply return the history for the node that is currently loaded in the 
> browser.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (NIFI-2263) Update History to work in a Cluster

2016-07-17 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-2263?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman updated NIFI-2263:
--
Description: Each node in a cluster records it's own independent history of 
configuration actions. Based on how these actions are audited it's not possible 
to guaruntee the same 

> Update History to work in a Cluster
> ---
>
> Key: NIFI-2263
> URL: https://issues.apache.org/jira/browse/NIFI-2263
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework, Core UI
>Reporter: Matt Gilman
>Assignee: Matt Gilman
>Priority: Blocker
> Fix For: 1.0.0
>
>
> Each node in a cluster records it's own independent history of configuration 
> actions. Based on how these actions are audited it's not possible to 
> guaruntee the same 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (NIFI-2026) nifi-hadoop-libraries-nar should use profiles to point to different hadoop distro artifacts

2016-07-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2026?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15381353#comment-15381353
 ] 

ASF GitHub Bot commented on NIFI-2026:
--

Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/475
  
@mattyb149 - PR updated to reflect the no-profile / just-repos `pom.xml`

Users should be able to build NiFi against non-Apache artifacts by 
specifying the adequate versions (e.g. `-Dhadoop.version=2.0.0-mr1-cdh4.2.0`)


> nifi-hadoop-libraries-nar should use profiles to point to different hadoop 
> distro artifacts
> ---
>
> Key: NIFI-2026
> URL: https://issues.apache.org/jira/browse/NIFI-2026
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Andre
>
> Raising a JIRA issue as discussed with [~mattyb149] as part of PR-475. 
> Users using particular Hadoop versions may struggle to use *HDFS against a 
> cluster running proprietary or particular versions of HDFS.
> Therefore, until we find a cleaner way of BYO hadoop machanism (as suggested 
> in NIFI-710), we should consider introducing a maven profiles to support 
> different hadoop library, enabling users to compile.
> This should cause no changes to default behaviour, just eliminating the need 
> to clone, modify, build and copy NAR bundles over standard NiFi artifacts.
> Unless the profile is explicitly requested, build will still includes just 
> the Apache licensed artifacts.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[GitHub] nifi issue #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-07-17 Thread mans2singh
Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/502
  
@pvillard31 - I am checking into the build failures.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---