[jira] [Commented] (NIFI-5519) Allow ListDatabaseTables to accept incoming connections

2018-09-11 Thread Colin Dean (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611551#comment-16611551
 ] 

Colin Dean commented on NIFI-5519:
--

I'm having a lot of problems getting the above to work with a remote database 
server. it seems to work just fine with SQLite but when accessing MSSQL 2012 
through JTDS, it has issues:


{code}
2018-09-11 22:31:27,499 ERROR [Timer-Driven Process Thread-48] 
o.a.n.p.groovyx.ExecuteGroovyScript 
ExecuteGroovyScript[id=2d9c363f-af36-15a2-a399-46b2fd0f66d4] Scripting error: 
org.apache.nifi.processor.exception.FlowFileHandlingException
: 
StandardFlowFileRecord[uuid=d8bc0870-68b3-4ec1-a8e9-cab176c45211,claim=,offset=0,name=demowarehouse.json,size=0]
 is already marked for transfer
org.apache.nifi.processor.exception.FlowFileHandlingException: 
StandardFlowFileRecord[uuid=d8bc0870-68b3-4ec1-a8e9-cab176c45211,claim=,offset=0,name=demowarehouse.json,size=0]
 is already marked for transfer
at 
org.apache.nifi.controller.repository.StandardProcessSession.validateRecordState(StandardProcessSession.java:3136)
at 
org.apache.nifi.controller.repository.StandardProcessSession.validateRecordState(StandardProcessSession.java:3118)
at 
org.apache.nifi.controller.repository.StandardProcessSession.putAttribute(StandardProcessSession.java:1759)
at 
org.apache.nifi.processors.groovyx.flow.ProcessSessionWrap.putAttribute(ProcessSessionWrap.java:471)
at 
org.apache.nifi.processors.groovyx.flow.ProcessSessionWrap.putAttribute(ProcessSessionWrap.java:52)
at org.apache.nifi.processor.ProcessSession$putAttribute$0.call(Unknown 
Source)
at Script78c15242.run(Script78c15242.groovy:134)
at 
org.apache.nifi.processors.groovyx.ExecuteGroovyScript.onTrigger(ExecuteGroovyScript.java:449)
at 
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at 
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
at 
org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
at 
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
2018-09-11 22:31:27,502 ERROR [Timer-Driven Process Thread-48] 
o.a.n.p.groovyx.ExecuteGroovyScript 
ExecuteGroovyScript[id=2d9c363f-af36-15a2-a399-46b2fd0f66d4] 
org.apache.nifi.processor.exception.FlowFileHandlingException: StandardFlowFil
eRecord[uuid=d8bc0870-68b3-4ec1-a8e9-cab176c45211,claim=,offset=0,name=demowarehouse.json,size=0]
 is not known in this session (StandardProcessSession[id=26460506]): 
org.apache.nifi.processor.exception.FlowFileHandlingException: StandardF
lowFileRecord[uuid=d8bc0870-68b3-4ec1-a8e9-cab176c45211,claim=,offset=0,name=demowarehouse.json,size=0]
 is not known in this session (StandardProcessSession[id=26460506])
org.apache.nifi.processor.exception.FlowFileHandlingException: 
StandardFlowFileRecord[uuid=d8bc0870-68b3-4ec1-a8e9-cab176c45211,claim=,offset=0,name=demowarehouse.json,size=0]
 is not known in this session (StandardProcessSession[id=264605
06])
at 
org.apache.nifi.controller.repository.StandardProcessSession.validateRecordState(StandardProcessSession.java:3132)
at 
org.apache.nifi.controller.repository.StandardProcessSession.validateRecordState(StandardProcessSession.java:3118)
at 
org.apache.nifi.controller.repository.StandardProcessSession.putAttribute(StandardProcessSession.java:1759)
at 
org.apache.nifi.processors.groovyx.flow.ProcessSessionWrap.putAttribute(ProcessSessionWrap.java:471)
at 
org.apache.nifi.processors.groovyx.flow.ProcessSessionWrap.putAttribute(ProcessSessionWrap.java:52)
at org.apache.nifi.processor.ProcessSession$putAttribute$0.call(Unknown 
Source)
at Script78c15242.run(Script78c15242.groovy:151)
at 
org.apache.nifi.processors.groovyx.ExecuteGroovyScript.onTrigger(ExecuteGroovyScript.java:449)
at 
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at 
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
at 
org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
at 

[jira] [Commented] (NIFI-5566) Bring HashContent inline with HashService and rename legacy components

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5566?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611497#comment-16611497
 ] 

ASF GitHub Bot commented on NIFI-5566:
--

Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2983
  
@thenatog are you seeing behavior that conflicts with the description of 
expected behavior for configuration 1 if you reproduce the test case Otto 
described in his PR above?


> Bring HashContent inline with HashService and rename legacy components
> --
>
> Key: NIFI-5566
> URL: https://issues.apache.org/jira/browse/NIFI-5566
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.7.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Major
>  Labels: backwards-compatibility, hash, security
>
> As documented in [NIFI-5147|https://issues.apache.org/jira/browse/NIFI-5147] 
> and [PR 2980|https://github.com/apache/nifi/pull/2980], the {{HashAttribute}} 
> processor and {{HashContent}} processor are lacking some features, do not 
> offer consistent algorithms across platforms, etc. 
> I propose the following:
> * Rename {{HashAttribute}} (which does not provide the service of calculating 
> a hash over one or more attributes) to {{HashAttributeLegacy}}
> * Renamed {{CalculateAttributeHash}} to {{HashAttribute}} to make semantic 
> sense
> * Rename {{HashContent}} to {{HashContentLegacy}} for users who need obscure 
> digest algorithms which may or may not have been offered on their platform
> * Implement a processor {{HashContent}} with similar semantics to the 
> existing processor but with consistent algorithm offerings and using the 
> common {{HashService}} offering
> With the new component versioning features provided as part of the flow 
> versioning behavior, silently disrupting existing flows which use these 
> processors is no longer a concern. Rather, Any flow currently using the 
> existing processors will either:
> 1. continue normal operation
> 1. require flow manager interaction and provide documentation about the change
>   1. migration notes and upgrade instructions will be provided



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...

2018-09-11 Thread alopresto
Github user alopresto commented on the issue:

https://github.com/apache/nifi/pull/2983
  
@thenatog are you seeing behavior that conflicts with the description of 
expected behavior for configuration 1 if you reproduce the test case Otto 
described in his PR above?


---


[jira] [Created] (NIFI-5588) Unable to set indefinite max wait time on DBCPConnectionPool

2018-09-11 Thread Colin Dean (JIRA)
Colin Dean created NIFI-5588:


 Summary: Unable to set indefinite max wait time on 
DBCPConnectionPool
 Key: NIFI-5588
 URL: https://issues.apache.org/jira/browse/NIFI-5588
 Project: Apache NiFi
  Issue Type: Bug
  Components: Core Framework
Affects Versions: 1.7.1
 Environment: macOS, Java 8
Reporter: Colin Dean


The DBCPConnectionPool controller service accepts a "Max Wait Time" that 
configures 

bq. The maximum amount of time that the pool will wait (when there are no 
available connections) for a connection to be returned before failing, or -1 to 
wait indefinitely. 

This value must validate as a time period. *There is no valid way to set 
{{-1}}* with the current validator.

The validator [in 
use|https://github.com/apache/nifi/blob/0274bd4ff3f4199838ff1307c9c01d98fcc9150b/nifi-nar-bundles/nifi-standard-services/nifi-dbcp-service-bundle/nifi-dbcp-service/src/main/java/org/apache/nifi/dbcp/DBCPConnectionPool.java#L110]
 is {{StandardValidators.TIME_PERIOD_VALIDATOR}}. The 
[TIME_PERIOD_VALIDATOR|https://github.com/apache/nifi/blob/0274bd4ff3f4199838ff1307c9c01d98fcc9150b/nifi-commons/nifi-utils/src/main/java/org/apache/nifi/processor/util/StandardValidators.java#L443]
 uses [a regex built  in 
FormatUtils|https://github.com/apache/nifi/blob/0274bd4ff3f4199838ff1307c9c01d98fcc9150b/nifi-commons/nifi-utils/src/main/java/org/apache/nifi/util/FormatUtils.java#L44]
 that must have a time unit:

{code:java}
public static final String TIME_DURATION_REGEX = "(\\d+)\\s*(" + 
VALID_TIME_UNITS + ")";
{code}

The regex does not allow for an value such as {{-1}} or {{-1 secs}}, etc.

The obvious workaround is to set that _very_ high.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5586) Add capability to generate ECDSA keys to TLS Toolkit

2018-09-11 Thread Andy LoPresto (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5586?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611436#comment-16611436
 ] 

Andy LoPresto commented on NIFI-5586:
-

If keystores with multiple aliases are supported, this could also be used to 
allow NiFi to have both an RSA and ECDSA key to optimize connections with 
clients depending on their support for various key types. 

See [RSA and ECDSA 
performance|https://securitypitfalls.wordpress.com/2014/10/06/rsa-and-ecdsa-performance/]

> Add capability to generate ECDSA keys to TLS Toolkit
> 
>
> Key: NIFI-5586
> URL: https://issues.apache.org/jira/browse/NIFI-5586
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Tools and Build
>Affects Versions: 1.7.1
>Reporter: Andy LoPresto
>Priority: Major
>  Labels: cryptography, ecc, ecdsa, security, tls, tls-toolkit
>
> The TLS Toolkit should be able to generate ECDSA keys to enable NiFi to 
> support ECDSA cipher suites. 
> Currently, ECDSA keys can be manually generated using external tools and 
> loaded into a keystore and truststore that are compatible with NiFi. 
> {code}
> keytool -genkeypair -alias ec -keyalg EC -keysize 256 -sigalg SHA256withECDSA 
> -validity 365 -storetype JKS -keystore ec-keystore.jks -storepass 
> passwordpassword
> keytool -export -alias ec -keystore ec-keystore.jks -file ec-public.pem
> keytool -import -alias ec -file ec-public.pem -keystore ec-truststore.jks 
> -storepass passwordpassword
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611103#comment-16611103
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216781672
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -0,0 +1,31 @@
+nifi-neo4j-nar
--- End diff --

LGTM for L Match what I see in the NAR module when I do `mvn 
dependency:tree | grep -i compile`


> Create Neo4J cypher execution processor
> ---
>
> Key: NIFI-5537
> URL: https://issues.apache.org/jira/browse/NIFI-5537
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.7.1
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: graph, neo4j, node, relationship
> Fix For: 1.8.0
>
>
> Create Nifi Neo4J cypher queries processor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611107#comment-16611107
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216783486
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/AbstractNeo4JCypherExecutor.java
 ---
@@ -0,0 +1,279 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.File;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Config;
+import org.neo4j.driver.v1.Config.ConfigBuilder;
+import org.neo4j.driver.v1.Config.LoadBalancingStrategy;
+import org.neo4j.driver.v1.Config.TrustStrategy;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+
+/**
+ * Abstract base class for Neo4JCypherExecutor processors
+ */
+abstract class AbstractNeo4JCypherExecutor extends AbstractProcessor {
+
+protected static final PropertyDescriptor QUERY = new 
PropertyDescriptor.Builder()
+.name("neo4J-query")
+.displayName("Neo4J Query")
+.description("Specifies the Neo4j Query.")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CONNECTION_URL = new 
PropertyDescriptor.Builder()
+.name("neo4j-connection-url")
+.displayName("Neo4j Connection URL")
+.description("Neo4J endpoing to connect to.")
+.required(true)
+.defaultValue("bolt://localhost:7687")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("neo4j-username")
+.displayName("Username")
+.description("Username for accessing Neo4J")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("neo4j-password")
+.displayName("Password")
+.description("Password for Neo4J user")
+.required(true)
+.sensitive(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_ROUND_ROBIN = new 
AllowableValue(LoadBalancingStrategy.ROUND_ROBIN.name(), "Round Robin", "Round 
Robin Strategy");
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_LEAST_CONNECTED = 
new 

[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611108#comment-16611108
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216785691
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/test/java/org/apache/nifi/processors/neo4j/ITNeo4JCyperExecutor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+import org.neo4j.driver.v1.Session;
+import org.neo4j.driver.v1.StatementResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.nio.charset.Charset;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Neo4J Cypher integration tests.  Please set the neo4j url, user and 
password according to your setup.
+ * The steps to setup neo4j are
+ * 
+ *Install Neo4J
+ *  brew install neo4j
+ *   Setup neo4j
+ * neo4j start
+ *Log into cypher shell using default username/password - 
neo4j/neo4j
+ * cypher-shell
+ *Changel password to admin
+ * CALL dbms.changePassword('admin')
+ * Restart neo4j
+ * neo4j restart
+ *Log into cypher shell using new password (admin)
+ * cypher-shell
+ * 
+ */
+public class ITNeo4JCyperExecutor {
+protected TestRunner runner;
+protected Driver driver;
+protected String neo4jUrl = "bolt://localhost:7687";
+protected String user = "neo4j";
+protected String password = "admin";
+
+@Before
+public void setUp() throws Exception {
+runner = TestRunners.newTestRunner(Neo4JCypherExecutor.class);
+runner.setProperty(AbstractNeo4JCypherExecutor.CONNECTION_URL, 
neo4jUrl);
+runner.setProperty(AbstractNeo4JCypherExecutor.USERNAME, user);
+runner.setProperty(AbstractNeo4JCypherExecutor.PASSWORD, password);
+runner.setProperty(AbstractNeo4JCypherExecutor.QUERY, "match (n) 
return n");
+driver = GraphDatabase.driver(neo4jUrl, AuthTokens.basic(user, 
password));
+executeSession("match (n) detach delete n");
+
+StatementResult result = executeSession("match (n) return n");
+
+assertEquals("nodes should be equal", 0, result.list().size());
+}
+
+protected StatementResult executeSession(String statement) {
+try (Session session = driver.session()) {
+return session.run(statement);
+}
+}
+
+@After
+public void tearDown() throws Exception {
+runner = null;
+if (driver != null)
--- End diff --

Curly brackets.


> Create Neo4J cypher execution processor
> ---
>
> Key: NIFI-5537
> URL: https://issues.apache.org/jira/browse/NIFI-5537
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.7.1
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: graph, neo4j, node, relationship
> Fix For: 1.8.0
>
>
> Create Nifi Neo4J cypher queries processor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611104#comment-16611104
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216781949
  
--- Diff: nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/pom.xml 
---
@@ -0,0 +1,87 @@
+
+
+http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
+4.0.0
+
+
+org.apache.nifi
+nifi-neo4j-bundle
+1.8.0-SNAPSHOT
+
+
+nifi-neo4j-processors
+jar
+
+
+ 
+ org.neo4j.driver
+ neo4j-java-driver
+ 1.6.2
+ compile
+
+
+org.apache.nifi
+nifi-api
+
+
+org.apache.nifi
+nifi-utils
+1.8.0-SNAPSHOT
+
+
+commons-io
+commons-io
+2.6
+
+
+org.apache.commons
+commons-lang3
+3.7
+
+
+com.google.code.gson
+gson
+2.7
+
+
+org.apache.nifi
+nifi-mock
+1.8.0-SNAPSHOT
+test
+
+
+com.google.guava
+guava
+18.0
--- End diff --

There's a ticket for upgrading to the latest Guava (v26 or something) 
harmonizing with that would be a good idea.


> Create Neo4J cypher execution processor
> ---
>
> Key: NIFI-5537
> URL: https://issues.apache.org/jira/browse/NIFI-5537
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.7.1
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: graph, neo4j, node, relationship
> Fix For: 1.8.0
>
>
> Create Nifi Neo4J cypher queries processor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=1660#comment-1660
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216784220
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/Neo4JCypherExecutor.java
 ---
@@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.neo4j.driver.v1.Session;
+import org.neo4j.driver.v1.StatementResult;
+import org.neo4j.driver.v1.summary.ResultSummary;
+import org.neo4j.driver.v1.summary.SummaryCounters;
+
+import com.google.gson.Gson;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"neo4j", "graph", "network", "insert", "update", "delete", "put", 
"get", "node", "relationship", "connection", "executor"})
+@CapabilityDescription("This processor executes a Neo4J Query 
(https://www.neo4j.com/) defined in the 'Neo4j Query' property of the "
++ "FlowFile and writes the result to the FlowFile body in JSON format. 
The processor has been tested with Neo4j version 3.4.5")
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.ERROR_MESSAGE, description = "Neo4J error message"),
+@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.LABELS_ADDED, 
description = "Number of labels added"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.NODES_CREATED, description = "Number of nodes 
created"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.NODES_DELETED, description = "Number of nodes 
deleted"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.PROPERTIES_SET, description = "Number of properties 
set"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.RELATIONS_CREATED, description = "Number of 
relationships created"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.RELATIONS_DELETED, description = "Number of 
relationships deleted"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.ROWS_RETURNED, description = "Number of rows 
returned"),
+})
+public class Neo4JCypherExecutor extends AbstractNeo4JCypherExecutor {
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+
+static {
+final Set 

[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611106#comment-16611106
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216784900
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/Neo4JCypherExecutor.java
 ---
@@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.neo4j.driver.v1.Session;
+import org.neo4j.driver.v1.StatementResult;
+import org.neo4j.driver.v1.summary.ResultSummary;
+import org.neo4j.driver.v1.summary.SummaryCounters;
+
+import com.google.gson.Gson;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"neo4j", "graph", "network", "insert", "update", "delete", "put", 
"get", "node", "relationship", "connection", "executor"})
+@CapabilityDescription("This processor executes a Neo4J Query 
(https://www.neo4j.com/) defined in the 'Neo4j Query' property of the "
++ "FlowFile and writes the result to the FlowFile body in JSON format. 
The processor has been tested with Neo4j version 3.4.5")
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.ERROR_MESSAGE, description = "Neo4J error message"),
+@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.LABELS_ADDED, 
description = "Number of labels added"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.NODES_CREATED, description = "Number of nodes 
created"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.NODES_DELETED, description = "Number of nodes 
deleted"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.PROPERTIES_SET, description = "Number of properties 
set"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.RELATIONS_CREATED, description = "Number of 
relationships created"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.RELATIONS_DELETED, description = "Number of 
relationships deleted"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.ROWS_RETURNED, description = "Number of rows 
returned"),
+})
+public class Neo4JCypherExecutor extends AbstractNeo4JCypherExecutor {
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+
+static {
+final Set 

[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611102#comment-16611102
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216781060
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-nar/src/main/resources/META-INF/LICENSE
 ---
@@ -0,0 +1,202 @@
+
+ Apache License
+   Version 2.0, January 2004
+http://www.apache.org/licenses/
+
+   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+   1. Definitions.
+
+  "License" shall mean the terms and conditions for use, reproduction,
+  and distribution as defined by Sections 1 through 9 of this document.
+
+  "Licensor" shall mean the copyright owner or entity authorized by
+  the copyright owner that is granting the License.
+
+  "Legal Entity" shall mean the union of the acting entity and all
+  other entities that control, are controlled by, or are under common
+  control with that entity. For the purposes of this definition,
+  "control" means (i) the power, direct or indirect, to cause the
+  direction or management of such entity, whether by contract or
+  otherwise, or (ii) ownership of fifty percent (50%) or more of the
+  outstanding shares, or (iii) beneficial ownership of such entity.
+
+  "You" (or "Your") shall mean an individual or Legal Entity
+  exercising permissions granted by this License.
+
+  "Source" form shall mean the preferred form for making modifications,
+  including but not limited to software source code, documentation
+  source, and configuration files.
+
+  "Object" form shall mean any form resulting from mechanical
+  transformation or translation of a Source form, including but
+  not limited to compiled object code, generated documentation,
+  and conversions to other media types.
+
+  "Work" shall mean the work of authorship, whether in Source or
+  Object form, made available under the License, as indicated by a
+  copyright notice that is included in or attached to the work
+  (an example is provided in the Appendix below).
+
+  "Derivative Works" shall mean any work, whether in Source or Object
+  form, that is based on (or derived from) the Work and for which the
+  editorial revisions, annotations, elaborations, or other 
modifications
+  represent, as a whole, an original work of authorship. For the 
purposes
+  of this License, Derivative Works shall not include works that remain
+  separable from, or merely link (or bind by name) to the interfaces 
of,
+  the Work and Derivative Works thereof.
+
+  "Contribution" shall mean any work of authorship, including
+  the original version of the Work and any modifications or additions
+  to that Work or Derivative Works thereof, that is intentionally
+  submitted to Licensor for inclusion in the Work by the copyright 
owner
+  or by an individual or Legal Entity authorized to submit on behalf of
+  the copyright owner. For the purposes of this definition, "submitted"
+  means any form of electronic, verbal, or written communication sent
+  to the Licensor or its representatives, including but not limited to
+  communication on electronic mailing lists, source code control 
systems,
+  and issue tracking systems that are managed by, or on behalf of, the
+  Licensor for the purpose of discussing and improving the Work, but
+  excluding communication that is conspicuously marked or otherwise
+  designated in writing by the copyright owner as "Not a Contribution."
+
+  "Contributor" shall mean Licensor and any individual or Legal Entity
+  on behalf of whom a Contribution has been received by Licensor and
+  subsequently incorporated within the Work.
+
+   2. Grant of Copyright License. Subject to the terms and conditions of
+  this License, each Contributor hereby grants to You a perpetual,
+  worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+  copyright license to reproduce, prepare Derivative Works of,
+  publicly display, publicly perform, sublicense, and distribute the
+  Work and such Derivative Works in Source or Object form.
+
+   3. Grant of Patent License. Subject to the terms and conditions of
+  this License, each Contributor hereby grants to You a perpetual,
+  worldwide, non-exclusive, no-charge, 

[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611105#comment-16611105
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216782816
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/AbstractNeo4JCypherExecutor.java
 ---
@@ -0,0 +1,279 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.File;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Config;
+import org.neo4j.driver.v1.Config.ConfigBuilder;
+import org.neo4j.driver.v1.Config.LoadBalancingStrategy;
+import org.neo4j.driver.v1.Config.TrustStrategy;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+
+/**
+ * Abstract base class for Neo4JCypherExecutor processors
+ */
+abstract class AbstractNeo4JCypherExecutor extends AbstractProcessor {
+
+protected static final PropertyDescriptor QUERY = new 
PropertyDescriptor.Builder()
+.name("neo4J-query")
+.displayName("Neo4J Query")
+.description("Specifies the Neo4j Query.")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CONNECTION_URL = new 
PropertyDescriptor.Builder()
--- End diff --

If you can wrap these into a controller service that would act as a 
connection pool, now would be the right time to do that. That's the direction 
I'm taking the Mongo ones because it makes configuration a lot easier to 
configure and can prevent NiFi from spinning up unnecessary 
connections/connection pools.


> Create Neo4J cypher execution processor
> ---
>
> Key: NIFI-5537
> URL: https://issues.apache.org/jira/browse/NIFI-5537
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.7.1
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: graph, neo4j, node, relationship
> Fix For: 1.8.0
>
>
> Create Nifi Neo4J cypher queries processor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611100#comment-16611100
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216780813
  
--- Diff: nifi-assembly/NOTICE ---
@@ -1135,6 +1135,11 @@ The following binary components are provided under 
the Apache Software License v
 
   Copyright 2010-2012 RethinkDB
 
+(ASLv2) Neo4j Java Driver
--- End diff --

LGTM


> Create Neo4J cypher execution processor
> ---
>
> Key: NIFI-5537
> URL: https://issues.apache.org/jira/browse/NIFI-5537
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.7.1
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: graph, neo4j, node, relationship
> Fix For: 1.8.0
>
>
> Create Nifi Neo4J cypher queries processor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=1662#comment-1662
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216783723
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/AbstractNeo4JCypherExecutor.java
 ---
@@ -0,0 +1,279 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.File;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Config;
+import org.neo4j.driver.v1.Config.ConfigBuilder;
+import org.neo4j.driver.v1.Config.LoadBalancingStrategy;
+import org.neo4j.driver.v1.Config.TrustStrategy;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+
+/**
+ * Abstract base class for Neo4JCypherExecutor processors
+ */
+abstract class AbstractNeo4JCypherExecutor extends AbstractProcessor {
+
+protected static final PropertyDescriptor QUERY = new 
PropertyDescriptor.Builder()
+.name("neo4J-query")
+.displayName("Neo4J Query")
+.description("Specifies the Neo4j Query.")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CONNECTION_URL = new 
PropertyDescriptor.Builder()
+.name("neo4j-connection-url")
+.displayName("Neo4j Connection URL")
+.description("Neo4J endpoing to connect to.")
+.required(true)
+.defaultValue("bolt://localhost:7687")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("neo4j-username")
+.displayName("Username")
+.description("Username for accessing Neo4J")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("neo4j-password")
+.displayName("Password")
+.description("Password for Neo4J user")
+.required(true)
+.sensitive(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_ROUND_ROBIN = new 
AllowableValue(LoadBalancingStrategy.ROUND_ROBIN.name(), "Round Robin", "Round 
Robin Strategy");
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_LEAST_CONNECTED = 
new 

[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=1661#comment-1661
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216783144
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/AbstractNeo4JCypherExecutor.java
 ---
@@ -0,0 +1,279 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.File;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Config;
+import org.neo4j.driver.v1.Config.ConfigBuilder;
+import org.neo4j.driver.v1.Config.LoadBalancingStrategy;
+import org.neo4j.driver.v1.Config.TrustStrategy;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+
+/**
+ * Abstract base class for Neo4JCypherExecutor processors
+ */
+abstract class AbstractNeo4JCypherExecutor extends AbstractProcessor {
+
+protected static final PropertyDescriptor QUERY = new 
PropertyDescriptor.Builder()
+.name("neo4J-query")
+.displayName("Neo4J Query")
+.description("Specifies the Neo4j Query.")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CONNECTION_URL = new 
PropertyDescriptor.Builder()
+.name("neo4j-connection-url")
+.displayName("Neo4j Connection URL")
+.description("Neo4J endpoing to connect to.")
+.required(true)
+.defaultValue("bolt://localhost:7687")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("neo4j-username")
+.displayName("Username")
+.description("Username for accessing Neo4J")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("neo4j-password")
+.displayName("Password")
+.description("Password for Neo4J user")
+.required(true)
+.sensitive(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_ROUND_ROBIN = new 
AllowableValue(LoadBalancingStrategy.ROUND_ROBIN.name(), "Round Robin", "Round 
Robin Strategy");
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_LEAST_CONNECTED = 
new 

[jira] [Commented] (NIFI-5537) Create Neo4J cypher execution processor

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16611109#comment-16611109
 ] 

ASF GitHub Bot commented on NIFI-5537:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216785597
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/test/java/org/apache/nifi/processors/neo4j/ITNeo4JCyperExecutor.java
 ---
@@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+import org.neo4j.driver.v1.Session;
+import org.neo4j.driver.v1.StatementResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.nio.charset.Charset;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.junit.Assert.assertEquals;
+
--- End diff --

I'll see about creating a docker alternative to the instructions below.


> Create Neo4J cypher execution processor
> ---
>
> Key: NIFI-5537
> URL: https://issues.apache.org/jira/browse/NIFI-5537
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.7.1
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: graph, neo4j, node, relationship
> Fix For: 1.8.0
>
>
> Create Nifi Neo4J cypher queries processor



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216783486
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/AbstractNeo4JCypherExecutor.java
 ---
@@ -0,0 +1,279 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.File;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Config;
+import org.neo4j.driver.v1.Config.ConfigBuilder;
+import org.neo4j.driver.v1.Config.LoadBalancingStrategy;
+import org.neo4j.driver.v1.Config.TrustStrategy;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+
+/**
+ * Abstract base class for Neo4JCypherExecutor processors
+ */
+abstract class AbstractNeo4JCypherExecutor extends AbstractProcessor {
+
+protected static final PropertyDescriptor QUERY = new 
PropertyDescriptor.Builder()
+.name("neo4J-query")
+.displayName("Neo4J Query")
+.description("Specifies the Neo4j Query.")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CONNECTION_URL = new 
PropertyDescriptor.Builder()
+.name("neo4j-connection-url")
+.displayName("Neo4j Connection URL")
+.description("Neo4J endpoing to connect to.")
+.required(true)
+.defaultValue("bolt://localhost:7687")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("neo4j-username")
+.displayName("Username")
+.description("Username for accessing Neo4J")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("neo4j-password")
+.displayName("Password")
+.description("Password for Neo4J user")
+.required(true)
+.sensitive(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_ROUND_ROBIN = new 
AllowableValue(LoadBalancingStrategy.ROUND_ROBIN.name(), "Round Robin", "Round 
Robin Strategy");
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_LEAST_CONNECTED = 
new AllowableValue(LoadBalancingStrategy.LEAST_CONNECTED.name(), "Least 
Connected", "Least Connected Strategy");
+
+protected static final PropertyDescriptor LOAD_BALANCING_STRATEGY = 
new PropertyDescriptor.Builder()
+.name("neo4j-load-balancing-strategy")
+  

[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216785597
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/test/java/org/apache/nifi/processors/neo4j/ITNeo4JCyperExecutor.java
 ---
@@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+import org.neo4j.driver.v1.Session;
+import org.neo4j.driver.v1.StatementResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.nio.charset.Charset;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.junit.Assert.assertEquals;
+
--- End diff --

I'll see about creating a docker alternative to the instructions below.


---


[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216781949
  
--- Diff: nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/pom.xml 
---
@@ -0,0 +1,87 @@
+
+
+http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
+4.0.0
+
+
+org.apache.nifi
+nifi-neo4j-bundle
+1.8.0-SNAPSHOT
+
+
+nifi-neo4j-processors
+jar
+
+
+ 
+ org.neo4j.driver
+ neo4j-java-driver
+ 1.6.2
+ compile
+
+
+org.apache.nifi
+nifi-api
+
+
+org.apache.nifi
+nifi-utils
+1.8.0-SNAPSHOT
+
+
+commons-io
+commons-io
+2.6
+
+
+org.apache.commons
+commons-lang3
+3.7
+
+
+com.google.code.gson
+gson
+2.7
+
+
+org.apache.nifi
+nifi-mock
+1.8.0-SNAPSHOT
+test
+
+
+com.google.guava
+guava
+18.0
--- End diff --

There's a ticket for upgrading to the latest Guava (v26 or something) 
harmonizing with that would be a good idea.


---


[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216782816
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/AbstractNeo4JCypherExecutor.java
 ---
@@ -0,0 +1,279 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.File;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Config;
+import org.neo4j.driver.v1.Config.ConfigBuilder;
+import org.neo4j.driver.v1.Config.LoadBalancingStrategy;
+import org.neo4j.driver.v1.Config.TrustStrategy;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+
+/**
+ * Abstract base class for Neo4JCypherExecutor processors
+ */
+abstract class AbstractNeo4JCypherExecutor extends AbstractProcessor {
+
+protected static final PropertyDescriptor QUERY = new 
PropertyDescriptor.Builder()
+.name("neo4J-query")
+.displayName("Neo4J Query")
+.description("Specifies the Neo4j Query.")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CONNECTION_URL = new 
PropertyDescriptor.Builder()
--- End diff --

If you can wrap these into a controller service that would act as a 
connection pool, now would be the right time to do that. That's the direction 
I'm taking the Mongo ones because it makes configuration a lot easier to 
configure and can prevent NiFi from spinning up unnecessary 
connections/connection pools.


---


[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216784900
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/Neo4JCypherExecutor.java
 ---
@@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.neo4j.driver.v1.Session;
+import org.neo4j.driver.v1.StatementResult;
+import org.neo4j.driver.v1.summary.ResultSummary;
+import org.neo4j.driver.v1.summary.SummaryCounters;
+
+import com.google.gson.Gson;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"neo4j", "graph", "network", "insert", "update", "delete", "put", 
"get", "node", "relationship", "connection", "executor"})
+@CapabilityDescription("This processor executes a Neo4J Query 
(https://www.neo4j.com/) defined in the 'Neo4j Query' property of the "
++ "FlowFile and writes the result to the FlowFile body in JSON format. 
The processor has been tested with Neo4j version 3.4.5")
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.ERROR_MESSAGE, description = "Neo4J error message"),
+@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.LABELS_ADDED, 
description = "Number of labels added"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.NODES_CREATED, description = "Number of nodes 
created"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.NODES_DELETED, description = "Number of nodes 
deleted"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.PROPERTIES_SET, description = "Number of properties 
set"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.RELATIONS_CREATED, description = "Number of 
relationships created"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.RELATIONS_DELETED, description = "Number of 
relationships deleted"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.ROWS_RETURNED, description = "Number of rows 
returned"),
+})
+public class Neo4JCypherExecutor extends AbstractNeo4JCypherExecutor {
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+tempRelationships.add(REL_FAILURE);
+relationships = Collections.unmodifiableSet(tempRelationships);
+
+final 

[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216781060
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-nar/src/main/resources/META-INF/LICENSE
 ---
@@ -0,0 +1,202 @@
+
+ Apache License
+   Version 2.0, January 2004
+http://www.apache.org/licenses/
+
+   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+   1. Definitions.
+
+  "License" shall mean the terms and conditions for use, reproduction,
+  and distribution as defined by Sections 1 through 9 of this document.
+
+  "Licensor" shall mean the copyright owner or entity authorized by
+  the copyright owner that is granting the License.
+
+  "Legal Entity" shall mean the union of the acting entity and all
+  other entities that control, are controlled by, or are under common
+  control with that entity. For the purposes of this definition,
+  "control" means (i) the power, direct or indirect, to cause the
+  direction or management of such entity, whether by contract or
+  otherwise, or (ii) ownership of fifty percent (50%) or more of the
+  outstanding shares, or (iii) beneficial ownership of such entity.
+
+  "You" (or "Your") shall mean an individual or Legal Entity
+  exercising permissions granted by this License.
+
+  "Source" form shall mean the preferred form for making modifications,
+  including but not limited to software source code, documentation
+  source, and configuration files.
+
+  "Object" form shall mean any form resulting from mechanical
+  transformation or translation of a Source form, including but
+  not limited to compiled object code, generated documentation,
+  and conversions to other media types.
+
+  "Work" shall mean the work of authorship, whether in Source or
+  Object form, made available under the License, as indicated by a
+  copyright notice that is included in or attached to the work
+  (an example is provided in the Appendix below).
+
+  "Derivative Works" shall mean any work, whether in Source or Object
+  form, that is based on (or derived from) the Work and for which the
+  editorial revisions, annotations, elaborations, or other 
modifications
+  represent, as a whole, an original work of authorship. For the 
purposes
+  of this License, Derivative Works shall not include works that remain
+  separable from, or merely link (or bind by name) to the interfaces 
of,
+  the Work and Derivative Works thereof.
+
+  "Contribution" shall mean any work of authorship, including
+  the original version of the Work and any modifications or additions
+  to that Work or Derivative Works thereof, that is intentionally
+  submitted to Licensor for inclusion in the Work by the copyright 
owner
+  or by an individual or Legal Entity authorized to submit on behalf of
+  the copyright owner. For the purposes of this definition, "submitted"
+  means any form of electronic, verbal, or written communication sent
+  to the Licensor or its representatives, including but not limited to
+  communication on electronic mailing lists, source code control 
systems,
+  and issue tracking systems that are managed by, or on behalf of, the
+  Licensor for the purpose of discussing and improving the Work, but
+  excluding communication that is conspicuously marked or otherwise
+  designated in writing by the copyright owner as "Not a Contribution."
+
+  "Contributor" shall mean Licensor and any individual or Legal Entity
+  on behalf of whom a Contribution has been received by Licensor and
+  subsequently incorporated within the Work.
+
+   2. Grant of Copyright License. Subject to the terms and conditions of
+  this License, each Contributor hereby grants to You a perpetual,
+  worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+  copyright license to reproduce, prepare Derivative Works of,
+  publicly display, publicly perform, sublicense, and distribute the
+  Work and such Derivative Works in Source or Object form.
+
+   3. Grant of Patent License. Subject to the terms and conditions of
+  this License, each Contributor hereby grants to You a perpetual,
+  worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+  (except as stated in this section) patent license to make, have made,
+  use, offer to sell, sell, import, and otherwise transfer the Work,
+  where such license applies only to those patent 

[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216783144
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/AbstractNeo4JCypherExecutor.java
 ---
@@ -0,0 +1,279 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.File;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Config;
+import org.neo4j.driver.v1.Config.ConfigBuilder;
+import org.neo4j.driver.v1.Config.LoadBalancingStrategy;
+import org.neo4j.driver.v1.Config.TrustStrategy;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+
+/**
+ * Abstract base class for Neo4JCypherExecutor processors
+ */
+abstract class AbstractNeo4JCypherExecutor extends AbstractProcessor {
+
+protected static final PropertyDescriptor QUERY = new 
PropertyDescriptor.Builder()
+.name("neo4J-query")
+.displayName("Neo4J Query")
+.description("Specifies the Neo4j Query.")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CONNECTION_URL = new 
PropertyDescriptor.Builder()
+.name("neo4j-connection-url")
+.displayName("Neo4j Connection URL")
+.description("Neo4J endpoing to connect to.")
+.required(true)
+.defaultValue("bolt://localhost:7687")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("neo4j-username")
+.displayName("Username")
+.description("Username for accessing Neo4J")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("neo4j-password")
+.displayName("Password")
+.description("Password for Neo4J user")
+.required(true)
+.sensitive(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_ROUND_ROBIN = new 
AllowableValue(LoadBalancingStrategy.ROUND_ROBIN.name(), "Round Robin", "Round 
Robin Strategy");
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_LEAST_CONNECTED = 
new AllowableValue(LoadBalancingStrategy.LEAST_CONNECTED.name(), "Least 
Connected", "Least Connected Strategy");
+
+protected static final PropertyDescriptor LOAD_BALANCING_STRATEGY = 
new PropertyDescriptor.Builder()
+.name("neo4j-load-balancing-strategy")
+  

[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216785691
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/test/java/org/apache/nifi/processors/neo4j/ITNeo4JCyperExecutor.java
 ---
@@ -0,0 +1,218 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+import org.neo4j.driver.v1.Session;
+import org.neo4j.driver.v1.StatementResult;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.nio.charset.Charset;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Neo4J Cypher integration tests.  Please set the neo4j url, user and 
password according to your setup.
+ * The steps to setup neo4j are
+ * 
+ *Install Neo4J
+ *  brew install neo4j
+ *   Setup neo4j
+ * neo4j start
+ *Log into cypher shell using default username/password - 
neo4j/neo4j
+ * cypher-shell
+ *Changel password to admin
+ * CALL dbms.changePassword('admin')
+ * Restart neo4j
+ * neo4j restart
+ *Log into cypher shell using new password (admin)
+ * cypher-shell
+ * 
+ */
+public class ITNeo4JCyperExecutor {
+protected TestRunner runner;
+protected Driver driver;
+protected String neo4jUrl = "bolt://localhost:7687";
+protected String user = "neo4j";
+protected String password = "admin";
+
+@Before
+public void setUp() throws Exception {
+runner = TestRunners.newTestRunner(Neo4JCypherExecutor.class);
+runner.setProperty(AbstractNeo4JCypherExecutor.CONNECTION_URL, 
neo4jUrl);
+runner.setProperty(AbstractNeo4JCypherExecutor.USERNAME, user);
+runner.setProperty(AbstractNeo4JCypherExecutor.PASSWORD, password);
+runner.setProperty(AbstractNeo4JCypherExecutor.QUERY, "match (n) 
return n");
+driver = GraphDatabase.driver(neo4jUrl, AuthTokens.basic(user, 
password));
+executeSession("match (n) detach delete n");
+
+StatementResult result = executeSession("match (n) return n");
+
+assertEquals("nodes should be equal", 0, result.list().size());
+}
+
+protected StatementResult executeSession(String statement) {
+try (Session session = driver.session()) {
+return session.run(statement);
+}
+}
+
+@After
+public void tearDown() throws Exception {
+runner = null;
+if (driver != null)
--- End diff --

Curly brackets.


---


[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216780813
  
--- Diff: nifi-assembly/NOTICE ---
@@ -1135,6 +1135,11 @@ The following binary components are provided under 
the Apache Software License v
 
   Copyright 2010-2012 RethinkDB
 
+(ASLv2) Neo4j Java Driver
--- End diff --

LGTM


---


[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216784220
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/Neo4JCypherExecutor.java
 ---
@@ -0,0 +1,203 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.ByteArrayInputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.neo4j.driver.v1.Session;
+import org.neo4j.driver.v1.StatementResult;
+import org.neo4j.driver.v1.summary.ResultSummary;
+import org.neo4j.driver.v1.summary.SummaryCounters;
+
+import com.google.gson.Gson;
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"neo4j", "graph", "network", "insert", "update", "delete", "put", 
"get", "node", "relationship", "connection", "executor"})
+@CapabilityDescription("This processor executes a Neo4J Query 
(https://www.neo4j.com/) defined in the 'Neo4j Query' property of the "
++ "FlowFile and writes the result to the FlowFile body in JSON format. 
The processor has been tested with Neo4j version 3.4.5")
+@WritesAttributes({
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.ERROR_MESSAGE, description = "Neo4J error message"),
+@WritesAttribute(attribute = AbstractNeo4JCypherExecutor.LABELS_ADDED, 
description = "Number of labels added"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.NODES_CREATED, description = "Number of nodes 
created"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.NODES_DELETED, description = "Number of nodes 
deleted"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.PROPERTIES_SET, description = "Number of properties 
set"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.RELATIONS_CREATED, description = "Number of 
relationships created"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.RELATIONS_DELETED, description = "Number of 
relationships deleted"),
+@WritesAttribute(attribute = 
AbstractNeo4JCypherExecutor.ROWS_RETURNED, description = "Number of rows 
returned"),
+})
+public class Neo4JCypherExecutor extends AbstractNeo4JCypherExecutor {
+
+private static final Set relationships;
+private static final List propertyDescriptors;
+
+static {
+final Set tempRelationships = new HashSet<>();
+tempRelationships.add(REL_SUCCESS);
+tempRelationships.add(REL_FAILURE);
+relationships = Collections.unmodifiableSet(tempRelationships);
+
+final 

[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216783723
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-processors/src/main/java/org/apache/nifi/processors/neo4j/AbstractNeo4JCypherExecutor.java
 ---
@@ -0,0 +1,279 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.neo4j;
+
+import java.io.File;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.neo4j.driver.v1.AuthTokens;
+import org.neo4j.driver.v1.Config;
+import org.neo4j.driver.v1.Config.ConfigBuilder;
+import org.neo4j.driver.v1.Config.LoadBalancingStrategy;
+import org.neo4j.driver.v1.Config.TrustStrategy;
+import org.neo4j.driver.v1.Driver;
+import org.neo4j.driver.v1.GraphDatabase;
+
+/**
+ * Abstract base class for Neo4JCypherExecutor processors
+ */
+abstract class AbstractNeo4JCypherExecutor extends AbstractProcessor {
+
+protected static final PropertyDescriptor QUERY = new 
PropertyDescriptor.Builder()
+.name("neo4J-query")
+.displayName("Neo4J Query")
+.description("Specifies the Neo4j Query.")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CONNECTION_URL = new 
PropertyDescriptor.Builder()
+.name("neo4j-connection-url")
+.displayName("Neo4j Connection URL")
+.description("Neo4J endpoing to connect to.")
+.required(true)
+.defaultValue("bolt://localhost:7687")
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor.Builder()
+.name("neo4j-username")
+.displayName("Username")
+.description("Username for accessing Neo4J")
+.required(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor PASSWORD = new 
PropertyDescriptor.Builder()
+.name("neo4j-password")
+.displayName("Password")
+.description("Password for Neo4J user")
+.required(true)
+.sensitive(true)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_BLANK_VALIDATOR)
+.build();
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_ROUND_ROBIN = new 
AllowableValue(LoadBalancingStrategy.ROUND_ROBIN.name(), "Round Robin", "Round 
Robin Strategy");
+
+public static AllowableValue LOAD_BALANCING_STRATEGY_LEAST_CONNECTED = 
new AllowableValue(LoadBalancingStrategy.LEAST_CONNECTED.name(), "Least 
Connected", "Least Connected Strategy");
+
+protected static final PropertyDescriptor LOAD_BALANCING_STRATEGY = 
new PropertyDescriptor.Builder()
+.name("neo4j-load-balancing-strategy")
+  

[GitHub] nifi pull request #2956: NIFI-5537 Create Neo4J cypher execution processor

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2956#discussion_r216781672
  
--- Diff: 
nifi-nar-bundles/nifi-neo4j-bundle/nifi-neo4j-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -0,0 +1,31 @@
+nifi-neo4j-nar
--- End diff --

LGTM for L Match what I see in the NAR module when I do `mvn 
dependency:tree | grep -i compile`


---


[jira] [Created] (NIFI-5587) Implement HPKP header

2018-09-11 Thread Andy LoPresto (JIRA)
Andy LoPresto created NIFI-5587:
---

 Summary: Implement HPKP header
 Key: NIFI-5587
 URL: https://issues.apache.org/jira/browse/NIFI-5587
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Core Framework
Affects Versions: 1.7.1
Reporter: Andy LoPresto


[HTTPS Public Key 
Pinning|https://en.wikipedia.org/wiki/HTTP_Public_Key_Pinning] allows for 
explicit public keys to be transmitted to a client instructing the client to 
only trust those keys for the service. This should only be implemented in 
conjunction with a strong certificate management strategy, as pinning a public 
key that is later compromised or expired without having a backup can lead to 
clients being blocked from using the legitimate service. 

More details on HPKP are available in [RFC 
7469|https://tools.ietf.org/html/rfc7469]. 




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5327) NetFlow Processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610860#comment-16610860
 ] 

ASF GitHub Bot commented on NIFI-5327:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2820
  
@PrashanthVenkatesan any updates?


> NetFlow Processors
> --
>
> Key: NIFI-5327
> URL: https://issues.apache.org/jira/browse/NIFI-5327
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework
>Affects Versions: 1.6.0
>Reporter: Prashanth Venkatesan
>Assignee: Prashanth Venkatesan
>Priority: Major
>
> As network traffic data scopes for the big data use case, would like NiFi to 
> have processors to support parsing of those protocols.
> Netflow is a protocol introduced by Cisco that provides the ability to 
> collect IP network traffic as it enters or exits an interface and is 
> described in detail in here:
> [https://www.cisco.com/c/en/us/td/docs/net_mgmt/netflow_collection_engine/3-6/user/guide/format.html]
>  
> Currently, I have created the following processor:
> *ParseNetflowv5*:  Parses the ingress netflowv5 bytes and ingest as either 
> NiFi flowfile attributes or as a JSON content. This also sends 
> one-time-template.
>  
> Further ahead, we can add many processor specific to network protocols in 
> this nar bundle.
> I will create a pull request.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2820: NIFI-5327 Adding Netflowv5 protocol parser

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2820
  
@PrashanthVenkatesan any updates?


---


[GitHub] nifi issue #2671: NiFi-5102 - Adding Processors for MarkLogic DB

2018-09-11 Thread MikeThomsen
Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2671
  
@vivekmuniyandi you have a merge conflict now.


---


[jira] [Updated] (NIFI-5569) Add tags to Route* processors for discoverability

2018-09-11 Thread Pierre Villard (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5569?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pierre Villard updated NIFI-5569:
-
   Resolution: Fixed
Fix Version/s: 1.8.0
   Status: Resolved  (was: Patch Available)

> Add tags to Route* processors for discoverability
> -
>
> Key: NIFI-5569
> URL: https://issues.apache.org/jira/browse/NIFI-5569
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation  Website
>Affects Versions: 1.7.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Trivial
>  Labels: beginner, documentation, keyword, label
> Fix For: 1.8.0
>
>
> In a fit of forgetfulness, I could not remember the {{RouteOn*}} processors 
> when trying to detect the presence of some bytes in flowfile content. I 
> propose adding the keywords "find", "search", "scan", and "detect" to the 
> following processors, as they are used for those functions but do not come up 
> in a search for those keywords. 
> * {{RouteOnAttribute}}
> * {{RouteOnContent}}
> * {{RouteText}}
> Additionally, {{ScanContent}} and {{ReplaceText}} can have additional 
> keywords added to improve discoverability. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5569) Add tags to Route* processors for discoverability

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5569?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610856#comment-16610856
 ] 

ASF GitHub Bot commented on NIFI-5569:
--

Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2984
  
+1, merging to master, thanks @alopresto 


> Add tags to Route* processors for discoverability
> -
>
> Key: NIFI-5569
> URL: https://issues.apache.org/jira/browse/NIFI-5569
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation  Website
>Affects Versions: 1.7.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Trivial
>  Labels: beginner, documentation, keyword, label
> Fix For: 1.8.0
>
>
> In a fit of forgetfulness, I could not remember the {{RouteOn*}} processors 
> when trying to detect the presence of some bytes in flowfile content. I 
> propose adding the keywords "find", "search", "scan", and "detect" to the 
> following processors, as they are used for those functions but do not come up 
> in a search for those keywords. 
> * {{RouteOnAttribute}}
> * {{RouteOnContent}}
> * {{RouteText}}
> Additionally, {{ScanContent}} and {{ReplaceText}} can have additional 
> keywords added to improve discoverability. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5569) Add tags to Route* processors for discoverability

2018-09-11 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5569?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610855#comment-16610855
 ] 

ASF subversion and git services commented on NIFI-5569:
---

Commit 7e8b7752ca8ef03498e76ad2a07b51b195f219ed in nifi's branch 
refs/heads/master from [~alopresto]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=7e8b775 ]

NIFI-5569 Added keywords to Route* and ScanAttribute processors to improve 
discoverability.

Signed-off-by: Pierre Villard 

This closes #2984.


> Add tags to Route* processors for discoverability
> -
>
> Key: NIFI-5569
> URL: https://issues.apache.org/jira/browse/NIFI-5569
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation  Website
>Affects Versions: 1.7.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Trivial
>  Labels: beginner, documentation, keyword, label
> Fix For: 1.8.0
>
>
> In a fit of forgetfulness, I could not remember the {{RouteOn*}} processors 
> when trying to detect the presence of some bytes in flowfile content. I 
> propose adding the keywords "find", "search", "scan", and "detect" to the 
> following processors, as they are used for those functions but do not come up 
> in a search for those keywords. 
> * {{RouteOnAttribute}}
> * {{RouteOnContent}}
> * {{RouteText}}
> Additionally, {{ScanContent}} and {{ReplaceText}} can have additional 
> keywords added to improve discoverability. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2984: NIFI-5569 Added keywords to Route* and ScanAttribute proce...

2018-09-11 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2984
  
+1, merging to master, thanks @alopresto 


---


[jira] [Commented] (NIFI-5318) Implement NiFi test harness

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5318?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610813#comment-16610813
 ] 

ASF GitHub Bot commented on NIFI-5318:
--

Github user peter-gergely-horvath commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2872#discussion_r216716909
  
--- Diff: 
nifi-testharness/src/test/java/org/apache/nifi/testharness/samples/Constants.java
 ---
@@ -0,0 +1,28 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+
+package org.apache.nifi.testharness.samples;
+
+import java.io.File;
+
+public final class Constants {
+
+static final File OUTPUT_DIR = new File("./NiFiTest/NiFiReadTest");
+static final File NIFI_ZIP_DIR = new 
File("../../nifi-assembly/target");
--- End diff --

No, these are in the test cases directory: they are there as a *sample* 
(referred to in the documentation), and will not be packaged to the actual API 
artifact. 
The paths there refer the NiFi bundle produced during the NiFi build, which 
seems to be a good default in case you are working on the test harness itself.


> Implement NiFi test harness
> ---
>
> Key: NIFI-5318
> URL: https://issues.apache.org/jira/browse/NIFI-5318
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Peter Horvath
>Priority: Major
>
> Currently, it is not really possible to automatically test the behaviour of a 
> specific NiFi flow and make unit test type asserts if it works as expected. 
> For example, if the expected behaviour of a NiFi flow is that a file placed 
> to a specific directory will trigger some operation after which some output 
> file will appear at another directory, once currently can only do one thing: 
> test the NiFi flow manually. 
> Manual testing is especially hard to manage if a NiFi flow is being actively 
> developed: any change to a complex, existing NiFi flow might require a lot of 
> manual testing just to ensure there are no regressions introduced. 
> Some kind of Java API that allows managing a NiFi instance and manipulating 
> flow deployments like for example, [Codehaus 
> Cargo|]https://codehaus-cargo.github.io/] would be of great help. 
>  
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2872: NIFI-5318 Implement NiFi test harness: initial comm...

2018-09-11 Thread peter-gergely-horvath
Github user peter-gergely-horvath commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2872#discussion_r216716909
  
--- Diff: 
nifi-testharness/src/test/java/org/apache/nifi/testharness/samples/Constants.java
 ---
@@ -0,0 +1,28 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+
+package org.apache.nifi.testharness.samples;
+
+import java.io.File;
+
+public final class Constants {
+
+static final File OUTPUT_DIR = new File("./NiFiTest/NiFiReadTest");
+static final File NIFI_ZIP_DIR = new 
File("../../nifi-assembly/target");
--- End diff --

No, these are in the test cases directory: they are there as a *sample* 
(referred to in the documentation), and will not be packaged to the actual API 
artifact. 
The paths there refer the NiFi bundle produced during the NiFi build, which 
seems to be a good default in case you are working on the test harness itself.


---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610805#comment-16610805
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2682
  
Hey @danieljimenez - sorry it took so long... I left comments on the PR 
while playing with the processor on my side (+ a rebase is needed with the 
recent proxy PR being merged). Once all the comments are addressed, I believe 
we can quickly be in a position to merge this in.


> BigQuery processors
> ---
>
> Key: NIFI-4731
> URL: https://issues.apache.org/jira/browse/NIFI-4731
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mikhail Sosonkin
>Priority: Major
>
> NIFI should have processors for putting data into BigQuery (Streaming and 
> Batch).
> Initial working processors can be found this repository: 
> https://github.com/nologic/nifi/tree/NIFI-4731/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery
> I'd like to get them into Nifi proper.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/2682
  
Hey @danieljimenez - sorry it took so long... I left comments on the PR 
while playing with the processor on my side (+ a rebase is needed with the 
recent proxy PR being merged). Once all the comments are addressed, I believe 
we can quickly be in a position to merge this in.


---


[jira] [Commented] (NIFI-5318) Implement NiFi test harness

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5318?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610799#comment-16610799
 ] 

ASF GitHub Bot commented on NIFI-5318:
--

Github user peter-gergely-horvath commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2872#discussion_r216713844
  
--- Diff: 
nifi-testharness/src/main/java/org/apache/nifi/testharness/util/FileUtils.java 
---
@@ -0,0 +1,88 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+
+
+package org.apache.nifi.testharness.util;
+
+import java.io.File;
+import java.io.IOException;
+import java.nio.file.FileVisitResult;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.nio.file.SimpleFileVisitor;
+import java.nio.file.attribute.BasicFileAttributes;
+import java.util.Arrays;
+
+public final class FileUtils {
+
+
+private static final String MAC_DS_STORE_NAME = ".DS_Store";
+
+private FileUtils() {
+// no instances
+}
+
+public static void deleteDirectoryRecursive(Path directory) throws 
IOException {
+Files.walkFileTree(directory, new SimpleFileVisitor() {
+@Override
+public FileVisitResult visitFile(Path file, 
BasicFileAttributes attrs) throws IOException {
+Files.delete(file);
+return FileVisitResult.CONTINUE;
+}
+
+@Override
+public FileVisitResult postVisitDirectory(Path dir, 
IOException exc) throws IOException {
+Files.delete(dir);
+return FileVisitResult.CONTINUE;
+}
+});
+}
+
+public static void deleteDirectoryRecursive(File dir) {
+try {
+deleteDirectoryRecursive(dir.toPath());
+} catch (IOException e) {
+throw new RuntimeException(e);
+}
+}
+
+public static void createLink(Path newLink, Path existingFile)  {
+try {
+Files.createSymbolicLink(newLink, existingFile);
--- End diff --

No, I have not tested this on Windows. I would expect it to work properly, 
since [Windows 10 does support symbolic 
links](https://blogs.windows.com/buildingapps/2016/12/02/symlinks-windows-10/). 
(you just need the [correct 
permission](https://docs.microsoft.com/en-us/windows/security/threat-protection/security-policy-settings/create-symbolic-links))



> Implement NiFi test harness
> ---
>
> Key: NIFI-5318
> URL: https://issues.apache.org/jira/browse/NIFI-5318
> Project: Apache NiFi
>  Issue Type: New Feature
>Reporter: Peter Horvath
>Priority: Major
>
> Currently, it is not really possible to automatically test the behaviour of a 
> specific NiFi flow and make unit test type asserts if it works as expected. 
> For example, if the expected behaviour of a NiFi flow is that a file placed 
> to a specific directory will trigger some operation after which some output 
> file will appear at another directory, once currently can only do one thing: 
> test the NiFi flow manually. 
> Manual testing is especially hard to manage if a NiFi flow is being actively 
> developed: any change to a complex, existing NiFi flow might require a lot of 
> manual testing just to ensure there are no regressions introduced. 
> Some kind of Java API that allows managing a NiFi instance and manipulating 
> flow deployments like for example, [Codehaus 
> Cargo|]https://codehaus-cargo.github.io/] would be of great help. 
>  
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2872: NIFI-5318 Implement NiFi test harness: initial comm...

2018-09-11 Thread peter-gergely-horvath
Github user peter-gergely-horvath commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2872#discussion_r216713844
  
--- Diff: 
nifi-testharness/src/main/java/org/apache/nifi/testharness/util/FileUtils.java 
---
@@ -0,0 +1,88 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+
+
+package org.apache.nifi.testharness.util;
+
+import java.io.File;
+import java.io.IOException;
+import java.nio.file.FileVisitResult;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.nio.file.SimpleFileVisitor;
+import java.nio.file.attribute.BasicFileAttributes;
+import java.util.Arrays;
+
+public final class FileUtils {
+
+
+private static final String MAC_DS_STORE_NAME = ".DS_Store";
+
+private FileUtils() {
+// no instances
+}
+
+public static void deleteDirectoryRecursive(Path directory) throws 
IOException {
+Files.walkFileTree(directory, new SimpleFileVisitor() {
+@Override
+public FileVisitResult visitFile(Path file, 
BasicFileAttributes attrs) throws IOException {
+Files.delete(file);
+return FileVisitResult.CONTINUE;
+}
+
+@Override
+public FileVisitResult postVisitDirectory(Path dir, 
IOException exc) throws IOException {
+Files.delete(dir);
+return FileVisitResult.CONTINUE;
+}
+});
+}
+
+public static void deleteDirectoryRecursive(File dir) {
+try {
+deleteDirectoryRecursive(dir.toPath());
+} catch (IOException e) {
+throw new RuntimeException(e);
+}
+}
+
+public static void createLink(Path newLink, Path existingFile)  {
+try {
+Files.createSymbolicLink(newLink, existingFile);
--- End diff --

No, I have not tested this on Windows. I would expect it to work properly, 
since [Windows 10 does support symbolic 
links](https://blogs.windows.com/buildingapps/2016/12/02/symlinks-windows-10/). 
(you just need the [correct 
permission](https://docs.microsoft.com/en-us/windows/security/threat-protection/security-policy-settings/create-symbolic-links))



---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610796#comment-16610796
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216713400
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216713400
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[jira] [Updated] (NIFI-5474) ReplaceText RegexReplace evaluates payload as Expression language

2018-09-11 Thread Matt Burgess (JIRA)


 [ 
https://issues.apache.org/jira/browse/NIFI-5474?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess updated NIFI-5474:
---
   Resolution: Fixed
Fix Version/s: 1.8.0
   Status: Resolved  (was: Patch Available)

> ReplaceText RegexReplace evaluates payload as Expression language
> -
>
> Key: NIFI-5474
> URL: https://issues.apache.org/jira/browse/NIFI-5474
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.7.0, 1.7.1
>Reporter: Joseph Percivall
>Assignee: Mark Payne
>Priority: Major
> Fix For: 1.8.0
>
>
> To reproduce, add "${this will fail}" to the ReplaceTest unit test resource 
> "hello.txt" and run one of the tests (like testRegexWithExpressionLanguage). 
> You'll end up seeing an error message like this: 
> {quote}java.lang.AssertionError: 
> org.apache.nifi.attribute.expression.language.exception.AttributeExpressionLanguageException:
>  Invalid Expression: ${replaceValue}, World! ${this will fail} due to 
> Unexpected token 'will' at line 1, column 7. Query: ${this will fail}
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:201)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:160)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:155)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:150)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:145)
> at 
> org.apache.nifi.processors.standard.TestReplaceText.testRegexWithExpressionLanguage(TestReplaceText.java:382)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
> at 
> com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
> at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
> Caused by: 
> org.apache.nifi.attribute.expression.language.exception.AttributeExpressionLanguageException:
>  Invalid Expression: ${replaceValue}, World! ${this will fail} due to 
> Unexpected token 'will' at line 1, column 7. Query: ${this will fail}
> at 
> org.apache.nifi.attribute.expression.language.InvalidPreparedQuery.evaluateExpressions(InvalidPreparedQuery.java:49)
> at 
> org.apache.nifi.attribute.expression.language.StandardPropertyValue.evaluateAttributeExpressions(StandardPropertyValue.java:160)
> at 
> org.apache.nifi.util.MockPropertyValue.evaluateAttributeExpressions(MockPropertyValue.java:257)
> at 
> org.apache.nifi.util.MockPropertyValue.evaluateAttributeExpressions(MockPropertyValue.java:244)
> at 
> org.apache.nifi.processors.standard.ReplaceText$RegexReplace.replace(ReplaceText.java:564)
> at 
> org.apache.nifi.processors.standard.ReplaceText.onTrigger(ReplaceText.java:299)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner$RunProcessor.call(StandardProcessorTestRunner.java:251)
> at 
> 

[jira] [Commented] (NIFI-5474) ReplaceText RegexReplace evaluates payload as Expression language

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5474?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610786#comment-16610786
 ] 

ASF GitHub Bot commented on NIFI-5474:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2951


> ReplaceText RegexReplace evaluates payload as Expression language
> -
>
> Key: NIFI-5474
> URL: https://issues.apache.org/jira/browse/NIFI-5474
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.7.0, 1.7.1
>Reporter: Joseph Percivall
>Assignee: Mark Payne
>Priority: Major
>
> To reproduce, add "${this will fail}" to the ReplaceTest unit test resource 
> "hello.txt" and run one of the tests (like testRegexWithExpressionLanguage). 
> You'll end up seeing an error message like this: 
> {quote}java.lang.AssertionError: 
> org.apache.nifi.attribute.expression.language.exception.AttributeExpressionLanguageException:
>  Invalid Expression: ${replaceValue}, World! ${this will fail} due to 
> Unexpected token 'will' at line 1, column 7. Query: ${this will fail}
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:201)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:160)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:155)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:150)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:145)
> at 
> org.apache.nifi.processors.standard.TestReplaceText.testRegexWithExpressionLanguage(TestReplaceText.java:382)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
> at 
> com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
> at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
> Caused by: 
> org.apache.nifi.attribute.expression.language.exception.AttributeExpressionLanguageException:
>  Invalid Expression: ${replaceValue}, World! ${this will fail} due to 
> Unexpected token 'will' at line 1, column 7. Query: ${this will fail}
> at 
> org.apache.nifi.attribute.expression.language.InvalidPreparedQuery.evaluateExpressions(InvalidPreparedQuery.java:49)
> at 
> org.apache.nifi.attribute.expression.language.StandardPropertyValue.evaluateAttributeExpressions(StandardPropertyValue.java:160)
> at 
> org.apache.nifi.util.MockPropertyValue.evaluateAttributeExpressions(MockPropertyValue.java:257)
> at 
> org.apache.nifi.util.MockPropertyValue.evaluateAttributeExpressions(MockPropertyValue.java:244)
> at 
> org.apache.nifi.processors.standard.ReplaceText$RegexReplace.replace(ReplaceText.java:564)
> at 
> org.apache.nifi.processors.standard.ReplaceText.onTrigger(ReplaceText.java:299)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner$RunProcessor.call(StandardProcessorTestRunner.java:251)
> at 
> 

[jira] [Commented] (NIFI-4272) ReplaceText processor does not properly iterate multiple replacement values when EL is used

2018-09-11 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610781#comment-16610781
 ] 

ASF subversion and git services commented on NIFI-4272:
---

Commit 0274bd4ff3f4199838ff1307c9c01d98fcc9150b in nifi's branch 
refs/heads/master from [~markap14]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0274bd4 ]

Revert "NIFI-4272 support multiple captures when EL is present in replacement 
value"

This reverts commit f7f809c3d3632eea5234b31740984b73de322464.

NIFI-5474, NIFI-4272: When using Regex Replace with ReplaceText, and there are 
capturing groups, ensure that we populate the 'additionalVariables' map for 
each match of the regex. This allows Expression Language to reference the 
back-references properly even when there are multiple matches

Signed-off-by: Matthew Burgess 

This closes #2951


> ReplaceText processor does not properly iterate multiple replacement values 
> when EL is used
> ---
>
> Key: NIFI-4272
> URL: https://issues.apache.org/jira/browse/NIFI-4272
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.0, 1.2.0, 1.3.0
>Reporter: Matthew Clarke
>Assignee: Otto Fowler
>Priority: Major
> Fix For: 1.7.0
>
>
> I am using the ReplaceText processor to take a string input (example:   
> {"name":"Smith","middle":"nifi","firstname":"John"} ) and change all the 
> filed names to all uppercase.
> Using above input as an example, I expect output like 
> {"NAME":"Smith","MIDDLE":"nifi","FIRSTNAME":"John"}
> I expect I should be able to do this with ReplaceText processor; however, I 
> see some unexpected behavior:
> ---
> Test 1:  (uses EL in the replacement value property)
> Search value:  \"([a-z]+?)\":\"(.+?)\"
> Replacement Value: \"*${'$1':toUpper()}*":\"$2\"
> Result: {"NAME":"Smith","NAME":"nifi","NAME":"John"}
> ---
> Test 2:  (Does not use EL in the replacement Value property)
> Search value:  \"([a-z]+?)\":\"(.+?)\"
> Replacement Value: \"new$1":\"$2\"
> Result: {"newname":"Smith","newmiddle":"nifi","newfirstname":"John"}
> 
> As you can see if I use a NiFi expression Language statement in the 
> Replacement Value property it no longer iterates as expect through the 
> various $1 captured values. It repeatedly uses the EL result from the first 
> EL evaluation in every iteration while $2 correctly iterates through the 
> search values.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4272) ReplaceText processor does not properly iterate multiple replacement values when EL is used

2018-09-11 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610783#comment-16610783
 ] 

ASF subversion and git services commented on NIFI-4272:
---

Commit 0274bd4ff3f4199838ff1307c9c01d98fcc9150b in nifi's branch 
refs/heads/master from [~markap14]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0274bd4 ]

Revert "NIFI-4272 support multiple captures when EL is present in replacement 
value"

This reverts commit f7f809c3d3632eea5234b31740984b73de322464.

NIFI-5474, NIFI-4272: When using Regex Replace with ReplaceText, and there are 
capturing groups, ensure that we populate the 'additionalVariables' map for 
each match of the regex. This allows Expression Language to reference the 
back-references properly even when there are multiple matches

Signed-off-by: Matthew Burgess 

This closes #2951


> ReplaceText processor does not properly iterate multiple replacement values 
> when EL is used
> ---
>
> Key: NIFI-4272
> URL: https://issues.apache.org/jira/browse/NIFI-4272
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.1.0, 1.2.0, 1.3.0
>Reporter: Matthew Clarke
>Assignee: Otto Fowler
>Priority: Major
> Fix For: 1.7.0
>
>
> I am using the ReplaceText processor to take a string input (example:   
> {"name":"Smith","middle":"nifi","firstname":"John"} ) and change all the 
> filed names to all uppercase.
> Using above input as an example, I expect output like 
> {"NAME":"Smith","MIDDLE":"nifi","FIRSTNAME":"John"}
> I expect I should be able to do this with ReplaceText processor; however, I 
> see some unexpected behavior:
> ---
> Test 1:  (uses EL in the replacement value property)
> Search value:  \"([a-z]+?)\":\"(.+?)\"
> Replacement Value: \"*${'$1':toUpper()}*":\"$2\"
> Result: {"NAME":"Smith","NAME":"nifi","NAME":"John"}
> ---
> Test 2:  (Does not use EL in the replacement Value property)
> Search value:  \"([a-z]+?)\":\"(.+?)\"
> Replacement Value: \"new$1":\"$2\"
> Result: {"newname":"Smith","newmiddle":"nifi","newfirstname":"John"}
> 
> As you can see if I use a NiFi expression Language statement in the 
> Replacement Value property it no longer iterates as expect through the 
> various $1 captured values. It repeatedly uses the EL result from the first 
> EL evaluation in every iteration while $2 correctly iterates through the 
> search values.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-5474) ReplaceText RegexReplace evaluates payload as Expression language

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5474?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610780#comment-16610780
 ] 

ASF GitHub Bot commented on NIFI-5474:
--

Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/2951
  
+1 LGTM, ran unit tests and contrib-check, tried on a live NiFi instance. 
Thanks for the fix! Merging to master


> ReplaceText RegexReplace evaluates payload as Expression language
> -
>
> Key: NIFI-5474
> URL: https://issues.apache.org/jira/browse/NIFI-5474
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.7.0, 1.7.1
>Reporter: Joseph Percivall
>Assignee: Mark Payne
>Priority: Major
>
> To reproduce, add "${this will fail}" to the ReplaceTest unit test resource 
> "hello.txt" and run one of the tests (like testRegexWithExpressionLanguage). 
> You'll end up seeing an error message like this: 
> {quote}java.lang.AssertionError: 
> org.apache.nifi.attribute.expression.language.exception.AttributeExpressionLanguageException:
>  Invalid Expression: ${replaceValue}, World! ${this will fail} due to 
> Unexpected token 'will' at line 1, column 7. Query: ${this will fail}
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:201)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:160)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:155)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:150)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:145)
> at 
> org.apache.nifi.processors.standard.TestReplaceText.testRegexWithExpressionLanguage(TestReplaceText.java:382)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
> at 
> com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
> at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
> Caused by: 
> org.apache.nifi.attribute.expression.language.exception.AttributeExpressionLanguageException:
>  Invalid Expression: ${replaceValue}, World! ${this will fail} due to 
> Unexpected token 'will' at line 1, column 7. Query: ${this will fail}
> at 
> org.apache.nifi.attribute.expression.language.InvalidPreparedQuery.evaluateExpressions(InvalidPreparedQuery.java:49)
> at 
> org.apache.nifi.attribute.expression.language.StandardPropertyValue.evaluateAttributeExpressions(StandardPropertyValue.java:160)
> at 
> org.apache.nifi.util.MockPropertyValue.evaluateAttributeExpressions(MockPropertyValue.java:257)
> at 
> org.apache.nifi.util.MockPropertyValue.evaluateAttributeExpressions(MockPropertyValue.java:244)
> at 
> org.apache.nifi.processors.standard.ReplaceText$RegexReplace.replace(ReplaceText.java:564)
> at 
> org.apache.nifi.processors.standard.ReplaceText.onTrigger(ReplaceText.java:299)
> at 
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at 
> 

[GitHub] nifi pull request #2951: NIFI-5474: When using Regex Replace with ReplaceTex...

2018-09-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2951


---


[jira] [Commented] (NIFI-5474) ReplaceText RegexReplace evaluates payload as Expression language

2018-09-11 Thread ASF subversion and git services (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5474?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610782#comment-16610782
 ] 

ASF subversion and git services commented on NIFI-5474:
---

Commit 0274bd4ff3f4199838ff1307c9c01d98fcc9150b in nifi's branch 
refs/heads/master from [~markap14]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=0274bd4 ]

Revert "NIFI-4272 support multiple captures when EL is present in replacement 
value"

This reverts commit f7f809c3d3632eea5234b31740984b73de322464.

NIFI-5474, NIFI-4272: When using Regex Replace with ReplaceText, and there are 
capturing groups, ensure that we populate the 'additionalVariables' map for 
each match of the regex. This allows Expression Language to reference the 
back-references properly even when there are multiple matches

Signed-off-by: Matthew Burgess 

This closes #2951


> ReplaceText RegexReplace evaluates payload as Expression language
> -
>
> Key: NIFI-5474
> URL: https://issues.apache.org/jira/browse/NIFI-5474
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.7.0, 1.7.1
>Reporter: Joseph Percivall
>Assignee: Mark Payne
>Priority: Major
>
> To reproduce, add "${this will fail}" to the ReplaceTest unit test resource 
> "hello.txt" and run one of the tests (like testRegexWithExpressionLanguage). 
> You'll end up seeing an error message like this: 
> {quote}java.lang.AssertionError: 
> org.apache.nifi.attribute.expression.language.exception.AttributeExpressionLanguageException:
>  Invalid Expression: ${replaceValue}, World! ${this will fail} due to 
> Unexpected token 'will' at line 1, column 7. Query: ${this will fail}
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:201)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:160)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:155)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:150)
> at 
> org.apache.nifi.util.StandardProcessorTestRunner.run(StandardProcessorTestRunner.java:145)
> at 
> org.apache.nifi.processors.standard.TestReplaceText.testRegexWithExpressionLanguage(TestReplaceText.java:382)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
> at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
> at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at 
> org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:239)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
> at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
> at 
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
> at 
> com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
> at 
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
> Caused by: 
> org.apache.nifi.attribute.expression.language.exception.AttributeExpressionLanguageException:
>  Invalid Expression: ${replaceValue}, World! ${this will fail} due to 
> Unexpected token 'will' at line 1, column 7. Query: ${this will fail}
> at 
> org.apache.nifi.attribute.expression.language.InvalidPreparedQuery.evaluateExpressions(InvalidPreparedQuery.java:49)
> at 
> org.apache.nifi.attribute.expression.language.StandardPropertyValue.evaluateAttributeExpressions(StandardPropertyValue.java:160)
> at 
> 

[GitHub] nifi issue #2951: NIFI-5474: When using Regex Replace with ReplaceText, and ...

2018-09-11 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/2951
  
+1 LGTM, ran unit tests and contrib-check, tried on a live NiFi instance. 
Thanks for the fix! Merging to master


---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610778#comment-16610778
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216708343
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216708343
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216704031
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/AbstractBigQueryProcessor.java
 ---
@@ -0,0 +1,122 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.api.gax.retrying.RetrySettings;
+import com.google.auth.oauth2.GoogleCredentials;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.AbstractGCPProcessor;
+import org.apache.nifi.util.StringUtils;
+
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+/**
+ * Base class for creating processors that connect to GCP BiqQuery service
+ */
+public abstract class AbstractBigQueryProcessor extends 
AbstractGCPProcessor {
+static final int BUFFER_SIZE = 65536;
+public static final Relationship REL_SUCCESS =
+new Relationship.Builder().name("success")
+.description("FlowFiles are routed to this 
relationship after a successful Google BigQuery operation.")
+.build();
+public static final Relationship REL_FAILURE =
+new Relationship.Builder().name("failure")
+.description("FlowFiles are routed to this 
relationship if the Google BigQuery operation fails.")
+.build();
+
+public static final Set relationships = 
Collections.unmodifiableSet(
+new HashSet<>(Arrays.asList(REL_SUCCESS, REL_FAILURE)));
+
+public static final PropertyDescriptor DATASET = new PropertyDescriptor
+.Builder().name(BigQueryAttributes.DATASET_ATTR)
+.displayName("Dataset")
+.description(BigQueryAttributes.DATASET_DESC)
+.required(true)
+.defaultValue("${" + BigQueryAttributes.DATASET_ATTR + "}")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor TABLE_NAME = new 
PropertyDescriptor
+.Builder().name(BigQueryAttributes.TABLE_NAME_ATTR)
+.displayName("Table Name")
+.description(BigQueryAttributes.TABLE_NAME_DESC)
+.required(true)
+.defaultValue("${" + BigQueryAttributes.TABLE_NAME_ATTR + "}")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor TABLE_SCHEMA = new 
PropertyDescriptor
+.Builder().name(BigQueryAttributes.TABLE_SCHEMA_ATTR)
+.displayName("Table Schema")
+.description(BigQueryAttributes.TABLE_SCHEMA_DESC)
+.required(false)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor READ_TIMEOUT = new 
PropertyDescriptor
--- End diff --

You missed that property in the list of supported property descriptors for 
the processor. It causes a NPE:

2018-09-11 17:00:21,606 ERROR [Timer-Driven Process Thread-6] 

[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610760#comment-16610760
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216704031
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/AbstractBigQueryProcessor.java
 ---
@@ -0,0 +1,122 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.api.gax.retrying.RetrySettings;
+import com.google.auth.oauth2.GoogleCredentials;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.BigQueryOptions;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.AbstractGCPProcessor;
+import org.apache.nifi.util.StringUtils;
+
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+/**
+ * Base class for creating processors that connect to GCP BiqQuery service
+ */
+public abstract class AbstractBigQueryProcessor extends 
AbstractGCPProcessor {
+static final int BUFFER_SIZE = 65536;
+public static final Relationship REL_SUCCESS =
+new Relationship.Builder().name("success")
+.description("FlowFiles are routed to this 
relationship after a successful Google BigQuery operation.")
+.build();
+public static final Relationship REL_FAILURE =
+new Relationship.Builder().name("failure")
+.description("FlowFiles are routed to this 
relationship if the Google BigQuery operation fails.")
+.build();
+
+public static final Set relationships = 
Collections.unmodifiableSet(
+new HashSet<>(Arrays.asList(REL_SUCCESS, REL_FAILURE)));
+
+public static final PropertyDescriptor DATASET = new PropertyDescriptor
+.Builder().name(BigQueryAttributes.DATASET_ATTR)
+.displayName("Dataset")
+.description(BigQueryAttributes.DATASET_DESC)
+.required(true)
+.defaultValue("${" + BigQueryAttributes.DATASET_ATTR + "}")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor TABLE_NAME = new 
PropertyDescriptor
+.Builder().name(BigQueryAttributes.TABLE_NAME_ATTR)
+.displayName("Table Name")
+.description(BigQueryAttributes.TABLE_NAME_DESC)
+.required(true)
+.defaultValue("${" + BigQueryAttributes.TABLE_NAME_ATTR + "}")
+
.expressionLanguageSupported(ExpressionLanguageScope.FLOWFILE_ATTRIBUTES)
+.addValidator(StandardValidators.NON_EMPTY_EL_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor TABLE_SCHEMA = new 
PropertyDescriptor
+.Builder().name(BigQueryAttributes.TABLE_SCHEMA_ATTR)
+.displayName("Table Schema")
+.description(BigQueryAttributes.TABLE_SCHEMA_DESC)
+.required(false)
+
.expressionLanguageSupported(ExpressionLanguageScope.VARIABLE_REGISTRY)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor READ_TIMEOUT = new 

[jira] [Updated] (MINIFICPP-605) Determine feasability of input requirements

2018-09-11 Thread Mr TheSegfault (JIRA)


 [ 
https://issues.apache.org/jira/browse/MINIFICPP-605?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mr TheSegfault updated MINIFICPP-605:
-
Description: 
Determine what the most effective and quickest way is to implement input 
requirements. 

 

Can easily make part of constructor, but would like a compile time check to 
ensure that this requirement isn't broken via function calls. 

  was:Determine what the most effective and quickest way is to implement input 
requirements. 


> Determine feasability of input requirements
> ---
>
> Key: MINIFICPP-605
> URL: https://issues.apache.org/jira/browse/MINIFICPP-605
> Project: NiFi MiNiFi C++
>  Issue Type: Improvement
>Reporter: Mr TheSegfault
>Assignee: Mr TheSegfault
>Priority: Major
>
> Determine what the most effective and quickest way is to implement input 
> requirements. 
>  
> Can easily make part of constructor, but would like a compile time check to 
> ensure that this requirement isn't broken via function calls. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610742#comment-16610742
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216699464
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/BigQueryAttributes.java
 ---
@@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import org.apache.nifi.components.PropertyDescriptor;
+import 
org.apache.nifi.processors.gcp.credentials.factory.CredentialPropertyDescriptors;
+
+/**
+ * Attributes associated with the BigQuery processors
+ */
+public class BigQueryAttributes {
+private BigQueryAttributes() {}
+
+public static final PropertyDescriptor SERVICE_ACCOUNT_JSON_FILE = 
CredentialPropertyDescriptors.SERVICE_ACCOUNT_JSON_FILE;
+
+public static final String DATASET_ATTR = "bq.dataset";
+public static final String DATASET_DESC = "BigQuery dataset";
--- End diff --

Can we add to the description that the dataset must exist before trying to 
add data into it (new table or not)?


> BigQuery processors
> ---
>
> Key: NIFI-4731
> URL: https://issues.apache.org/jira/browse/NIFI-4731
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mikhail Sosonkin
>Priority: Major
>
> NIFI should have processors for putting data into BigQuery (Streaming and 
> Batch).
> Initial working processors can be found this repository: 
> https://github.com/nologic/nifi/tree/NIFI-4731/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery
> I'd like to get them into Nifi proper.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216699464
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/BigQueryAttributes.java
 ---
@@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import org.apache.nifi.components.PropertyDescriptor;
+import 
org.apache.nifi.processors.gcp.credentials.factory.CredentialPropertyDescriptors;
+
+/**
+ * Attributes associated with the BigQuery processors
+ */
+public class BigQueryAttributes {
+private BigQueryAttributes() {}
+
+public static final PropertyDescriptor SERVICE_ACCOUNT_JSON_FILE = 
CredentialPropertyDescriptors.SERVICE_ACCOUNT_JSON_FILE;
+
+public static final String DATASET_ATTR = "bq.dataset";
+public static final String DATASET_DESC = "BigQuery dataset";
--- End diff --

Can we add to the description that the dataset must exist before trying to 
add data into it (new table or not)?


---


[jira] [Created] (MINIFICPP-605) Determine feasability of input requirements

2018-09-11 Thread Mr TheSegfault (JIRA)
Mr TheSegfault created MINIFICPP-605:


 Summary: Determine feasability of input requirements
 Key: MINIFICPP-605
 URL: https://issues.apache.org/jira/browse/MINIFICPP-605
 Project: NiFi MiNiFi C++
  Issue Type: Improvement
Reporter: Mr TheSegfault
Assignee: Mr TheSegfault


Determine what the most effective and quickest way is to implement input 
requirements. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216697497
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610736#comment-16610736
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216697497
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610722#comment-16610722
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216694142
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/BigQueryAttributes.java
 ---
@@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import org.apache.nifi.components.PropertyDescriptor;
+import 
org.apache.nifi.processors.gcp.credentials.factory.CredentialPropertyDescriptors;
+
+/**
+ * Attributes associated with the BigQuery processors
+ */
+public class BigQueryAttributes {
+private BigQueryAttributes() {}
+
+public static final PropertyDescriptor SERVICE_ACCOUNT_JSON_FILE = 
CredentialPropertyDescriptors.SERVICE_ACCOUNT_JSON_FILE;
+
+public static final String DATASET_ATTR = "bq.dataset";
+public static final String DATASET_DESC = "BigQuery dataset";
+
+public static final String TABLE_NAME_ATTR = "bq.table.name";
+public static final String TABLE_NAME_DESC = "BigQuery table name";
+
+public static final String TABLE_SCHEMA_ATTR = "bq.table.schema";
+public static final String TABLE_SCHEMA_DESC = "BigQuery table name";
--- End diff --

copy/paste error.
besides the description should specify that the schema must be provided as 
JSON data. At first I tried to specify the schema with 
``name:STRING,age:INTEGER,email:STRING`` which is also an option in the 
BigQuery UI.


> BigQuery processors
> ---
>
> Key: NIFI-4731
> URL: https://issues.apache.org/jira/browse/NIFI-4731
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mikhail Sosonkin
>Priority: Major
>
> NIFI should have processors for putting data into BigQuery (Streaming and 
> Batch).
> Initial working processors can be found this repository: 
> https://github.com/nologic/nifi/tree/NIFI-4731/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery
> I'd like to get them into Nifi proper.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216694142
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/BigQueryAttributes.java
 ---
@@ -0,0 +1,80 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import org.apache.nifi.components.PropertyDescriptor;
+import 
org.apache.nifi.processors.gcp.credentials.factory.CredentialPropertyDescriptors;
+
+/**
+ * Attributes associated with the BigQuery processors
+ */
+public class BigQueryAttributes {
+private BigQueryAttributes() {}
+
+public static final PropertyDescriptor SERVICE_ACCOUNT_JSON_FILE = 
CredentialPropertyDescriptors.SERVICE_ACCOUNT_JSON_FILE;
+
+public static final String DATASET_ATTR = "bq.dataset";
+public static final String DATASET_DESC = "BigQuery dataset";
+
+public static final String TABLE_NAME_ATTR = "bq.table.name";
+public static final String TABLE_NAME_DESC = "BigQuery table name";
+
+public static final String TABLE_SCHEMA_ATTR = "bq.table.schema";
+public static final String TABLE_SCHEMA_DESC = "BigQuery table name";
--- End diff --

copy/paste error.
besides the description should specify that the schema must be provided as 
JSON data. At first I tried to specify the schema with 
``name:STRING,age:INTEGER,email:STRING`` which is also an option in the 
BigQuery UI.


---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610717#comment-16610717
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216692916
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216692916
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610711#comment-16610711
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216691828
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216691828
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610699#comment-16610699
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216690236
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610700#comment-16610700
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216690423
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216690423
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216690236
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[jira] [Commented] (NIFI-5573) Allow overriding of nifi-env.sh

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610693#comment-16610693
 ] 

ASF GitHub Bot commented on NIFI-5573:
--

Github user pepov commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2985#discussion_r216688824
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/bin/nifi-env.sh
 ---
@@ -16,16 +16,38 @@
 #limitations under the License.
 #
 
+# By default this file will unconditionally override whatever environment 
variables you have set
+# and set them to defaults defined here.
+# If you want to define your own versions outside of this script please 
set the environment variable
+# NIFI_OVERRIDE_NIFIENV to "true". That will then use whatever variables 
you used outside of
+# this script.
+
 # The java implementation to use.
 #export JAVA_HOME=/usr/java/jdk1.8.0/
 
-export NIFI_HOME=$(cd "${SCRIPT_DIR}" && cd .. && pwd)
+setOrDefault() {
+  declare envvar="$1" default="$2"
+
+  local res="$envvar"
+  if [ -z "$envvar" ] || [ "$NIFI_OVERRIDE_NIFIENV" != "true" ]
--- End diff --

okey, thanks for checking in!


> Allow overriding of nifi-env.sh
> ---
>
> Key: NIFI-5573
> URL: https://issues.apache.org/jira/browse/NIFI-5573
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
>
> (as discussed in 
> https://lists.apache.org/thread.html/ddfbff7f371d47c6da013ff14e28bce3b353716653a01649a408d0ce@%3Cdev.nifi.apache.org%3E)
> Currently nifi-env.sh unconditionally sets NIFI_HOME, NIFI_PID_DIR, 
> NIFI_LOG_DIR and NIFI_ALLOW_EXPLICIT_KEYTAB so they can only be overridden by 
> changing nifi-env.sh.
> Other *-env.sh files I looked at (e.g. from Hadoop or HBase) have most/all 
> their settings commented out or only override variables if they have not 
> already been set outside of the *-env.sh script.
> Peter and [~joewitt] witt from the mailing list are in favor of keeping the 
> current behavior of the file unchanged due to the fear that it might break 
> something for some people out there.
> There are a few different options I can think of on how to work around this:
>  # Have another environment variable NIFI_DISABLE_NIFIENV that basically 
> exits the nifi-env.sh script if it's set
>  # NIFI_OVERRIDE_NIFIENV which - if set to true - allows externally set 
> environment variables to override the ones in nifi-env.sh
> I'm sure there are more but those are the ones I can think of now.
> I'm in favor of option 2 as that allows me to selectively use the defaults 
> from nifi-env.sh
>  
> I can provide a patch once we've agreed on a way to go forward.
>  
> This would help me tremendously in an environment where I cannot easily alter 
> the nifi-env.sh file. This is also useful in the Docker image which currently 
> wipes out the nifi-env.sh script so its own environment variable take effect.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2985: NIFI-5573 Allow overriding of nifi-env.sh

2018-09-11 Thread pepov
Github user pepov commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2985#discussion_r216688824
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/bin/nifi-env.sh
 ---
@@ -16,16 +16,38 @@
 #limitations under the License.
 #
 
+# By default this file will unconditionally override whatever environment 
variables you have set
+# and set them to defaults defined here.
+# If you want to define your own versions outside of this script please 
set the environment variable
+# NIFI_OVERRIDE_NIFIENV to "true". That will then use whatever variables 
you used outside of
+# this script.
+
 # The java implementation to use.
 #export JAVA_HOME=/usr/java/jdk1.8.0/
 
-export NIFI_HOME=$(cd "${SCRIPT_DIR}" && cd .. && pwd)
+setOrDefault() {
+  declare envvar="$1" default="$2"
+
+  local res="$envvar"
+  if [ -z "$envvar" ] || [ "$NIFI_OVERRIDE_NIFIENV" != "true" ]
--- End diff --

okey, thanks for checking in!


---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610690#comment-16610690
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216687983
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java
 ---
@@ -43,7 +43,8 @@
 .Builder().name("gcp-project-id")
 .displayName("Project ID")
 .description("Google Cloud Project ID")
-.required(true)
--- End diff --

Can you add a custom validate to ensure this property is set in processors 
where it is required (such as the one you're proposing here)?


> BigQuery processors
> ---
>
> Key: NIFI-4731
> URL: https://issues.apache.org/jira/browse/NIFI-4731
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mikhail Sosonkin
>Priority: Major
>
> NIFI should have processors for putting data into BigQuery (Streaming and 
> Batch).
> Initial working processors can be found this repository: 
> https://github.com/nologic/nifi/tree/NIFI-4731/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery
> I'd like to get them into Nifi proper.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216687983
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java
 ---
@@ -43,7 +43,8 @@
 .Builder().name("gcp-project-id")
 .displayName("Project ID")
 .description("Google Cloud Project ID")
-.required(true)
--- End diff --

Can you add a custom validate to ensure this property is set in processors 
where it is required (such as the one you're proposing here)?


---


[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610666#comment-16610666
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216683722
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610665#comment-16610665
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216683619
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216683722
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216683619
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery/PutBigQueryBatch.java
 ---
@@ -0,0 +1,269 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.gcp.bigquery;
+
+import com.google.cloud.RetryOption;
+import com.google.cloud.bigquery.BigQuery;
+import com.google.cloud.bigquery.FormatOptions;
+import com.google.cloud.bigquery.Job;
+import com.google.cloud.bigquery.JobInfo;
+import com.google.cloud.bigquery.Schema;
+import com.google.cloud.bigquery.TableDataWriteChannel;
+import com.google.cloud.bigquery.TableId;
+import com.google.cloud.bigquery.WriteChannelConfiguration;
+import com.google.common.collect.ImmutableList;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.PropertyValue;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.LogLevel;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.processors.gcp.storage.DeleteGCSObject;
+import org.apache.nifi.processors.gcp.storage.PutGCSObject;
+import org.apache.nifi.util.StringUtils;
+import org.threeten.bp.Duration;
+import org.threeten.bp.temporal.ChronoUnit;
+
+import java.nio.ByteBuffer;
+import java.nio.channels.Channels;
+import java.nio.channels.ReadableByteChannel;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+/**
+ * A processor for batch loading data into a Google BigQuery table
+ */
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@Tags({"google", "google cloud", "bq", "bigquery"})
+@CapabilityDescription("Batch loads flow files to a Google BigQuery 
table.")
+@SeeAlso({PutGCSObject.class, DeleteGCSObject.class})
+
+@WritesAttributes({
+@WritesAttribute(attribute = BigQueryAttributes.DATASET_ATTR, 
description = BigQueryAttributes.DATASET_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_NAME_ATTR, 
description = BigQueryAttributes.TABLE_NAME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.TABLE_SCHEMA_ATTR, 
description = BigQueryAttributes.TABLE_SCHEMA_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.SOURCE_TYPE_ATTR, 
description = BigQueryAttributes.SOURCE_TYPE_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.IGNORE_UNKNOWN_ATTR, description = 
BigQueryAttributes.IGNORE_UNKNOWN_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.CREATE_DISPOSITION_ATTR, description = 
BigQueryAttributes.CREATE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.WRITE_DISPOSITION_ATTR, description = 
BigQueryAttributes.WRITE_DISPOSITION_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.MAX_BADRECORDS_ATTR, description = 
BigQueryAttributes.MAX_BADRECORDS_DESC),
+@WritesAttribute(attribute = 
BigQueryAttributes.JOB_CREATE_TIME_ATTR, description = 
BigQueryAttributes.JOB_CREATE_TIME_DESC),
+@WritesAttribute(attribute = BigQueryAttributes.JOB_END_TIME_ATTR, 
description = BigQueryAttributes.JOB_END_TIME_DESC),
+@WritesAttribute(attribute = 

[jira] [Commented] (NIFI-4731) BigQuery processors

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-4731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610553#comment-16610553
 ] 

ASF GitHub Bot commented on NIFI-4731:
--

Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216662096
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java
 ---
@@ -43,7 +43,8 @@
 .Builder().name("gcp-project-id")
 .displayName("Project ID")
 .description("Google Cloud Project ID")
-.required(true)
--- End diff --

OK get it. So it's for the other JIRA. Good to mention it to be sure we 
update the JIRA when we merge this one.


> BigQuery processors
> ---
>
> Key: NIFI-4731
> URL: https://issues.apache.org/jira/browse/NIFI-4731
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Reporter: Mikhail Sosonkin
>Priority: Major
>
> NIFI should have processors for putting data into BigQuery (Streaming and 
> Batch).
> Initial working processors can be found this repository: 
> https://github.com/nologic/nifi/tree/NIFI-4731/nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/bigquery
> I'd like to get them into Nifi proper.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2682: NIFI-4731: BQ Processors and GCP library update.

2018-09-11 Thread pvillard31
Github user pvillard31 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2682#discussion_r216662096
  
--- Diff: 
nifi-nar-bundles/nifi-gcp-bundle/nifi-gcp-processors/src/main/java/org/apache/nifi/processors/gcp/AbstractGCPProcessor.java
 ---
@@ -43,7 +43,8 @@
 .Builder().name("gcp-project-id")
 .displayName("Project ID")
 .description("Google Cloud Project ID")
-.required(true)
--- End diff --

OK get it. So it's for the other JIRA. Good to mention it to be sure we 
update the JIRA when we merge this one.


---


[jira] [Commented] (NIFI-375) New user role: Operator who can start and stop components

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610528#comment-16610528
 ] 

ASF GitHub Bot commented on NIFI-375:
-

Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2990
  
@ijokarumawak Thanks for the additional commits! Will review...


> New user role: Operator who can start and stop components
> -
>
> Key: NIFI-375
> URL: https://issues.apache.org/jira/browse/NIFI-375
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Core Framework
>Reporter: Daniel Ueberfluss
>Assignee: Koji Kawamura
>Priority: Major
>
> Would like to have a user role that allows a user to stop/start processors 
> but perform no other changes to the dataflow.
> This would allow users to address simple problems without providing full 
> access to modifying a data flow. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2990: NIFI-375: Added operation policy

2018-09-11 Thread mcgilman
Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2990
  
@ijokarumawak Thanks for the additional commits! Will review...


---


[jira] [Commented] (NIFI-5566) Bring HashContent inline with HashService and rename legacy components

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5566?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610523#comment-16610523
 ] 

ASF GitHub Bot commented on NIFI-5566:
--

Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2983
  
There are checkstyle errors not related to my pr on this pr as well.

```bash
[INFO] --- maven-checkstyle-plugin:2.17:check (check-style) @ 
nifi-standard-processors ---
[WARNING] 
src/main/java/org/apache/nifi/processors/standard/CryptographicHashAttribute.java:[202]
 (sizes) LineLength: Line is longer than 200 characters (found 202).
[WARNING] 
src/main/java/org/apache/nifi/security/util/crypto/HashService.java:[84] 
(sizes) LineLength: Line is longer than 200 characters (found 207).
[INFO] 

```


> Bring HashContent inline with HashService and rename legacy components
> --
>
> Key: NIFI-5566
> URL: https://issues.apache.org/jira/browse/NIFI-5566
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.7.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Major
>  Labels: backwards-compatibility, hash, security
>
> As documented in [NIFI-5147|https://issues.apache.org/jira/browse/NIFI-5147] 
> and [PR 2980|https://github.com/apache/nifi/pull/2980], the {{HashAttribute}} 
> processor and {{HashContent}} processor are lacking some features, do not 
> offer consistent algorithms across platforms, etc. 
> I propose the following:
> * Rename {{HashAttribute}} (which does not provide the service of calculating 
> a hash over one or more attributes) to {{HashAttributeLegacy}}
> * Renamed {{CalculateAttributeHash}} to {{HashAttribute}} to make semantic 
> sense
> * Rename {{HashContent}} to {{HashContentLegacy}} for users who need obscure 
> digest algorithms which may or may not have been offered on their platform
> * Implement a processor {{HashContent}} with similar semantics to the 
> existing processor but with consistent algorithm offerings and using the 
> common {{HashService}} offering
> With the new component versioning features provided as part of the flow 
> versioning behavior, silently disrupting existing flows which use these 
> processors is no longer a concern. Rather, Any flow currently using the 
> existing processors will either:
> 1. continue normal operation
> 1. require flow manager interaction and provide documentation about the change
>   1. migration notes and upgrade instructions will be provided



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...

2018-09-11 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2983
  
There are checkstyle errors not related to my pr on this pr as well.

```bash
[INFO] --- maven-checkstyle-plugin:2.17:check (check-style) @ 
nifi-standard-processors ---
[WARNING] 
src/main/java/org/apache/nifi/processors/standard/CryptographicHashAttribute.java:[202]
 (sizes) LineLength: Line is longer than 200 characters (found 202).
[WARNING] 
src/main/java/org/apache/nifi/security/util/crypto/HashService.java:[84] 
(sizes) LineLength: Line is longer than 200 characters (found 207).
[INFO] 

```


---


[jira] [Commented] (NIFI-5566) Bring HashContent inline with HashService and rename legacy components

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5566?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610478#comment-16610478
 ] 

ASF GitHub Bot commented on NIFI-5566:
--

Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2983
  
https://github.com/alopresto/nifi/pull/6


> Bring HashContent inline with HashService and rename legacy components
> --
>
> Key: NIFI-5566
> URL: https://issues.apache.org/jira/browse/NIFI-5566
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.7.1
>Reporter: Andy LoPresto
>Assignee: Andy LoPresto
>Priority: Major
>  Labels: backwards-compatibility, hash, security
>
> As documented in [NIFI-5147|https://issues.apache.org/jira/browse/NIFI-5147] 
> and [PR 2980|https://github.com/apache/nifi/pull/2980], the {{HashAttribute}} 
> processor and {{HashContent}} processor are lacking some features, do not 
> offer consistent algorithms across platforms, etc. 
> I propose the following:
> * Rename {{HashAttribute}} (which does not provide the service of calculating 
> a hash over one or more attributes) to {{HashAttributeLegacy}}
> * Renamed {{CalculateAttributeHash}} to {{HashAttribute}} to make semantic 
> sense
> * Rename {{HashContent}} to {{HashContentLegacy}} for users who need obscure 
> digest algorithms which may or may not have been offered on their platform
> * Implement a processor {{HashContent}} with similar semantics to the 
> existing processor but with consistent algorithm offerings and using the 
> common {{HashService}} offering
> With the new component versioning features provided as part of the flow 
> versioning behavior, silently disrupting existing flows which use these 
> processors is no longer a concern. Rather, Any flow currently using the 
> existing processors will either:
> 1. continue normal operation
> 1. require flow manager interaction and provide documentation about the change
>   1. migration notes and upgrade instructions will be provided



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi issue #2983: NIFI-5566 Improve HashContent processor and standardize Ha...

2018-09-11 Thread ottobackwards
Github user ottobackwards commented on the issue:

https://github.com/apache/nifi/pull/2983
  
https://github.com/alopresto/nifi/pull/6


---


[jira] [Commented] (NIFI-5573) Allow overriding of nifi-env.sh

2018-09-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610436#comment-16610436
 ] 

ASF GitHub Bot commented on NIFI-5573:
--

Github user lfrancke commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2985#discussion_r216626331
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/bin/nifi-env.sh
 ---
@@ -16,16 +16,38 @@
 #limitations under the License.
 #
 
+# By default this file will unconditionally override whatever environment 
variables you have set
+# and set them to defaults defined here.
+# If you want to define your own versions outside of this script please 
set the environment variable
+# NIFI_OVERRIDE_NIFIENV to "true". That will then use whatever variables 
you used outside of
+# this script.
+
 # The java implementation to use.
 #export JAVA_HOME=/usr/java/jdk1.8.0/
 
-export NIFI_HOME=$(cd "${SCRIPT_DIR}" && cd .. && pwd)
+setOrDefault() {
+  declare envvar="$1" default="$2"
+
+  local res="$envvar"
+  if [ -z "$envvar" ] || [ "$NIFI_OVERRIDE_NIFIENV" != "true" ]
--- End diff --

Sorry @pepov ! No worries at all. I'm on vacation and won't be back for 
another week. I'll get to it then. Your comments make sense. I just haven't 
gotten around to look at them in detail.


> Allow overriding of nifi-env.sh
> ---
>
> Key: NIFI-5573
> URL: https://issues.apache.org/jira/browse/NIFI-5573
> Project: Apache NiFi
>  Issue Type: Improvement
>Reporter: Lars Francke
>Assignee: Lars Francke
>Priority: Minor
>
> (as discussed in 
> https://lists.apache.org/thread.html/ddfbff7f371d47c6da013ff14e28bce3b353716653a01649a408d0ce@%3Cdev.nifi.apache.org%3E)
> Currently nifi-env.sh unconditionally sets NIFI_HOME, NIFI_PID_DIR, 
> NIFI_LOG_DIR and NIFI_ALLOW_EXPLICIT_KEYTAB so they can only be overridden by 
> changing nifi-env.sh.
> Other *-env.sh files I looked at (e.g. from Hadoop or HBase) have most/all 
> their settings commented out or only override variables if they have not 
> already been set outside of the *-env.sh script.
> Peter and [~joewitt] witt from the mailing list are in favor of keeping the 
> current behavior of the file unchanged due to the fear that it might break 
> something for some people out there.
> There are a few different options I can think of on how to work around this:
>  # Have another environment variable NIFI_DISABLE_NIFIENV that basically 
> exits the nifi-env.sh script if it's set
>  # NIFI_OVERRIDE_NIFIENV which - if set to true - allows externally set 
> environment variables to override the ones in nifi-env.sh
> I'm sure there are more but those are the ones I can think of now.
> I'm in favor of option 2 as that allows me to selectively use the defaults 
> from nifi-env.sh
>  
> I can provide a patch once we've agreed on a way to go forward.
>  
> This would help me tremendously in an environment where I cannot easily alter 
> the nifi-env.sh file. This is also useful in the Docker image which currently 
> wipes out the nifi-env.sh script so its own environment variable take effect.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] nifi pull request #2985: NIFI-5573 Allow overriding of nifi-env.sh

2018-09-11 Thread lfrancke
Github user lfrancke commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2985#discussion_r216626331
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-resources/src/main/resources/bin/nifi-env.sh
 ---
@@ -16,16 +16,38 @@
 #limitations under the License.
 #
 
+# By default this file will unconditionally override whatever environment 
variables you have set
+# and set them to defaults defined here.
+# If you want to define your own versions outside of this script please 
set the environment variable
+# NIFI_OVERRIDE_NIFIENV to "true". That will then use whatever variables 
you used outside of
+# this script.
+
 # The java implementation to use.
 #export JAVA_HOME=/usr/java/jdk1.8.0/
 
-export NIFI_HOME=$(cd "${SCRIPT_DIR}" && cd .. && pwd)
+setOrDefault() {
+  declare envvar="$1" default="$2"
+
+  local res="$envvar"
+  if [ -z "$envvar" ] || [ "$NIFI_OVERRIDE_NIFIENV" != "true" ]
--- End diff --

Sorry @pepov ! No worries at all. I'm on vacation and won't be back for 
another week. I'll get to it then. Your comments make sense. I just haven't 
gotten around to look at them in detail.


---


[jira] [Commented] (NIFI-5081) Lack of guidance and inability to deal with ISO-8601 dates

2018-09-11 Thread Matt Forrester (JIRA)


[ 
https://issues.apache.org/jira/browse/NIFI-5081?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16610252#comment-16610252
 ] 

Matt Forrester commented on NIFI-5081:
--

Thanks for your response and trying to help, it sounds like the right direction 
but I've been looking for the API docs you mentioned for 15 mins and I cannot 
find the examples...

I don't meant to sound grumpy but the bug is about poor documentation, I'd help 
out if I could because I have a significant amount invested n NiFi, but I can't 
figure out how to achieve this and having to understand "flavors" seems to 
completely miss the point of ISO8601.

> Lack of guidance and inability to deal with ISO-8601 dates
> --
>
> Key: NIFI-5081
> URL: https://issues.apache.org/jira/browse/NIFI-5081
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Documentation  Website
>Affects Versions: 1.6.0
> Environment: Ubuntu / Chromeium
>Reporter: Matt Forrester
>Priority: Minor
> Attachments: y.xml
>
>
> I've got a Node process that outputs in JSON onto an SQS queue. The dates it 
> spits out are ISO-8601 dates within a string, which is the normal, default 
> and best way to do this in JSON.
> I tried putting them into MongoDB with PutMongo and they go in as strings, 
> which is not good ( https://issues.apache.org/jira/browse/NIFI-2079 ).
> Gave up on Mongo and tried PostgreSQL...
> Figuring I was in Java land I used an esoteric path of GetSQS > 
> EvaluateJsonPath > UpdateAttribute [ 
> "$\{time:toDate("-MM-dd'T'HH:mm:ss.SSS'Z'", "GMT") ] > PutSQL to get it 
> into what I assume is a java.lang.Date, it took me forever to find the 
> sql.args.N.type's required but for some reason PutSQL does not like 
> java.util.Dates.
> Eventually found the ConvertJSONToSQL processor and this created my SQL for 
> me, but it doesn't work as it leaves ISO-8601 dates as ISO-8601 dates, which 
> don't seem to work.
> Eventually found this 
> [https://community.hortonworks.com/questions/84772/putsql-with-date-as-argument.html]
>  and now I have something working, but I'm using my esoteric GetSQS -> 
> EvaluateJsonPath -> UpdateAttribute -> PutSQL path again.
> Think there should be some documentation around this at least because it's 
> very non-obvious.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)