[GitHub] nifi pull request #599: NIFI-2156: Add ListDatabaseTables processor

2016-07-06 Thread mattyb149
Github user mattyb149 closed the pull request at:

https://github.com/apache/nifi/pull/599


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #613: NIFI-2157: Add GenerateTableFetch processor

2016-07-06 Thread mattyb149
GitHub user mattyb149 opened a pull request:

https://github.com/apache/nifi/pull/613

NIFI-2157: Add GenerateTableFetch processor

NOTE: This PR depends on #599. I put the DB-specific stuff in an interface 
and used ServiceLoader to get at them, but I welcome your comments perhaps on 
adding it to the DBCPService interface for example. I didn't want to overkill 
here, but if we end up with more processors that need a DB-specific interface, 
I wanted to make sure it was more generic. Perhaps it belongs in nifi-api or 
something in that case.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mattyb149/nifi NIFI-2157

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/613.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #613


commit c33da1d7f3635225f74ae4a722fbb49440e5e72a
Author: Matt Burgess <mattyb...@apache.org>
Date:   2016-07-07T02:34:37Z

NIFI-2156: Add ListDatabaseTables processor

commit 4037dbe540df0d3f98927cd66296f1310c53b9ef
Author: Matt Burgess <mattyb...@apache.org>
Date:   2016-07-07T02:35:55Z

NIFI-2157: Add GenerateTableFetch processor




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #602: NIFI-2165: fix support for inserting timestamps into cassan...

2016-07-01 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/602
  
Can you put your template in a Gist? I'll give it a try too


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #602: NIFI-2165: fix support for inserting timestamps into cassan...

2016-07-01 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/602
  
Were there quotes (single or double) in the real flow's value?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #602: NIFI-2165: fix support for inserting timestamps into cassan...

2016-07-01 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/602
  
I cherry-picked in the commits that added unit tests, to show that they 
would fail without your fix, however they pass. Any idea how the tests differ 
from the error that spawned the Jira?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #602: NIFI-2165: fix support for inserting timestamps into cassan...

2016-07-01 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/602
  
An example could be another test just like that one with cql.args.10.value 
set to "I'm not a timestamp" or a timestamp not in ISO-8601 format like 
"07.01.2016". Not saying the operation should succeed, but should be handled 
appropriately (error logged, flow file transferred to failure, e.g.)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69355562
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttp.java
 ---
@@ -0,0 +1,330 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.MediaType;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.RequestBody;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.AttributeExpression;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.apache.nifi.util.StringUtils;
+import org.codehaus.jackson.JsonNode;
+import org.codehaus.jackson.node.ArrayNode;
+
+import java.io.IOException;
+import java.net.URL;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Set;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "insert", "update", "upsert", "delete", "write", 
"put", "http"})
+@CapabilityDescription("Writes the contents of a FlowFile to 
Elasticsearch, using the specified parameters such as "
++ "the index to insert into and the type of the document.")
+public class PutElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are written to Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be written to 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
database cannot be updated but attempting the operation again may succeed")
+.build();
+
+public static final PropertyDescriptor ID_ATTRIBUTE = new 
PropertyDescriptor.Builder()
+.name("Identifier Attribute")
+.description("The name of the FlowFile attribute containing 
the identifier for the document. If the Index Operation is \"index\", "
++ "this property may be left empty or evalua

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69353249
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttp.java
 ---
@@ -0,0 +1,330 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.MediaType;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.RequestBody;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.AttributeExpression;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.apache.nifi.util.StringUtils;
+import org.codehaus.jackson.JsonNode;
+import org.codehaus.jackson.node.ArrayNode;
+
+import java.io.IOException;
+import java.net.URL;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Set;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "insert", "update", "upsert", "delete", "write", 
"put", "http"})
+@CapabilityDescription("Writes the contents of a FlowFile to 
Elasticsearch, using the specified parameters such as "
++ "the index to insert into and the type of the document.")
+public class PutElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are written to Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be written to 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
database cannot be updated but attempting the operation again may succeed")
+.build();
+
+public static final PropertyDescriptor ID_ATTRIBUTE = new 
PropertyDescriptor.Builder()
+.name("Identifier Attribute")
+.description("The name of the FlowFile attribute containing 
the identifier for the document. If the Index Operation is \"index\", "
++ "this property may be left empty or evalua

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r6935
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearchHttp.java
 ---
@@ -0,0 +1,330 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.MediaType;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.RequestBody;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.io.IOUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.expression.AttributeExpression;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.apache.nifi.util.StringUtils;
+import org.codehaus.jackson.JsonNode;
+import org.codehaus.jackson.node.ArrayNode;
+
+import java.io.IOException;
+import java.net.URL;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Set;
+
+import static org.apache.commons.lang3.StringUtils.trimToEmpty;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "insert", "update", "upsert", "delete", "write", 
"put", "http"})
+@CapabilityDescription("Writes the contents of a FlowFile to 
Elasticsearch, using the specified parameters such as "
++ "the index to insert into and the type of the document.")
+public class PutElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are written to Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be written to 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
database cannot be updated but attempting the operation again may succeed")
+.build();
+
+public static final PropertyDescriptor ID_ATTRIBUTE = new 
PropertyDescriptor.Builder()
+.name("Identifier Attribute")
+.description("The name of the FlowFile attribute containing 
the identifier for the document. If the Index Operation is \"index\", "
++ "this property may be left empty or evalua

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69351172
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/FetchElasticsearchHttp.java
 ---
@@ -0,0 +1,293 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.HttpUrl;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.codehaus.jackson.JsonNode;
+
+import java.io.IOException;
+import java.net.MalformedURLException;
+import java.net.URL;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_ALLOWED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "fetch", "read", "get", "http"})
+@CapabilityDescription("Retrieves a document from Elasticsearch using the 
specified connection properties and the "
++ "identifier of the document to retrieve.")
+@WritesAttributes({
+@WritesAttribute(attribute = "filename", description = "The 
filename attributes is set to the document identifier"),
+@WritesAttribute(attribute = "es.index", description = "The 
Elasticsearch index containing the document"),
+@WritesAttribute(attribute = "es.type", description = "The 
Elasticsearch document type")
+})
+public class FetchElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+private static final String FIELD_INCLUDE_QUERY_PARAM = 
"_source_include";
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are read from Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be read from 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
document cannot be fetched but attempting the operation again may succeed")
+.build();
+
+public static final Relationship REL_NOT_FOUND = new 
Relationship.Builder().name("not found")
+

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69346410
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/FetchElasticsearchHttp.java
 ---
@@ -0,0 +1,293 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.HttpUrl;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.codehaus.jackson.JsonNode;
+
+import java.io.IOException;
+import java.net.MalformedURLException;
+import java.net.URL;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_ALLOWED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "fetch", "read", "get", "http"})
+@CapabilityDescription("Retrieves a document from Elasticsearch using the 
specified connection properties and the "
++ "identifier of the document to retrieve.")
+@WritesAttributes({
+@WritesAttribute(attribute = "filename", description = "The 
filename attributes is set to the document identifier"),
+@WritesAttribute(attribute = "es.index", description = "The 
Elasticsearch index containing the document"),
+@WritesAttribute(attribute = "es.type", description = "The 
Elasticsearch document type")
+})
+public class FetchElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+private static final String FIELD_INCLUDE_QUERY_PARAM = 
"_source_include";
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are read from Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be read from 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
document cannot be fetched but attempting the operation again may succeed")
+.build();
+
+public static final Relationship REL_NOT_FOUND = new 
Relationship.Builder().name("not found")
+

[GitHub] nifi issue #602: NIFI-2165: fix support for inserting timestamps into cassan...

2016-07-01 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/602
  
Looks good, mind adding unit test(s) to try various good and bad values?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69325929
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/pom.xml
 ---
@@ -56,6 +56,11 @@ language governing permissions and limitations under the 
License. -->
 ${es.version}
 
 
+com.squareup.okhttp
+okhttp
+2.7.1
--- End diff --

Turns out the API is very similar (mostly just switched from get/set to a 
Builder), so we're good to go :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69320414
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/test/java/org/apache/nifi/processors/elasticsearch/TestFetchElasticsearchHttp.java
 ---
@@ -0,0 +1,292 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.Call;
+import com.squareup.okhttp.MediaType;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.Protocol;
+import com.squareup.okhttp.Request;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.MockFlowFile;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+import org.mockito.invocation.InvocationOnMock;
+import org.mockito.stubbing.Answer;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.HashMap;
+
+import static org.junit.Assert.assertNotNull;
+import static org.mockito.Matchers.any;
+import static org.mockito.Mockito.mock;
+import static org.mockito.Mockito.when;
+
+public class TestFetchElasticsearchHttp {
+
+private InputStream docExample;
+private TestRunner runner;
+
+@Before
+public void setUp() throws IOException {
+ClassLoader classloader = 
Thread.currentThread().getContextClassLoader();
+docExample = 
classloader.getResourceAsStream("DocumentExample.json");
--- End diff --

Not necessary. But historically when I have the content in the test I get 
review comments saying to put them in a file, and vice versa :P


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69319909
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/FetchElasticsearchHttp.java
 ---
@@ -0,0 +1,293 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.HttpUrl;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.codehaus.jackson.JsonNode;
+
+import java.io.IOException;
+import java.net.MalformedURLException;
+import java.net.URL;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_ALLOWED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "fetch", "read", "get", "http"})
+@CapabilityDescription("Retrieves a document from Elasticsearch using the 
specified connection properties and the "
++ "identifier of the document to retrieve.")
+@WritesAttributes({
+@WritesAttribute(attribute = "filename", description = "The 
filename attributes is set to the document identifier"),
+@WritesAttribute(attribute = "es.index", description = "The 
Elasticsearch index containing the document"),
+@WritesAttribute(attribute = "es.type", description = "The 
Elasticsearch document type")
+})
+public class FetchElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+private static final String FIELD_INCLUDE_QUERY_PARAM = 
"_source_include";
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are read from Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be read from 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
document cannot be fetched but attempting the operation again may succeed")
+.build();
+
+public static final Relationship REL_NOT_FOUND = new 
Relationship.Builder().name("not found")
+

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69319637
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/FetchElasticsearchHttp.java
 ---
@@ -0,0 +1,293 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.HttpUrl;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.codehaus.jackson.JsonNode;
+
+import java.io.IOException;
+import java.net.MalformedURLException;
+import java.net.URL;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_ALLOWED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "fetch", "read", "get", "http"})
+@CapabilityDescription("Retrieves a document from Elasticsearch using the 
specified connection properties and the "
++ "identifier of the document to retrieve.")
+@WritesAttributes({
+@WritesAttribute(attribute = "filename", description = "The 
filename attributes is set to the document identifier"),
+@WritesAttribute(attribute = "es.index", description = "The 
Elasticsearch index containing the document"),
+@WritesAttribute(attribute = "es.type", description = "The 
Elasticsearch document type")
+})
+public class FetchElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+private static final String FIELD_INCLUDE_QUERY_PARAM = 
"_source_include";
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are read from Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be read from 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
document cannot be fetched but attempting the operation again may succeed")
+.build();
+
+public static final Relationship REL_NOT_FOUND = new 
Relationship.Builder().name("not found")
+

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69319591
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/FetchElasticsearchHttp.java
 ---
@@ -0,0 +1,293 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.HttpUrl;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.codehaus.jackson.JsonNode;
+
+import java.io.IOException;
+import java.net.MalformedURLException;
+import java.net.URL;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_ALLOWED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "fetch", "read", "get", "http"})
+@CapabilityDescription("Retrieves a document from Elasticsearch using the 
specified connection properties and the "
++ "identifier of the document to retrieve.")
+@WritesAttributes({
+@WritesAttribute(attribute = "filename", description = "The 
filename attributes is set to the document identifier"),
+@WritesAttribute(attribute = "es.index", description = "The 
Elasticsearch index containing the document"),
+@WritesAttribute(attribute = "es.type", description = "The 
Elasticsearch document type")
+})
+public class FetchElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+private static final String FIELD_INCLUDE_QUERY_PARAM = 
"_source_include";
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are read from Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be read from 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
document cannot be fetched but attempting the operation again may succeed")
+.build();
+
+public static final Relationship REL_NOT_FOUND = new 
Relationship.Builder().name("not found")
+

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69318800
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/AbstractElasticsearchProcessor.java
 ---
@@ -17,129 +17,25 @@
 package org.apache.nifi.processors.elasticsearch;
 
 import org.apache.nifi.components.PropertyDescriptor;
-import org.apache.nifi.components.ValidationContext;
-import org.apache.nifi.components.ValidationResult;
-import org.apache.nifi.components.Validator;
-import org.apache.nifi.logging.ComponentLog;
 import org.apache.nifi.processor.AbstractProcessor;
 import org.apache.nifi.processor.ProcessContext;
 import org.apache.nifi.processor.exception.ProcessException;
 import org.apache.nifi.processor.util.StandardValidators;
 import org.apache.nifi.ssl.SSLContextService;
-import org.apache.nifi.util.StringUtils;
-import org.elasticsearch.client.Client;
-import org.elasticsearch.client.transport.TransportClient;
-import org.elasticsearch.common.settings.Settings;
-import org.elasticsearch.common.transport.InetSocketTransportAddress;
-
-import java.io.File;
-import java.lang.reflect.Constructor;
-import java.lang.reflect.InvocationTargetException;
-import java.lang.reflect.Method;
-import java.net.InetSocketAddress;
-import java.net.MalformedURLException;
-import java.net.URL;
-import java.net.URLClassLoader;
-import java.util.ArrayList;
-import java.util.Arrays;
-import java.util.Collection;
-import java.util.HashSet;
-import java.util.List;
-import java.util.Map;
-import java.util.Set;
-import java.util.concurrent.atomic.AtomicReference;
-
 
+/**
+ * A base class for all Elasticsearch processors
+ */
 public abstract class AbstractElasticsearchProcessor extends 
AbstractProcessor {
 
-/**
- * This validator ensures the Elasticsearch hosts property is a valid 
list of hostname:port entries
- */
-private static final Validator HOSTNAME_PORT_VALIDATOR = new 
Validator() {
-@Override
-public ValidationResult validate(final String subject, final 
String input, final ValidationContext context) {
-final List esList = Arrays.asList(input.split(","));
-for (String hostnamePort : esList) {
-String[] addresses = hostnamePort.split(":");
-// Protect against invalid input like 
http://127.0.0.1:9300 (URL scheme should not be there)
-if (addresses.length != 2) {
-return new 
ValidationResult.Builder().subject(subject).input(input).explanation(
-"Must be in hostname:port form (no scheme such 
as http://;).valid(false).build();
-}
-}
-return new 
ValidationResult.Builder().subject(subject).input(input).explanation(
-"Valid cluster definition").valid(true).build();
-}
-};
-
-protected static final PropertyDescriptor CLUSTER_NAME = new 
PropertyDescriptor.Builder()
-.name("Cluster Name")
-.description("Name of the ES cluster (for example, 
elasticsearch_brew). Defaults to 'elasticsearch'")
-.required(true)
-.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
-.defaultValue("elasticsearch")
-.build();
-
-protected static final PropertyDescriptor HOSTS = new 
PropertyDescriptor.Builder()
-.name("ElasticSearch Hosts")
-.description("ElasticSearch Hosts, which should be comma 
separated and colon for hostname/port "
-+ "host1:port,host2:port,  For example 
testcluster:9300.")
-.required(true)
-.expressionLanguageSupported(false)
-.addValidator(HOSTNAME_PORT_VALIDATOR)
-.build();
-
 public static final PropertyDescriptor PROP_SSL_CONTEXT_SERVICE = new 
PropertyDescriptor.Builder()
 .name("SSL Context Service")
 .description("The SSL Context Service used to provide client 
certificate information for TLS/SSL "
-+ "connections. This service only applies if the 
Shield plugin is available.")
++ "connections. This service only applies if the 
Elasticsearch endpoints have been protected by SSLShield plugin is available.")
--- End diff --

I've already changed that locally but didn't push since I knew there'd be 
more comments forthcoming :)


---
If your project

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69318859
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/FetchElasticsearchHttp.java
 ---
@@ -0,0 +1,293 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import com.squareup.okhttp.HttpUrl;
+import com.squareup.okhttp.OkHttpClient;
+import com.squareup.okhttp.Response;
+import com.squareup.okhttp.ResponseBody;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.ByteArrayInputStream;
+import org.codehaus.jackson.JsonNode;
+
+import java.io.IOException;
+import java.net.MalformedURLException;
+import java.net.URL;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_ALLOWED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "fetch", "read", "get", "http"})
+@CapabilityDescription("Retrieves a document from Elasticsearch using the 
specified connection properties and the "
++ "identifier of the document to retrieve.")
+@WritesAttributes({
+@WritesAttribute(attribute = "filename", description = "The 
filename attributes is set to the document identifier"),
+@WritesAttribute(attribute = "es.index", description = "The 
Elasticsearch index containing the document"),
+@WritesAttribute(attribute = "es.type", description = "The 
Elasticsearch document type")
+})
+public class FetchElasticsearchHttp extends 
AbstractElasticsearchHttpProcessor {
+
+private static final String FIELD_INCLUDE_QUERY_PARAM = 
"_source_include";
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are read from Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be read from 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
document cannot be fetched but attempting the operation again may succeed")
+.build();
+
+public static final Relationship REL_NOT_FOUND = new 
Relationship.Builder().name("not found")
+

[GitHub] nifi pull request #576: NIFI-2068: Add Elasticsearch HTTP processors

2016-07-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/576#discussion_r69318666
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/pom.xml
 ---
@@ -56,6 +56,11 @@ language governing permissions and limitations under the 
License. -->
 ${es.version}
 
 
+com.squareup.okhttp
+okhttp
+2.7.1
--- End diff --

Mostly to be able to borrow code from InvokeHttp, and in the hopes that we 
might have a single "common" NAR with oft-used libraries someday, rather than 
multiple versions of OkHttp (etc.) in various NARs. I will update the version 
(and the code that uses the API) to latest


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #595: pull new code

2016-06-30 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/595
  
This PR appears to be a collection of merge commits, were you trying to 
issue a PR with new code against the master branch? If so can you point me at 
the commit, I can help get the PR into the correct form. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #500: NIFI 1922

2016-06-27 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/500
  
Also it looks like like you've got a merge commit in your chain here. Can 
you start with a fresh master branch, apply your patch, then push the pull 
request? If you have two commits, perhaps you could squash them after applying 
them before pushing the branch/PR.  Please let me know if you have any 
questions, I can help with these operations if need be :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #500: NIFI 1922

2016-06-27 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/500
  
The file "NIFI-1922-patch" is included in this commit and should not be, 
just the changes it implies (which have been applied). Can you remove this file 
from the commit? Please and thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #558: NIFI-2065: When a provenance query matches the max number o...

2016-06-22 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/558
  
+1 LGTM, tested with plenty of provenance events, verified the "Skipping 
search" entry appeared in the log.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #550: NIFI-1900: Verify that connection's destination is not runn...

2016-06-22 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/550
  
+1 LGTM, reproduced the error then applied the patch and retested, verified 
that a connection cannot be moved if its destination has active threads (even 
if stopped).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #477: NIFI-1663: Add ConvertAvroToORC processor

2016-06-22 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/477#discussion_r68073235
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/main/java/org/apache/nifi/processors/hive/ConvertAvroToORC.java
 ---
@@ -0,0 +1,303 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.hive;
+
+import org.apache.avro.Schema;
+import org.apache.avro.file.DataFileStream;
+import org.apache.avro.generic.GenericDatumReader;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.commons.lang3.mutable.MutableInt;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.StreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.util.hive.HiveJdbcCommon;
+import org.apache.nifi.util.orc.OrcFlowFileWriter;
+import org.apache.nifi.util.orc.OrcUtils;
+import org.apache.orc.CompressionKind;
+import org.apache.orc.OrcFile;
+import org.apache.orc.TypeDescription;
+
+import java.io.BufferedInputStream;
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicInteger;
+import java.util.concurrent.atomic.AtomicReference;
+
+/**
+ * The ConvertAvroToORC processor takes an Avro-formatted flow file as 
input and converts it into ORC format.
+ */
+@SideEffectFree
+@SupportsBatching
+@Tags({"avro", "orc", "hive", "convert"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Converts an Avro record into ORC file format. This 
processor provides a direct mapping of an Avro record to an ORC record, such "
++ "that the resulting ORC file will have the same hierarchical 
structure as the Avro document. If an incoming FlowFile contains a stream of "
++ "multiple Avro records, the resultant FlowFile will contain a 
ORC file containing all of the Avro records.  If an incoming FlowFile does "
++ "not contain any records, an empty ORC file is the output.")
+@WritesAttributes({
+@WritesAttribute(attribute = "mime.type", description = "Sets the 
mime type to application/octet-stream"),
+@WritesAttribute(attribute = "filename", description = "Sets the 
filename to the existing filename with the extension replaced by / added to by 
.orc"),
+@WritesAttribute(attribute = "record.count", description = "Sets 
the number of records in the ORC file."),
+@WritesAttribute(attribute = "hive.ddl", description =

[GitHub] nifi pull request #477: NIFI-1663: Add ConvertAvroToORC processor

2016-06-22 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/477#discussion_r68072879
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/main/java/org/apache/nifi/processors/hive/ConvertAvroToORC.java
 ---
@@ -0,0 +1,303 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.hive;
+
+import org.apache.avro.Schema;
+import org.apache.avro.file.DataFileStream;
+import org.apache.avro.generic.GenericDatumReader;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.commons.lang3.mutable.MutableInt;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.StreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.util.hive.HiveJdbcCommon;
+import org.apache.nifi.util.orc.OrcFlowFileWriter;
+import org.apache.nifi.util.orc.OrcUtils;
+import org.apache.orc.CompressionKind;
+import org.apache.orc.OrcFile;
+import org.apache.orc.TypeDescription;
+
+import java.io.BufferedInputStream;
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicInteger;
+import java.util.concurrent.atomic.AtomicReference;
+
+/**
+ * The ConvertAvroToORC processor takes an Avro-formatted flow file as 
input and converts it into ORC format.
+ */
+@SideEffectFree
+@SupportsBatching
+@Tags({"avro", "orc", "hive", "convert"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Converts an Avro record into ORC file format. This 
processor provides a direct mapping of an Avro record to an ORC record, such "
++ "that the resulting ORC file will have the same hierarchical 
structure as the Avro document. If an incoming FlowFile contains a stream of "
++ "multiple Avro records, the resultant FlowFile will contain a 
ORC file containing all of the Avro records.  If an incoming FlowFile does "
++ "not contain any records, an empty ORC file is the output.")
+@WritesAttributes({
+@WritesAttribute(attribute = "mime.type", description = "Sets the 
mime type to application/octet-stream"),
+@WritesAttribute(attribute = "filename", description = "Sets the 
filename to the existing filename with the extension replaced by / added to by 
.orc"),
+@WritesAttribute(attribute = "record.count", description = "Sets 
the number of records in the ORC file."),
+@WritesAttribute(attribute = "hive.ddl", description =

[GitHub] nifi pull request #477: NIFI-1663: Add ConvertAvroToORC processor

2016-06-22 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/477#discussion_r68072906
  
--- Diff: 
nifi-nar-bundles/nifi-hive-bundle/nifi-hive-processors/src/main/java/org/apache/nifi/processors/hive/ConvertAvroToORC.java
 ---
@@ -0,0 +1,303 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.hive;
+
+import org.apache.avro.Schema;
+import org.apache.avro.file.DataFileStream;
+import org.apache.avro.generic.GenericDatumReader;
+import org.apache.avro.generic.GenericRecord;
+import org.apache.commons.lang3.mutable.MutableInt;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.StreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.util.hive.HiveJdbcCommon;
+import org.apache.nifi.util.orc.OrcFlowFileWriter;
+import org.apache.nifi.util.orc.OrcUtils;
+import org.apache.orc.CompressionKind;
+import org.apache.orc.OrcFile;
+import org.apache.orc.TypeDescription;
+
+import java.io.BufferedInputStream;
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicInteger;
+import java.util.concurrent.atomic.AtomicReference;
+
+/**
+ * The ConvertAvroToORC processor takes an Avro-formatted flow file as 
input and converts it into ORC format.
+ */
+@SideEffectFree
+@SupportsBatching
+@Tags({"avro", "orc", "hive", "convert"})
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Converts an Avro record into ORC file format. This 
processor provides a direct mapping of an Avro record to an ORC record, such "
++ "that the resulting ORC file will have the same hierarchical 
structure as the Avro document. If an incoming FlowFile contains a stream of "
++ "multiple Avro records, the resultant FlowFile will contain a 
ORC file containing all of the Avro records.  If an incoming FlowFile does "
++ "not contain any records, an empty ORC file is the output.")
+@WritesAttributes({
+@WritesAttribute(attribute = "mime.type", description = "Sets the 
mime type to application/octet-stream"),
+@WritesAttribute(attribute = "filename", description = "Sets the 
filename to the existing filename with the extension replaced by / added to by 
.orc"),
+@WritesAttribute(attribute = "record.count", description = "Sets 
the number of records in the ORC file."),
+@WritesAttribute(attribute = "hive.ddl", description =

[GitHub] nifi pull request #534: Fix for NIFI-1838 & NIFI-1152

2016-06-21 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/534#discussion_r67970007
  
--- Diff: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/InvokeScriptedProcessor.java
 ---
@@ -92,11 +92,10 @@
 logger.error(message, t);
 }
 }
-} else {
-// Return defaults for now
-relationships.add(REL_SUCCESS);
-relationships.add(REL_FAILURE);
 }
+// Add defaults
+relationships.add(REL_SUCCESS);
+relationships.add(REL_FAILURE);
--- End diff --

Yeah if this is failing tests, I'd say take a look at the tests. We don't 
want to add default relationships, especially when there's a legit instance of 
a Processor, as it is the Processor's responsibility to define all relationships


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #248: NIFI-1568: Add Filter Capability to UnpackContent

2016-06-18 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/248#discussion_r67605391
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/UnpackContent.java
 ---
@@ -154,75 +171,88 @@ protected void init(final 
ProcessorInitializationContext context) {
 return properties;
 }
 
-@Override
-public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
-FlowFile flowFile = session.get();
-if (flowFile == null) {
-return;
-}
+@OnStopped
+public void onStopped() {
+unpacker = null;
+fileFilter = null;
+}
 
-final ComponentLog logger = getLogger();
-String packagingFormat = 
context.getProperty(PACKAGING_FORMAT).getValue().toLowerCase();
-if (AUTO_DETECT_FORMAT.equals(packagingFormat)) {
-final String mimeType = 
flowFile.getAttribute(CoreAttributes.MIME_TYPE.key());
-if (mimeType == null) {
-logger.error("No mime.type attribute set for {}; routing 
to failure", new Object[]{flowFile});
-session.transfer(flowFile, REL_FAILURE);
-return;
-}
+@OnScheduled
+public void onScheduled(ProcessContext context) throws 
ProcessException {
+if (fileFilter == null) {
+fileFilter = 
Pattern.compile(context.getProperty(FILE_FILTER).getValue());
+tarUnpacker = new TarUnpacker(fileFilter);
+zipUnpacker = new ZipUnpacker(fileFilter);
+flowFileStreamV3Unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV3());
+flowFileStreamV2Unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV2());
+flowFileTarUnpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV1());
+}
 
-switch (mimeType.toLowerCase()) {
-case "application/tar":
-packagingFormat = TAR_FORMAT;
-break;
-case "application/x-tar":
-packagingFormat = TAR_FORMAT;
-break;
-case "application/zip":
-packagingFormat = ZIP_FORMAT;
-break;
-case "application/flowfile-v3":
-packagingFormat = FLOWFILE_STREAM_FORMAT_V3;
-break;
-case "application/flowfile-v2":
-packagingFormat = FLOWFILE_STREAM_FORMAT_V2;
-break;
-case "application/flowfile-v1":
-packagingFormat = FLOWFILE_TAR_FORMAT;
-break;
-default: {
-logger.info("Cannot unpack {} because its mime.type 
attribute is set to '{}', which is not a format that can be unpacked; routing 
to 'success'", new Object[]{flowFile, mimeType});
-session.transfer(flowFile, REL_SUCCESS);
-return;
-}
-}
+PackageFormat format = 
PackageFormat.getFormat(context.getProperty(PACKAGING_FORMAT).getValue());
+if (format != PackageFormat.AUTO_DETECT_FORMAT && unpacker == 
null) {
+initUnpacker(format);
 }
+}
 
-final Unpacker unpacker;
-final boolean addFragmentAttrs;
+public void initUnpacker(PackageFormat packagingFormat) {
 switch (packagingFormat) {
 case TAR_FORMAT:
-unpacker = new TarUnpacker();
+case X_TAR_FORMAT:
+unpacker = tarUnpacker;
 addFragmentAttrs = true;
 break;
 case ZIP_FORMAT:
-unpacker = new ZipUnpacker();
+unpacker = zipUnpacker;
 addFragmentAttrs = true;
 break;
 case FLOWFILE_STREAM_FORMAT_V2:
-unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV2());
+unpacker = flowFileStreamV2Unpacker;
 addFragmentAttrs = false;
 break;
 case FLOWFILE_STREAM_FORMAT_V3:
-unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV3());
+unpacker = flowFileStreamV3Unpacker;
 addFragmentAttrs = false;
 break;
 case FLOWFILE_TAR_FORMAT:
-unpacker 

[GitHub] nifi pull request #256: NIFI-1578: Create PutSlack processor

2016-06-17 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/256#discussion_r67562782
  
--- Diff: 
nifi-nar-bundles/nifi-slack-bundle/nifi-slack-processors/src/main/java/org/apache/nifi/processors/slack/PutSlack.java
 ---
@@ -0,0 +1,245 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.slack;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.DataOutputStream;
+
+import javax.json.Json;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+import javax.json.JsonWriter;
+import java.io.IOException;
+import java.io.StringWriter;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.net.URLEncoder;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@Tags({"put", "slack", "notify"})
+@CapabilityDescription("Sends a message to your team on slack.com")
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class PutSlack extends AbstractProcessor {
+
+public static final PropertyDescriptor WEBHOOK_URL = new 
PropertyDescriptor
+.Builder()
+.name("webhook-url")
+.displayName("Webhook URL")
+.description("The POST URL provided by Slack to send messages 
into a channel.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.sensitive(true)
+.build();
+
+public static final PropertyDescriptor WEBHOOK_TEXT = new 
PropertyDescriptor
+.Builder()
+.name("webhook-text")
+.displayName("Webhook Text")
+.description("The text sent in the webhook message")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CHANNEL = new PropertyDescriptor
+.Builder()
+.name("channel")
+.displayName("Channel")
+.description("A public channel using #channel or direct 
message using @username. If not specified, " +
+"the default webhook channel as specified in Slack's 
Incoming Webhooks web interface is used.")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor
+.Builder()
+.name("username")
+.displayName("Username")
+.description("The displayed Slack username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
 

[GitHub] nifi issue #255: NIFI-1594: Add option to bulk using Index or Update.

2016-06-17 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/255
  
+1 LGTM, built and ran tests, also ran a few sample documents through the 
flow to exercise the index, update, and upsert logic. Thanks for the 
contribution! will merge to 0.x and master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #397: NIFI-1815

2016-06-16 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/397#discussion_r67451461
  
--- Diff: 
nifi-nar-bundles/nifi-ocr-bundle/nifi-ocr-processors/src/test/resources/tessdata/tessconfigs/nobatch
 ---
@@ -0,0 +1 @@
+
--- End diff --

Is this supposed to be an empty file or one that was created (which is 
probably better located in target/)?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #256: NIFI-1578: Create PutSlack processor

2016-06-16 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/256#discussion_r67392432
  
--- Diff: 
nifi-nar-bundles/nifi-slack-bundle/nifi-slack-processors/src/main/java/org/apache/nifi/processors/slack/PutSlack.java
 ---
@@ -0,0 +1,245 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.slack;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.DataOutputStream;
+
+import javax.json.Json;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+import javax.json.JsonWriter;
+import java.io.IOException;
+import java.io.StringWriter;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.net.URLEncoder;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@Tags({"put", "slack", "notify"})
+@CapabilityDescription("Sends a message to your team on slack.com")
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class PutSlack extends AbstractProcessor {
+
+public static final PropertyDescriptor WEBHOOK_URL = new 
PropertyDescriptor
+.Builder()
+.name("webhook-url")
+.displayName("Webhook URL")
+.description("The POST URL provided by Slack to send messages 
into a channel.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.sensitive(true)
+.build();
+
+public static final PropertyDescriptor WEBHOOK_TEXT = new 
PropertyDescriptor
+.Builder()
+.name("webhook-text")
+.displayName("Webhook Text")
+.description("The text sent in the webhook message")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CHANNEL = new PropertyDescriptor
+.Builder()
+.name("channel")
+.displayName("Channel")
+.description("A public channel using #channel or direct 
message using @username. If not specified, " +
+"the default webhook channel as specified in Slack's 
Incoming Webhooks web interface is used.")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor
+.Builder()
+.name("username")
+.displayName("Username")
+.description("The displayed Slack username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
 

[GitHub] nifi issue #256: NIFI-1578: Create PutSlack processor

2016-06-15 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/256
  
+1 LGTM, can you squash your commits into 1? should make it easier to 
rebase (will need to change the version to 0.7.0-SNAPSHOT) unless after you 
squash you want to issue another PR against 0.x? If not I can do the merge to 
both branches and will of course maintain your ID as the author. Thanks again 
for the sweet contribution!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #256: NIFI-1578: Create PutSlack processor

2016-06-15 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/256#discussion_r67281209
  
--- Diff: 
nifi-nar-bundles/nifi-slack-bundle/nifi-slack-processors/src/main/java/org/apache/nifi/processors/slack/PutSlack.java
 ---
@@ -0,0 +1,243 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.slack;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.DataOutputStream;
+
+import javax.json.Json;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+import javax.json.JsonWriter;
+import java.io.IOException;
+import java.io.StringWriter;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.net.URLEncoder;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@Tags({"put", "slack", "notify"})
+@CapabilityDescription("Sends a message to your team on slack.com")
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class PutSlack extends AbstractProcessor {
+
+public static final PropertyDescriptor WEBHOOK_URL = new 
PropertyDescriptor
+.Builder()
+.name("webhook-url")
+.displayName("Webhook URL")
+.description("The POST URL provided by Slack to send messages 
into a channel.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.sensitive(true)
+.build();
+
+public static final PropertyDescriptor WEBHOOK_TEXT = new 
PropertyDescriptor
+.Builder()
+.name("webhook-text")
+.displayName("Webhook Text")
+.description("The text sent in the webhook message")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CHANNEL = new PropertyDescriptor
+.Builder()
+.name("channel")
+.displayName("Channel")
+.description("A public channel using #channel or direct 
message using @username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor
+.Builder()
+.name("username")
+.displayName("Username")
+.description("The displayed Slack username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor ICON_URL = new 
PropertyDescriptor
+.Builder()
+.name("icon-url")
+

[GitHub] nifi pull request #255: NIFI-1594: Add option to bulk using Index or Update.

2016-06-15 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/255#discussion_r67277159
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearch.java
 ---
@@ -178,8 +190,20 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 public void process(final InputStream in) throws 
IOException {
 String json = IOUtils.toString(in, charset)
 .replace("\r\n", " ").replace('\n', ' 
').replace('\r', ' ');
-bulk.add(esClient.get().prepareIndex(index, 
docType, id)
-.setSource(json.getBytes(charset)));
+
+if (indexOp.equalsIgnoreCase("index")) {
+
bulk.add(esClient.get().prepareIndex(index, docType, id)
+
.setSource(json.getBytes(charset)));
+} else if (indexOp.equalsIgnoreCase("upsert")) 
{
--- End diff --

I haven;t seen a difference in behavior between index and upsert, I tried 
setting the document identifier attribute to a constant number. What should I 
do to try upsert or update?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #248: NIFI-1568: Add Filter Capability to UnpackContent

2016-06-15 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/248#discussion_r67246009
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/UnpackContent.java
 ---
@@ -154,75 +171,88 @@ protected void init(final 
ProcessorInitializationContext context) {
 return properties;
 }
 
-@Override
-public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
-FlowFile flowFile = session.get();
-if (flowFile == null) {
-return;
-}
+@OnStopped
+public void onStopped() {
+unpacker = null;
+fileFilter = null;
+}
 
-final ComponentLog logger = getLogger();
-String packagingFormat = 
context.getProperty(PACKAGING_FORMAT).getValue().toLowerCase();
-if (AUTO_DETECT_FORMAT.equals(packagingFormat)) {
-final String mimeType = 
flowFile.getAttribute(CoreAttributes.MIME_TYPE.key());
-if (mimeType == null) {
-logger.error("No mime.type attribute set for {}; routing 
to failure", new Object[]{flowFile});
-session.transfer(flowFile, REL_FAILURE);
-return;
-}
+@OnScheduled
+public void onScheduled(ProcessContext context) throws 
ProcessException {
+if (fileFilter == null) {
+fileFilter = 
Pattern.compile(context.getProperty(FILE_FILTER).getValue());
+tarUnpacker = new TarUnpacker(fileFilter);
+zipUnpacker = new ZipUnpacker(fileFilter);
+flowFileStreamV3Unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV3());
+flowFileStreamV2Unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV2());
+flowFileTarUnpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV1());
+}
 
-switch (mimeType.toLowerCase()) {
-case "application/tar":
-packagingFormat = TAR_FORMAT;
-break;
-case "application/x-tar":
-packagingFormat = TAR_FORMAT;
-break;
-case "application/zip":
-packagingFormat = ZIP_FORMAT;
-break;
-case "application/flowfile-v3":
-packagingFormat = FLOWFILE_STREAM_FORMAT_V3;
-break;
-case "application/flowfile-v2":
-packagingFormat = FLOWFILE_STREAM_FORMAT_V2;
-break;
-case "application/flowfile-v1":
-packagingFormat = FLOWFILE_TAR_FORMAT;
-break;
-default: {
-logger.info("Cannot unpack {} because its mime.type 
attribute is set to '{}', which is not a format that can be unpacked; routing 
to 'success'", new Object[]{flowFile, mimeType});
-session.transfer(flowFile, REL_SUCCESS);
-return;
-}
-}
+PackageFormat format = 
PackageFormat.getFormat(context.getProperty(PACKAGING_FORMAT).getValue());
+if (format != PackageFormat.AUTO_DETECT_FORMAT && unpacker == 
null) {
+initUnpacker(format);
 }
+}
 
-final Unpacker unpacker;
-final boolean addFragmentAttrs;
+public void initUnpacker(PackageFormat packagingFormat) {
 switch (packagingFormat) {
 case TAR_FORMAT:
-unpacker = new TarUnpacker();
+case X_TAR_FORMAT:
+unpacker = tarUnpacker;
 addFragmentAttrs = true;
 break;
 case ZIP_FORMAT:
-unpacker = new ZipUnpacker();
+unpacker = zipUnpacker;
 addFragmentAttrs = true;
 break;
 case FLOWFILE_STREAM_FORMAT_V2:
-unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV2());
+unpacker = flowFileStreamV2Unpacker;
 addFragmentAttrs = false;
 break;
 case FLOWFILE_STREAM_FORMAT_V3:
-unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV3());
+unpacker = flowFileStreamV3Unpacker;
 addFragmentAttrs = false;
 break;
 case FLOWFILE_TAR_FORMAT:
-unpacker 

[GitHub] nifi pull request #458: NIFI-1829 - Create new DebugFlow processor.

2016-06-15 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/458#discussion_r67230976
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DebugFlow.java
 ---
@@ -0,0 +1,499 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.http.annotation.ThreadSafe;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+
+import java.lang.reflect.InvocationTargetException;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicReference;
+
+@ThreadSafe()
+@EventDriven()
+@Tags({"test", "debug", "processor", "utility", "flow", "FlowFile"})
+@CapabilityDescription("The DebugFlow processor aids testing and debugging 
the FlowFile framework by allowing various "
++ "responses to be explicitly triggered in response to the receipt 
of a FlowFile or a timer event without a "
++ "FlowFile if using timer or cron based scheduling.  It can force 
responses needed to exercise or test "
++ "various failure modes that can occur when a processor runs.\n"
++ "\n"
++ "When triggered, the processor loops through the appropriate 
response list (based on whether or not it "
++ "received a FlowFile).  A response is produced the configured 
number of times for each pass through its"
++ "response list, as long as the processor is running.\n"
++ "\n"
++ "Triggered by a FlowFile, the processor can produce the 
following responses."
++ "  1. transfer FlowFile to success relationship.\n"
++ "  2. transfer FlowFile to failure relationship.\n"
++ "  3. rollback the FlowFile without penalty.\n"
++ "  4. rollback the FlowFile and yield the context.\n"
++ "  5. rollback the FlowFile with penalty.\n"
++ "  6. throw an exception.\n"
++ "\n"
++ "Triggered without a FlowFile, the processor can produce the 
following responses."
++ "  1. do nothing and return.\n"
++ "  2. throw an exception.\n"
++ "  3. yield the context.\n")
--- End diff --

I wonder if there's an abbreviated version of the doc for capability 
description, and move this out to an additional details page 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #475: NIFI-2026 - Add Maven profile to compile nifi-hadoop-librar...

2016-06-15 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/475
  
The arg part is useful for more than just this case, perhaps it could be 
set to a more generic property such as "nifi.additional.java.args" or 
something, and done as a separate improvement.

Perhaps the addition of repositories to anywhere in NiFi should be done on 
the devs email list as a discussion so everyone can weigh in (as they may not 
be watching this PR). The alternative is to add the repositories to a profile 
in your own settings.xml (where you have your own profiles with the java.arg 
stuff and hadoop.version set appropriately)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #475: NIFI-2026 - Add Maven profile to compile nifi-hadoop-librar...

2016-06-15 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/475
  
The more I think about this, especially with the inclusion of  
elements for specific vendors, I tend to agree with the settings.xml approach. 
Should be easy to describe in a blog post or email to the devs list on how to 
do this. I agree that messing with settings.xml is not always a good thing, but 
it supports profiles, repositories, properties, etc. that would make it fairly 
easy for someone to add to their settings.xml without putting vendor-specific 
stuff in Apache NiFi proper.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #475: NIFI-2026 - Add Maven profile to compile nifi-hadoop...

2016-06-15 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/475#discussion_r67174016
  
--- Diff: 
nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml 
---
@@ -21,19 +21,70 @@
 true
 true
 
-
-
+
+  
+vanilla
+   
+  true
+
+
+
+org.apache.nifi
+nifi-standard-services-api-nar
+nar
+
+
+org.apache.hadoop
+hadoop-client
+
+
+org.apache.avro
+avro
+
+
+  
+  
+custom_hadoop_libraries
--- End diff --

This profile should probably be up at a higher-level, for example 
hadoop-utils cannot build when I set hadoop.version to a vendor-specific value


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #256: NIFI-1578: Create PutSlack processor

2016-06-15 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/256#discussion_r67162014
  
--- Diff: 
nifi-nar-bundles/nifi-slack-bundle/nifi-slack-processors/src/main/java/org/apache/nifi/processors/slack/PutSlack.java
 ---
@@ -0,0 +1,243 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.slack;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.DataOutputStream;
+
+import javax.json.Json;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+import javax.json.JsonWriter;
+import java.io.IOException;
+import java.io.StringWriter;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.net.URLEncoder;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@Tags({"put", "slack", "notify"})
+@CapabilityDescription("Sends a message to your team on slack.com")
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class PutSlack extends AbstractProcessor {
+
+public static final PropertyDescriptor WEBHOOK_URL = new 
PropertyDescriptor
+.Builder()
+.name("webhook-url")
+.displayName("Webhook URL")
+.description("The POST URL provided by Slack to send messages 
into a channel.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.sensitive(true)
+.build();
+
+public static final PropertyDescriptor WEBHOOK_TEXT = new 
PropertyDescriptor
+.Builder()
+.name("webhook-text")
+.displayName("Webhook Text")
+.description("The text sent in the webhook message")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CHANNEL = new PropertyDescriptor
+.Builder()
+.name("channel")
+.displayName("Channel")
+.description("A public channel using #channel or direct 
message using @username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor
+.Builder()
+.name("username")
+.displayName("Username")
+.description("The displayed Slack username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor ICON_URL = new 
PropertyDescriptor
+.Builder()
+.name("icon-url")
+

[GitHub] nifi issue #477: NIFI-1663: Add ConvertAvroToORC processor

2016-06-15 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/477
  
Just updated the POM to use Orc 1.1.1 (and changed the hard-coded version 
to a property)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #475: NIFI-2026 - Add Maven profile to compile nifi-hadoop...

2016-06-15 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/475#discussion_r67157439
  
--- Diff: nifi-assembly/pom.xml ---
@@ -470,6 +470,15 @@ language governing permissions and limitations under 
the License. -->
 
 
 
+mapr
--- End diff --

Seems like with the addition of the custom_hadoop_libraries profile and the 
nifi.hadoop.distro.params, we don't need this explicit profile anymore (or it 
should be renamed to match the one in the NAR)? I like the generic 
profile/options versus anything vendor-specific


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #256: NIFI-1578: Create PutSlack processor

2016-06-14 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/256#discussion_r67039275
  
--- Diff: 
nifi-nar-bundles/nifi-slack-bundle/nifi-slack-processors/src/main/java/org/apache/nifi/processors/slack/PutSlack.java
 ---
@@ -0,0 +1,238 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.slack;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.DataOutputStream;
+
+import javax.json.Json;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+import javax.json.JsonWriter;
+import java.io.IOException;
+import java.io.StringWriter;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.net.URLEncoder;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@Tags({"put", "slack", "notify"})
+@CapabilityDescription("Sends a message to your team on slack.com")
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class PutSlack extends AbstractProcessor {
+
+public static final PropertyDescriptor WEBHOOK_URL = new 
PropertyDescriptor
+.Builder()
+.name("webhook-url")
+.displayName("Webhook URL")
+.description("The POST URL provided by Slack to send messages 
into a channel.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.sensitive(true)
+.build();
+
+public static final PropertyDescriptor WEBHOOK_TEXT = new 
PropertyDescriptor
+.Builder()
+.name("webhook-text")
+.displayName("Webhook Text")
+.description("The text sent in the webhook message")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CHANNEL = new PropertyDescriptor
+.Builder()
+.name("channel")
+.displayName("Channel")
+.description("A public channel using #channel or direct 
message using @username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor
+.Builder()
+.name("username")
+.displayName("Username")
+.description("The displayed Slack username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor ICON_URL = new 
PropertyDe

[GitHub] nifi pull request #256: NIFI-1578: Create PutSlack processor

2016-06-14 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/256#discussion_r67039181
  
--- Diff: 
nifi-nar-bundles/nifi-slack-bundle/nifi-slack-processors/src/main/java/org/apache/nifi/processors/slack/PutSlack.java
 ---
@@ -0,0 +1,238 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.slack;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.DataOutputStream;
+
+import javax.json.Json;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+import javax.json.JsonWriter;
+import java.io.IOException;
+import java.io.StringWriter;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.net.URLEncoder;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@Tags({"put", "slack", "notify"})
+@CapabilityDescription("Sends a message to your team on slack.com")
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class PutSlack extends AbstractProcessor {
+
+public static final PropertyDescriptor WEBHOOK_URL = new 
PropertyDescriptor
+.Builder()
+.name("webhook-url")
+.displayName("Webhook URL")
+.description("The POST URL provided by Slack to send messages 
into a channel.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.sensitive(true)
+.build();
+
+public static final PropertyDescriptor WEBHOOK_TEXT = new 
PropertyDescriptor
+.Builder()
+.name("webhook-text")
+.displayName("Webhook Text")
+.description("The text sent in the webhook message")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CHANNEL = new PropertyDescriptor
+.Builder()
+.name("channel")
+.displayName("Channel")
+.description("A public channel using #channel or direct 
message using @username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor
+.Builder()
+.name("username")
+.displayName("Username")
+.description("The displayed Slack username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor ICON_URL = new 
PropertyDe

[GitHub] nifi pull request #248: NIFI-1568: Add Filter Capability to UnpackContent

2016-06-14 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/248#discussion_r67017365
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/UnpackContent.java
 ---
@@ -154,75 +171,88 @@ protected void init(final 
ProcessorInitializationContext context) {
 return properties;
 }
 
-@Override
-public void onTrigger(final ProcessContext context, final 
ProcessSession session) throws ProcessException {
-FlowFile flowFile = session.get();
-if (flowFile == null) {
-return;
-}
+@OnStopped
+public void onStopped() {
+unpacker = null;
+fileFilter = null;
+}
 
-final ComponentLog logger = getLogger();
-String packagingFormat = 
context.getProperty(PACKAGING_FORMAT).getValue().toLowerCase();
-if (AUTO_DETECT_FORMAT.equals(packagingFormat)) {
-final String mimeType = 
flowFile.getAttribute(CoreAttributes.MIME_TYPE.key());
-if (mimeType == null) {
-logger.error("No mime.type attribute set for {}; routing 
to failure", new Object[]{flowFile});
-session.transfer(flowFile, REL_FAILURE);
-return;
-}
+@OnScheduled
+public void onScheduled(ProcessContext context) throws 
ProcessException {
+if (fileFilter == null) {
+fileFilter = 
Pattern.compile(context.getProperty(FILE_FILTER).getValue());
+tarUnpacker = new TarUnpacker(fileFilter);
+zipUnpacker = new ZipUnpacker(fileFilter);
+flowFileStreamV3Unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV3());
+flowFileStreamV2Unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV2());
+flowFileTarUnpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV1());
+}
 
-switch (mimeType.toLowerCase()) {
-case "application/tar":
-packagingFormat = TAR_FORMAT;
-break;
-case "application/x-tar":
-packagingFormat = TAR_FORMAT;
-break;
-case "application/zip":
-packagingFormat = ZIP_FORMAT;
-break;
-case "application/flowfile-v3":
-packagingFormat = FLOWFILE_STREAM_FORMAT_V3;
-break;
-case "application/flowfile-v2":
-packagingFormat = FLOWFILE_STREAM_FORMAT_V2;
-break;
-case "application/flowfile-v1":
-packagingFormat = FLOWFILE_TAR_FORMAT;
-break;
-default: {
-logger.info("Cannot unpack {} because its mime.type 
attribute is set to '{}', which is not a format that can be unpacked; routing 
to 'success'", new Object[]{flowFile, mimeType});
-session.transfer(flowFile, REL_SUCCESS);
-return;
-}
-}
+PackageFormat format = 
PackageFormat.getFormat(context.getProperty(PACKAGING_FORMAT).getValue());
+if (format != PackageFormat.AUTO_DETECT_FORMAT && unpacker == 
null) {
+initUnpacker(format);
 }
+}
 
-final Unpacker unpacker;
-final boolean addFragmentAttrs;
+public void initUnpacker(PackageFormat packagingFormat) {
 switch (packagingFormat) {
 case TAR_FORMAT:
-unpacker = new TarUnpacker();
+case X_TAR_FORMAT:
+unpacker = tarUnpacker;
 addFragmentAttrs = true;
 break;
 case ZIP_FORMAT:
-unpacker = new ZipUnpacker();
+unpacker = zipUnpacker;
 addFragmentAttrs = true;
 break;
 case FLOWFILE_STREAM_FORMAT_V2:
-unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV2());
+unpacker = flowFileStreamV2Unpacker;
 addFragmentAttrs = false;
 break;
 case FLOWFILE_STREAM_FORMAT_V3:
-unpacker = new FlowFileStreamUnpacker(new 
FlowFileUnpackagerV3());
+unpacker = flowFileStreamV3Unpacker;
 addFragmentAttrs = false;
 break;
 case FLOWFILE_TAR_FORMAT:
-unpacker 

[GitHub] nifi pull request #256: NIFI-1578: Create PutSlack processor

2016-06-14 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/256#discussion_r67014112
  
--- Diff: 
nifi-nar-bundles/nifi-slack-bundle/nifi-slack-processors/src/main/java/org/apache/nifi/processors/slack/PutSlack.java
 ---
@@ -0,0 +1,238 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.slack;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.DataOutputStream;
+
+import javax.json.Json;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+import javax.json.JsonWriter;
+import java.io.IOException;
+import java.io.StringWriter;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.net.URLEncoder;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@Tags({"put", "slack", "notify"})
+@CapabilityDescription("Sends a message to your team on slack.com")
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class PutSlack extends AbstractProcessor {
+
+public static final PropertyDescriptor WEBHOOK_URL = new 
PropertyDescriptor
+.Builder()
+.name("webhook-url")
+.displayName("Webhook URL")
+.description("The POST URL provided by Slack to send messages 
into a channel.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.sensitive(true)
+.build();
+
+public static final PropertyDescriptor WEBHOOK_TEXT = new 
PropertyDescriptor
+.Builder()
+.name("webhook-text")
+.displayName("Webhook Text")
+.description("The text sent in the webhook message")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CHANNEL = new PropertyDescriptor
+.Builder()
+.name("channel")
+.displayName("Channel")
+.description("A public channel using #channel or direct 
message using @username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor
+.Builder()
+.name("username")
+.displayName("Username")
+.description("The displayed Slack username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor ICON_URL = new 
PropertyDe

[GitHub] nifi pull request #256: NIFI-1578: Create PutSlack processor

2016-06-14 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/256#discussion_r67014211
  
--- Diff: 
nifi-nar-bundles/nifi-slack-bundle/nifi-slack-processors/src/main/java/org/apache/nifi/processors/slack/PutSlack.java
 ---
@@ -0,0 +1,238 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.slack;
+
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.DataOutputStream;
+
+import javax.json.Json;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+import javax.json.JsonWriter;
+import java.io.IOException;
+import java.io.StringWriter;
+import java.net.HttpURLConnection;
+import java.net.URL;
+import java.net.URLEncoder;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+@Tags({"put", "slack", "notify"})
+@CapabilityDescription("Sends a message to your team on slack.com")
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+public class PutSlack extends AbstractProcessor {
+
+public static final PropertyDescriptor WEBHOOK_URL = new 
PropertyDescriptor
+.Builder()
+.name("webhook-url")
+.displayName("Webhook URL")
+.description("The POST URL provided by Slack to send messages 
into a channel.")
+.required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.addValidator(StandardValidators.URL_VALIDATOR)
+.sensitive(true)
+.build();
+
+public static final PropertyDescriptor WEBHOOK_TEXT = new 
PropertyDescriptor
+.Builder()
+.name("webhook-text")
+.displayName("Webhook Text")
+.description("The text sent in the webhook message")
+.required(true)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor CHANNEL = new PropertyDescriptor
+.Builder()
+.name("channel")
+.displayName("Channel")
+.description("A public channel using #channel or direct 
message using @username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor USERNAME = new 
PropertyDescriptor
+.Builder()
+.name("username")
+.displayName("Username")
+.description("The displayed Slack username")
+.required(false)
+.expressionLanguageSupported(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+public static final PropertyDescriptor ICON_URL = new 
PropertyDe

[GitHub] nifi pull request #508: NIFI-1981 Cluster communication without client certi...

2016-06-14 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/508#discussion_r66990173
  
--- Diff: nifi-commons/nifi-security-utils/pom.xml ---
@@ -30,6 +30,21 @@
 org.apache.commons
 commons-lang3
 
+
+org.bouncycastle
+bcprov-jdk15on
+test
+
+
+org.bouncycastle
+bcpkix-jdk15on
+test
+
+
+org.bouncycastle
+bcpkix-jdk15on
--- End diff --

Missed this the first time, a duplicate definition of this artifact. Gives 
the following warning upon build:

[WARNING] Some problems were encountered while building the effective model 
for org.apache.nifi:nifi-security-utils:jar:1.0.0-SNAPSHOT
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' 
must be unique: org.bouncycastle:bcpkix-jdk15on:jar -> duplicate declaration of 
version (?) @ org.apache.nifi:nifi-security-utils:[unknown-version], 
/Users/mburgess/git-apache/nifi/nifi-commons/nifi-security-utils/pom.xml, line 
43, column 21



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #508: NIFI-1981 Cluster communication without client certificate

2016-06-14 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/508
  
+1 LGTM, built and ran with Java 7 and 8 on 0.x branch. Mind squashing the 
commits? Then I'll merge to 0.x and master, thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #508: NIFI-1981 Cluster communication without client certi...

2016-06-13 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/508#discussion_r66900343
  
--- Diff: 
nifi-commons/nifi-security-utils/src/test/groovy/org/apache/nifi/security/util/CertificateUtilsTest.groovy
 ---
@@ -272,4 +275,179 @@ class CertificateUtilsTest extends GroovyTestCase {
 assert convertedCertificate instanceof X509Certificate
 assert convertedCertificate == EXPECTED_NEW_CERTIFICATE
 }
+
+@Test
+void testShouldDetermineClientAuthStatusFromSocket() {
+// Arrange
+SSLSocket needSocket = [getNeedClientAuth: { -> true }] as 
SSLSocket
--- End diff --

nevermind, I should've noticed these were Groovy coercions. 
<-- embarrassed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #508: NIFI-1981 Cluster communication without client certi...

2016-06-13 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/508#discussion_r66900061
  
--- Diff: 
nifi-commons/nifi-security-utils/src/test/groovy/org/apache/nifi/security/util/CertificateUtilsTest.groovy
 ---
@@ -272,4 +275,179 @@ class CertificateUtilsTest extends GroovyTestCase {
 assert convertedCertificate instanceof X509Certificate
 assert convertedCertificate == EXPECTED_NEW_CERTIFICATE
 }
+
+@Test
+void testShouldDetermineClientAuthStatusFromSocket() {
+// Arrange
+SSLSocket needSocket = [getNeedClientAuth: { -> true }] as 
SSLSocket
--- End diff --

I'm seeing some Java 8 stuff (Lambdas, e.g) but this PR is against 0.x, 
which needs Java 7 language target


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #255: NIFI-1594: Add option to bulk using Index or Update.

2016-06-13 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/255
  
@joaohf will you have a chance to make the suggested updates? The PMC is 
looking to cut a 0.7.0 release soon and are wondering if this ticket can/should 
be included. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #477: NIFI-1663: Add ConvertAvroToORC processor

2016-06-10 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/477
  
@omalley Thanks! I updated the version in the POM to 1.1.0 and force-pushed 
the branch. Hopefully Travis will find the release JARs and complete 
successfully :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #475: - Add Maven profile to compile nifi-hadoop-libraries-nar us...

2016-06-10 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/475
  
Do you mind filing a Jira case for this, and updating the PR title with the 
Jira case number? It will make things easier to track, please and thanks!

I am having trouble with PutHDFS at the moment, it is hanging forever 
during the FileSystem.getFileStatus() call. Were you able to run PutHDFS 
successfully against these JARs?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #521: NIFI-1998: Upgraded Cassandra driver to 3.0.2

2016-06-10 Thread mattyb149
GitHub user mattyb149 opened a pull request:

https://github.com/apache/nifi/pull/521

NIFI-1998: Upgraded Cassandra driver to 3.0.2

This should apply cleanly to both master and 0.x branches

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mattyb149/nifi cassandra3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/521.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #521


commit c360c657cea74e8ae79874bedc0b82fcb9b7fc62
Author: Matt Burgess <mattyb...@apache.org>
Date:   2016-06-10T19:25:09Z

NIFI-1998: Upgraded Cassandra driver to 3.0.2




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #516: NIFI-1993 upgraded CGLIB to 3.2.2

2016-06-10 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/516
  
+1 LGTM, tested #515 with and without this commit, verified the test fails 
without this commit. Merged to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #475: - Add Maven profile to compile nifi-hadoop-libraries-nar us...

2016-06-09 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/475
  
As an optional Maven build profile I would imagine it's ok but will defer 
to @joewitt and others. In the meantime I will review and test this profile 
(and its absence) against various Hadoop distros.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #516: NIFI-1993 upgraded CGLIB to 3.2.2

2016-06-09 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/516
  
What's a good test? Successful build? Or something more?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #492: NIFI-1975 - Processor for parsing evtx files

2016-06-09 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/492
  
+1 LGTM, built and ran tests and contrib-check. Ran a NiFi flow with 
multiple EVTX files exercising all relationships and granularities.

Great contribution, thanks much!  Merging to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #492: NIFI-1975 - Processor for parsing evtx files

2016-06-08 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/492#discussion_r66292230
  
--- Diff: 
nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/ParseEvtxTest.java
 ---
@@ -0,0 +1,318 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.evtx;
+
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processors.evtx.parser.ChunkHeader;
+import org.apache.nifi.processors.evtx.parser.FileHeader;
+import org.apache.nifi.processors.evtx.parser.FileHeaderFactory;
+import org.apache.nifi.processors.evtx.parser.MalformedChunkException;
+import org.apache.nifi.processors.evtx.parser.Record;
+import org.apache.nifi.processors.evtx.parser.bxml.RootNode;
+import org.junit.Before;
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.mockito.Mock;
+import org.mockito.runners.MockitoJUnitRunner;
+
+import javax.xml.stream.XMLStreamException;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.junit.Assert.assertEquals;
+import static org.mockito.Mockito.any;
+import static org.mockito.Mockito.anyString;
+import static org.mockito.Mockito.eq;
+import static org.mockito.Mockito.isA;
+import static org.mockito.Mockito.mock;
+import static org.mockito.Mockito.verify;
+import static org.mockito.Mockito.verifyNoMoreInteractions;
+import static org.mockito.Mockito.when;
+
+@RunWith(MockitoJUnitRunner.class)
+public class ParseEvtxTest {
--- End diff --

This class tests the individual methods in the ParseEvtx processor, but not 
the processor lifecycle (like onTrigger). Can you add some more tests that 
exercise the processor? An example of using the nifi-mock framework can be 
found in TestEvaluateXPath, it has the TestRunner stuff with flowfiles, 
relationships, asserts, etc. You will likely want a test file or two to be used 
as input, although if line endings/whitespace are important in the format you 
may just need the data directly in the Java code.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #510: NIFI-1984: Ensure that locks are always cleaned up by Naive...

2016-06-08 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/510
  
+1 LGTM, built and ran tests, also started NiFi and tried various delete 
operations including an attempt to delete a processor that had an incoming 
connection. All behavior was as expected


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #492: NIFI-1975 - Processor for parsing evtx files

2016-06-08 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/492
  
The nifi-evtx-nar needs to be added to the top-level POM and the 
nifi-assembly POM, otherwise it will not be included in the distro.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #477: NIFI-1663: Add ConvertAvroToORC processor

2016-06-07 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/477
  
They're evaluating RC3 as of this writing, I will be doing the same but I 
expect good results and an imminent release :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #504: NIFI-1979 Moved HL7 test file into TestExtractHL7Attributes...

2016-06-07 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/504
  
+1 LGTM, tested on Apache and GitHub and tests are passing. Merging to 0.x 
and master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #504: NIFI-1979 Moved HL7 test file into TestExtractHL7Attributes...

2016-06-07 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/504
  
Reviewing...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #255: NIFI-1594: Add option to bulk using Index or Update.

2016-06-07 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/255#discussion_r66067616
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-processors/src/main/java/org/apache/nifi/processors/elasticsearch/PutElasticsearch.java
 ---
@@ -178,8 +190,18 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 public void process(final InputStream in) throws 
IOException {
 String json = IOUtils.toString(in, charset)
 .replace("\r\n", " ").replace('\n', ' 
').replace('\r', ' ');
-bulk.add(esClient.get().prepareIndex(index, 
docType, id)
-.setSource(json.getBytes(charset)));
+
+if (indexOp.equalsIgnoreCase("index")) {
+
bulk.add(esClient.get().prepareIndex(index, docType, id)
+
.setSource(json.getBytes(charset)));
+} else if (indexOp.equalsIgnoreCase("upsert")) 
{
+
bulk.add(esClient.get().prepareUpdate(index, docType, id)
+.setDoc(json.getBytes(charset))
+.setDocAsUpsert(true));
+} else if (indexOp.equalsIgnoreCase("update")) 
{
+
bulk.add(esClient.get().prepareUpdate(index, docType, id)
+.setDoc(json.getBytes(charset)));
+}
--- End diff --

If the index operation is not one of these three values, it looks like the 
file is quietly ignored and still transferred to success. Can you add an else 
clause that throws an IOException, and unit test(s) to test these code paths?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #248: NIFI-1568: Add Filter Capability to UnpackContent

2016-06-07 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/248
  
I can continue the review on this one, mind rebasing against the latest 
master or 0.x branch?  Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #492: NIFI-1975 - Processor for parsing evtx files

2016-06-06 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/492#discussion_r66000949
  
--- Diff: 
nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/ParseEvtx.java
 ---
@@ -0,0 +1,353 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.processors.evtx;
+
+import com.google.common.annotations.VisibleForTesting;
+import com.google.common.net.MediaType;
+import com.google.common.primitives.UnsignedLong;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processors.evtx.parser.ChunkHeader;
+import org.apache.nifi.processors.evtx.parser.FileHeader;
+import org.apache.nifi.processors.evtx.parser.FileHeaderFactory;
+import org.apache.nifi.processors.evtx.parser.MalformedChunkException;
+import org.apache.nifi.processors.evtx.parser.Record;
+import org.apache.nifi.processors.evtx.parser.XmlBxmlNodeVisitor;
+import org.apache.nifi.processors.evtx.parser.bxml.RootNode;
+
+import javax.xml.stream.XMLOutputFactory;
+import javax.xml.stream.XMLStreamException;
+import javax.xml.stream.XMLStreamWriter;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.atomic.AtomicReference;
+
+@SideEffectFree
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"logs", "windows", "event", "evtx", "message", "file"})
+@CapabilityDescription("Parses the contents of a Windows Event Log file 
(evtx) and writes the resulting xml to the FlowFile")
+public class ParseEvtx extends AbstractProcessor {
+public static final String RECORD = "Record";
+public static final String CHUNK = "Chunk";
+public static final String FILE = "File";
+public static final String EVENTS = "Events";
+public static final XMLOutputFactory XML_OUTPUT_FACTORY = 
XMLOutputFactory.newFactory();
+public static final String EVTX_EXTENSION = ".evtx";
+public static final String UNABLE_TO_PROCESS_DUE_TO = "Unable to 
process {} due to {}";
+public static final String XML_EXTENSION = ".xml";
+
+@VisibleForTesting
+static final Relationship REL_SUCCESS = new Relationship.Builder()
+.name("success")
+.description("Any FlowFile that was successfully converted 
from evtx to xml")
+.build();
+
+@VisibleForTesting
+static final Relationship REL_FAILURE = new Relationship.Builder()
+.name("failure")
+.description("Any FlowFile that encountered an exception 
during conversion will be transferred to this relationship with as much parsing 
as possible done")
+

[GitHub] nifi issue #495: NIFI-1968 ExtractHL7Attributes is squashing empty component...

2016-06-06 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/495
  
+1 LGTM, built and ran the tests, also ran with a NiFi flow, verified the 
attributes with empty segment components are displayed correctly.

Mind squashing your commits? Also I was having trouble applying the patch, 
I may need to manually intervene but I will keep you as author of the commit(s).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #495: NIFI-1968 ExtractHL7Attributes is squashing empty component...

2016-06-06 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/495
  
Reviewing...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #498: NIFI-1973 Allow ExecuteSQL to use flow file content ...

2016-06-06 Thread mattyb149
GitHub user mattyb149 opened a pull request:

https://github.com/apache/nifi/pull/498

NIFI-1973 Allow ExecuteSQL to use flow file content as SQL query



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mattyb149/nifi NIFI-1973

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/498.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #498


commit 5ae16ec261428d19acd20acc67c25f414146854a
Author: Matt Burgess <mattyb...@apache.org>
Date:   2016-06-06T15:19:32Z

NIFI-1973 Allow ExecuteSQL to use flow file content as SQL query




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #474: NIFI-1919 Add replaceFirst method to expression language

2016-06-01 Thread mattyb149
Github user mattyb149 commented on the issue:

https://github.com/apache/nifi/pull/474
  
I built and ran various tests with the new (and existing) replaceXYZ 
functions, everything looks good.

+1


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #486: NIFI-1929: Improvements for PutHDFS attribute handli...

2016-06-01 Thread mattyb149
GitHub user mattyb149 opened a pull request:

https://github.com/apache/nifi/pull/486

NIFI-1929: Improvements for PutHDFS attribute handling



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mattyb149/nifi NIFI-1929

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/486.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #486


commit 856c7d157643cbc77313f0a56987d8cf4a0d843b
Author: Matt Burgess <mattyb...@apache.org>
Date:   2016-06-01T17:16:10Z

NIFI-1929: Improvements for PutHDFS attribute handling




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1919 Add replaceFirst method to expression language

2016-06-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/474#discussion_r65358298
  
--- Diff: 
nifi-commons/nifi-expression-language/src/main/java/org/apache/nifi/attribute/expression/language/evaluation/functions/ReplaceEvaluator.java
 ---
@@ -17,7 +17,6 @@
 package org.apache.nifi.attribute.expression.language.evaluation.functions;
 
 import java.util.Map;
-
--- End diff --

nit pick for unnecessary whitespace change with no other changes in the file


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1919 Add replaceFirst method to expression language

2016-06-01 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/474#discussion_r65358238
  
--- Diff: nifi-docs/src/main/asciidoc/expression-language-guide.adoc ---
@@ -884,7 +916,7 @@ Expressions will provide the following results:
 
 
 
-.replaceAll Examples
+.ReplaceAll Examples
--- End diff --

Shouldn't this remain lowercase? Also is it necessary for the markdown 
table or should it be removed to be consistent with other examples sections?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1663: Add ConvertAvroToORC processor

2016-06-01 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/477
  
This currently depends on a SNAPSHOT of Apache ORC 1.1.0, which is why the 
CI builds are failing. I will contact the ORC crew to see when the release will 
be ready


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1663: Add ConvertAvroToORC processor

2016-05-30 Thread mattyb149
GitHub user mattyb149 opened a pull request:

https://github.com/apache/nifi/pull/477

NIFI-1663: Add ConvertAvroToORC processor

There is a test template here: 
https://gist.github.com/mattyb149/3644803e8e0642346cf02b89ef62b411

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mattyb149/nifi NIFI-1663

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/477.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #477


commit 9e55151346a015b17713976637f10ee4cc90e6ce
Author: Matt Burgess <mattyb...@apache.org>
Date:   2016-05-31T02:03:35Z

NIFI-1663: Add ConvertAvroToORC processor




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1868: Add PutHiveStreaming processor

2016-05-27 Thread mattyb149
Github user mattyb149 closed the pull request at:

https://github.com/apache/nifi/pull/434


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1660 - Enhance the expression language wit...

2016-05-21 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/303#discussion_r64142229
  
--- Diff: nifi-docs/src/main/asciidoc/expression-language-guide.adoc ---
@@ -1161,7 +1161,61 @@ Expressions will provide the following results:
 
|===
 
 
+[.function]
+=== jsonPath
+
+*Description*: [.description]#The `jsonPath` function generates a string 
by evaluating the Subject as JSON and applying a JSON
+  path expression. An empty string is generated if the Subject does not 
contain valid JSON, the _jsonPath_ is invalid, or the path
+   does not exist in the Subject.  If the evaluation results in a scalar 
value, the string representation of scalar value is
+   generated.  Otherwise a string representation of the JSON result is 
generated.  A JSON array of length 1 is special cased
+   when `[0]` is a scalar, the string representation of `[0]` is 
generated.^1^#
+
+*Subject Type*: [.subject]#String#
+
+*Arguments*:
+[.argName]#_jsonPath_# : [.argDesc]#the JSON path expression used to 
evaluate the Subject.#
 
+*Return Type*: [.returnType]#String#
+
+*Examples*: If the "myJson" attribute is
+
+..
+{
+  "firstName": "John",
+  "lastName": "Smith",
+  "isAlive": true,
+  "age": 25,
+  "address": {
+"streetAddress": "21 2nd Street",
+"city": "New York",
+"state": "NY",
+"postalCode": "10021-3100"
+  },
+  "phoneNumbers": [
+{
+  "type": "home",
+  "number": "212 555-1234"
+},
+{
+  "type": "office",
+  "number": "646 555-4567"
+}
+  ],
+  "children": [],
+  "spouse": null
+}
+..
+
+.jsonPath Examples
+|===
+| Expression | Value
+| `${myJson:jsonPath('$.firstName')}` | `John`
+| `${myJson:jsonPath('$.address.postalCode')}` | `10021-3100`
+| `${myJson:jsonPath('$.phoneNumbers[?(@.type=="home")].number')}`^1^ | 
`212 555-1234`
+| `${myJson:jsonPath('$.phoneNumbers')}` | `[{"type":"home","number":"212 
555-1234"},{"type":"office","number":"646 555-4567"}]`
+| `${myJson:jsonPath('$.missing-path')}` |
+| `${myJson:jsonPath('$.bad@expression')}` |
--- End diff --

For the bad JSON Paths (missing or invalid), perhaps it should explicitly 
say the value is "_empty_" or something like that. I couldn't find other 
examples of table entries where the values were empty, but I did see one where 
a space was called out by _space_ to make it more explicit and visible.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1660 - Enhance the expression language wit...

2016-05-21 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/303#discussion_r64142194
  
--- Diff: 
nifi-commons/nifi-expression-language/src/test/java/org/apache/nifi/attribute/expression/language/TestQuery.java
 ---
@@ -233,6 +233,24 @@ public void testEmbeddedExpressionsAndQuotes() {
 }
 
 @Test
+public void testJsonPath() {
+final Map<String, String> attributes = new HashMap<>();
+attributes.put("json",
+"{" + "\n  \"firstName\": \"John\"," + "\n  \"lastName\": 
\"Smith\"," + "\n  \"isAlive\": true," + "\n  \"age\": 25," + "\n  \"address\": 
{"
++ "\n\"streetAddress\": \"21 2nd Street\"," + 
"\n\"city\": \"New York\"," + "\n\"state\": \"NY\","
++ "\n\"postalCode\": \"10021-3100\"" + "\n  
}," + "\n  \"phoneNumbers\": [" + "\n{" + "\n  \"type\": \"home\","
++ "\n  \"number\": \"212 555-1234\"" + "\n
}," + "\n{" + "\n  \"type\": \"office\","
++ "\n  \"number\": \"646 555-4567\"" + "\n
}" + "\n  ]," + "\n  \"children\": []," + "\n  \"spouse\": null" + "\n}");
+verifyEquals("${json:jsonPath('$.firstName')}", attributes, 
"John");
+verifyEquals("${json:jsonPath('$.address.postalCode')}", 
attributes, "10021-3100");
+
verifyEquals("${json:jsonPath(\"$.phoneNumbers[?(@.type=='home')].number\")}", 
attributes, "212 555-1234");
+verifyEquals("${json:jsonPath('$.phoneNumbers')}", attributes,
+"[{\"type\":\"home\",\"number\":\"212 
555-1234\"},{\"type\":\"office\",\"number\":\"646 555-4567\"}]");
+verifyEquals("${json:jsonPath('$.missing-path')}", attributes, "");
+verifyEquals("${missing:jsonPath('$.bad@expression')}", 
attributes, "");
+}
--- End diff --

Perhaps add a test for an existing attribute containing invalid JSON.  If 
the behavior on bad JSON is changed to throw an exception (see above comment), 
a test could verify that; otherwise it should be tested for null.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1660 - Enhance the expression language wit...

2016-05-21 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/303#discussion_r64142183
  
--- Diff: 
nifi-commons/nifi-expression-language/src/main/java/org/apache/nifi/attribute/expression/language/evaluation/functions/JsonPathEvaluator.java
 ---
@@ -0,0 +1,98 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.attribute.expression.language.evaluation.functions;
+
+import java.util.List;
+import java.util.Map;
+import java.util.Objects;
+
+import org.apache.nifi.attribute.expression.language.evaluation.Evaluator;
+import 
org.apache.nifi.attribute.expression.language.evaluation.QueryResult;
+import 
org.apache.nifi.attribute.expression.language.evaluation.StringEvaluator;
+import 
org.apache.nifi.attribute.expression.language.evaluation.StringQueryResult;
+
+import com.jayway.jsonpath.Configuration;
+import com.jayway.jsonpath.DocumentContext;
+import com.jayway.jsonpath.InvalidJsonException;
+import com.jayway.jsonpath.JsonPath;
+import com.jayway.jsonpath.spi.json.JacksonJsonProvider;
+import com.jayway.jsonpath.spi.json.JsonProvider;
+
+
+public class JsonPathEvaluator extends StringEvaluator {
+
+private static final StringQueryResult NULL_RESULT = new 
StringQueryResult("");
+private static final Configuration STRICT_PROVIDER_CONFIGURATION = 
Configuration.builder().jsonProvider(new JacksonJsonProvider()).build();
+private static final JsonProvider JSON_PROVIDER = 
STRICT_PROVIDER_CONFIGURATION.jsonProvider();
+
+private final Evaluator subject;
+private final Evaluator jsonPathExp;
+
+public JsonPathEvaluator(final Evaluator subject, final 
Evaluator jsonPathExp) {
+this.subject = subject;
+this.jsonPathExp = jsonPathExp;
+}
+
+@Override
+public QueryResult evaluate(final Map<String, String> 
attributes) {
+final String subjectValue = 
subject.evaluate(attributes).getValue();
+if (subjectValue == null || subjectValue.length() == 0) {
+return NULL_RESULT;
+}
+DocumentContext documentContext = null;
+try {
+documentContext = 
validateAndEstablishJsonContext(subjectValue);
+} catch (InvalidJsonException e) {
+return NULL_RESULT;
--- End diff --

Instead of returning null when the incoming JSON is not valid, perhaps we 
should wrap the InvalidJsonException in a AttributeExpressionLanguageException 
and throw that, to make the error more visible to the user. Otherwise something 
like UpdateAttribute could succeed but getting a null value for the JSON Path 
result might not show up as a problem until later in the flow, making it harder 
to debug. If the JSON itself is bad, no JSON Path will succeed, so probably 
better to alert the user immediately (bulletin, e.g.)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1660 - Enhance the expression language wit...

2016-05-21 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/303#issuecomment-220807201
  
This appears to have been based on the 0.x branch, not master (so the PR 
has conflicts). If you'd like to submit the PR against 0.x, do you mind 
updating the PR? If you'd like to submit against master, can you rebase against 
the latest master and force push the branch again? Please and thanks :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1660 - Enhance the expression language wit...

2016-05-21 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/303#discussion_r64142161
  
--- Diff: nifi-commons/nifi-expression-language/pom.xml ---
@@ -56,5 +56,13 @@
 org.apache.nifi
 nifi-utils
 
+
+com.jayway.jsonpath
+json-path
+
+   
--- End diff --

Nit-pick for consistent whitespace/indentation for the dependencies


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1822: Allow concurrent execution in Execut...

2016-05-14 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/387#issuecomment-219265046
  
Thanks much for the corrections/improvements/additions!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1822: Allow concurrent execution in Execut...

2016-05-14 Thread mattyb149
Github user mattyb149 closed the pull request at:

https://github.com/apache/nifi/pull/387


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1822: Allow concurrent execution in Execut...

2016-05-13 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/387#discussion_r63269288
  
--- Diff: 
nifi-nar-bundles/nifi-scripting-bundle/nifi-scripting-processors/src/main/java/org/apache/nifi/processors/script/AbstractScriptProcessor.java
 ---
@@ -248,11 +253,13 @@ protected void setupEngine() {
 
Thread.currentThread().setContextClassLoader(scriptEngineModuleClassLoader);
 }
 
-scriptEngine = createScriptEngine();
+ScriptEngine scriptEngine = createScriptEngine();
--- End diff --

D'oh! will fix


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1868: Add PutHiveStreaming processor

2016-05-11 Thread mattyb149
GitHub user mattyb149 opened a pull request:

https://github.com/apache/nifi/pull/434

NIFI-1868: Add PutHiveStreaming processor

Also refactored some common Kerberos code out of HiveControllerService into 
a new HiveConfigurator class

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mattyb149/nifi NIFI-1868

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/434.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #434


commit d739dfc9ff2b1e91189985b72a17b073ae0ec8ab
Author: Matt Burgess <mattyb...@apache.org>
Date:   2016-05-11T20:37:14Z

NIFI-1868: Add PutHiveStreaming processor




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1826 Expression Language: add function to ...

2016-05-11 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/396#issuecomment-218468773
  
LGTM +1, thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1865 Close StreamThrottler when processor ...

2016-05-10 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/432#issuecomment-218343371
  
LGTM, merged to master and 0.x. Do you mind closing this PR?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---



[GitHub] nifi pull request: NIFI-1865 Close StreamThrottler when processor ...

2016-05-10 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/432#issuecomment-218341768
  
Reviewing


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1865 Close StreamThrottler when processor ...

2016-05-10 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/429#issuecomment-218325829
  
May the --force be with you ;)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1865 Close StreamThrottler when processor ...

2016-05-10 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/429#issuecomment-218322096
  
You're both right :) you can do the squash then a rebase and force push to 
the same PR. Doesn't matter to me though, I'll grab the latest and 
review/merge, thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1863 extended caught exception in cleaning...

2016-05-10 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/428#issuecomment-218235856
  
LGTM (and seconded off-line by @markap14), +1 will merge to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1865 Close StreamThrottler when processor ...

2016-05-10 Thread mattyb149
Github user mattyb149 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/429#discussion_r62712326
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestPostHTTP.java
 ---
@@ -407,4 +407,31 @@ public void testSendChunked() throws Exception {
 Assert.assertNull(lastPostHeaders.get("Content-Length"));
 
Assert.assertEquals("chunked",lastPostHeaders.get("Transfer-Encoding"));
 }
+
+@Test
+public void testSendWithThrottler() throws Exception {
+setup(null);
+
+final String suppliedMimeType = "text/plain";
+runner.setProperty(PostHTTP.URL, server.getUrl());
+runner.setProperty(PostHTTP.CONTENT_TYPE, suppliedMimeType);
+runner.setProperty(PostHTTP.CHUNKED_ENCODING, "false");
+runner.setProperty(PostHTTP.MAX_DATA_RATE, "10kb");
+
+final Map<String, String> attrs = new HashMap<>();
+attrs.put(CoreAttributes.MIME_TYPE.key(), "text/plain");
+
+runner.enqueue(StringUtils.repeat("This is the song that never 
ends. It goes on and on my friend.", 100).getBytes(), attrs);
--- End diff --

These song lyrics are copyrighted :-/ Need to replace with some other 
generic sample text


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1721 updated mock framework to return corr...

2016-05-03 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/408#issuecomment-216634155
  
Built and ran all tests including the added unit tests, everything LGTM, 
thanks! +1


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1721 updated mock framework to return corr...

2016-05-03 Thread mattyb149
Github user mattyb149 commented on the pull request:

https://github.com/apache/nifi/pull/408#issuecomment-216632581
  
Reviewing


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-981: Added SelectHiveQL and PutHiveQL proc...

2016-05-03 Thread mattyb149
Github user mattyb149 closed the pull request at:

https://github.com/apache/nifi/pull/384


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


  1   2   >