[GitHub] nifi pull request #1280: NIFI-3109 Initial refactoring of TimerDrivenSchedul...

2017-07-15 Thread olegz
Github user olegz closed the pull request at:

https://github.com/apache/nifi/pull/1280


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1278: NIFI-2886 Fixed the lifecycle delay

2017-07-15 Thread olegz
Github user olegz closed the pull request at:

https://github.com/apache/nifi/pull/1278


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110541830
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/grok/GrokRecordReader.java
 ---
@@ -0,0 +1,318 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.grok;
+
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.text.ParseException;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.Date;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.TimeZone;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
+
+import org.apache.commons.lang3.time.FastDateFormat;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import io.thekraken.grok.api.Grok;
+import io.thekraken.grok.api.GrokUtils;
+import io.thekraken.grok.api.Match;
+
+public class GrokRecordReader implements RecordReader {
+private final BufferedReader reader;
+private final Grok grok;
+private RecordSchema schema;
+
+private String nextLine;
+
+static final String STACK_TRACE_COLUMN_NAME = "STACK_TRACE";
+private static final Pattern STACK_TRACE_PATTERN = Pattern.compile(
+"^\\s*(?:(?:|\\t)+at )|"
++ "(?:(?:|\\t)+\\[CIRCULAR REFERENCE\\:)|"
++ "(?:Caused by\\: )|"
++ "(?:Suppressed\\: )|"
++ "(?:\\s+... \\d+ (?:more|common frames? omitted)$)");
+
+private static final FastDateFormat TIME_FORMAT_DATE;
+private static final FastDateFormat TIME_FORMAT_TIME;
+private static final FastDateFormat TIME_FORMAT_TIMESTAMP;
+
+static {
+final TimeZone gmt = TimeZone.getTimeZone("GMT");
+TIME_FORMAT_DATE = FastDateFormat.getInstance("-MM-dd", gmt);
+TIME_FORMAT_TIME = FastDateFormat.getInstance("HH:mm:ss", gmt);
+TIME_FORMAT_TIMESTAMP = FastDateFormat.getInstance("-MM-dd 
HH:mm:ss", gmt);
+}
+
+public GrokRecordReader(final InputStream in, final Grok grok, final 
RecordSchema schema) {
+this.reader = new BufferedReader(new InputStreamReader(in));
+this.grok = grok;
+this.schema = schema;
+}
+
+@Override
+public void close() throws IOException {
+reader.close();
+}
+
+@Override
+public Record nextRecord() throws IOException, 
MalformedRecordException {
+final String line = nextLine == null ? reader.readLine() : 
nextLine;
+nextLine = null; // ensure that we don't process nextLine again
+if (line == null) {
+return null;
+}
+
+final RecordSchema schema = getSchema();
+
+final Match match = grok.match(line);
+match.captures();
+final Map<String, Object> valueMap = match.toMap();
+if (valueMap.isEmpty()) {   // We were unable to match the pattern 
so return an empty Object array.
+return new MapRecord(schema, Collections.emptyMap());
+}
+
+// Read the next line to see if it ma

[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110541217
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/QueryFlowFile.java
 ---
@@ -0,0 +1,550 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import java.io.Closeable;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.PreparedStatement;
+import java.sql.ResultSet;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.Set;
+import java.util.concurrent.BlockingQueue;
+import java.util.concurrent.LinkedBlockingQueue;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+import java.util.function.Supplier;
+
+import org.apache.calcite.config.CalciteConnectionProperty;
+import org.apache.calcite.config.Lex;
+import org.apache.calcite.jdbc.CalciteConnection;
+import org.apache.calcite.schema.SchemaPlus;
+import org.apache.calcite.sql.parser.SqlParser;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.DynamicRelationship;
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.SideEffectFree;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.queryflowfile.FlowFileTable;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+import org.apache.nifi.serialization.RowRecordReaderFactory;
+import org.apache.nifi.serialization.WriteResult;
+import org.apache.nifi.serialization.record.ResultSetRecordSet;
+import org.apache.nifi.util.StopWatch;
+
+@EventDriven
+@SideEffectFree
+@SupportsBatching
+@Tags({"sql", "query", "calcite", "route", "record", "transform", 
"select", "update", "modify", "etl", "filter", "record", "csv", "json", "logs", 
"text", "avro", "aggregate"})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@CapabilityDescription("Evaluates one or more SQL queries against the 
contents of a FlowFile. The result of the "
++ "SQL query then becomes the content of the output FlowFile. This can 
be used, for example, "
++ 

[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110541076
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/repository/StandardProcessSession.java
 ---
@@ -2201,7 +2202,10 @@ public int read(final byte[] b, final int off, final 
int len) throws IOException
 
 @Override
 public void close() throws IOException {
-StandardProcessSession.this.bytesRead += 
countingStream.getBytesRead();
+if (!closed) {
+StandardProcessSession.this.bytesRead += 
countingStream.getBytesRead();
+closed = true;
+}
--- End diff --

Technically the above is not thread safe, consider adding some 
synchronization.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110541128
  
--- Diff: 
nifi-nar-bundles/nifi-registry-bundle/nifi-registry-service/src/main/java/org/apache/nifi/schemaregistry/services/AvroSchemaRegistry.java
 ---
@@ -0,0 +1,217 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.schemaregistry.services;
+
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.stream.Collectors;
+
+import org.apache.avro.LogicalType;
+import org.apache.avro.Schema;
+import org.apache.avro.Schema.Field;
+import org.apache.avro.Schema.Type;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnDisabled;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+@Tags({ "schema", "registry", "avro", "json", "csv" })
+@CapabilityDescription("Provides a service for registering and accessing 
schemas. You can register a schema "
++ "as a dynamic property where 'name' represents the schema name 
and 'value' represents the textual "
++ "representation of the actual schema following the syntax and 
semantics of Avro's Schema format.")
+public class AvroSchemaRegistry extends AbstractControllerService 
implements SchemaRegistry {
+
+private final Map<String, String> schemaNameToSchemaMap;
+
+private static final String LOGICAL_TYPE_DATE = "date";
+private static final String LOGICAL_TYPE_TIME_MILLIS = "time-millis";
+private static final String LOGICAL_TYPE_TIME_MICROS = "time-micros";
+private static final String LOGICAL_TYPE_TIMESTAMP_MILLIS = 
"timestamp-millis";
+private static final String LOGICAL_TYPE_TIMESTAMP_MICROS = 
"timestamp-micros";
+
+
+public AvroSchemaRegistry() {
+this.schemaNameToSchemaMap = new HashMap<>();
+}
+
+@OnEnabled
+public void enable(ConfigurationContext configuratiponContext) throws 
InitializationException {
+
this.schemaNameToSchemaMap.putAll(configuratiponContext.getProperties().entrySet().stream()
+.filter(propEntry -> propEntry.getKey().isDynamic())
+.collect(Collectors.toMap(propEntry -> 
propEntry.getKey().getName(), propEntry -> propEntry.getValue(;
+}
+
+@Override
+public String retrieveSchemaText(String schemaName) {
+if (!this.schemaNameToSchemaMap.containsKey(schemaName)) {
+throw new IllegalArgumentException("Failed to find schema; 
Name: '" + schemaName + ".");
+} else {
+return this.schemaNameToSchemaMap.get(schemaName);
+}
+}
+
+@Override
+public String retrieveSchemaText(String schemaName, Map<String, 
String> attributes) {
+throw new UnsupportedOperationException("This version of schema 
registry does not "
++ "support this operation, since schemas are only 
identofied by name.");
--- End diff --

Perhaps instead of throwing the exception we should just delegate to the 
`retrieveSchemaText(

[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110544739
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/grok/GrokReader.java
 ---
@@ -0,0 +1,111 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.grok;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.Reader;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RowRecordReaderFactory;
+import org.apache.nifi.serialization.SchemaRegistryRecordReader;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import io.thekraken.grok.api.Grok;
+import io.thekraken.grok.api.exception.GrokException;
+
+@Tags({"grok", "logs", "logfiles", "parse", "unstructured", "text", 
"record", "reader", "regex", "pattern", "logstash"})
+@CapabilityDescription("Provides a mechanism for reading unstructured text 
data, such as log files, and structuring the data "
++ "so that it can be processed. The service is configured using Grok 
patterns. "
++ "The service reads from a stream of data and splits each message 
that it finds into a separate Record, each containing the fields that are 
configured. "
++ "If a line in the input does not match the expected message pattern, 
the line of text is considered to be part of the previous "
++ "message, with the exception of stack traces. A stack trace that is 
found at the end of a log message is considered to be part "
++ "of the previous message but is added to the 'STACK_TRACE' field of 
the Record. If a record has no stack trace, it will have a NULL value "
++ "for the STACK_TRACE field. All fields that are parsed are 
considered to be of type String by default. If there is need to change the type 
of a field, "
++ "this can be accomplished by configuring the Schema Registry to use 
and adding the appropriate schema.")
+public class GrokReader extends SchemaRegistryRecordReader implements 
RowRecordReaderFactory {
+private volatile Grok grok;
+private volatile boolean useSchemaRegistry;
+
+private static final String DEFAULT_PATTERN_NAME = 
"/default-grok-patterns.txt";
+
+static final PropertyDescriptor PATTERN_FILE = new 
PropertyDescriptor.Builder()
+.name("Grok Pattern File")
+.description("Path to a file that contains Grok Patterns to use 
for parsing logs. If not specified, a built-in default Pattern file "
++ "will be used. If specified, all patterns in the given 
pattern file will override the default patterns. See the Controller Service's "
++ "Additional Details for a list of pre-defined patterns.")
+.addValidator(StandardValidators.FILE_EXISTS_VALIDATOR)
+.expressionLanguageSupported(true)
+.required(false)
+.build();
+
+static final PropertyDescriptor GROK_EXPRESSION = new 
PropertyDescriptor.Builder()
+.name("Grok Expression")
+.description("Specifies the format of

[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110541788
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/grok/GrokRecordReader.java
 ---
@@ -0,0 +1,318 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.grok;
+
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.text.ParseException;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.Date;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.TimeZone;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
+
+import org.apache.commons.lang3.time.FastDateFormat;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.SimpleRecordSchema;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordField;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import io.thekraken.grok.api.Grok;
+import io.thekraken.grok.api.GrokUtils;
+import io.thekraken.grok.api.Match;
+
+public class GrokRecordReader implements RecordReader {
+private final BufferedReader reader;
+private final Grok grok;
+private RecordSchema schema;
+
+private String nextLine;
+
+static final String STACK_TRACE_COLUMN_NAME = "STACK_TRACE";
+private static final Pattern STACK_TRACE_PATTERN = Pattern.compile(
+"^\\s*(?:(?:|\\t)+at )|"
++ "(?:(?:|\\t)+\\[CIRCULAR REFERENCE\\:)|"
++ "(?:Caused by\\: )|"
++ "(?:Suppressed\\: )|"
++ "(?:\\s+... \\d+ (?:more|common frames? omitted)$)");
+
+private static final FastDateFormat TIME_FORMAT_DATE;
+private static final FastDateFormat TIME_FORMAT_TIME;
+private static final FastDateFormat TIME_FORMAT_TIMESTAMP;
+
+static {
+final TimeZone gmt = TimeZone.getTimeZone("GMT");
+TIME_FORMAT_DATE = FastDateFormat.getInstance("-MM-dd", gmt);
+TIME_FORMAT_TIME = FastDateFormat.getInstance("HH:mm:ss", gmt);
+TIME_FORMAT_TIMESTAMP = FastDateFormat.getInstance("-MM-dd 
HH:mm:ss", gmt);
+}
+
+public GrokRecordReader(final InputStream in, final Grok grok, final 
RecordSchema schema) {
+this.reader = new BufferedReader(new InputStreamReader(in));
+this.grok = grok;
+this.schema = schema;
+}
+
+@Override
+public void close() throws IOException {
+reader.close();
+}
+
+@Override
+public Record nextRecord() throws IOException, 
MalformedRecordException {
+final String line = nextLine == null ? reader.readLine() : 
nextLine;
+nextLine = null; // ensure that we don't process nextLine again
+if (line == null) {
+return null;
+}
+
+final RecordSchema schema = getSchema();
+
+final Match match = grok.match(line);
+match.captures();
+final Map<String, Object> valueMap = match.toMap();
+if (valueMap.isEmpty()) {   // We were unable to match the pattern 
so return an empty Object array.
+return new MapRecord(schema, Collections.emptyMap());
+}
+
+// Read the next line to see if it ma

[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110541764
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/json/JsonPathRowRecordReader.java
 ---
@@ -0,0 +1,158 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.json;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.HashMap;
+import java.util.LinkedHashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.record.DataType;
+import org.apache.nifi.serialization.record.MapRecord;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.type.ArrayDataType;
+import org.apache.nifi.serialization.record.type.RecordDataType;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+import 
org.apache.nifi.serialization.record.util.IllegalTypeConversionException;
+import org.codehaus.jackson.JsonNode;
+
+import com.jayway.jsonpath.Configuration;
+import com.jayway.jsonpath.DocumentContext;
+import com.jayway.jsonpath.JsonPath;
+import com.jayway.jsonpath.PathNotFoundException;
+import com.jayway.jsonpath.spi.json.JacksonJsonProvider;
+
+public class JsonPathRowRecordReader extends AbstractJsonRowRecordReader {
+private static final Configuration STRICT_PROVIDER_CONFIGURATION = 
Configuration.builder().jsonProvider(new JacksonJsonProvider()).build();
+
+private final LinkedHashMap<String, JsonPath> jsonPaths;
+private final InputStream in;
+private RecordSchema schema;
+private final String dateFormat;
+private final String timeFormat;
+private final String timestampFormat;
+
+public JsonPathRowRecordReader(final LinkedHashMap<String, JsonPath> 
jsonPaths, final RecordSchema schema, final InputStream in, final ComponentLog 
logger,
+final String dateFormat, final String timeFormat, final String 
timestampFormat)
+throws MalformedRecordException, IOException {
+super(in, logger);
+
+this.dateFormat = dateFormat;
+this.timeFormat = timeFormat;
+this.timestampFormat = timestampFormat;
+
+this.schema = schema;
+this.jsonPaths = jsonPaths;
+this.in = in;
+}
+
+@Override
+public void close() throws IOException {
+in.close();
+}
+
+@Override
+public RecordSchema getSchema() {
+return schema;
+}
+
+@Override
+protected Record convertJsonNodeToRecord(final JsonNode jsonNode, 
final RecordSchema schema) throws IOException {
+if (jsonNode == null) {
+return null;
+}
+
+final DocumentContext ctx = 
JsonPath.using(STRICT_PROVIDER_CONFIGURATION).parse(jsonNode.toString());
+final Map<String, Object> values = new 
HashMap<>(schema.getFieldCount());
+
+for (final Map.Entry<String, JsonPath> entry : 
jsonPaths.entrySet()) {
+final String fieldName = entry.getKey();
+final DataType desiredType = 
schema.getDataType(fieldName).orElse(null);
+if (desiredType == null) {
+continue;
+}
+
+final JsonPath jsonPath = entry.getValue();
+
+Object value;
+try {
+value = ctx.read(jsonPath);
+} catch (final PathNotFoundException pnfe) {
+value = null;
+  

[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110544779
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/src/main/java/org/apache/nifi/csv/CSVRecordSetWriter.java
 ---
@@ -0,0 +1,67 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.csv;
+
+import java.util.ArrayList;
+import java.util.List;
+
+import org.apache.commons.csv.CSVFormat;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnEnabled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.serialization.DateTimeTextRecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriter;
+import org.apache.nifi.serialization.RecordSetWriterFactory;
+
+@Tags({"csv", "result", "set", "recordset", "record", "writer", 
"serializer", "row", "tsv", "tab", "separated", "delimited"})
+@CapabilityDescription("Writes the contents of a RecordSet as CSV data. 
The first line written "
++ "will be the column names. All subsequent lines will be the values 
corresponding to those columns.")
+public class CSVRecordSetWriter extends DateTimeTextRecordSetWriter 
implements RecordSetWriterFactory {
+
+private volatile CSVFormat csvFormat;
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = new 
ArrayList<>(super.getSupportedPropertyDescriptors());
+properties.add(CSVUtils.CSV_FORMAT);
+properties.add(CSVUtils.VALUE_SEPARATOR);
+properties.add(CSVUtils.QUOTE_CHAR);
+properties.add(CSVUtils.ESCAPE_CHAR);
+properties.add(CSVUtils.COMMENT_MARKER);
+properties.add(CSVUtils.NULL_STRING);
+properties.add(CSVUtils.TRIM_FIELDS);
+properties.add(CSVUtils.QUOTE_MODE);
+properties.add(CSVUtils.RECORD_SEPARATOR);
+properties.add(CSVUtils.TRAILING_DELIMITER);
+return properties;
+}
--- End diff --

Similar as before. This will be called at least 3 times, so may be static 
initializer. This pattern appears in many places throughout this PR so consider 
fixing it in many places.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFil...

2017-04-09 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1652#discussion_r110544874
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-record-serialization-services-bundle/nifi-record-serialization-services/.gitignore
 ---
@@ -0,0 +1 @@
+/bin/
--- End diff --

I think it's covered by the global .gitignore


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1652: NIFI-1280: Renamed FilterCSVColumns to QueryFlowFile; intr...

2017-04-06 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1652
  
Reviewing. . .


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1651: NIFI-3509 Fixed ID duplication

2017-04-06 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1651

NIFI-3509 Fixed ID duplication

Fixed the possibility of duplicated MSB part of the component during 
template creation

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3509

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1651.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1651


commit 7c74d8ce36f59d09266f8fcb0c78b0fced213085
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-04-06T13:27:25Z

NIFI-3509 Fixed ID duplication
Fixed the possibility of duplicated MSB part of the component during 
template creation




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1629: NIFI-3441 added @Ignore to integration tests due to...

2017-03-28 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1629

NIFI-3441 added @Ignore to integration tests due to intermittent fail…

…ures

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3441

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1629.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1629


commit f982525bcb414ec421fbc4f6de5665599a743810
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-03-28T16:34:38Z

NIFI-3441 added @Ignore to integration tests due to intermittent failures




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1548: NIFI-3543 Fixed a bug that nifi-jms-cf-service cannot have...

2017-03-03 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1548
  
Thanks you @ShellyLC , will review and merge.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1555: NIFI-3548: Fixed bug that caused failed requests to not ge...

2017-03-03 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1555
  
Wow!, I was just about to hit merge myself ;) Race condition ;)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1553: NIFI-1449 - Migrate PutEmail tests from Mock class ...

2017-03-02 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1553#discussion_r104022318
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestPutEmail.java
 ---
@@ -230,27 +234,42 @@ public void testOutgoingMessageAttachment() throws 
Exception {
 runner.assertAllFlowFilesTransferred(PutEmail.REL_SUCCESS);
 
 // Verify that the Message was populated correctly
-assertEquals("Expected a single message to be sent", 1, 
processor.getMessages().size());
-Message message = processor.getMessages().get(0);
-assertEquals("t...@apache.org", message.getFrom()[0].toString());
-assertEquals("X-Mailer Header", "TestingNiFi", 
message.getHeader("X-Mailer")[0]);
-assertEquals("recipi...@apache.org", 
message.getRecipients(RecipientType.TO)[0].toString());
-
-assertTrue(message.getContent() instanceof MimeMultipart);
-
-final MimeMultipart multipart = (MimeMultipart) 
message.getContent();
-final BodyPart part = multipart.getBodyPart(0);
-final InputStream is = part.getDataHandler().getInputStream();
-final String decodedText = 
StringUtils.newStringUtf8(Base64.decodeBase64(IOUtils.toString(is, "UTF-8")));
-assertEquals("Message Body", decodedText);
-
-final BodyPart attachPart = multipart.getBodyPart(1);
-final InputStream attachIs = 
attachPart.getDataHandler().getInputStream();
-final String text = IOUtils.toString(attachIs, "UTF-8");
-assertEquals("Some text", text);
-
-assertNull(message.getRecipients(RecipientType.BCC));
-assertNull(message.getRecipients(RecipientType.CC));
+assertEquals("Expected a single message to be sent", 1, 
smtpRunner.getFlowFilesForRelationship(PutEmail.REL_SUCCESS).size());
+
+MockFlowFile message;
+
+message = 
smtpRunner.getFlowFilesForRelationship(PutEmail.REL_SUCCESS).get(0);
+
+extractHeadersRunner.enqueue(message);
+extractHeadersRunner.run();
+extractHeadersRunner.shutdown();
+
+message = 
extractHeadersRunner.getFlowFilesForRelationship(ExtractEmailHeaders.REL_SUCCESS).get(0);
+
+assertEquals("t...@apache.org", 
message.getAttribute("email.headers.from.0"));
+assertEquals("X-Mailer Header", "TestingNiFi", 
message.getAttribute("email.headers.x-mailer"));
+
+// Needed in order to avoid errors like
+// "already in use for an active callback or InputStream created 
by ProcessSession.read(FlowFile) has not been closed"
+
+
+message = 
extractHeadersRunner.getFlowFilesForRelationship(ExtractEmailHeaders.REL_SUCCESS).get(0);
+
+//extractAttachmentsRunner.enqueue(message);
+//extractAttachmentsRunner.run();
+//
+//final List splits = 
extractAttachmentsRunner.getFlowFilesForRelationship(ExtractEmailAttachments.REL_ATTACHMENTS);
+//
+//final MimeMultipart multipart = (MimeMultipart) 
message.getContent();
+//final BodyPart part = multipart.getBodyPart(0);
+//final InputStream is = part.getDataHandler().getInputStream();
+//final String decodedText = 
StringUtils.newStringUtf8(Base64.decodeBase64(IOUtils.toString(is, "UTF-8")));
+//assertEquals("Message Body", decodedText);
+//
+//final BodyPart attachPart = multipart.getBodyPart(1);
+//final InputStream attachIs = 
attachPart.getDataHandler().getInputStream();
+//final String text = IOUtils.toString(attachIs, "UTF-8");
+//assertEquals("Some text", text);
--- End diff --

I see. Then we need to look at it as it may be a nasty bug in the processor 
itself with the way FlowFile(s) interacts with session. The test helped to 
uncover the issue that IMHO needs to be addressed before the merge. If you want 
I can take a look at it. LMK.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1548: NIFI-3543 Fixed a bug that nifi-jms-cf-service cannot have...

2017-03-02 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1548
  
@ShellyLC also, please amend your commit message preceding it with the JIRA 
number so it could be automatically linked in JIRA (i.e., "NIFI-3543 Fixed a 
bug that nifi. . ."). Also, please read my comments on the differences between 
BUG and IMPROVEMENT.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1548: NIFI-3543 Fixed a bug that nifi-jms-cf-service cann...

2017-03-02 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1548#discussion_r103945068
  
--- Diff: 
nifi-nar-bundles/nifi-jms-bundle/nifi-jms-cf-service/src/main/java/org/apache/nifi/jms/cf/JMSConnectionFactoryProvider.java
 ---
@@ -178,7 +178,7 @@ private void 
setConnectionFactoryProperties(ConfigurationContext context) {
 } else {
 if (propertyName.equals(BROKER)) {
 if 
(context.getProperty(CONNECTION_FACTORY_IMPL).evaluateAttributeExpressions().getValue().startsWith("org.apache.activemq"))
 {
-this.setProperty("brokerURL", entry.getValue());
+this.setProperty("brokerURL", 
context.getProperty(descriptor).evaluateAttributeExpressions().getValue());
--- End diff --

Unfortunately this only addresses the IF and not the ELSE where the value 
is treated differently to account for providers other then ActiveMQ. So, if 
we're adding EL support for  'brokerURL' we should do it for all. 

Also, with EL one must account for a possibility of null.

Anyway, if please address it when you get a chance or let us know if you 
don't have time and one of us can essentially collaborate and address the 
comments above.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1553: NIFI-1449 - Migrate PutEmail tests from Mock class ...

2017-03-02 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1553#discussion_r103939988
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestPutEmail.java
 ---
@@ -102,13 +91,16 @@ public void testExceptionWhenSending() {
 
 runner.assertQueueEmpty();
 runner.assertAllFlowFilesTransferred(PutEmail.REL_FAILURE);
-assertEquals("Expected an attempt to send a single message", 1, 
processor.getMessages().size());
+List results = 
runner.getFlowFilesForRelationship(PutEmail.REL_FAILURE);
+assertEquals("Expected an attempt to send a single message", 1, 
results.size());
 }
 
 @Test
 public void testOutgoingMessage() throws Exception {
 // verifies that are set on the outgoing Message correctly
-runner.setProperty(PutEmail.SMTP_HOSTNAME, "smtp-host");
+runner.setProperty(PutEmail.SMTP_AUTH, "false");
+runner.setProperty(PutEmail.SMTP_HOSTNAME, "localhost");
+runner.setProperty(PutEmail.SMTP_PORT, String.valueOf(port));
 runner.setProperty(PutEmail.HEADER_XMAILER, "TestingNiFi");
 runner.setProperty(PutEmail.FROM, "t...@apache.org");
--- End diff --

Not sure if that's ok to use legitimate email address as test data 
especially as TO/FROM email address. NOt saying it's wrong, simply raising the 
question.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1553: NIFI-1449 - Migrate PutEmail tests from Mock class ...

2017-03-02 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1553#discussion_r103938291
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestPutEmail.java
 ---
@@ -230,27 +234,42 @@ public void testOutgoingMessageAttachment() throws 
Exception {
 runner.assertAllFlowFilesTransferred(PutEmail.REL_SUCCESS);
 
 // Verify that the Message was populated correctly
-assertEquals("Expected a single message to be sent", 1, 
processor.getMessages().size());
-Message message = processor.getMessages().get(0);
-assertEquals("t...@apache.org", message.getFrom()[0].toString());
-assertEquals("X-Mailer Header", "TestingNiFi", 
message.getHeader("X-Mailer")[0]);
-assertEquals("recipi...@apache.org", 
message.getRecipients(RecipientType.TO)[0].toString());
-
-assertTrue(message.getContent() instanceof MimeMultipart);
-
-final MimeMultipart multipart = (MimeMultipart) 
message.getContent();
-final BodyPart part = multipart.getBodyPart(0);
-final InputStream is = part.getDataHandler().getInputStream();
-final String decodedText = 
StringUtils.newStringUtf8(Base64.decodeBase64(IOUtils.toString(is, "UTF-8")));
-assertEquals("Message Body", decodedText);
-
-final BodyPart attachPart = multipart.getBodyPart(1);
-final InputStream attachIs = 
attachPart.getDataHandler().getInputStream();
-final String text = IOUtils.toString(attachIs, "UTF-8");
-assertEquals("Some text", text);
-
-assertNull(message.getRecipients(RecipientType.BCC));
-assertNull(message.getRecipients(RecipientType.CC));
+assertEquals("Expected a single message to be sent", 1, 
smtpRunner.getFlowFilesForRelationship(PutEmail.REL_SUCCESS).size());
+
+MockFlowFile message;
+
+message = 
smtpRunner.getFlowFilesForRelationship(PutEmail.REL_SUCCESS).get(0);
+
+extractHeadersRunner.enqueue(message);
+extractHeadersRunner.run();
+extractHeadersRunner.shutdown();
+
+message = 
extractHeadersRunner.getFlowFilesForRelationship(ExtractEmailHeaders.REL_SUCCESS).get(0);
+
+assertEquals("t...@apache.org", 
message.getAttribute("email.headers.from.0"));
+assertEquals("X-Mailer Header", "TestingNiFi", 
message.getAttribute("email.headers.x-mailer"));
+
+// Needed in order to avoid errors like
+// "already in use for an active callback or InputStream created 
by ProcessSession.read(FlowFile) has not been closed"
+
+
+message = 
extractHeadersRunner.getFlowFilesForRelationship(ExtractEmailHeaders.REL_SUCCESS).get(0);
+
+//extractAttachmentsRunner.enqueue(message);
+//extractAttachmentsRunner.run();
+//
+//final List splits = 
extractAttachmentsRunner.getFlowFilesForRelationship(ExtractEmailAttachments.REL_ATTACHMENTS);
+//
+//final MimeMultipart multipart = (MimeMultipart) 
message.getContent();
+//final BodyPart part = multipart.getBodyPart(0);
+//final InputStream is = part.getDataHandler().getInputStream();
+//final String decodedText = 
StringUtils.newStringUtf8(Base64.decodeBase64(IOUtils.toString(is, "UTF-8")));
+//assertEquals("Message Body", decodedText);
+//
+//final BodyPart attachPart = multipart.getBodyPart(1);
+//final InputStream attachIs = 
attachPart.getDataHandler().getInputStream();
+//final String text = IOUtils.toString(attachIs, "UTF-8");
+//assertEquals("Some text", text);
--- End diff --

Are these comments necessary?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1530: NIFI-3490 added SAN option for TLS toolkit in stand...

2017-03-01 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1530#discussion_r103752061
  
--- Diff: 
nifi-toolkit/nifi-toolkit-tls/src/main/java/org/apache/nifi/toolkit/tls/util/TlsHelper.java
 ---
@@ -215,4 +210,17 @@ public static JcaPKCS10CertificationRequest 
generateCertificationRequest(String
 JcaContentSignerBuilder jcaContentSignerBuilder = new 
JcaContentSignerBuilder(signingAlgorithm);
 return new 
JcaPKCS10CertificationRequest(jcaPKCS10CertificationRequestBuilder.build(jcaContentSignerBuilder.build(keyPair.getPrivate(;
 }
+
+public static Extensions createDomainAlternativeNamesExtensions(String 
domainAlternativeNames) throws IOException {
+List namesList = new ArrayList<>();
+for(String alternativeName : domainAlternativeNames.split(",")) {
+namesList.add(new GeneralName(GeneralName.dNSName, 
alternativeName));
+}
+
--- End diff --

Since we're on Java 8 may be we should start embracing it a bit more, at 
least for simple things like the above
```
List namesList = Stream.of(domainAlternativeNames.split(","))
.map(alternativeName -> new 
GeneralName(GeneralName.dNSName, alternativeName))
.collect(Collectors.toList());
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1539: NIFI-3520 Updates classloading for components annot...

2017-03-01 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1539#discussion_r103750596
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-nar-utils/src/main/java/org/apache/nifi/nar/ExtensionManager.java
 ---
@@ -199,8 +199,20 @@ public static ClassLoader getClassLoader(final String 
classType, final String in
 // then make a new InstanceClassLoader that is a full copy of 
the NAR Class Loader, otherwise create an empty
 // InstanceClassLoader that has the NAR ClassLoader as a parent
 if (requiresInstanceClassLoading.contains(classType) && 
(registeredClassLoader instanceof URLClassLoader)) {
-final URLClassLoader registeredUrlClassLoader = 
(URLClassLoader) registeredClassLoader;
-instanceClassLoader = new 
InstanceClassLoader(instanceIdentifier, classType, 
registeredUrlClassLoader.getURLs(), registeredUrlClassLoader.getParent());
+final Set classLoaderResources = new HashSet<>();
+
+// if the registered class loader is a NAR class loader 
then walk the chain of class loaders until
+// finding a non-NAR class loader, and build up a list of 
all URLs along the way
+while (registeredClassLoader != null && 
registeredClassLoader instanceof NarClassLoader) {
+final NarClassLoader narClassLoader = (NarClassLoader) 
registeredClassLoader;
+for (URL classLoaderResource : 
narClassLoader.getURLs()) {
+classLoaderResources.add(classLoaderResource);
+}
+registeredClassLoader = narClassLoader.getParent();
+}
--- End diff --

Perhaps this logic belongs in 
```org.apache.nifi.util.file.classloader.ClassLoaderUtils```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1542: NIFI-3523: Ensure that if we are unable to read Pro...

2017-03-01 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1542#discussion_r103749486
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-persistent-provenance-repository/src/main/java/org/apache/nifi/provenance/serialization/CompressableRecordReader.java
 ---
@@ -283,7 +283,20 @@ public StandardProvenanceEventRecord nextRecord() 
throws IOException {
 }
 
 if (isData()) {
-return nextRecord(dis, serializationVersion);
+while (true) {
+try {
+return nextRecord(dis, serializationVersion);
+} catch (final IOException ioe) {
+throw ioe;
+} catch (final Exception e) {
+// This would only happen if a bug were to exist such 
that an 'invalid' event were written
+// out. For example an Event that has no FlowFile 
UUID. While there is in fact an underlying
+// cause that would need to be sorted out in this 
case, the Provenance Repository should be
+// resilient enough to handle this. Otherwise, we end 
up throwing an Exception, which may
+// prevent iterating over additional events in the 
repository.
+logger.error("Failed to read Provenance Event from " + 
filename + "; will skip this event and continue reading subsequent events", e);
+}
+}
--- End diff --

Perhaps a style preference, but IMHO it would look better like this:
```
StandardProvenanceEventRecord record = null;
 do {
try {
 record = nextRecord(dis, serializationVersion);
} catch (IOException e) {
// . . .
} catch (Exception e) {
// . . .
}
} while (record == null);
return record;
```
First, it avoids the ```while(true)``` which essentially relies on the hope 
that something in the loop will eventually break it.
Second it follows the general pattern of reading (e.g., from InputStream) 
used in Reader implementations.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1214: NIFI-2876 refactored demarcators into a common abst...

2017-02-23 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1214#discussion_r102830048
  
--- Diff: 
nifi-commons/nifi-utils/src/main/java/org/apache/nifi/stream/io/util/AbstractDemarcator.java
 ---
@@ -0,0 +1,138 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.stream.io.util;
+
+import java.io.Closeable;
+import java.io.IOException;
+import java.io.InputStream;
+
+import org.apache.nifi.stream.io.exception.TokenTooLargeException;
+
+/**
+ * Base class for implementing streaming demarcators.
+ * 
+ * NOTE: Not intended for multi-thread usage hence not Thread-safe.
+ * 
+ */
+abstract class AbstractDemarcator implements Closeable {
+
+final static int INIT_BUFFER_SIZE = 8192;
+
+private final InputStream is;
+
+private final int initialBufferSize;
+
+private final int maxDataSize;
+
+byte[] buffer;
+
+int index;
+
+int mark;
+
+long offset;
+
+int bufferLength;
+
+/**
+ * Constructs an instance of demarcator with provided {@link 
InputStream}
+ * and max buffer size. Each demarcated token must fit within max 
buffer
+ * size, otherwise the exception will be raised.
+ */
+AbstractDemarcator(InputStream is, int maxDataSize) {
+this(is, maxDataSize, INIT_BUFFER_SIZE);
+}
+
+/**
+ * Constructs an instance of demarcator with provided {@link 
InputStream}
+ * and max buffer size and initial buffer size. Each demarcated token 
must
+ * fit within max buffer size, otherwise the exception will be raised.
+ */
+AbstractDemarcator(InputStream is, int maxDataSize, int 
initialBufferSize) {
+this.validate(is, maxDataSize, initialBufferSize);
+this.is = is;
+this.initialBufferSize = initialBufferSize;
+this.buffer = new byte[initialBufferSize];
+this.maxDataSize = maxDataSize;
+}
+
+@Override
+public void close() throws IOException {
+// noop
+}
+
+/**
+ * Will fill the current buffer from current 'index' position, 
expanding it
+ * and or shuffling it if necessary. If buffer exceeds max buffer size 
a
+ * {@link TokenTooLargeException} will be thrown.
+ *
+ * @throws IOException
+ * if unable to read from the stream
+ */
+void fill() throws IOException {
+if (this.index >= this.buffer.length) {
+if (this.mark == 0) { // expand
+byte[] newBuff = new byte[this.buffer.length + 
this.initialBufferSize];
+System.arraycopy(this.buffer, 0, newBuff, 0, 
this.buffer.length);
+this.buffer = newBuff;
+} else { // shuffle
+int length = this.index - this.mark;
+System.arraycopy(this.buffer, this.mark, this.buffer, 0, 
length);
+this.index = length;
+this.mark = 0;
+}
+}
+
+int bytesRead;
+do {
+bytesRead = this.is.read(this.buffer, this.index, 
this.buffer.length - this.index);
--- End diff --

Good point @mosermw . Mark and I did discuss it earlier and the I kept the 
code as is for exactly that reason. I did add an inline comment to explain the 
do/while loop.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1214: NIFI-2876 refactored demarcators into a common abst...

2017-02-23 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1214#discussion_r102775809
  
--- Diff: 
nifi-commons/nifi-utils/src/main/java/org/apache/nifi/stream/io/util/AbstractDemarcator.java
 ---
@@ -98,22 +136,26 @@ void fill() throws IOException {
 }
 
 int bytesRead;
+/*
+ * The do/while pattern is used here similar to the way it is used 
in
+ * BufferedReader essentially protecting from assuming the EOS 
until it
+ * actually is since not every implementation of InputStream 
guarantees
+ * that bytes are always available while the stream is open.
+ */
 do {
 bytesRead = this.is.read(this.buffer, this.index, 
this.buffer.length - this.index);
 } while (bytesRead == 0);
-this.bufferLength = bytesRead != -1 ? this.index + bytesRead : -1;
-if (this.bufferLength > this.maxDataSize) {
+this.availableBytesLength = bytesRead != -1 ? this.index + 
bytesRead : -1;
+if (this.availableBytesLength > this.maxDataSize) {
--- End diff --

Hmm, good catch. It should only be compared to a token as it's being 
computed, will fix



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1532: NIFI-3522: When creating a clone of a FlowFile, ensure tha...

2017-02-22 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1532
  
+1 merging


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1436: NIFI-3354 Added support for simple AVRO/CSV/JSON transform...

2017-02-17 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1436
  
Merged


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1436: NIFI-3354 Added support for simple AVRO/CSV/JSON tr...

2017-02-17 Thread olegz
Github user olegz closed the pull request at:

https://github.com/apache/nifi/pull/1436


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1436: NIFI-3354 Added support for simple AVRO/CSV/JSON transform...

2017-02-17 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1436
  
@joewitt added new JIRAs to handle multi-line CSV input and multi-char 
delimiters suggested by @patricker 
https://issues.apache.org/jira/browse/NIFI-3499
https://issues.apache.org/jira/browse/NIFI-3500


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1518: NIFI-3495 fixed the index issue with TextLineDemarc...

2017-02-16 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1518

NIFI-3495 fixed the index issue with TextLineDemarcator

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3495

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1518.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1518


commit d65ceef58adc448cfa321411d5f4651459063e1c
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-02-17T02:05:59Z

NIFI-3495 fixed the index issue with TextLineDemarcator




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1436: NIFI-3354 Added support for simple AVRO/CSV/JSON transform...

2017-02-16 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1436
  
@joewitt with regard to content type and name change comment, 100% agree. 
Will make a change.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1179: NIFI-969 Added support for standard JSON

2017-02-14 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1179
  
The issue described in the underlying JIRA and this PR is essentially 
addressed or to be addressed/maintained by a new Schema Registry based 
transformer effort - https://github.com/apache/nifi/pull/1436
So, closing it and suggesting to close the underlying JIRA as well


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1179: NIFI-969 Added support for standard JSON

2017-02-14 Thread olegz
Github user olegz closed the pull request at:

https://github.com/apache/nifi/pull/1179


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101130301
  
--- Diff: 
nifi-framework-api/src/main/java/org/apache/nifi/provenance/IdentifierLookup.java
 ---
@@ -0,0 +1,88 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.provenance;
+
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+/**
+ * Provides a mechanism for obtaining the identifiers of components, 
queues, etc.
+ */
+public interface IdentifierLookup {
+
+/**
+ * @return the identifiers of components that may generate Provenance 
Events
+ */
+List getComponentIdentifiers();
+
+/**
+ * @return a list of component types that may generate Provenance 
Events
+ */
+List getComponentTypes();
+
+/**
+ *
+ * @return the identifiers of FlowFile Queues that are in the flow
+ */
+List getQueueIdentifiers();
+
+default Map<String, Integer> invertQueueIdentifiers() {
+return invertList(getQueueIdentifiers());
+}
+
+default Map<String, Integer> invertComponentTypes() {
+return invertList(getComponentTypes());
+}
+
+default Map<String, Integer> invertComponentIdentifiers() {
+return invertList(getComponentIdentifiers());
+}
+
+default Map<String, Integer> invertList(final List values) {
--- End diff --

Obviously a List can have duplicate entries, so different indexes may 
correspond to the same  value. Just wanted to make sure that this is acceptable.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #584: closing InputStream after execution completes

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/584#discussion_r101118510
  
--- Diff: 
nifi-external/nifi-spark-receiver/src/main/java/org/apache/nifi/spark/NiFiReceiver.java
 ---
@@ -162,17 +162,19 @@ public void run() {
 }
 
 final List dataPackets = new 
ArrayList<>();
+InputStream inStream =null;
 do {
 // Read the data into a byte array and wrap it 
along with the attributes
 // into a NiFiDataPacket.
-final InputStream inStream = 
dataPacket.getData();
+inStream = dataPacket.getData();
 final byte[] data = new byte[(int) 
dataPacket.getSize()];
 StreamUtils.fillBuffer(inStream, data);
 
 final Map<String, String> attributes = 
dataPacket.getAttributes();
 final NiFiDataPacket NiFiDataPacket = new 
StandardNiFiDataPacket(data, attributes);
 dataPackets.add(NiFiDataPacket);
 dataPacket = transaction.receive();
+inStream.close();
--- End diff --

While generally it is indeed necessary to close the resources to avoid 
unnecessary memory leaks, this is not the right place to do that since 
NiFiReceiver is not the creator/owner of the stream and if another handle exist 
for such stream it could lead to unexpected results. We should probably 
investigate where this stream is created and ensure that it is closed there.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #584: closing InputStream after execution completes

2017-02-14 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/584
  
@vcharmcaster Since it appears this is your initial participation with 
Apache NiFi I'd like to welcome you and thank you for your contribution and 
participation. 
Aside from making things run smoothly (as @joewitt pointed out), the core 
reason for having JIRA corresponding to every PR is the traceability factor 
where users and developers like  you can clearly see what issues are being 
addressed. So please file a JIRA - https://issues.apache.org/jira/browse/NIFI 
and update your PR and commit message. And don't hesitate to ask for help with 
this process if need to.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101099586
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/repository/claim/ContentClaimWriteCache.java
 ---
@@ -0,0 +1,168 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.controller.repository.claim;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.HashMap;
+import java.util.LinkedList;
+import java.util.Map;
+import java.util.Queue;
+
+import org.apache.nifi.controller.repository.ContentRepository;
+import org.apache.nifi.stream.io.ByteCountingOutputStream;
+
+public class ContentClaimWriteCache {
+private final ContentRepository contentRepo;
+private final Map<ResourceClaim, ByteCountingOutputStream> streamMap = 
new HashMap<>();
+private final Queue queue = new LinkedList<>();
+private final int bufferSize;
+
+public ContentClaimWriteCache(final ContentRepository contentRepo) {
+this(contentRepo, 8192);
+}
+
+public ContentClaimWriteCache(final ContentRepository contentRepo, 
final int bufferSize) {
+this.contentRepo = contentRepo;
+this.bufferSize = bufferSize;
+}
+
+public void reset() throws IOException {
+try {
+forEachStream(OutputStream::close);
+} finally {
+streamMap.clear();
+queue.clear();
+}
+}
+
+public ContentClaim getContentClaim() throws IOException {
+final ContentClaim contentClaim = queue.poll();
+if (contentClaim != null) {
+contentRepo.incrementClaimaintCount(contentClaim);
+return contentClaim;
+}
+
+final ContentClaim claim = contentRepo.create(false);
+registerStream(claim);
+return claim;
+}
+
+private ByteCountingOutputStream registerStream(final ContentClaim 
contentClaim) throws IOException {
+final OutputStream out = contentRepo.write(contentClaim);
+final OutputStream buffered = new BufferedOutputStream(out, 
bufferSize);
+final ByteCountingOutputStream bcos = new 
ByteCountingOutputStream(buffered);
+streamMap.put(contentClaim.getResourceClaim(), bcos);
+return bcos;
+}
+
+public OutputStream write(final ContentClaim claim) throws IOException 
{
+OutputStream out = streamMap.get(claim.getResourceClaim());
+if (out == null) {
+out = registerStream(claim);
+}
+
+if (!(claim instanceof StandardContentClaim)) {
+// we know that we will only create Content Claims that are of 
type StandardContentClaim, so if we get anything
+// else, just throw an Exception because it is not valid for 
this Repository
+throw new IllegalArgumentException("Cannot write to " + claim 
+ " because that Content Claim does belong to this Claim Cache");
+}
+
+final StandardContentClaim scc = (StandardContentClaim) claim;
+final long initialLength = Math.max(0L, scc.getLength());
+
+final OutputStream bcos = out;
+return new OutputStream() {
+private long bytesWritten = 0L;
+
+@Override
+public void write(final int b) throws IOException {
+bcos.write(b);
+bytesWritten++;
+scc.setLength(initialLength + bytesWritten);
+}
+
+@Override
+public void write(byte[] b, int off, int len) throws 
IOException {
+bcos.write(b, off, len);
+bytesWritten += len;
+

[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101097614
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/repository/claim/ContentClaimWriteCache.java
 ---
@@ -0,0 +1,168 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.controller.repository.claim;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.HashMap;
+import java.util.LinkedList;
+import java.util.Map;
+import java.util.Queue;
+
+import org.apache.nifi.controller.repository.ContentRepository;
+import org.apache.nifi.stream.io.ByteCountingOutputStream;
+
+public class ContentClaimWriteCache {
+private final ContentRepository contentRepo;
+private final Map<ResourceClaim, ByteCountingOutputStream> streamMap = 
new HashMap<>();
+private final Queue queue = new LinkedList<>();
+private final int bufferSize;
+
+public ContentClaimWriteCache(final ContentRepository contentRepo) {
+this(contentRepo, 8192);
+}
+
+public ContentClaimWriteCache(final ContentRepository contentRepo, 
final int bufferSize) {
+this.contentRepo = contentRepo;
+this.bufferSize = bufferSize;
+}
+
+public void reset() throws IOException {
+try {
+forEachStream(OutputStream::close);
+} finally {
+streamMap.clear();
+queue.clear();
+}
+}
+
+public ContentClaim getContentClaim() throws IOException {
+final ContentClaim contentClaim = queue.poll();
+if (contentClaim != null) {
+contentRepo.incrementClaimaintCount(contentClaim);
+return contentClaim;
+}
+
+final ContentClaim claim = contentRepo.create(false);
+registerStream(claim);
+return claim;
+}
+
+private ByteCountingOutputStream registerStream(final ContentClaim 
contentClaim) throws IOException {
+final OutputStream out = contentRepo.write(contentClaim);
+final OutputStream buffered = new BufferedOutputStream(out, 
bufferSize);
+final ByteCountingOutputStream bcos = new 
ByteCountingOutputStream(buffered);
+streamMap.put(contentClaim.getResourceClaim(), bcos);
+return bcos;
+}
+
+public OutputStream write(final ContentClaim claim) throws IOException 
{
+OutputStream out = streamMap.get(claim.getResourceClaim());
+if (out == null) {
+out = registerStream(claim);
+}
+
+if (!(claim instanceof StandardContentClaim)) {
+// we know that we will only create Content Claims that are of 
type StandardContentClaim, so if we get anything
+// else, just throw an Exception because it is not valid for 
this Repository
+throw new IllegalArgumentException("Cannot write to " + claim 
+ " because that Content Claim does belong to this Claim Cache");
+}
+
+final StandardContentClaim scc = (StandardContentClaim) claim;
+final long initialLength = Math.max(0L, scc.getLength());
+
+final OutputStream bcos = out;
+return new OutputStream() {
+private long bytesWritten = 0L;
+
+@Override
+public void write(final int b) throws IOException {
+bcos.write(b);
+bytesWritten++;
+scc.setLength(initialLength + bytesWritten);
+}
+
+@Override
+public void write(byte[] b, int off, int len) throws 
IOException {
+bcos.write(b, off, len);
+bytesWritten += len;
+

[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101094014
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/repository/claim/ContentClaimWriteCache.java
 ---
@@ -0,0 +1,168 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.controller.repository.claim;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.HashMap;
+import java.util.LinkedList;
+import java.util.Map;
+import java.util.Queue;
+
+import org.apache.nifi.controller.repository.ContentRepository;
+import org.apache.nifi.stream.io.ByteCountingOutputStream;
+
+public class ContentClaimWriteCache {
+private final ContentRepository contentRepo;
+private final Map<ResourceClaim, ByteCountingOutputStream> streamMap = 
new HashMap<>();
+private final Queue queue = new LinkedList<>();
+private final int bufferSize;
+
+public ContentClaimWriteCache(final ContentRepository contentRepo) {
+this(contentRepo, 8192);
+}
+
+public ContentClaimWriteCache(final ContentRepository contentRepo, 
final int bufferSize) {
+this.contentRepo = contentRepo;
+this.bufferSize = bufferSize;
+}
+
+public void reset() throws IOException {
+try {
+forEachStream(OutputStream::close);
+} finally {
+streamMap.clear();
+queue.clear();
+}
+}
+
+public ContentClaim getContentClaim() throws IOException {
+final ContentClaim contentClaim = queue.poll();
+if (contentClaim != null) {
+contentRepo.incrementClaimaintCount(contentClaim);
+return contentClaim;
+}
+
+final ContentClaim claim = contentRepo.create(false);
+registerStream(claim);
+return claim;
+}
+
+private ByteCountingOutputStream registerStream(final ContentClaim 
contentClaim) throws IOException {
+final OutputStream out = contentRepo.write(contentClaim);
+final OutputStream buffered = new BufferedOutputStream(out, 
bufferSize);
+final ByteCountingOutputStream bcos = new 
ByteCountingOutputStream(buffered);
+streamMap.put(contentClaim.getResourceClaim(), bcos);
+return bcos;
+}
+
+public OutputStream write(final ContentClaim claim) throws IOException 
{
--- End diff --

The _write(..)_ as a name is quite confusing since there is no writing 
going on. I think something along the lines of _obtainStream(..)_. . .


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101088999
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/test/java/org/apache/nifi/controller/repository/TestWriteAheadFlowFileRepository.java
 ---
@@ -74,6 +86,295 @@ public void clearRepo() throws IOException {
 }
 }
 
+
+@Test
+public void testUpdatePerformance() throws IOException, 
InterruptedException {
--- End diff --

I don't think you meant to leave this test as is without @Ignore. It 
certainly takes time to complete.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101084757
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/repository/claim/ContentClaimWriteCache.java
 ---
@@ -0,0 +1,168 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.controller.repository.claim;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.HashMap;
+import java.util.LinkedList;
+import java.util.Map;
+import java.util.Queue;
+
+import org.apache.nifi.controller.repository.ContentRepository;
+import org.apache.nifi.stream.io.ByteCountingOutputStream;
+
+public class ContentClaimWriteCache {
+private final ContentRepository contentRepo;
+private final Map<ResourceClaim, ByteCountingOutputStream> streamMap = 
new HashMap<>();
+private final Queue queue = new LinkedList<>();
+private final int bufferSize;
+
+public ContentClaimWriteCache(final ContentRepository contentRepo) {
+this(contentRepo, 8192);
+}
+
+public ContentClaimWriteCache(final ContentRepository contentRepo, 
final int bufferSize) {
+this.contentRepo = contentRepo;
+this.bufferSize = bufferSize;
+}
+
+public void reset() throws IOException {
+try {
+forEachStream(OutputStream::close);
+} finally {
+streamMap.clear();
+queue.clear();
+}
+}
+
+public ContentClaim getContentClaim() throws IOException {
+final ContentClaim contentClaim = queue.poll();
+if (contentClaim != null) {
+contentRepo.incrementClaimaintCount(contentClaim);
+return contentClaim;
+}
+
+final ContentClaim claim = contentRepo.create(false);
+registerStream(claim);
+return claim;
+}
+
+private ByteCountingOutputStream registerStream(final ContentClaim 
contentClaim) throws IOException {
--- End diff --

Also, since additional functionality (i.e., counter) of 
_ByteCountingOutputStream_ is used, do you even need the additional wrapper?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101084265
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/repository/claim/ContentClaimWriteCache.java
 ---
@@ -0,0 +1,168 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.controller.repository.claim;
+
+import java.io.BufferedOutputStream;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.HashMap;
+import java.util.LinkedList;
+import java.util.Map;
+import java.util.Queue;
+
+import org.apache.nifi.controller.repository.ContentRepository;
+import org.apache.nifi.stream.io.ByteCountingOutputStream;
+
+public class ContentClaimWriteCache {
+private final ContentRepository contentRepo;
+private final Map<ResourceClaim, ByteCountingOutputStream> streamMap = 
new HashMap<>();
+private final Queue queue = new LinkedList<>();
+private final int bufferSize;
+
+public ContentClaimWriteCache(final ContentRepository contentRepo) {
+this(contentRepo, 8192);
+}
+
+public ContentClaimWriteCache(final ContentRepository contentRepo, 
final int bufferSize) {
+this.contentRepo = contentRepo;
+this.bufferSize = bufferSize;
+}
+
+public void reset() throws IOException {
+try {
+forEachStream(OutputStream::close);
+} finally {
+streamMap.clear();
+queue.clear();
+}
+}
+
+public ContentClaim getContentClaim() throws IOException {
+final ContentClaim contentClaim = queue.poll();
+if (contentClaim != null) {
+contentRepo.incrementClaimaintCount(contentClaim);
+return contentClaim;
+}
+
+final ContentClaim claim = contentRepo.create(false);
+registerStream(claim);
+return claim;
+}
+
+private ByteCountingOutputStream registerStream(final ContentClaim 
contentClaim) throws IOException {
--- End diff --

Does it have to return _ByteCountingOutputStream_? It seems like everywhere 
it is referenced it is used as _ OutputStream_.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101080280
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -2074,7 +2074,25 @@ The Provenance Repository contains the information 
related to Data Provenance. T
 
 |
 |*Property*|*Description*
-|nifi.provenance.repository.implementation|The Provenance Repository 
implementation. The default value is 
org.apache.nifi.provenance.PersistentProvenanceRepository and should only be 
changed with caution. To store provenance events in memory instead of on disk 
(at the risk of data loss in the event of power/machine failure), set this 
property to org.apache.nifi.provenance.VolatileProvenanceRepository.
+|nifi.provenance.repository.implementation|The Provenance Repository 
implementation. The default value is 
org.apache.nifi.provenance.PersistentProvenanceRepository.
+Two additional repositories are available as and should only be changed 
with caution.
+To store provenance events in memory instead of on disk (at the risk of 
data loss in the event of power/machine failure),
+set this property to 
org.apache.nifi.provenance.VolatileProvenanceRepository. This leaves a 
configurable number of Provenance Events in the Java heap, so the number
+of events that can be retained is very limited. It has been used 
essentially as a no-op repository and is not recommended.
--- End diff --

I think all we need to say here is that _VolatileProvenanceRepository_ 
stores events in memory and is configurable. However, once event buffer 
exceeds, events are evicted and essentially lost, and that is expected. There 
is no *risk* here, just different expectations.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1493: NIFI-3356: Initial implementation of writeahead pro...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1493#discussion_r101079356
  
--- Diff: nifi-docs/src/main/asciidoc/administration-guide.adoc ---
@@ -2074,7 +2074,25 @@ The Provenance Repository contains the information 
related to Data Provenance. T
 
 |
 |*Property*|*Description*
-|nifi.provenance.repository.implementation|The Provenance Repository 
implementation. The default value is 
org.apache.nifi.provenance.PersistentProvenanceRepository and should only be 
changed with caution. To store provenance events in memory instead of on disk 
(at the risk of data loss in the event of power/machine failure), set this 
property to org.apache.nifi.provenance.VolatileProvenanceRepository.
+|nifi.provenance.repository.implementation|The Provenance Repository 
implementation. The default value is 
org.apache.nifi.provenance.PersistentProvenanceRepository.
+Two additional repositories are available as and should only be changed 
with caution.
--- End diff --

I am not sure '_should only be changed with caution_' is necessary. It 
sounds like the other two are broken or may break something. In reality they 
don't. They just behave different.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1493: NIFI-3356: Initial implementation of writeahead provenance...

2017-02-14 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1493
  
@markap14 there seem to be some unintended files that made into the commit 
(i.e., /bin/.., .gitignore etc)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1394: NIFI-3255 removed dependency on session.merge from SplitTe...

2017-02-14 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1394
  
@ijokarumawak addressed some of the PR comments and commented on the 
others. Changed the name of the _merge(..)_ operation to 
_concatenateContents(..)_.
Pushed. . .


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1394: NIFI-3255 removed dependency on session.merge from ...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1394#discussion_r101050962
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/SplitText.java
 ---
@@ -326,6 +329,27 @@ public void process(InputStream in) throws IOException 
{
 return splitFlowFiles;
 }
 
+/**
+ * Will merge the two files. This operation has different semantics 
then
+ * sessioin.merge. Also, it always expects 2 and no more then 2 flow 
files,
+ * so vararg signature primarily for simplification.
+ */
+private FlowFile merge(ProcessSession session, FlowFile... flowFiles) {
+FlowFile mergedFlowFile = session.create();
--- End diff --

Koji, as you can see, there are no source/target files here. Just an array 
of FlowFiles that need to be concatenated in the order they were provided. It 
is a simple internal utility operation which requires no provenance. 
The fact that I am storing header information in what essentially is a 
temporary FlowFile, does not mean I am deriving A from B. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1394: NIFI-3255 removed dependency on session.merge from ...

2017-02-14 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1394#discussion_r101050107
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/SplitText.java
 ---
@@ -326,6 +329,27 @@ public void process(InputStream in) throws IOException 
{
 return splitFlowFiles;
 }
 
+/**
+ * Will merge the two files. This operation has different semantics 
then
+ * sessioin.merge. Also, it always expects 2 and no more then 2 flow 
files,
+ * so vararg signature primarily for simplification.
+ */
+private FlowFile merge(ProcessSession session, FlowFile... flowFiles) {
--- End diff --

So, in the context of this specific processor what you're saying is true - 
_prepend header_. But, the operation itself has nothing to do with either 
_headers_ or _prepend_.  It simply merges two files together in the order they 
have been supplied and eventually should probably be considered as candidate 
for some utility static operation. Anyway, I'll keep the name as is, but will 
update the javadoc.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1394: NIFI-3255 removed dependency on session.merge from SplitTe...

2017-02-14 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1394
  
@ijokarumawak sorry, didn't get a chance to get to it yet. Will definitely 
address.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1436: NIFI-3354 Added support for simple AVRO/CSV/JSON tr...

2017-02-07 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1436#discussion_r99842926
  
--- Diff: 
nifi-nar-bundles/nifi-registry-bundle/nifi-registry-processors/src/main/java/org/apache/nifi/schemaregistry/processors/AbstractCSVTransformerViaRegistryProcessor.java
 ---
@@ -0,0 +1,57 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.schemaregistry.processors;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.schemaregistry.services.SchemaRegistry;
+
+/**
+ * Base processor for implementing transform-like processors for CSV
+ * transformations that integrate with Schema Registry (see
+ * {@link SchemaRegistry})
+ */
+abstract class AbstractCSVTransformerViaRegistryProcessor extends 
BaseContentTransformerViaSchemaRegistry {
+
+static final List BASE_CSV_DESCRIPTORS;
+
+static {
+List descriptors = new 
ArrayList();
+descriptors.addAll(BASE_DESCRIPTORS);
+descriptors.add(DELIMITER);
+BASE_CSV_DESCRIPTORS = Collections.unmodifiableList(descriptors);
+}
+
+protected volatile char delimiter;
+
+@Override
+public List getSupportedPropertyDescriptors() {
+return BASE_CSV_DESCRIPTORS;
+}
+
+@Override
+@OnScheduled
+public void onScheduled(ProcessContext context) {
+super.onScheduled(context);
+this.delimiter = 
context.getProperty(DELIMITER).getValue().charAt(0);
--- End diff --

Very valid point. Thanks for bringing this up and indeed we do have utility 
class that parses based on multi-char delimiters. Will make that happen. Once 
again, thanks for bringing this up


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1449: NIFI-3223 added support for expression language

2017-01-31 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1449
  
@pvillard31 yes you can
```
runner.setProperty(MyProcessor.SOME_PROPERTY, "${some.attribute.key}");
Map<String, String> attributes = new HashMap<>();
attributes.put("some.attribute.key", "Bonjour Pierre");
runner.enqueue("Hello World\nGoodbye".getBytes(StandardCharsets.UTF_8), 
attributes);
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1452: NIFI-3179 Added support for default UTF-8 char encoding

2017-01-30 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1452
  
@pvillard31 should be all good now, sorry about that


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1452: NIFI-3179 Added support for default UTF-8 char enco...

2017-01-30 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1452#discussion_r98547614
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/MergeContent.java
 ---
@@ -780,7 +781,7 @@ public void process(final InputStream rawIn) throws 
IOException {
 if 
(attributes.containsKey(CoreAttributes.MIME_TYPE.key())) {
 attributes.put("content-type", 
attributes.get(CoreAttributes.MIME_TYPE.key()));
 }
-
+StandCh
--- End diff --

Wow! Indeed strange


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1401: NIFI-3290 Reporting task to send bulletins with S2S

2017-01-29 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1401#discussion_r98357001
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteBulletinReportingTask.java
 ---
@@ -0,0 +1,235 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.reporting;
+
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import java.text.DateFormat;
+import java.text.SimpleDateFormat;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.TimeZone;
+import java.util.UUID;
+import java.util.concurrent.TimeUnit;
+
+import javax.json.Json;
+import javax.json.JsonArray;
+import javax.json.JsonArrayBuilder;
+import javax.json.JsonBuilderFactory;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+
+import org.apache.nifi.annotation.behavior.Restricted;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.configuration.DefaultSchedule;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.remote.Transaction;
+import org.apache.nifi.remote.TransferDirection;
+import org.apache.nifi.scheduling.SchedulingStrategy;
+
+@Tags({"bulletin", "site", "site to site", "restricted"})
+@CapabilityDescription("Publishes Bulletin events using the Site To Site 
protocol. Note: only up to 5 bulletins are stored per component and up to "
++ "10 bulletins at controller level for a duration of up to 5 
minutes. If this reporting task is not scheduled frequently enough some 
bulletins "
++ "may not be sent.")
+@Stateful(scopes = Scope.LOCAL, description = "Stores the Reporting Task's 
last bulletin ID so that on restart the task knows where it left off.")
+@Restricted("Provides operator the ability to send sensitive details 
contained in bulletin events to any external system.")
+@DefaultSchedule(strategy = SchedulingStrategy.TIMER_DRIVEN, period = "1 
min")
+public class SiteToSiteBulletinReportingTask extends 
AbstractSiteToSiteReportingTask {
+
+static final String TIMESTAMP_FORMAT = "-MM-dd'T'HH:mm:ss.SSS'Z'";
+static final String LAST_EVENT_ID_KEY = "last_event_id";
+
+static final PropertyDescriptor PLATFORM = new 
PropertyDescriptor.Builder()
+.name("Platform")
+.description("The value to use for the platform field in each 
provenance event.")
+.required(true)
+.expressionLanguageSupported(true)
+.defaultValue("nifi")
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+private volatile long lastSentBulletinId = -1L;
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = new ArrayList<>();
+properties.add(DESTINATION_URL);
+properties.add(PORT_NAME);
+properties.add(SSL_CONTEXT);
+properties.add(COMPRESS);
+properties.add(TIMEOUT);
+properties.add(PLATFORM);
+return properties;
+}
+
+@Override
+public void onTrigger(final ReportingContext context) {
+
+final boolean isClustered = context.isClustered();
+fin

[GitHub] nifi pull request #1401: NIFI-3290 Reporting task to send bulletins with S2S

2017-01-29 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1401#discussion_r98356790
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteBulletinReportingTask.java
 ---
@@ -0,0 +1,235 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.reporting;
+
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import java.text.DateFormat;
+import java.text.SimpleDateFormat;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.TimeZone;
+import java.util.UUID;
+import java.util.concurrent.TimeUnit;
+
+import javax.json.Json;
+import javax.json.JsonArray;
+import javax.json.JsonArrayBuilder;
+import javax.json.JsonBuilderFactory;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+
+import org.apache.nifi.annotation.behavior.Restricted;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.configuration.DefaultSchedule;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.remote.Transaction;
+import org.apache.nifi.remote.TransferDirection;
+import org.apache.nifi.scheduling.SchedulingStrategy;
+
+@Tags({"bulletin", "site", "site to site", "restricted"})
+@CapabilityDescription("Publishes Bulletin events using the Site To Site 
protocol. Note: only up to 5 bulletins are stored per component and up to "
++ "10 bulletins at controller level for a duration of up to 5 
minutes. If this reporting task is not scheduled frequently enough some 
bulletins "
++ "may not be sent.")
+@Stateful(scopes = Scope.LOCAL, description = "Stores the Reporting Task's 
last bulletin ID so that on restart the task knows where it left off.")
+@Restricted("Provides operator the ability to send sensitive details 
contained in bulletin events to any external system.")
+@DefaultSchedule(strategy = SchedulingStrategy.TIMER_DRIVEN, period = "1 
min")
+public class SiteToSiteBulletinReportingTask extends 
AbstractSiteToSiteReportingTask {
+
+static final String TIMESTAMP_FORMAT = "-MM-dd'T'HH:mm:ss.SSS'Z'";
+static final String LAST_EVENT_ID_KEY = "last_event_id";
+
+static final PropertyDescriptor PLATFORM = new 
PropertyDescriptor.Builder()
+.name("Platform")
+.description("The value to use for the platform field in each 
provenance event.")
+.required(true)
+.expressionLanguageSupported(true)
+.defaultValue("nifi")
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+private volatile long lastSentBulletinId = -1L;
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = new ArrayList<>();
+properties.add(DESTINATION_URL);
+properties.add(PORT_NAME);
+properties.add(SSL_CONTEXT);
+properties.add(COMPRESS);
+properties.add(TIMEOUT);
+properties.add(PLATFORM);
+return properties;
+}
+
+@Override
+public void onTrigger(final ReportingContext context) {
+
+final boolean isClustered = context.isClustered();
+fin

[GitHub] nifi pull request #1401: NIFI-3290 Reporting task to send bulletins with S2S

2017-01-29 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1401#discussion_r98356724
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteBulletinReportingTask.java
 ---
@@ -0,0 +1,235 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.reporting;
+
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import java.text.DateFormat;
+import java.text.SimpleDateFormat;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.TimeZone;
+import java.util.UUID;
+import java.util.concurrent.TimeUnit;
+
+import javax.json.Json;
+import javax.json.JsonArray;
+import javax.json.JsonArrayBuilder;
+import javax.json.JsonBuilderFactory;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+
+import org.apache.nifi.annotation.behavior.Restricted;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.configuration.DefaultSchedule;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.remote.Transaction;
+import org.apache.nifi.remote.TransferDirection;
+import org.apache.nifi.scheduling.SchedulingStrategy;
+
+@Tags({"bulletin", "site", "site to site", "restricted"})
+@CapabilityDescription("Publishes Bulletin events using the Site To Site 
protocol. Note: only up to 5 bulletins are stored per component and up to "
++ "10 bulletins at controller level for a duration of up to 5 
minutes. If this reporting task is not scheduled frequently enough some 
bulletins "
++ "may not be sent.")
+@Stateful(scopes = Scope.LOCAL, description = "Stores the Reporting Task's 
last bulletin ID so that on restart the task knows where it left off.")
+@Restricted("Provides operator the ability to send sensitive details 
contained in bulletin events to any external system.")
+@DefaultSchedule(strategy = SchedulingStrategy.TIMER_DRIVEN, period = "1 
min")
+public class SiteToSiteBulletinReportingTask extends 
AbstractSiteToSiteReportingTask {
+
+static final String TIMESTAMP_FORMAT = "-MM-dd'T'HH:mm:ss.SSS'Z'";
+static final String LAST_EVENT_ID_KEY = "last_event_id";
+
+static final PropertyDescriptor PLATFORM = new 
PropertyDescriptor.Builder()
+.name("Platform")
+.description("The value to use for the platform field in each 
provenance event.")
+.required(true)
+.expressionLanguageSupported(true)
+.defaultValue("nifi")
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+private volatile long lastSentBulletinId = -1L;
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = new ArrayList<>();
+properties.add(DESTINATION_URL);
+properties.add(PORT_NAME);
+properties.add(SSL_CONTEXT);
+properties.add(COMPRESS);
+properties.add(TIMEOUT);
+properties.add(PLATFORM);
+return properties;
+}
+
+@Override
+public void onTrigger(final ReportingContext context) {
+
+final boolean isClustered = context.isClustered();
+fin

[GitHub] nifi pull request #1401: NIFI-3290 Reporting task to send bulletins with S2S

2017-01-29 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1401#discussion_r98356661
  
--- Diff: 
nifi-nar-bundles/nifi-site-to-site-reporting-bundle/nifi-site-to-site-reporting-task/src/main/java/org/apache/nifi/reporting/SiteToSiteBulletinReportingTask.java
 ---
@@ -0,0 +1,235 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.reporting;
+
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import java.text.DateFormat;
+import java.text.SimpleDateFormat;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.TimeZone;
+import java.util.UUID;
+import java.util.concurrent.TimeUnit;
+
+import javax.json.Json;
+import javax.json.JsonArray;
+import javax.json.JsonArrayBuilder;
+import javax.json.JsonBuilderFactory;
+import javax.json.JsonObject;
+import javax.json.JsonObjectBuilder;
+
+import org.apache.nifi.annotation.behavior.Restricted;
+import org.apache.nifi.annotation.behavior.Stateful;
+import org.apache.nifi.annotation.configuration.DefaultSchedule;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.state.Scope;
+import org.apache.nifi.components.state.StateManager;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.remote.Transaction;
+import org.apache.nifi.remote.TransferDirection;
+import org.apache.nifi.scheduling.SchedulingStrategy;
+
+@Tags({"bulletin", "site", "site to site", "restricted"})
+@CapabilityDescription("Publishes Bulletin events using the Site To Site 
protocol. Note: only up to 5 bulletins are stored per component and up to "
++ "10 bulletins at controller level for a duration of up to 5 
minutes. If this reporting task is not scheduled frequently enough some 
bulletins "
++ "may not be sent.")
+@Stateful(scopes = Scope.LOCAL, description = "Stores the Reporting Task's 
last bulletin ID so that on restart the task knows where it left off.")
+@Restricted("Provides operator the ability to send sensitive details 
contained in bulletin events to any external system.")
+@DefaultSchedule(strategy = SchedulingStrategy.TIMER_DRIVEN, period = "1 
min")
+public class SiteToSiteBulletinReportingTask extends 
AbstractSiteToSiteReportingTask {
+
+static final String TIMESTAMP_FORMAT = "-MM-dd'T'HH:mm:ss.SSS'Z'";
+static final String LAST_EVENT_ID_KEY = "last_event_id";
+
+static final PropertyDescriptor PLATFORM = new 
PropertyDescriptor.Builder()
+.name("Platform")
+.description("The value to use for the platform field in each 
provenance event.")
+.required(true)
+.expressionLanguageSupported(true)
+.defaultValue("nifi")
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR)
+.build();
+
+private volatile long lastSentBulletinId = -1L;
+
+@Override
+protected List getSupportedPropertyDescriptors() {
+final List properties = new ArrayList<>();
+properties.add(DESTINATION_URL);
+properties.add(PORT_NAME);
+properties.add(SSL_CONTEXT);
+properties.add(COMPRESS);
+properties.add(TIMEOUT);
+properties.add(PLATFORM);
+return properties;
+}
--- End diff --

@pvillard31 given the fact that this operation is invoked multiple times 
consider applying the same pattern we use in Processors where property 
descr

[GitHub] nifi issue #1401: NIFI-3290 Reporting task to send bulletins with S2S

2017-01-29 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1401
  
Reviewing. . .


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1400: NIFI-957 Added the possibility to use DefaultSchedu...

2017-01-29 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1400#discussion_r98356568
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/reporting/AbstractReportingTaskNode.java
 ---
@@ -72,6 +77,27 @@ public AbstractReportingTaskNode(final ReportingTask 
reportingTask, final String
 this.reportingTask = reportingTask;
 this.processScheduler = processScheduler;
 this.serviceLookup = controllerServiceProvider;
+
+final Class reportingClass = reportingTask.getClass();
+
+try {
+if (reportingClass.isAnnotationPresent(DefaultSchedule.class)) 
{
+DefaultSchedule dsc = 
reportingClass.getAnnotation(DefaultSchedule.class);
+try {
+this.setSchedulingStrategy(dsc.strategy());
+} catch (Throwable ex) {
+LOG.error(String.format("Error while setting 
scheduling strategy from DefaultSchedule annotation: %s", ex.getMessage()), ex);
+}
+try {
+this.setSchedulingPeriod(dsc.period());
+} catch (Throwable ex) {
+
this.setSchedulingStrategy(SchedulingStrategy.TIMER_DRIVEN);
+LOG.error(String.format("Error while setting 
scheduling period from DefaultSchedule annotation: %s", ex.getMessage()), ex);
+}
+}
+} catch (Throwable ex) {
+LOG.error(String.format("Error while setting default schedule 
from DefaultSchedule annotation: %s",ex.getMessage()),ex);
--- End diff --

Wondering based on the code above and internal try/catch is it even 
possible to get here?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1400: NIFI-957 Added the possibility to use DefaultSchedu...

2017-01-29 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1400#discussion_r98356528
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/reporting/AbstractReportingTaskNode.java
 ---
@@ -72,6 +77,27 @@ public AbstractReportingTaskNode(final ReportingTask 
reportingTask, final String
 this.reportingTask = reportingTask;
 this.processScheduler = processScheduler;
 this.serviceLookup = controllerServiceProvider;
+
+final Class reportingClass = reportingTask.getClass();
+
+try {
+if (reportingClass.isAnnotationPresent(DefaultSchedule.class)) 
{
+DefaultSchedule dsc = 
reportingClass.getAnnotation(DefaultSchedule.class);
--- End diff --

I am wondering if we should instead use Spring's AnnotationUtils and its 
discovery operations there. Basically the question is if you *only* looking for 
annotation in the specific class _reportingTask.getClass()_, then the above is 
OK i guess, but I am afraid that if impl of the ReportingTask has a more 
complex hierarchy and annotated operation is in some super class, then the 
above will not work. WDYT?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1400: NIFI-957 Added the possibility to use DefaultSchedule anno...

2017-01-29 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1400
  
Reviewing. . .
@pvillard31 could you please address the merge conflict, looks trivial


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1419: NIFI-3314 Adjusting Dockerfile for Docker Hub to use defau...

2017-01-29 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1419
  
LGTM, merging



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1453: NIFI-3161 added ASM exclusion

2017-01-29 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1453

NIFI-3161 added ASM exclusion

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3161

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1453.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1453


commit b56fee8fb8d409081a463646cbfd2b8538c1bea0
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-01-29T15:34:54Z

NIFI-3161 added ASM exclusion




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1452: NIFI-3179 Added support for default UTF-8 char enco...

2017-01-29 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1452

NIFI-3179 Added support for default UTF-8 char encoding

removed deprected usage of BAOS and BAIS

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3179

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1452.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1452


commit 2d346aa5b3c47d9b3e792eb027034109dba1c181
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-01-29T15:11:09Z

NIFI-3179 Added support for default UTF-8 char encoding
removed deprected usage of BAOS and BAIS




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1451: NIFI-3180 Fixed NPE in TemplateUtils

2017-01-29 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1451

NIFI-3180 Fixed NPE in TemplateUtils

added null check for ProcessorDTO.getRelationship()
removed deprecated usage of ByteArrayInputStream

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3180

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1451.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1451


commit a22c7911a0a19530737ecfbe346f3e6e170366da
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-01-29T14:17:35Z

NIFI-3180 Fixed NPE in TemplateUtils
added null check for ProcessorDTO.getRelationship()
removed deprecated usage of ByteArrayInputStream




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1449: NIFI-3223 added support for expression language

2017-01-27 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1449

NIFI-3223 added support for expression language

- EXCHANGE
- ROUTING_KEY

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3223

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1449.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1449


commit 0b2d1417a948804cebe121acde19bc5f2184da44
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-01-27T18:41:54Z

NIFI-3223 added support for expression language
- EXCHANGE
- ROUTING_KEY




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1436: NIFI-3354 Added support for simple AVRO/CSV/JSON transform...

2017-01-27 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1436
  
@joewitt copyrights are fixed. Please see the latest commit.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1425: NIFI-3363: PutKafka NPE with User-Defined partition

2017-01-27 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1425
  
Tested with Kafka 0.8 broker, looks good, will merge once build completes


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1425: NIFI-3363: PutKafka NPE with User-Defined partition

2017-01-26 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1425
  
Reviewing. . .


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1436: NIFI-3354 Added support for simple AVRO/CSV/JSON transform...

2017-01-24 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1436
  
@joewitt Thanks Joe. That was indeed copy/paste issue from unrelated 
internal effort. Will fix.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1434: NIFI-2615 Adding GetTCP processor

2017-01-23 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1434#discussion_r97348509
  
--- Diff: 
nifi-nar-bundles/nifi-tcp-bundle/nifi-tcp-nar/src/main/resources/META-INF/NOTICE
 ---
@@ -0,0 +1,24 @@
+nifi-gettcp-nar
--- End diff --

Change name to _nifi-tcp-nar_


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1434: NIFI-2615 Adding GetTCP processor

2017-01-23 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1434#discussion_r97348408
  
--- Diff: nifi-nar-bundles/nifi-tcp-bundle/.gitignore ---
@@ -0,0 +1,16 @@
+target
+.project
+.settings
+.classpath
+nbactions.xml
+nb-configuration.xml
+.DS_Store
+.metadata
+.recommenders
+
+# Intellij
+.idea/
+*.iml
+*.iws
+*~
+
--- End diff --

you may want to remove this gitignore since all this stuff is already 
mentioned in the root one.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1434: NIFI-2615 Adding GetTCP processor

2017-01-23 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1434#discussion_r97349937
  
--- Diff: 
nifi-nar-bundles/nifi-tcp-bundle/nifi-tcp-processors/src/main/java/org/apache/nifi/processors/gettcp/DisconnectListener.java
 ---
@@ -0,0 +1,6 @@
+package org.apache.nifi.processors.gettcp;
+
+public interface DisconnectListener {
+
+void onDisconnect(ReceivingClient client);
+}
--- End diff --

This class is actually not used anywhere. Accidental leftover (my bad), so 
please remove


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1436: NIFI-3354 Added support for simple AVRO/CSV/JSON tr...

2017-01-22 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1436

NIFI-3354 Added support for simple AVRO/CSV/JSON transformers that ut…

…ilize external Schema

Added support for simple Key/Value Schema Registry as Controller Service
Added support for registering multiple schemas as dynamic properties of 
Schema Registry  Controller Service
Added the following 8 processors
- ExtractAvroFieldsViaSchemaRegistry
- TransformAvroToCSVViaSchemaRegistry
- TransformAvroToJsonViaSchemaRegistry
- TransformCSVToAvroViaSchemaRegistry
- TransformCSVToJsonViaSchemaRegistry
- TransformJsonToAvroViaSchemaRegistry
- TransformJsonToCSVViaSchemaRegistry
- UpdateAttributeWithSchemaViaSchemaRegistry

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3354

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1436.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1436


commit e4fda15f573a8ff4a40b0e98dab5833ef9c6c7f5
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-01-20T15:04:48Z

NIFI-3354 Added support for simple AVRO/CSV/JSON transformers that utilize 
external Schema
Added support for simple Key/Value Schema Registry as Controller Service
Added support for registering multiple schemas as dynamic properties of 
Schema Registry  Controller Service
Added the following 8 processors
- ExtractAvroFieldsViaSchemaRegistry
- TransformAvroToCSVViaSchemaRegistry
- TransformAvroToJsonViaSchemaRegistry
- TransformCSVToAvroViaSchemaRegistry
- TransformCSVToJsonViaSchemaRegistry
- TransformJsonToAvroViaSchemaRegistry
- TransformJsonToCSVViaSchemaRegistry
- UpdateAttributeWithSchemaViaSchemaRegistry




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1435: NIFI-3345 Added support for simple AVRO/CSV/JSON tr...

2017-01-22 Thread olegz
Github user olegz closed the pull request at:

https://github.com/apache/nifi/pull/1435


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1435: NIFI-3345 Added support for simple AVRO/CSV/JSON transform...

2017-01-22 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1435
  
Closing due to wrong commit message, will resubmit 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1435: NIFI-3345 Added support for simple AVRO/CSV/JSON tr...

2017-01-22 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1435

NIFI-3345 Added support for simple AVRO/CSV/JSON transformers that ut…

…ilize external Schema

Added support for simple Key/Value Schema Registry as Controller Service
Added support for registering multiple schemas as dynamic properties of 
Schema Registry  Controller Service
Added the following 8 processors
- ExtractAvroFieldsViaSchemaRegistry
- TransformAvroToCSVViaSchemaRegistry
- TransformAvroToJsonViaSchemaRegistry
- TransformCSVToAvroViaSchemaRegistry
- TransformCSVToJsonViaSchemaRegistry
- TransformJsonToAvroViaSchemaRegistry
- TransformJsonToCSVViaSchemaRegistry
- UpdateAttributeWithSchemaViaSchemaRegistry

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3354

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1435.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1435


commit 61cc69b3df687b3468fb82de8cc29122993dcb30
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-01-20T15:04:48Z

NIFI-3345 Added support for simple AVRO/CSV/JSON transformers that utilize 
external Schema
Added support for simple Key/Value Schema Registry as Controller Service
Added support for registering multiple schemas as dynamic properties of 
Schema Registry  Controller Service
Added the following 8 processors
- ExtractAvroFieldsViaSchemaRegistry
- TransformAvroToCSVViaSchemaRegistry
- TransformAvroToJsonViaSchemaRegistry
- TransformCSVToAvroViaSchemaRegistry
- TransformCSVToJsonViaSchemaRegistry
- TransformJsonToAvroViaSchemaRegistry
- TransformJsonToCSVViaSchemaRegistry
- UpdateAttributeWithSchemaViaSchemaRegistry




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1433: NIFI-2615 Adding support for GetTCP processor

2017-01-19 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1433
  
@mosermw +1 to the name change!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1118: NIFI-2716 initial implementation of JMS Broker processor

2017-01-08 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1118
  
There seems to be no traction at the moment, closing for now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1393: NIFI-3225 removed dependency on session.merge from SplitTe...

2017-01-04 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1393
  
Closing due to typo in JIRA #


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1393: NIFI-3225 removed dependency on session.merge from ...

2017-01-04 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1393

NIFI-3225 removed dependency on session.merge from SplitText

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3225B

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1393.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1393


commit ca0103719b02a884f2d70f88a6160bbed5c604b1
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-01-04T20:37:30Z

NIFI-3225 removed dependency on session.merge from SplitText




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1389: NIFI-3278 Fixed ArrayIndexOutOfBoundsException in T...

2017-01-04 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1389#discussion_r94649300
  
--- Diff: 
nifi-commons/nifi-utils/src/main/java/org/apache/nifi/stream/io/util/TextLineDemarcator.java
 ---
@@ -159,8 +159,11 @@ private int isEol(byte currentByte, int currentIndex) {
 this.index = currentIndex + 1;
 this.fill();
 }
-currentByte = this.buffer[currentIndex + 1];
-crlfLength = currentByte == '\n' ? 2 : 1;
+crlfLength = 1;
+if (this.bufferLength != -1) {
--- End diff --

Hmm, why? Basically after the above call to _fill()_ it will be -1 or . . . 
Am I missing your point?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1389: NIFI-3278 Fixed ArrayIndexOutOfBoundsException in T...

2017-01-04 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1389

NIFI-3278 Fixed ArrayIndexOutOfBoundsException in TextLineDemarcator

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-3278

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1389.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1389


commit de3a4e0a62b06fdc97465b85043f35f6d31168f1
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2017-01-04T17:09:34Z

NIFI-3278 Fixed ArrayIndexOutOfBoundsException in TextLineDemarcator




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1289: NIFI-3141: Fixed TailFile ArrayIndexOutOfBounds.

2016-12-02 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1289
  
Reviewing. . .


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1278: NIFI-2886 Fixed the lifecycle delay

2016-11-29 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1278

NIFI-2886 Fixed the lifecycle delay

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.

- Fixed the intrinsic lifecycle delay on the Processor  caused by the 
administrativeYield wait on the re-submited OnScheduled task

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi NIFI-2886

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1278.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1278


commit 706e167ea282b37008c2140bbc2c1838f382198e
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2016-11-29T13:42:09Z

NIFI-2886 Fixed the lifecycle delay
- Fixed the intrinsic lifecycle delay on the Processor  caused by the 
administrativeYield wait on the re-submited OnScheduled task




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1259: Added posix guard to ignore tests on windows

2016-11-22 Thread olegz
GitHub user olegz opened a pull request:

https://github.com/apache/nifi/pull/1259

Added posix guard to ignore tests on windows

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/olegz/nifi GetFile

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/1259.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1259


commit 6296242acdd38bedcf7298664743e28a6c90c78b
Author: Oleg Zhurakousky <o...@suitcase.io>
Date:   2016-11-22T21:31:52Z

Added posix guard to ignore tests on windows




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1221: NIFI-3024 - Encrypt-toolkit flow.xml.gz support

2016-11-21 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1221
  
@brosander not sure if you noticed there is a merge conflict that needs to 
be addressed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1245: Nifi 3066

2016-11-21 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1245
  
+1, merging


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1233: NIFI-3011: Added Elasticsearch5 processors

2016-11-21 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1233#discussion_r88927684
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-5-processors/src/main/java/org/apache/nifi/processors/elasticsearch/FetchElasticsearch5.java
 ---
@@ -0,0 +1,222 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import org.apache.nifi.annotation.behavior.EventDriven;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.SupportsBatching;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.OutputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.elasticsearch.ElasticsearchTimeoutException;
+import org.elasticsearch.action.get.GetRequestBuilder;
+import org.elasticsearch.action.get.GetResponse;
+import org.elasticsearch.client.transport.NoNodeAvailableException;
+import org.elasticsearch.node.NodeClosedException;
+import org.elasticsearch.transport.ReceiveTimeoutTransportException;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import java.nio.charset.Charset;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+
+@InputRequirement(InputRequirement.Requirement.INPUT_REQUIRED)
+@EventDriven
+@SupportsBatching
+@Tags({"elasticsearch", "elasticsearch 5", "fetch", "read", "get"})
+@CapabilityDescription("Retrieves a document from Elasticsearch using the 
specified connection properties and the "
++ "identifier of the document to retrieve. If the cluster has been 
configured for authorization and/or secure "
++ "transport (SSL/TLS), and the X-Pack plugin is available, secure 
connections can be made. This processor "
++ "supports Elasticsearch 5.x clusters.")
+@WritesAttributes({
+@WritesAttribute(attribute = "filename", description = "The 
filename attributes is set to the document identifier"),
+@WritesAttribute(attribute = "es.index", description = "The 
Elasticsearch index containing the document"),
+@WritesAttribute(attribute = "es.type", description = "The 
Elasticsearch document type")
+})
+public class FetchElasticsearch5 extends 
AbstractElasticsearch5TransportClientProcessor {
+
+public static final Relationship REL_SUCCESS = new 
Relationship.Builder().name("success")
+.description("All FlowFiles that are read from Elasticsearch 
are routed to this relationship").build();
+
+public static final Relationship REL_FAILURE = new 
Relationship.Builder().name("failure")
+.description("All FlowFiles that cannot be read from 
Elasticsearch are routed to this relationship").build();
+
+public static final Relationship REL_RETRY = new 
Relationship.Builder().name("retry")
+.description("A FlowFile is routed to this relationship if the 
document cannot b

[GitHub] nifi pull request #1233: NIFI-3011: Added Elasticsearch5 processors

2016-11-20 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1233#discussion_r88806479
  
--- Diff: 
nifi-nar-bundles/nifi-elasticsearch-bundle/nifi-elasticsearch-5-processors/src/main/java/org/apache/nifi/processors/elasticsearch/AbstractElasticsearch5Processor.java
 ---
@@ -0,0 +1,97 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.elasticsearch;
+
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.StringUtils;
+
+import java.util.Collection;
+import java.util.HashSet;
+import java.util.Set;
+
+/**
+ * A base class for all Elasticsearch processors
+ */
+public abstract class AbstractElasticsearch5Processor extends 
AbstractProcessor {
--- End diff --

Minor, but looking at AbstractElasticsearch5TransportClientProcessor does 
this one have to be public?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1236: LDAP - Configurable strategy to identify users

2016-11-19 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1236
  
@mcgilman @ijokarumawak squashing and merging


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1236: LDAP - Configurable strategy to identify users

2016-11-18 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1236
  
I am +1 now. If @ijokarumawak has no issues let's merge


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1202: NIFI-2854: Refactor repositories and swap files to use sch...

2016-11-18 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1202
  
LGTM, merging


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1244: NIFI-2954 Moved StandardPropertyValidator to nifi-u...

2016-11-18 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1244#discussion_r88666833
  
--- Diff: 
nifi-nar-bundles/nifi-avro-bundle/nifi-avro-processors/src/main/java/org/apache/nifi/processors/avro/ExtractAvroMetadata.java
 ---
@@ -54,7 +54,7 @@
 import org.apache.nifi.processor.Relationship;
 import org.apache.nifi.processor.exception.ProcessException;
 import org.apache.nifi.processor.io.InputStreamCallback;
-import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.util.validator.StandardValidators;
--- End diff --

Wouldn't this break all the existing processors that are not part of NiFi 
distribution? Basically any custom/in-house developed processor?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1244: NIFI-2954 Moved StandardPropertyValidator to nifi-u...

2016-11-18 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1244#discussion_r88665853
  
--- Diff: 
nifi-commons/nifi-utils/src/main/java/org/apache/nifi/util/file/classloader/ClassLoaderUtils.java
 ---
@@ -66,12 +66,16 @@ public static ClassLoader getCustomClassLoader(String 
modulePath, ClassLoader pa
 if (modulePaths != null) {
 modulePaths.stream()
 .flatMap(path -> Arrays.stream(path.split(",")))
-.filter(StringUtils::isNotBlank)
+.filter(path -> isNotBlank(path))
 .map(String::trim)
 .forEach(m -> modules.add(m));
 }
 return toURLs(modules, filenameFilter, suppressExceptions);
 }
+
+private static boolean isNotBlank(final String value){
+return value != null && !value.trim().isEmpty();
+}
--- End diff --

Not sure why this operation was added since StringUtils actually checks for 
 white spaces


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1236: LDAP - Configurable strategy to identify users

2016-11-17 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1236#discussion_r88571378
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/main/java/org/apache/nifi/web/contextlistener/ApplicationStartupContextListener.java
 ---
@@ -121,4 +111,16 @@ public void contextDestroyed(ServletContextEvent sce) {
 }
 }
 }
+
+private void shutdown(final FlowService flowService, final 
RequestReplicator requestReplicator) {
+// ensure the flow service is terminated
+if (flowService != null && flowService.isRunning()) {
+flowService.stop(false);
+}
+
+// ensure the request replicator is shutdown
+if (requestReplicator != null) {
+requestReplicator.shutdown();
+}
--- End diff --

Shouldn't _stop_ and _shutdown_ be guarded against exception (wrapped 
try/catch). I mean if something happens in _stop_, _shutdown_ won't execute. 
Not sure if that's ok.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #1242: NIFI-3059 Adds --ignore-source option to the ZooKeeper Mig...

2016-11-17 Thread olegz
Github user olegz commented on the issue:

https://github.com/apache/nifi/pull/1242
  
Reviewing. . .



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1202: NIFI-2854: Refactor repositories and swap files to ...

2016-11-17 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1202#discussion_r88488867
  
--- Diff: 
nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-core/src/main/java/org/apache/nifi/controller/repository/schema/ResourceClaimFieldMap.java
 ---
@@ -0,0 +1,85 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.controller.repository.schema;
+
+import org.apache.nifi.controller.repository.claim.ResourceClaim;
+import org.apache.nifi.controller.repository.claim.ResourceClaimManager;
+import org.apache.nifi.repository.schema.Record;
+import org.apache.nifi.repository.schema.RecordSchema;
+
+public class ResourceClaimFieldMap implements Record {
+private final ResourceClaim resourceClaim;
+private final RecordSchema schema;
+
+public ResourceClaimFieldMap(final ResourceClaim resourceClaim, final 
RecordSchema schema) {
+this.resourceClaim = resourceClaim;
--- End diff --

Perhaps null checks,


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1202: NIFI-2854: Refactor repositories and swap files to ...

2016-11-17 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1202#discussion_r88486311
  
--- Diff: 
nifi-nar-bundles/nifi-provenance-repository-bundle/nifi-persistent-provenance-repository/src/main/java/org/apache/nifi/provenance/AbstractRecordWriter.java
 ---
@@ -0,0 +1,173 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.nifi.provenance;
+
+import java.io.File;
+import java.io.IOException;
+import java.io.OutputStream;
+import java.util.concurrent.locks.Lock;
+import java.util.concurrent.locks.ReentrantLock;
+
+import org.apache.nifi.provenance.serialization.RecordWriter;
+import org.apache.nifi.provenance.toc.TocWriter;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public abstract class AbstractRecordWriter implements RecordWriter {
+private static final Logger logger = 
LoggerFactory.getLogger(AbstractRecordWriter.class);
+
+private final File file;
+private final TocWriter tocWriter;
+private final Lock lock = new ReentrantLock();
+
+private volatile boolean dirty = false;
+private volatile boolean closed = false;
+
+private int recordsWritten = 0;
+
+public AbstractRecordWriter(final File file, final TocWriter writer) 
throws IOException {
+logger.trace("Creating Record Writer for {}", file);
+
+this.file = file;
+this.tocWriter = writer;
+}
+
+@Override
+public synchronized void close() throws IOException {
+closed = true;
+
+logger.trace("Closing Record Writer for {}", file == null ? null : 
file.getName());
+
+lock();
--- End diff --

Wondering what is the purpose of explicit locking when _synchronized_ is 
used?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


  1   2   3   4   >