[GitHub] [nifi] adamfisher commented on a change in pull request #3317: NIFI-6047 Add DetectDuplicateRecord Processor

2020-06-29 Thread GitBox


adamfisher commented on a change in pull request #3317:
URL: https://github.com/apache/nifi/pull/3317#discussion_r447339535



##
File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DetectDuplicateRecord.java
##
@@ -0,0 +1,620 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import com.google.common.base.Joiner;
+import com.google.common.hash.BloomFilter;
+import com.google.common.hash.Funnels;
+import org.apache.commons.codec.binary.Hex;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.codec.digest.MessageDigestAlgorithms;
+import org.apache.nifi.annotation.behavior.*;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.*;
+import org.apache.nifi.distributed.cache.client.Deserializer;
+import org.apache.nifi.distributed.cache.client.DistributedMapCacheClient;
+import org.apache.nifi.distributed.cache.client.Serializer;
+import 
org.apache.nifi.distributed.cache.client.exception.DeserializationException;
+import 
org.apache.nifi.distributed.cache.client.exception.SerializationException;
+import org.apache.nifi.expression.AttributeExpression.ResultType;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.*;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.RecordPathResult;
+import org.apache.nifi.record.path.util.RecordPathCache;
+import org.apache.nifi.record.path.validation.RecordPathPropertyNameValidator;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.*;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.*;
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import java.security.MessageDigest;
+import java.util.*;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+import static java.util.stream.Collectors.toList;
+import static org.apache.commons.codec.binary.StringUtils.getBytesUtf8;
+import static org.apache.commons.lang3.StringUtils.*;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@SystemResourceConsideration(resource = SystemResource.MEMORY,
+description = "Caches records from each incoming FlowFile and 
determines if the cached record has " +
+"already been seen. The name of user-defined properties 
determines the RecordPath values used to " +
+"determine if a record is unique. If no user-defined 
properties are present, the entire record is " +
+"used as the input to determine uniqueness. All duplicate 
records are routed to 'duplicate'. " +
+"If the record is not determined to be a duplicate, the 
Processor routes the record to 'non-duplicate'.")
+@Tags({"text", "record", "update", "change", "replace", "modify", "distinct", 
"unique",
+"filter", "hash", "dupe", "duplicate", "dedupe"})
+@CapabilityDescription("Caches records from each incoming FlowFile and 
determines if the cached record has " +
+"already been seen. The name of user-defined properties determines the 
RecordPath values used to " +
+"determine if a record is unique. If no user-defined properties are 
present, the entire record is " +
+"used as the input to determine uniqueness. All duplicate records are 
routed to 'duplicate'. " +
+"If the record is not determined to be a 

[GitHub] [nifi] adamfisher commented on a change in pull request #3317: NIFI-6047 Add DetectDuplicateRecord Processor

2020-06-29 Thread GitBox


adamfisher commented on a change in pull request #3317:
URL: https://github.com/apache/nifi/pull/3317#discussion_r447318893



##
File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DetectDuplicateRecord.java
##
@@ -0,0 +1,620 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import com.google.common.base.Joiner;
+import com.google.common.hash.BloomFilter;
+import com.google.common.hash.Funnels;
+import org.apache.commons.codec.binary.Hex;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.codec.digest.MessageDigestAlgorithms;
+import org.apache.nifi.annotation.behavior.*;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.*;
+import org.apache.nifi.distributed.cache.client.Deserializer;
+import org.apache.nifi.distributed.cache.client.DistributedMapCacheClient;
+import org.apache.nifi.distributed.cache.client.Serializer;
+import 
org.apache.nifi.distributed.cache.client.exception.DeserializationException;
+import 
org.apache.nifi.distributed.cache.client.exception.SerializationException;
+import org.apache.nifi.expression.AttributeExpression.ResultType;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.*;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.RecordPathResult;
+import org.apache.nifi.record.path.util.RecordPathCache;
+import org.apache.nifi.record.path.validation.RecordPathPropertyNameValidator;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.*;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+
+import java.io.*;
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import java.security.MessageDigest;
+import java.util.*;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+import static java.util.stream.Collectors.toList;
+import static org.apache.commons.codec.binary.StringUtils.getBytesUtf8;
+import static org.apache.commons.lang3.StringUtils.*;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@SystemResourceConsideration(resource = SystemResource.MEMORY,
+description = "Caches records from each incoming FlowFile and 
determines if the cached record has " +
+"already been seen. The name of user-defined properties 
determines the RecordPath values used to " +
+"determine if a record is unique. If no user-defined 
properties are present, the entire record is " +
+"used as the input to determine uniqueness. All duplicate 
records are routed to 'duplicate'. " +
+"If the record is not determined to be a duplicate, the 
Processor routes the record to 'non-duplicate'.")
+@Tags({"text", "record", "update", "change", "replace", "modify", "distinct", 
"unique",
+"filter", "hash", "dupe", "duplicate", "dedupe"})
+@CapabilityDescription("Caches records from each incoming FlowFile and 
determines if the cached record has " +

Review comment:
   File added in commits:
   
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/resources/docs/org/apache/nifi/processors/standard/DetectDuplicateRecord/additionalDetails.html





This is an automated message from the Apache Git Service.
To respond to the message, please log 

[GitHub] [nifi] adamfisher commented on a change in pull request #3317: NIFI-6047 Add DetectDuplicateRecord Processor

2020-06-29 Thread GitBox


adamfisher commented on a change in pull request #3317:
URL: https://github.com/apache/nifi/pull/3317#discussion_r447317427



##
File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestDetectDuplicateRecord.java
##
@@ -0,0 +1,205 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.record.MockRecordParser;
+import org.apache.nifi.serialization.record.MockRecordWriter;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.util.*;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.apache.nifi.processors.standard.DetectDuplicateRecord.*;
+import static org.junit.Assert.assertEquals;
+
+public class TestDetectDuplicateRecord {
+
+static {
+System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", "info");

Review comment:
   @MikeThomsen What would the values be for the `@AfterClass`? That whole 
section was copied from another test and I don't see any `@BeforeClasss` or 
`@AfterClass` decorators on them.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [nifi] adamfisher commented on a change in pull request #3317: NIFI-6047 Add DetectDuplicateRecord Processor

2020-06-29 Thread GitBox


adamfisher commented on a change in pull request #3317:
URL: https://github.com/apache/nifi/pull/3317#discussion_r447314976



##
File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestDetectDuplicateRecord.java
##
@@ -0,0 +1,209 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.record.MockRecordParser;
+import org.apache.nifi.serialization.record.MockRecordWriter;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.util.*;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import static org.apache.nifi.processors.standard.DetectDuplicateRecord.*;
+import static org.junit.Assert.assertEquals;
+
+public class TestDetectDuplicateRecord {
+
+private TestRunner runner;
+private MockCacheService cache;
+private MockRecordParser reader;
+private MockRecordWriter writer;
+
+@BeforeClass
+public static void beforeClass() {
+System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", "info");
+System.setProperty("org.slf4j.simpleLogger.showDateTime", "true");
+System.setProperty("org.slf4j.simpleLogger.log.nifi.io.nio", "debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.DetectDuplicateRecord",
 "debug");
+
System.setProperty("org.slf4j.simpleLogger.log.nifi.processors.standard.TestDetectDuplicateRecord",
 "debug");
+}
+
+@Before
+public void setup() throws InitializationException {
+runner = TestRunners.newTestRunner(DetectDuplicateRecord.class);
+
+// RECORD_READER, RECORD_WRITER
+reader = new MockRecordParser();
+writer = new MockRecordWriter("header", false);
+
+runner.addControllerService("reader", reader);
+runner.enableControllerService(reader);
+runner.addControllerService("writer", writer);
+runner.enableControllerService(writer);
+
+runner.setProperty(RECORD_READER, "reader");
+runner.setProperty(RECORD_WRITER, "writer");
+
+reader.addSchemaField("firstName", RecordFieldType.STRING);
+reader.addSchemaField("middleName", RecordFieldType.STRING);
+reader.addSchemaField("lastName", RecordFieldType.STRING);
+
+// INCLUDE_ZERO_RECORD_FLOWFILES
+runner.setProperty(INCLUDE_ZERO_RECORD_FLOWFILES, "true");
+
+// CACHE_IDENTIFIER
+runner.setProperty(CACHE_IDENTIFIER, "true");
+
+// DISTRIBUTED_CACHE_SERVICE
+cache = new MockCacheService();
+runner.addControllerService("cache", cache);
+runner.setProperty(DISTRIBUTED_CACHE_SERVICE, "cache");
+runner.enableControllerService(cache);
+
+// CACHE_ENTRY_IDENTIFIER
+final Map props = new HashMap<>();
+props.put("hash.value", "1000");
+runner.enqueue(new byte[]{}, props);
+
+// AGE_OFF_DURATION
+runner.setProperty(AGE_OFF_DURATION, "48 hours");
+
+runner.assertValid();
+}
+
+ @Test
+ public void testDetectDuplicatesHashSet() {
+runner.setProperty(FILTER_TYPE, HASH_SET_VALUE);
+runner.setProperty("/middleName", "${field.value}");
+reader.addRecord("John", "Q", "Smith");
+reader.addRecord("John", "Q", "Smith");
+reader.addRecord("Jane", "X", "Doe");
+
+runner.enqueue("");
+runner.run();
+
+doCountTests(0, 1, 1, 1, 2, 1);
+}
+
+@Test
+public void testDetectDuplicatesBloomFilter() {
+runner.setProperty(FILTER_TYPE, BLOOM_FILTER_VALUE);
+runner.setProperty(BLOOM_FILTER_FPP, "0.10");
+runner.setProperty("/middleName", "${field.value}");
+reader.addRecord("John", "Q", "Smith");
+reader.addRecord("John", "Q", "Smith");
+reader.addRecord("Jane", "X", "Doe");
+
+runner.enqueue("");
+runner.run();
+
+doCountTests(0, 1, 1, 1, 2, 1);
+}
+
+@Test
+public 

[GitHub] [nifi] adamfisher commented on a change in pull request #3317: NIFI-6047 Add DetectDuplicateRecord Processor

2020-06-29 Thread GitBox


adamfisher commented on a change in pull request #3317:
URL: https://github.com/apache/nifi/pull/3317#discussion_r447314009



##
File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/DetectDuplicateRecord.java
##
@@ -0,0 +1,646 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.standard;
+
+import com.google.common.base.Joiner;
+import com.google.common.hash.BloomFilter;
+import com.google.common.hash.Funnels;
+import org.apache.commons.codec.binary.Hex;
+import org.apache.commons.codec.digest.DigestUtils;
+import org.apache.commons.codec.digest.MessageDigestAlgorithms;
+import org.apache.nifi.annotation.behavior.*;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.components.*;
+import org.apache.nifi.distributed.cache.client.Deserializer;
+import org.apache.nifi.distributed.cache.client.DistributedMapCacheClient;
+import org.apache.nifi.distributed.cache.client.Serializer;
+import 
org.apache.nifi.distributed.cache.client.exception.DeserializationException;
+import 
org.apache.nifi.distributed.cache.client.exception.SerializationException;
+import org.apache.nifi.expression.AttributeExpression.ResultType;
+import org.apache.nifi.expression.ExpressionLanguageScope;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.*;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.record.path.RecordPath;
+import org.apache.nifi.record.path.RecordPathResult;
+import org.apache.nifi.record.path.util.RecordPathCache;
+import org.apache.nifi.record.path.validation.RecordPathPropertyNameValidator;
+import org.apache.nifi.record.path.validation.RecordPathValidator;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.*;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.serialization.record.RecordSchema;
+import org.apache.nifi.serialization.record.util.DataTypeUtils;
+
+import java.io.*;
+import java.nio.charset.Charset;
+import java.nio.charset.StandardCharsets;
+import java.security.MessageDigest;
+import java.util.*;
+import java.util.concurrent.TimeUnit;
+import java.util.stream.Collectors;
+
+import static java.util.stream.Collectors.toList;
+import static org.apache.commons.codec.binary.StringUtils.getBytesUtf8;
+import static org.apache.commons.lang3.StringUtils.*;
+
+@EventDriven
+@SupportsBatching
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@SystemResourceConsideration(resource = SystemResource.MEMORY,
+description = "The HashSet filter type will grow memory space 
proportionate to the number of unique records processed. " +
+"The BloomFilter type will use constant memory regardless of the 
number of records processed.")
+@Tags({"text", "record", "update", "change", "replace", "modify", "distinct", 
"unique",
+"filter", "hash", "dupe", "duplicate", "dedupe"})
+@CapabilityDescription("Caches records from each incoming FlowFile and 
determines if the record " +
+"has already been seen. If so, routes the record to 'duplicate'. If the 
record is " +
+"not determined to be a duplicate, it is routed to 'non-duplicate'."
+)
+@WritesAttribute(attribute = "record.count", description = "The number of 
records processed.")
+@DynamicProperty(
+name = "RecordPath",
+value = "An expression language statement used to determine how the 
RecordPath is resolved. " +
+"The following variables are availble: ${field.name}, 
${field.value}, ${field.type}",
+description = "The name of each user-defined property must be a valid 
RecordPath.")
+@SeeAlso(classNames = {
+

[GitHub] [nifi] adamfisher commented on a change in pull request #3317: NIFI-6047 Add DetectDuplicateRecord Processor

2019-05-26 Thread GitBox
adamfisher commented on a change in pull request #3317: NIFI-6047 Add 
DetectDuplicateRecord Processor
URL: https://github.com/apache/nifi/pull/3317#discussion_r287618059
 
 

 ##
 File path: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/MockCacheService.groovy
 ##
 @@ -0,0 +1,77 @@
+/*
 
 Review comment:
   I moved the groovy implementation to the Groovy test folder.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services