[GitHub] nifi pull request: NIFI-1107 - Create new PutS3ObjectMultipart pro...

2015-11-12 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/121#discussion_r44715683
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3ObjectMultipart.java
 ---
@@ -0,0 +1,550 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.s3;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.services.s3.AmazonS3;
+import com.amazonaws.services.s3.AmazonS3Client;
+import com.amazonaws.services.s3.model.AccessControlList;
+import com.amazonaws.services.s3.model.CompleteMultipartUploadRequest;
+import com.amazonaws.services.s3.model.CompleteMultipartUploadResult;
+import com.amazonaws.services.s3.model.InitiateMultipartUploadRequest;
+import com.amazonaws.services.s3.model.InitiateMultipartUploadResult;
+import com.amazonaws.services.s3.model.ObjectMetadata;
+import com.amazonaws.services.s3.model.PartETag;
+import com.amazonaws.services.s3.model.StorageClass;
+import com.amazonaws.services.s3.model.UploadPartRequest;
+import com.amazonaws.services.s3.model.UploadPartResult;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.BufferedInputStream;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.Serializable;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.concurrent.TimeUnit;
+
+@SeeAlso({FetchS3Object.class, PutS3Object.class, DeleteS3Object.class})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"Amazon", "S3", "AWS", "Archive", "Put", "Multi", "Multipart", 
"Upload"})
+@CapabilityDescription("Puts FlowFiles to an Amazon S3 Bucket using the 
MultipartUpload API method.  " +
+"This upload consists of three steps 1) initiate upload, 2) upload 
the parts, and 3) complete the upload.\n" +
+"Since the intent for this processor involves large files, the 
processor saves state locally after each step " +
+"so that an upload can be resumed without having to restart from 
the beginning of the file.\n" +
+"The AWS libraries default to using standard AWS regions but the 
'Endpoint Override URL' allows this to be " +
+"overridden.")
+@DynamicProperty(name = "The name of a User-Defined Metadata field to add 
to the S3 Object",
+value = "The value of a User-Defined Metadata field to add to the 
S3 Object",
+description = "Allows user-defined metadata to be added to the S3 
object as key/value pairs",
+supportsExpressionLanguage = true)
+@ReadsAttribute(attribute = "filename", description = "Uses the FlowFile's 
filename as the filename for 

Re: ExecuteStreamCommand tests

2015-11-12 Thread Joe Witt
well that explains these goofball classes I deleted the other day

https://issues.apache.org/jira/browse/NIFI-1134

These classes were used to make those Jars.  Those jars are used to
test execute command.  We've now removed the source that was floating
randomly.  We need the built to automatically create whatever we
execute against if we're going to do this.  Those tests should be
replaced by something else.


On Thu, Nov 12, 2015 at 3:02 PM, Joe Percivall
 wrote:
> Tony,
>
> I did a bit of digging through the history and the jars were a part of the 
> initial code import so unless if Joe or someone else knows where they came 
> from then we may be out of luck.
>
> Joe
>
> - - - - - -
> Joseph Percivall
> linkedin.com/in/Percivall
> e: joeperciv...@yahoo.com
>
>
>
>
> On Wednesday, November 11, 2015 6:23 PM, Tony Kurc  wrote:
>
>
>
> All, I was code reviewing and something occurred to me. This raised my
> eyebrow:
> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestExecuteStreamCommand.java#L63
>
> If I'm reading it right, the test "runs" a jar that we've got in our source
> tree
>
> What code made those jars in src/test/resources?


[GitHub] nifi pull request: NIFI-1107 - Create new PutS3ObjectMultipart pro...

2015-11-12 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/121#discussion_r44713942
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3ObjectMultipart.java
 ---
@@ -0,0 +1,550 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.s3;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.services.s3.AmazonS3;
+import com.amazonaws.services.s3.AmazonS3Client;
+import com.amazonaws.services.s3.model.AccessControlList;
+import com.amazonaws.services.s3.model.CompleteMultipartUploadRequest;
+import com.amazonaws.services.s3.model.CompleteMultipartUploadResult;
+import com.amazonaws.services.s3.model.InitiateMultipartUploadRequest;
+import com.amazonaws.services.s3.model.InitiateMultipartUploadResult;
+import com.amazonaws.services.s3.model.ObjectMetadata;
+import com.amazonaws.services.s3.model.PartETag;
+import com.amazonaws.services.s3.model.StorageClass;
+import com.amazonaws.services.s3.model.UploadPartRequest;
+import com.amazonaws.services.s3.model.UploadPartResult;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.BufferedInputStream;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.Serializable;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.concurrent.TimeUnit;
+
+@SeeAlso({FetchS3Object.class, PutS3Object.class, DeleteS3Object.class})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"Amazon", "S3", "AWS", "Archive", "Put", "Multi", "Multipart", 
"Upload"})
+@CapabilityDescription("Puts FlowFiles to an Amazon S3 Bucket using the 
MultipartUpload API method.  " +
+"This upload consists of three steps 1) initiate upload, 2) upload 
the parts, and 3) complete the upload.\n" +
+"Since the intent for this processor involves large files, the 
processor saves state locally after each step " +
+"so that an upload can be resumed without having to restart from 
the beginning of the file.\n" +
+"The AWS libraries default to using standard AWS regions but the 
'Endpoint Override URL' allows this to be " +
+"overridden.")
+@DynamicProperty(name = "The name of a User-Defined Metadata field to add 
to the S3 Object",
+value = "The value of a User-Defined Metadata field to add to the 
S3 Object",
+description = "Allows user-defined metadata to be added to the S3 
object as key/value pairs",
+supportsExpressionLanguage = true)
+@ReadsAttribute(attribute = "filename", description = "Uses the FlowFile's 
filename as the filename for 

[GitHub] nifi pull request: NIFI-1107 - Create new PutS3ObjectMultipart pro...

2015-11-12 Thread jskora
Github user jskora commented on a diff in the pull request:

https://github.com/apache/nifi/pull/121#discussion_r44714959
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3ObjectMultipart.java
 ---
@@ -0,0 +1,550 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.s3;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.services.s3.AmazonS3;
+import com.amazonaws.services.s3.AmazonS3Client;
+import com.amazonaws.services.s3.model.AccessControlList;
+import com.amazonaws.services.s3.model.CompleteMultipartUploadRequest;
+import com.amazonaws.services.s3.model.CompleteMultipartUploadResult;
+import com.amazonaws.services.s3.model.InitiateMultipartUploadRequest;
+import com.amazonaws.services.s3.model.InitiateMultipartUploadResult;
+import com.amazonaws.services.s3.model.ObjectMetadata;
+import com.amazonaws.services.s3.model.PartETag;
+import com.amazonaws.services.s3.model.StorageClass;
+import com.amazonaws.services.s3.model.UploadPartRequest;
+import com.amazonaws.services.s3.model.UploadPartResult;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.BufferedInputStream;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.Serializable;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.concurrent.TimeUnit;
+
+@SeeAlso({FetchS3Object.class, PutS3Object.class, DeleteS3Object.class})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"Amazon", "S3", "AWS", "Archive", "Put", "Multi", "Multipart", 
"Upload"})
+@CapabilityDescription("Puts FlowFiles to an Amazon S3 Bucket using the 
MultipartUpload API method.  " +
+"This upload consists of three steps 1) initiate upload, 2) upload 
the parts, and 3) complete the upload.\n" +
+"Since the intent for this processor involves large files, the 
processor saves state locally after each step " +
+"so that an upload can be resumed without having to restart from 
the beginning of the file.\n" +
+"The AWS libraries default to using standard AWS regions but the 
'Endpoint Override URL' allows this to be " +
+"overridden.")
+@DynamicProperty(name = "The name of a User-Defined Metadata field to add 
to the S3 Object",
+value = "The value of a User-Defined Metadata field to add to the 
S3 Object",
+description = "Allows user-defined metadata to be added to the S3 
object as key/value pairs",
+supportsExpressionLanguage = true)
+@ReadsAttribute(attribute = "filename", description = "Uses the FlowFile's 
filename as the filename for 

[GitHub] nifi pull request: NIFI-1107 - Create new PutS3ObjectMultipart pro...

2015-11-12 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/121#discussion_r44716157
  
--- Diff: 
nifi-nar-bundles/nifi-aws-bundle/nifi-aws-processors/src/main/java/org/apache/nifi/processors/aws/s3/PutS3ObjectMultipart.java
 ---
@@ -0,0 +1,550 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.aws.s3;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.services.s3.AmazonS3;
+import com.amazonaws.services.s3.AmazonS3Client;
+import com.amazonaws.services.s3.model.AccessControlList;
+import com.amazonaws.services.s3.model.CompleteMultipartUploadRequest;
+import com.amazonaws.services.s3.model.CompleteMultipartUploadResult;
+import com.amazonaws.services.s3.model.InitiateMultipartUploadRequest;
+import com.amazonaws.services.s3.model.InitiateMultipartUploadResult;
+import com.amazonaws.services.s3.model.ObjectMetadata;
+import com.amazonaws.services.s3.model.PartETag;
+import com.amazonaws.services.s3.model.StorageClass;
+import com.amazonaws.services.s3.model.UploadPartRequest;
+import com.amazonaws.services.s3.model.UploadPartResult;
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.behavior.ReadsAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttribute;
+import org.apache.nifi.annotation.behavior.WritesAttributes;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.SeeAlso;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.flowfile.attributes.CoreAttributes;
+import org.apache.nifi.processor.DataUnit;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.io.InputStreamCallback;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.stream.io.BufferedInputStream;
+
+import java.io.File;
+import java.io.FileInputStream;
+import java.io.FileOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.Serializable;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+import java.util.concurrent.TimeUnit;
+
+@SeeAlso({FetchS3Object.class, PutS3Object.class, DeleteS3Object.class})
+@InputRequirement(Requirement.INPUT_REQUIRED)
+@Tags({"Amazon", "S3", "AWS", "Archive", "Put", "Multi", "Multipart", 
"Upload"})
+@CapabilityDescription("Puts FlowFiles to an Amazon S3 Bucket using the 
MultipartUpload API method.  " +
+"This upload consists of three steps 1) initiate upload, 2) upload 
the parts, and 3) complete the upload.\n" +
+"Since the intent for this processor involves large files, the 
processor saves state locally after each step " +
+"so that an upload can be resumed without having to restart from 
the beginning of the file.\n" +
+"The AWS libraries default to using standard AWS regions but the 
'Endpoint Override URL' allows this to be " +
+"overridden.")
+@DynamicProperty(name = "The name of a User-Defined Metadata field to add 
to the S3 Object",
+value = "The value of a User-Defined Metadata field to add to the 
S3 Object",
+description = "Allows user-defined metadata to be added to the S3 
object as key/value pairs",
+supportsExpressionLanguage = true)
+@ReadsAttribute(attribute = "filename", description = "Uses the FlowFile's 
filename as the filename for 

Re: ExecuteStreamCommand tests

2015-11-12 Thread Tony Kurc
Do you plan to undo the removal?
On Nov 12, 2015 4:46 PM, "Joe Witt"  wrote:

> well that explains these goofball classes I deleted the other day
>
> https://issues.apache.org/jira/browse/NIFI-1134
>
> These classes were used to make those Jars.  Those jars are used to
> test execute command.  We've now removed the source that was floating
> randomly.  We need the built to automatically create whatever we
> execute against if we're going to do this.  Those tests should be
> replaced by something else.
>
>
> On Thu, Nov 12, 2015 at 3:02 PM, Joe Percivall
>  wrote:
> > Tony,
> >
> > I did a bit of digging through the history and the jars were a part of
> the initial code import so unless if Joe or someone else knows where they
> came from then we may be out of luck.
> >
> > Joe
> >
> > - - - - - -
> > Joseph Percivall
> > linkedin.com/in/Percivall
> > e: joeperciv...@yahoo.com
> >
> >
> >
> >
> > On Wednesday, November 11, 2015 6:23 PM, Tony Kurc 
> wrote:
> >
> >
> >
> > All, I was code reviewing and something occurred to me. This raised my
> > eyebrow:
> >
> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestExecuteStreamCommand.java#L63
> >
> > If I'm reading it right, the test "runs" a jar that we've got in our
> source
> > tree
> >
> > What code made those jars in src/test/resources?
>


Re: ExecuteStreamCommand tests

2015-11-12 Thread Joe Percivall
All of the current ExecuteStreamCommand tests rely on those jars.
 - - - - - - 
Joseph Percivall
linkedin.com/in/Percivall
e: joeperciv...@yahoo.com




On Thursday, November 12, 2015 5:34 PM, Joe Witt  wrote:
i think we should kill those tests which depend on the build of those
jars personally.  But if the view is to undo the removal of those
three classes i can do that.

Thanks
Joe


On Thu, Nov 12, 2015 at 5:32 PM, Tony Kurc  wrote:
> Do you plan to undo the removal?
> On Nov 12, 2015 4:46 PM, "Joe Witt"  wrote:
>
>> well that explains these goofball classes I deleted the other day
>>
>> https://issues.apache.org/jira/browse/NIFI-1134
>>
>> These classes were used to make those Jars.  Those jars are used to
>> test execute command.  We've now removed the source that was floating
>> randomly.  We need the built to automatically create whatever we
>> execute against if we're going to do this.  Those tests should be
>> replaced by something else.
>>
>>
>> On Thu, Nov 12, 2015 at 3:02 PM, Joe Percivall
>>  wrote:
>> > Tony,
>> >
>> > I did a bit of digging through the history and the jars were a part of
>> the initial code import so unless if Joe or someone else knows where they
>> came from then we may be out of luck.
>> >
>> > Joe
>> >
>> > - - - - - -
>> > Joseph Percivall
>> > linkedin.com/in/Percivall
>> > e: joeperciv...@yahoo.com
>> >
>> >
>> >
>> >
>> > On Wednesday, November 11, 2015 6:23 PM, Tony Kurc 
>> wrote:
>> >
>> >
>> >
>> > All, I was code reviewing and something occurred to me. This raised my
>> > eyebrow:
>> >
>> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestExecuteStreamCommand.java#L63
>> >
>> > If I'm reading it right, the test "runs" a jar that we've got in our
>> source
>> > tree
>> >
>> > What code made those jars in src/test/resources?
>>


Re: ExecuteStreamCommand tests

2015-11-12 Thread Joe Witt
ok - will undo the commit.  I get to learn a new git trick?  Or just
add them back?  I must admit I'm not sure how best to do that.

On Thu, Nov 12, 2015 at 5:39 PM, Brandon DeVries  wrote:
> I would undo the removal for now, and make a point of doing the test
> properly. I don't like the idea of removing the test and saying we'll add
> new ones eventually (those sorts of things tend to not happen...).
>
> Brandon
> On Thu, Nov 12, 2015 at 5:36 PM Tony Kurc  wrote:
>
>> Shipping built jars that tests depend on is icky. Not shipping the source
>> to those tests is ickier.
>> On Nov 12, 2015 5:34 PM, "Joe Witt"  wrote:
>>
>> > i think we should kill those tests which depend on the build of those
>> > jars personally.  But if the view is to undo the removal of those
>> > three classes i can do that.
>> >
>> > Thanks
>> > Joe
>> >
>> > On Thu, Nov 12, 2015 at 5:32 PM, Tony Kurc  wrote:
>> > > Do you plan to undo the removal?
>> > > On Nov 12, 2015 4:46 PM, "Joe Witt"  wrote:
>> > >
>> > >> well that explains these goofball classes I deleted the other day
>> > >>
>> > >> https://issues.apache.org/jira/browse/NIFI-1134
>> > >>
>> > >> These classes were used to make those Jars.  Those jars are used to
>> > >> test execute command.  We've now removed the source that was floating
>> > >> randomly.  We need the built to automatically create whatever we
>> > >> execute against if we're going to do this.  Those tests should be
>> > >> replaced by something else.
>> > >>
>> > >>
>> > >> On Thu, Nov 12, 2015 at 3:02 PM, Joe Percivall
>> > >>  wrote:
>> > >> > Tony,
>> > >> >
>> > >> > I did a bit of digging through the history and the jars were a part
>> of
>> > >> the initial code import so unless if Joe or someone else knows where
>> > they
>> > >> came from then we may be out of luck.
>> > >> >
>> > >> > Joe
>> > >> >
>> > >> > - - - - - -
>> > >> > Joseph Percivall
>> > >> > linkedin.com/in/Percivall
>> > >> > e: joeperciv...@yahoo.com
>> > >> >
>> > >> >
>> > >> >
>> > >> >
>> > >> > On Wednesday, November 11, 2015 6:23 PM, Tony Kurc <
>> trk...@gmail.com>
>> > >> wrote:
>> > >> >
>> > >> >
>> > >> >
>> > >> > All, I was code reviewing and something occurred to me. This raised
>> my
>> > >> > eyebrow:
>> > >> >
>> > >>
>> >
>> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestExecuteStreamCommand.java#L63
>> > >> >
>> > >> > If I'm reading it right, the test "runs" a jar that we've got in our
>> > >> source
>> > >> > tree
>> > >> >
>> > >> > What code made those jars in src/test/resources?
>> > >>
>> >
>>


Re: ExecuteStreamCommand tests

2015-11-12 Thread Joe Witt
i think we should kill those tests which depend on the build of those
jars personally.  But if the view is to undo the removal of those
three classes i can do that.

Thanks
Joe

On Thu, Nov 12, 2015 at 5:32 PM, Tony Kurc  wrote:
> Do you plan to undo the removal?
> On Nov 12, 2015 4:46 PM, "Joe Witt"  wrote:
>
>> well that explains these goofball classes I deleted the other day
>>
>> https://issues.apache.org/jira/browse/NIFI-1134
>>
>> These classes were used to make those Jars.  Those jars are used to
>> test execute command.  We've now removed the source that was floating
>> randomly.  We need the built to automatically create whatever we
>> execute against if we're going to do this.  Those tests should be
>> replaced by something else.
>>
>>
>> On Thu, Nov 12, 2015 at 3:02 PM, Joe Percivall
>>  wrote:
>> > Tony,
>> >
>> > I did a bit of digging through the history and the jars were a part of
>> the initial code import so unless if Joe or someone else knows where they
>> came from then we may be out of luck.
>> >
>> > Joe
>> >
>> > - - - - - -
>> > Joseph Percivall
>> > linkedin.com/in/Percivall
>> > e: joeperciv...@yahoo.com
>> >
>> >
>> >
>> >
>> > On Wednesday, November 11, 2015 6:23 PM, Tony Kurc 
>> wrote:
>> >
>> >
>> >
>> > All, I was code reviewing and something occurred to me. This raised my
>> > eyebrow:
>> >
>> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestExecuteStreamCommand.java#L63
>> >
>> > If I'm reading it right, the test "runs" a jar that we've got in our
>> source
>> > tree
>> >
>> > What code made those jars in src/test/resources?
>>


Re: ExecuteStreamCommand tests

2015-11-12 Thread Joe Witt
thanks mike/adam.  Doing now.  Hold onto your source codes.

On Thu, Nov 12, 2015 at 5:54 PM, Adam Taft  wrote:
> git revert is your friend.
>
> https://git-scm.com/docs/git-revert
>
> It's not "rollback" -- it's another new commit with the changes reinstated.
>
> On Thu, Nov 12, 2015 at 5:45 PM, Joe Witt  wrote:
>
>> ok - will undo the commit.  I get to learn a new git trick?  Or just
>> add them back?  I must admit I'm not sure how best to do that.
>>
>> On Thu, Nov 12, 2015 at 5:39 PM, Brandon DeVries  wrote:
>> > I would undo the removal for now, and make a point of doing the test
>> > properly. I don't like the idea of removing the test and saying we'll add
>> > new ones eventually (those sorts of things tend to not happen...).
>> >
>> > Brandon
>> > On Thu, Nov 12, 2015 at 5:36 PM Tony Kurc  wrote:
>> >
>> >> Shipping built jars that tests depend on is icky. Not shipping the
>> source
>> >> to those tests is ickier.
>> >> On Nov 12, 2015 5:34 PM, "Joe Witt"  wrote:
>> >>
>> >> > i think we should kill those tests which depend on the build of those
>> >> > jars personally.  But if the view is to undo the removal of those
>> >> > three classes i can do that.
>> >> >
>> >> > Thanks
>> >> > Joe
>> >> >
>> >> > On Thu, Nov 12, 2015 at 5:32 PM, Tony Kurc  wrote:
>> >> > > Do you plan to undo the removal?
>> >> > > On Nov 12, 2015 4:46 PM, "Joe Witt"  wrote:
>> >> > >
>> >> > >> well that explains these goofball classes I deleted the other day
>> >> > >>
>> >> > >> https://issues.apache.org/jira/browse/NIFI-1134
>> >> > >>
>> >> > >> These classes were used to make those Jars.  Those jars are used to
>> >> > >> test execute command.  We've now removed the source that was
>> floating
>> >> > >> randomly.  We need the built to automatically create whatever we
>> >> > >> execute against if we're going to do this.  Those tests should be
>> >> > >> replaced by something else.
>> >> > >>
>> >> > >>
>> >> > >> On Thu, Nov 12, 2015 at 3:02 PM, Joe Percivall
>> >> > >>  wrote:
>> >> > >> > Tony,
>> >> > >> >
>> >> > >> > I did a bit of digging through the history and the jars were a
>> part
>> >> of
>> >> > >> the initial code import so unless if Joe or someone else knows
>> where
>> >> > they
>> >> > >> came from then we may be out of luck.
>> >> > >> >
>> >> > >> > Joe
>> >> > >> >
>> >> > >> > - - - - - -
>> >> > >> > Joseph Percivall
>> >> > >> > linkedin.com/in/Percivall
>> >> > >> > e: joeperciv...@yahoo.com
>> >> > >> >
>> >> > >> >
>> >> > >> >
>> >> > >> >
>> >> > >> > On Wednesday, November 11, 2015 6:23 PM, Tony Kurc <
>> >> trk...@gmail.com>
>> >> > >> wrote:
>> >> > >> >
>> >> > >> >
>> >> > >> >
>> >> > >> > All, I was code reviewing and something occurred to me. This
>> raised
>> >> my
>> >> > >> > eyebrow:
>> >> > >> >
>> >> > >>
>> >> >
>> >>
>> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestExecuteStreamCommand.java#L63
>> >> > >> >
>> >> > >> > If I'm reading it right, the test "runs" a jar that we've got in
>> our
>> >> > >> source
>> >> > >> > tree
>> >> > >> >
>> >> > >> > What code made those jars in src/test/resources?
>> >> > >>
>> >> >
>> >>
>>


Re: ExecuteStreamCommand tests

2015-11-12 Thread Tony Kurc
Shipping built jars that tests depend on is icky. Not shipping the source
to those tests is ickier.
On Nov 12, 2015 5:34 PM, "Joe Witt"  wrote:

> i think we should kill those tests which depend on the build of those
> jars personally.  But if the view is to undo the removal of those
> three classes i can do that.
>
> Thanks
> Joe
>
> On Thu, Nov 12, 2015 at 5:32 PM, Tony Kurc  wrote:
> > Do you plan to undo the removal?
> > On Nov 12, 2015 4:46 PM, "Joe Witt"  wrote:
> >
> >> well that explains these goofball classes I deleted the other day
> >>
> >> https://issues.apache.org/jira/browse/NIFI-1134
> >>
> >> These classes were used to make those Jars.  Those jars are used to
> >> test execute command.  We've now removed the source that was floating
> >> randomly.  We need the built to automatically create whatever we
> >> execute against if we're going to do this.  Those tests should be
> >> replaced by something else.
> >>
> >>
> >> On Thu, Nov 12, 2015 at 3:02 PM, Joe Percivall
> >>  wrote:
> >> > Tony,
> >> >
> >> > I did a bit of digging through the history and the jars were a part of
> >> the initial code import so unless if Joe or someone else knows where
> they
> >> came from then we may be out of luck.
> >> >
> >> > Joe
> >> >
> >> > - - - - - -
> >> > Joseph Percivall
> >> > linkedin.com/in/Percivall
> >> > e: joeperciv...@yahoo.com
> >> >
> >> >
> >> >
> >> >
> >> > On Wednesday, November 11, 2015 6:23 PM, Tony Kurc 
> >> wrote:
> >> >
> >> >
> >> >
> >> > All, I was code reviewing and something occurred to me. This raised my
> >> > eyebrow:
> >> >
> >>
> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestExecuteStreamCommand.java#L63
> >> >
> >> > If I'm reading it right, the test "runs" a jar that we've got in our
> >> source
> >> > tree
> >> >
> >> > What code made those jars in src/test/resources?
> >>
>


Re: Master Branch Tests Failing on Windows

2015-11-12 Thread Tony Kurc
Joe, I built on windows 7 without issue. I have a win 10 box, will try
tonight to recreate there
On Nov 11, 2015 11:08 PM, "Joe Percivall" 
wrote:

> I like the thought but I just tried replacing all the "/" with
> File.separator and it still failed the same way. I also tried it also on
> testPutFileWithException but I still got the same NPE as I was getting
> before on this line:
>
> fs.setPermission(p,permission);
>
>
>
> - - - - - -
> Joseph Percivall
> linkedin.com/in/Percivall
> e: joeperciv...@yahoo.com
>
>
>
>
>
> On Wednesday, November 11, 2015 10:59 PM, Oleg Zhurakousky <
> ozhurakou...@hortonworks.com> wrote:
> Joe
>
> I am gonna go out on the limb here, but do you think it may have something
> to do with forward slashes “target/test-classes”?
> Perhaps we may need to start using File.separator?
>
> Oleg
>
> On Nov 11, 2015, at 8:41 PM, Joe Percivall  > wrote:
>
> Hey Dev,
>
> Is anyone building the project using Windows (specifically 8)?
>
> I wanted to verify a patch I'm doing for ExecuteStreamCommand works on
> Windows. So I did a fresh clone of the git repo and ran "mvn clean install"
> but I get seemingly random errors. One such error is on PutHDFSTest line
> 178 (added an extra line to break up the logic):
>
> List failedFlowFiles =
> runner.getFlowFilesForRelationship(new
> Relationship.Builder().name("failure").build());
> boolean isEmpty = failedFlowFiles.isEmpty();
> assertTrue(isEmpty);
>
> I get an AssertionError:
>
> testPutFile(org.apache.nifi.processors.hadoop.PutHDFSTest)  Time elapsed:
> 5.223 sec  <<< FAILURE!
> java.lang.AssertionError: null
> at org.junit.Assert.fail(Assert.java:86)
> at org.junit.Assert.assertTrue(Assert.java:41)
> at org.junit.Assert.assertTrue(Assert.java:52)
> at
> org.apache.nifi.processors.hadoop.PutHDFSTest.testPutFile(PutHDFSTest.java:179)
>
> Except that I was stepping through and saw that isEmpty was not null going
> into the assertion.
>
> I have the feeling something is funky with my local system but can anyone
> verify that a clean clone of the repo is working on Windows?
>
> Thanks,
> Joe
>
> - - - - - -
> Joseph Percivall
> linkedin.com/in/Percivall
>
> e: joeperciv...@yahoo.com
>


[GitHub] nifi pull request: NIFI-1103: Add support for long polling in GetS...

2015-11-12 Thread adamonduty
GitHub user adamonduty opened a pull request:

https://github.com/apache/nifi/pull/122

NIFI-1103: Add support for long polling in GetSQS processor

This adds long polling support to the GetSQS processor.

By default, it sets the Receive Message Wait Time to zero, which is the 
same as the current behavior. Setting the Receive Message Wait Time to 1 second 
or greater causes the request to long poll for that period of time before 
returning.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/adamonduty/nifi NIFI-1103

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/122.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #122


commit 428b20fc2553ac6ae56867630fa10bd3ac355582
Author: Adam Lamar 
Date:   2015-11-13T05:03:36Z

NIFI-1103: Add support for long polling in GetSQS processor




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Re: Release wrangling: 1 week until our hopeful 0.4.0 release

2015-11-12 Thread Matt Gilman
https://issues.apache.org/jira/browse/NIFI-655 is getting there. Still need
to iron out a few corner cases, do some code clean up, wrap up a few unit
tests, and find a way to test Active Directory integration (this is an
ongoing effort and will hopefully have something soon). I will likely be
working this ticket into next week and will then need some help with
reviewing/testing from other folks who have time. Thanks.

Matt

On Thu, Nov 12, 2015 at 9:13 AM, Tony Kurc  wrote:

> https://issues.apache.org/jira/browse/NIFI-61 - awaiting an answer before
> patch can be completed
> https://issues.apache.org/jira/browse/NIFI-655 - Based on feature branch
> activity, is close?
> https://issues.apache.org/jira/browse/NIFI-696 - awaiting a patch marking
> method as deprecated (assigned to me, but if someone else wants to take it
> and I review, thats cool too)
> https://issues.apache.org/jira/browse/NIFI-812 -  a bit confused about
> this
> one. patch in NIFI-1086 will close this?
> https://issues.apache.org/jira/browse/NIFI-973 - awaiting review?
> https://issues.apache.org/jira/browse/NIFI-980 - (see 812 confusion)
> presumably closed when NIFI-1086 is closed
> https://issues.apache.org/jira/browse/NIFI-1009 (same!)
> https://issues.apache.org/jira/browse/NIFI-1054 I'm at the ready to submit
> a patch at the 11th hour. .gitattributes may be a candidate for 0.5.0
> https://issues.apache.org/jira/browse/NIFI-1073 - some bug fixes awaiting
> review. if we don't have review bandwidth can be pushed to next release
> https://issues.apache.org/jira/browse/NIFI-1081 - reviewed, probably needs
> revision. Presuming the revisions can be done
> https://issues.apache.org/jira/browse/NIFI-1082 - Unclear as to status. Is
> this awaiting review?
> https://issues.apache.org/jira/browse/NIFI-1086 - Seems like this is the
> lynchpin for several tickets, probably should be a priority
> https://issues.apache.org/jira/browse/NIFI-1097 - Awaiting review?
> https://issues.apache.org/jira/browse/NIFI-1108 - Awaiting comment from
> Mark Payne.
> https://issues.apache.org/jira/browse/NIFI-1109 - awaiting a merge to
> master?
> https://issues.apache.org/jira/browse/NIFI-1127 - awaiting review?
> https://issues.apache.org/jira/browse/NIFI-1132 - awaiting review
> https://issues.apache.org/jira/browse/NIFI-1133 - awaiting review
> https://issues.apache.org/jira/browse/NIFI-1153 - awaiting review
> https://issues.apache.org/jira/browse/NIFI-1155 - blocker bug awaiting
> patch
>
> Seems like we're in good shape. I will have some review bandwidth this
> evening, so if you start reviewing, please note that in the jiras.
>
> Can we get answers for 1108 and 61 and contemplate a slip to 0.5.0?
>
> Tony
>


Re: Merging csv files based on criterias

2015-11-12 Thread Joe Witt
Hello

Certainly ways to slice this but it would be helpful to understand
your use case a bit more in the context of an automated flow of data.
Can you describe how this applies in that context?

If you had two streams of data feeding in and can pair of data from
one stream with data from another stream and run them through a sort
of combiner function then this could be fairly straightforward but
does require building a processor that doesn't exist as of now (as far
as I know).

But let's understand the context of your use case a bit more to see if
helping with a NiFi answer is the right thing or not.

Thanks
Joe

On Thu, Nov 12, 2015 at 10:02 AM, Yaismel Miranda Pons
 wrote:
> Hi all,
>
> I am testing nifi and I am wondering if it is possible to merge two CSV
> files based on specific criterias.
> For example, given these csv files:
>
> business.csv
> businessid,businessname,categoryid
> 1,McDonalds,1
> 2,Burger King,1
> 3,Walmart,2
> 4,Publix,2
>
> categories.csv
> categoryid,categoryname
> 1,Fast food chain
> 2,Super market
>
> I would like to know if there is an effective way in nifi to combine both
> csv files using the categoryid as criteria in this case:
>
> generated.csv
> businessid,businessname,categoryid,categoryname
> 1,McDonalds,1,Fast food chain
> 2,Burger King,1,Fast food chain
> 3,Walmart,2,Super market
> 4,Publix,2,Super market
>
> Thank you for your time.


Merging csv files based on criterias

2015-11-12 Thread Yaismel Miranda Pons
Hi all,

I am testing nifi and I am wondering if it is possible to merge two CSV
files based on specific criterias.
For example, given these csv files:

business.csv
businessid,businessname,categoryid
1,McDonalds,1
2,Burger King,1
3,Walmart,2
4,Publix,2

categories.csv
categoryid,categoryname
1,Fast food chain
2,Super market

I would like to know if there is an effective way in nifi to combine both
csv files using the categoryid as criteria in this case:

generated.csv
businessid,businessname,categoryid,categoryname
1,McDonalds,1,Fast food chain
2,Burger King,1,Fast food chain
3,Walmart,2,Super market
4,Publix,2,Super market

Thank you for your time.


Re: ExecuteStreamCommand tests

2015-11-12 Thread Adam Taft
git revert is your friend.

https://git-scm.com/docs/git-revert

It's not "rollback" -- it's another new commit with the changes reinstated.

On Thu, Nov 12, 2015 at 5:45 PM, Joe Witt  wrote:

> ok - will undo the commit.  I get to learn a new git trick?  Or just
> add them back?  I must admit I'm not sure how best to do that.
>
> On Thu, Nov 12, 2015 at 5:39 PM, Brandon DeVries  wrote:
> > I would undo the removal for now, and make a point of doing the test
> > properly. I don't like the idea of removing the test and saying we'll add
> > new ones eventually (those sorts of things tend to not happen...).
> >
> > Brandon
> > On Thu, Nov 12, 2015 at 5:36 PM Tony Kurc  wrote:
> >
> >> Shipping built jars that tests depend on is icky. Not shipping the
> source
> >> to those tests is ickier.
> >> On Nov 12, 2015 5:34 PM, "Joe Witt"  wrote:
> >>
> >> > i think we should kill those tests which depend on the build of those
> >> > jars personally.  But if the view is to undo the removal of those
> >> > three classes i can do that.
> >> >
> >> > Thanks
> >> > Joe
> >> >
> >> > On Thu, Nov 12, 2015 at 5:32 PM, Tony Kurc  wrote:
> >> > > Do you plan to undo the removal?
> >> > > On Nov 12, 2015 4:46 PM, "Joe Witt"  wrote:
> >> > >
> >> > >> well that explains these goofball classes I deleted the other day
> >> > >>
> >> > >> https://issues.apache.org/jira/browse/NIFI-1134
> >> > >>
> >> > >> These classes were used to make those Jars.  Those jars are used to
> >> > >> test execute command.  We've now removed the source that was
> floating
> >> > >> randomly.  We need the built to automatically create whatever we
> >> > >> execute against if we're going to do this.  Those tests should be
> >> > >> replaced by something else.
> >> > >>
> >> > >>
> >> > >> On Thu, Nov 12, 2015 at 3:02 PM, Joe Percivall
> >> > >>  wrote:
> >> > >> > Tony,
> >> > >> >
> >> > >> > I did a bit of digging through the history and the jars were a
> part
> >> of
> >> > >> the initial code import so unless if Joe or someone else knows
> where
> >> > they
> >> > >> came from then we may be out of luck.
> >> > >> >
> >> > >> > Joe
> >> > >> >
> >> > >> > - - - - - -
> >> > >> > Joseph Percivall
> >> > >> > linkedin.com/in/Percivall
> >> > >> > e: joeperciv...@yahoo.com
> >> > >> >
> >> > >> >
> >> > >> >
> >> > >> >
> >> > >> > On Wednesday, November 11, 2015 6:23 PM, Tony Kurc <
> >> trk...@gmail.com>
> >> > >> wrote:
> >> > >> >
> >> > >> >
> >> > >> >
> >> > >> > All, I was code reviewing and something occurred to me. This
> raised
> >> my
> >> > >> > eyebrow:
> >> > >> >
> >> > >>
> >> >
> >>
> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/test/java/org/apache/nifi/processors/standard/TestExecuteStreamCommand.java#L63
> >> > >> >
> >> > >> > If I'm reading it right, the test "runs" a jar that we've got in
> our
> >> > >> source
> >> > >> > tree
> >> > >> >
> >> > >> > What code made those jars in src/test/resources?
> >> > >>
> >> >
> >>
>