[jira] [Commented] (NIFI-4326) ExtractEmailHeaders.java unhandled Exceptions

2017-08-28 Thread Benjamin Wood (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144345#comment-16144345
 ] 

Benjamin Wood commented on NIFI-4326:
-

Using grepcode as a reference, should be able to substitute "getRecipients" 
with "getHeaderAsInternetAddresses" and pass it a "non-strict" address 
parameter. Because getRecipients just calls the other function under the hood.

Will do some rewrites of the processor and see if the exception is ignored in 
the "non-strict" parsing of mail addresses.

http://grepcode.com/file/repo1.maven.org/maven2/org.apache.geronimo.specs/geronimo-javamail_1.4_spec/1.1/javax/mail/internet/MimeMessage.java#MimeMessage.getHeaderAsInternetAddresses%28java.lang.String%2Cboolean%29

> ExtractEmailHeaders.java unhandled Exceptions
> -
>
> Key: NIFI-4326
> URL: https://issues.apache.org/jira/browse/NIFI-4326
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.3.0
> Environment: jdk 1.8.0_121-b13
>Reporter: Benjamin Wood
>Priority: Minor
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> The ExtractEmailHeaders  processor throws a NullPointerException if there is 
> no TO, CC, and BCC recipients.
> If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
> and not a 0 length array.
> If an address is empty (<> or " ") then getRecipients() will throw an "Empty 
> Address" AddressException
> It's possible this is only an issue with Oracle Java.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4326) ExtractEmailHeaders.java unhandled Exceptions

2017-08-28 Thread Benjamin Wood (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144279#comment-16144279
 ] 

Benjamin Wood commented on NIFI-4326:
-

Patch needs work, because of the newly discovered issue with getRecipients()

Will continue to iterate on patch-test until a workable solution is found.

The contract states that it will collect data "if available", meaning it should 
accept messages with no addressee or lists containing empty addresses.

> ExtractEmailHeaders.java unhandled Exceptions
> -
>
> Key: NIFI-4326
> URL: https://issues.apache.org/jira/browse/NIFI-4326
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.3.0
> Environment: jdk 1.8.0_121-b13
>Reporter: Benjamin Wood
>Priority: Minor
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> The ExtractEmailHeaders  processor throws a NullPointerException if there is 
> no TO, CC, and BCC recipients.
> If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
> and not a 0 length array.
> If an address is empty (<> or " ") then getRecipients() will throw an "Empty 
> Address" AddressException
> It's possible this is only an issue with Oracle Java.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (NIFI-4326) ExtractEmailHeaders.java unhandled Exceptions

2017-08-28 Thread Benjamin Wood (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4326?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Benjamin Wood updated NIFI-4326:

Environment: jdk 1.8.0_121-b13
Description: 
The ExtractEmailHeaders  processor throws a NullPointerException if there is no 
TO, CC, and BCC recipients.

If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
and not a 0 length array.

If an address is empty (<> or " ") then getRecipients() will throw an "Empty 
Address" AddressException

It's possible this is only an issue with Oracle Java.

  was:
The ExtractEmailHeaders  processor throws a NullPointerException if there is no 
TO, CC, and BCC recipients.

If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
and not a 0 length array.

I've already written a patch and submitted it to github as pull request #2111

Summary: ExtractEmailHeaders.java unhandled Exceptions  (was: 
ExtractEmailHeaders.java unhandled NullPointerException)

> ExtractEmailHeaders.java unhandled Exceptions
> -
>
> Key: NIFI-4326
> URL: https://issues.apache.org/jira/browse/NIFI-4326
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.3.0
> Environment: jdk 1.8.0_121-b13
>Reporter: Benjamin Wood
>Priority: Minor
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> The ExtractEmailHeaders  processor throws a NullPointerException if there is 
> no TO, CC, and BCC recipients.
> If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
> and not a 0 length array.
> If an address is empty (<> or " ") then getRecipients() will throw an "Empty 
> Address" AddressException
> It's possible this is only an issue with Oracle Java.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4327) Parameterize node and npm versions

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144216#comment-16144216
 ] 

ASF GitHub Bot commented on NIFI-4327:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2115


> Parameterize node and npm versions
> --
>
> Key: NIFI-4327
> URL: https://issues.apache.org/jira/browse/NIFI-4327
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Minor
> Fix For: 1.4.0
>
>
> Some versions of NodeJS are not compatible with some OS. Some versions of npm 
> do not work with some versions of NodeJs. As a user I want to be able to set 
> the node and npm version numbers as a property in the maven build command.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (NIFI-4327) Parameterize node and npm versions

2017-08-28 Thread Matt Gilman (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4327?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Gilman updated NIFI-4327:
--
   Resolution: Fixed
Fix Version/s: 1.4.0
   Status: Resolved  (was: Patch Available)

> Parameterize node and npm versions
> --
>
> Key: NIFI-4327
> URL: https://issues.apache.org/jira/browse/NIFI-4327
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Minor
> Fix For: 1.4.0
>
>
> Some versions of NodeJS are not compatible with some OS. Some versions of npm 
> do not work with some versions of NodeJs. As a user I want to be able to set 
> the node and npm version numbers as a property in the maven build command.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4327) Parameterize node and npm versions

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144215#comment-16144215
 ] 

ASF GitHub Bot commented on NIFI-4327:
--

Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2115
  
Thanks @scottyaslan! This has been merged to master.


> Parameterize node and npm versions
> --
>
> Key: NIFI-4327
> URL: https://issues.apache.org/jira/browse/NIFI-4327
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Minor
> Fix For: 1.4.0
>
>
> Some versions of NodeJS are not compatible with some OS. Some versions of npm 
> do not work with some versions of NodeJs. As a user I want to be able to set 
> the node and npm version numbers as a property in the maven build command.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi pull request #2115: [NIFI-4327] Parameterize node and npm in poms

2017-08-28 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2115


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #2115: [NIFI-4327] Parameterize node and npm in poms

2017-08-28 Thread mcgilman
Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2115
  
Thanks @scottyaslan! This has been merged to master.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-4327) Parameterize node and npm versions

2017-08-28 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144214#comment-16144214
 ] 

ASF subversion and git services commented on NIFI-4327:
---

Commit e2b8be53cbd0485f5aeec713100ddf1f7b6ed3f4 in nifi's branch 
refs/heads/master from [~scottyaslan]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=e2b8be5 ]

[NIFI-4327] Parameterize node and npm in poms. This closes #2115


> Parameterize node and npm versions
> --
>
> Key: NIFI-4327
> URL: https://issues.apache.org/jira/browse/NIFI-4327
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Minor
>
> Some versions of NodeJS are not compatible with some OS. Some versions of npm 
> do not work with some versions of NodeJs. As a user I want to be able to set 
> the node and npm version numbers as a property in the maven build command.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4328) Invalid swagger.json generated from ControllerServiceReferencingComponentDTO#referencingCompoents

2017-08-28 Thread Michael Werle (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4328?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144203#comment-16144203
 ] 

Michael Werle commented on NIFI-4328:
-

How odd -- I just generated it locally with commit 
{{e68ff153e81ddb82d1136d44a96bdb7a70da86d1}} and it was correct, though earlier 
versions had had that error, exactly.

> Invalid swagger.json generated from 
> ControllerServiceReferencingComponentDTO#referencingCompoents
> -
>
> Key: NIFI-4328
> URL: https://issues.apache.org/jira/browse/NIFI-4328
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Michael Werle
>
> The referencingComponents field in 
> {{nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-client-dto/src/main/java/org/apache/nifi/web/api/dto/ControllerServiceReferencingComponentDTO.java}}
>  field generates the following lines in swagger.json:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "$ref" : "#/definitions/Set"
> }
>   }
> }
> {code}
> Which causes this invalid object definition:
> {code:javascript}
> "Set" : {
>   "properties" : {
> "empty" : {
>   "type" : "boolean",
>   "default" : false
> }
>   }
> }
> {code}
> It is not clear how to fix the annotation, but the generated swagger.json 
> should be:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "type" : "array",
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "uniqueItems" : true,
>   "items" : {
> "$ref" : 
> "#/definitions/ControllerServiceReferencingComponentEntity"
>   }
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4327) Parameterize node and npm versions

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144205#comment-16144205
 ] 

ASF GitHub Bot commented on NIFI-4327:
--

Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2115
  
Will review...


> Parameterize node and npm versions
> --
>
> Key: NIFI-4327
> URL: https://issues.apache.org/jira/browse/NIFI-4327
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Minor
>
> Some versions of NodeJS are not compatible with some OS. Some versions of npm 
> do not work with some versions of NodeJs. As a user I want to be able to set 
> the node and npm version numbers as a property in the maven build command.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi issue #2115: [NIFI-4327] Parameterize node and npm in poms

2017-08-28 Thread mcgilman
Github user mcgilman commented on the issue:

https://github.com/apache/nifi/pull/2115
  
Will review...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Comment Edited] (NIFI-4328) Invalid swagger.json generated from ControllerServiceReferencingComponentDTO#referencingCompoents

2017-08-28 Thread Matt Gilman (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4328?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144188#comment-16144188
 ] 

Matt Gilman edited comment on NIFI-4328 at 8/28/17 6:59 PM:


I totally agree with the assetment here. I'm wondering if the particular 
version of swagger we're currently using does not handle recursive models well. 
Will need to investigate updating the swagger dependency and ensure it's still 
compatible with the swagger-maven-plugin. When this was initially added, there 
were some compatibility issues as Swagger just releasing the 2.0 specification 
and a lot of the tooling was still catching up.


was (Author: mcgilman):
I totally agree with the asset here. I'm wondering if the particular version of 
swagger we're currently using does not handle recursive models well. Will need 
to investigate updating the swagger dependency and ensure it's still compatible 
with the swagger-maven-plugin. When this was initially added, there were some 
compatibility issues as Swagger just releasing the 2.0 specification and a lot 
of the tooling was still catching up.

> Invalid swagger.json generated from 
> ControllerServiceReferencingComponentDTO#referencingCompoents
> -
>
> Key: NIFI-4328
> URL: https://issues.apache.org/jira/browse/NIFI-4328
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Michael Werle
>
> The referencingComponents field in 
> {{nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-client-dto/src/main/java/org/apache/nifi/web/api/dto/ControllerServiceReferencingComponentDTO.java}}
>  field generates the following lines in swagger.json:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "$ref" : "#/definitions/Set"
> }
>   }
> }
> {code}
> Which causes this invalid object definition:
> {code:javascript}
> "Set" : {
>   "properties" : {
> "empty" : {
>   "type" : "boolean",
>   "default" : false
> }
>   }
> }
> {code}
> It is not clear how to fix the annotation, but the generated swagger.json 
> should be:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "type" : "array",
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "uniqueItems" : true,
>   "items" : {
> "$ref" : 
> "#/definitions/ControllerServiceReferencingComponentEntity"
>   }
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4328) Invalid swagger.json generated from ControllerServiceReferencingComponentDTO#referencingCompoents

2017-08-28 Thread Matt Gilman (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4328?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144202#comment-16144202
 ] 

Matt Gilman commented on NIFI-4328:
---

I'm still seeing issues with that field.

{code}
"processGroups" : {
  "description" : "The process groups in this flow snippet.",
  "$ref" : "#/definitions/Set"
},
{code}

> Invalid swagger.json generated from 
> ControllerServiceReferencingComponentDTO#referencingCompoents
> -
>
> Key: NIFI-4328
> URL: https://issues.apache.org/jira/browse/NIFI-4328
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Michael Werle
>
> The referencingComponents field in 
> {{nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-client-dto/src/main/java/org/apache/nifi/web/api/dto/ControllerServiceReferencingComponentDTO.java}}
>  field generates the following lines in swagger.json:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "$ref" : "#/definitions/Set"
> }
>   }
> }
> {code}
> Which causes this invalid object definition:
> {code:javascript}
> "Set" : {
>   "properties" : {
> "empty" : {
>   "type" : "boolean",
>   "default" : false
> }
>   }
> }
> {code}
> It is not clear how to fix the annotation, but the generated swagger.json 
> should be:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "type" : "array",
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "uniqueItems" : true,
>   "items" : {
> "$ref" : 
> "#/definitions/ControllerServiceReferencingComponentEntity"
>   }
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4328) Invalid swagger.json generated from ControllerServiceReferencingComponentDTO#referencingCompoents

2017-08-28 Thread Michael Werle (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4328?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144196#comment-16144196
 ] 

Michael Werle commented on NIFI-4328:
-

Strangely, {{FlowSnippetDTO#processGroups}} had the same issue until recently, 
and whatever fixed it does not seem to have been in FlowSnippetDTO.java or 
ProcessGroupDTO.java.  It is not clear at all why "Set" is sometimes generated 
and why other times, a proper array with "uniqueItems" set is.

> Invalid swagger.json generated from 
> ControllerServiceReferencingComponentDTO#referencingCompoents
> -
>
> Key: NIFI-4328
> URL: https://issues.apache.org/jira/browse/NIFI-4328
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Michael Werle
>
> The referencingComponents field in 
> {{nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-client-dto/src/main/java/org/apache/nifi/web/api/dto/ControllerServiceReferencingComponentDTO.java}}
>  field generates the following lines in swagger.json:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "$ref" : "#/definitions/Set"
> }
>   }
> }
> {code}
> Which causes this invalid object definition:
> {code:javascript}
> "Set" : {
>   "properties" : {
> "empty" : {
>   "type" : "boolean",
>   "default" : false
> }
>   }
> }
> {code}
> It is not clear how to fix the annotation, but the generated swagger.json 
> should be:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "type" : "array",
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "uniqueItems" : true,
>   "items" : {
> "$ref" : 
> "#/definitions/ControllerServiceReferencingComponentEntity"
>   }
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4328) Invalid swagger.json generated from ControllerServiceReferencingComponentDTO#referencingCompoents

2017-08-28 Thread Matt Gilman (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4328?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144188#comment-16144188
 ] 

Matt Gilman commented on NIFI-4328:
---

I totally agree with the asset here. I'm wondering if the particular version of 
swagger we're currently using does not handle recursive models well. Will need 
to investigate updating the swagger dependency and ensure it's still compatible 
with the swagger-maven-plugin. When this was initially added, there were some 
compatibility issues as Swagger just releasing the 2.0 specification and a lot 
of the tooling was still catching up.

> Invalid swagger.json generated from 
> ControllerServiceReferencingComponentDTO#referencingCompoents
> -
>
> Key: NIFI-4328
> URL: https://issues.apache.org/jira/browse/NIFI-4328
> Project: Apache NiFi
>  Issue Type: Bug
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Michael Werle
>
> The referencingComponents field in 
> {{nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-client-dto/src/main/java/org/apache/nifi/web/api/dto/ControllerServiceReferencingComponentDTO.java}}
>  field generates the following lines in swagger.json:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "$ref" : "#/definitions/Set"
> }
>   }
> }
> {code}
> Which causes this invalid object definition:
> {code:javascript}
> "Set" : {
>   "properties" : {
> "empty" : {
>   "type" : "boolean",
>   "default" : false
> }
>   }
> }
> {code}
> It is not clear how to fix the annotation, but the generated swagger.json 
> should be:
> {code:javascript}
> "ControllerServiceReferencingComponentDTO" : {
>   "properties" : {
> //... (omitted for brevity)
> "referencingComponents" : {
>   "type" : "array",
>   "description" : "If the referencing component represents a 
> controller service, these are the components that reference it.",
>   "uniqueItems" : true,
>   "items" : {
> "$ref" : 
> "#/definitions/ControllerServiceReferencingComponentEntity"
>   }
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4326) ExtractEmailHeaders.java unhandled NullPointerException

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144186#comment-16144186
 ] 

ASF GitHub Bot commented on NIFI-4326:
--

Github user btwood commented on the issue:

https://github.com/apache/nifi/pull/2111
  
getAllRecipients() will also throw an Empty Address exception. This causes 
blank addresses (<>, or "") to also not be parsed correctly.


> ExtractEmailHeaders.java unhandled NullPointerException
> ---
>
> Key: NIFI-4326
> URL: https://issues.apache.org/jira/browse/NIFI-4326
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.3.0
>Reporter: Benjamin Wood
>Priority: Minor
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> The ExtractEmailHeaders  processor throws a NullPointerException if there is 
> no TO, CC, and BCC recipients.
> If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
> and not a 0 length array.
> I've already written a patch and submitted it to github as pull request #2111



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi issue #2111: NIFI-4326 - ExtractEmailHeaders.java unhandled NullPointer...

2017-08-28 Thread btwood
Github user btwood commented on the issue:

https://github.com/apache/nifi/pull/2111
  
getAllRecipients() will also throw an Empty Address exception. This causes 
blank addresses (<>, or "") to also not be parsed correctly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (NIFI-4328) Invalid swagger.json generated from ControllerServiceReferencingComponentDTO#referencingCompoents

2017-08-28 Thread Michael Werle (JIRA)
Michael Werle created NIFI-4328:
---

 Summary: Invalid swagger.json generated from 
ControllerServiceReferencingComponentDTO#referencingCompoents
 Key: NIFI-4328
 URL: https://issues.apache.org/jira/browse/NIFI-4328
 Project: Apache NiFi
  Issue Type: Bug
Affects Versions: 1.3.0, 1.4.0
Reporter: Michael Werle


The referencingComponents field in 
{{nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-client-dto/src/main/java/org/apache/nifi/web/api/dto/ControllerServiceReferencingComponentDTO.java}}
 field generates the following lines in swagger.json:

{code:javascript}
"ControllerServiceReferencingComponentDTO" : {
  "properties" : {
//... (omitted for brevity)
"referencingComponents" : {
  "description" : "If the referencing component represents a controller 
service, these are the components that reference it.",
  "$ref" : "#/definitions/Set"
}
  }
}
{code}

Which causes this invalid object definition:

{code:javascript}
"Set" : {
  "properties" : {
"empty" : {
  "type" : "boolean",
  "default" : false
}
  }
}
{code}

It is not clear how to fix the annotation, but the generated swagger.json 
should be:
{code:javascript}
"ControllerServiceReferencingComponentDTO" : {
  "properties" : {
//... (omitted for brevity)
"referencingComponents" : {
  "type" : "array",
  "description" : "If the referencing component represents a controller 
service, these are the components that reference it.",
  "uniqueItems" : true,
  "items" : {
"$ref" : "#/definitions/ControllerServiceReferencingComponentEntity"
  }
}
{code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4327) Parameterize node and npm versions

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16144095#comment-16144095
 ] 

ASF GitHub Bot commented on NIFI-4327:
--

GitHub user scottyaslan opened a pull request:

https://github.com/apache/nifi/pull/2115

[NIFI-4327] Parameterize node and npm in poms

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/scottyaslan/nifi NIFI-4327

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2115.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2115


commit b66a270cccd531923bb727b22428d6f79979a8cc
Author: Scott Aslan 
Date:   2017-08-28T17:23:43Z

[NIFI-4327] Parameterize node and npm in poms




> Parameterize node and npm versions
> --
>
> Key: NIFI-4327
> URL: https://issues.apache.org/jira/browse/NIFI-4327
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Minor
>
> Some versions of NodeJS are not compatible with some OS. Some versions of npm 
> do not work with some versions of NodeJs. As a user I want to be able to set 
> the node and npm version numbers as a property in the maven build command.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (NIFI-4327) Parameterize node and npm versions

2017-08-28 Thread Scott Aslan (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4327?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Scott Aslan updated NIFI-4327:
--
Status: Patch Available  (was: In Progress)

> Parameterize node and npm versions
> --
>
> Key: NIFI-4327
> URL: https://issues.apache.org/jira/browse/NIFI-4327
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Core UI
>Reporter: Scott Aslan
>Assignee: Scott Aslan
>Priority: Minor
>
> Some versions of NodeJS are not compatible with some OS. Some versions of npm 
> do not work with some versions of NodeJs. As a user I want to be able to set 
> the node and npm version numbers as a property in the maven build command.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi pull request #2115: [NIFI-4327] Parameterize node and npm in poms

2017-08-28 Thread scottyaslan
GitHub user scottyaslan opened a pull request:

https://github.com/apache/nifi/pull/2115

[NIFI-4327] Parameterize node and npm in poms

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/scottyaslan/nifi NIFI-4327

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2115.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2115


commit b66a270cccd531923bb727b22428d6f79979a8cc
Author: Scott Aslan 
Date:   2017-08-28T17:23:43Z

[NIFI-4327] Parameterize node and npm in poms




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (NIFI-4327) Parameterize node and npm versions

2017-08-28 Thread Scott Aslan (JIRA)
Scott Aslan created NIFI-4327:
-

 Summary: Parameterize node and npm versions
 Key: NIFI-4327
 URL: https://issues.apache.org/jira/browse/NIFI-4327
 Project: Apache NiFi
  Issue Type: Improvement
  Components: Core UI
Reporter: Scott Aslan
Assignee: Scott Aslan
Priority: Minor


Some versions of NodeJS are not compatible with some OS. Some versions of npm 
do not work with some versions of NodeJs. As a user I want to be able to set 
the node and npm version numbers as a property in the maven build command.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (NIFI-4069) ListXXX processors can miss files those created while the processor is listing and filesystem does not provide timestamp milliseconds precision

2017-08-28 Thread Bryan Bende (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4069?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende updated NIFI-4069:
--
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> ListXXX processors can miss files those created while the processor is 
> listing and filesystem does not provide timestamp milliseconds precision
> ---
>
> Key: NIFI-4069
> URL: https://issues.apache.org/jira/browse/NIFI-4069
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.0.0
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
> Fix For: 1.4.0
>
> Attachments: ListFilesWithoutMilliseconds.png
>
>
> For some filesystems such as Mac OS X HFS (Hierarchical File System) or EXT3 
> are known that only support timestamp in seconds precision. Also some FTP 
> server is reported that it can only provides timestamp precision in minutes.
> This can cause files to NOT be listed as ListXXX processors logic expects 
> timestamps in milliseconds.
> Specifically, if generate several files in one second, not all files will be 
> listened.
> Steps to reproduce:
> 1. start processor ListFile
> 2. generate 1 zero size files with following command:
> {code}
> for i in {1..1}; do touch ./test_$i; done
> {code}
> 3. see processor stats: out 3952 (0 bytes)
> Current AbstractListProcessor logic adopts LISTING_LAG_NANOS (100ms) and 
> postponing the files those have the latest timestamp within a listing 
> iteration to next iteration, however with those filesystem without 
> milliseconds precision, these logics do not work as expected.
> This issue is originally reported at nifi-dev ML. 
> http://apache-nifi-developer-list.39713.n7.nabble.com/processors-ListFile-ListSFTP-do-not-store-milliseconds-in-timestamp-td16037.html



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (NIFI-4069) ListXXX processors can miss files those created while the processor is listing and filesystem does not provide timestamp milliseconds precision

2017-08-28 Thread Bryan Bende (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4069?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende updated NIFI-4069:
--
Fix Version/s: 1.4.0

> ListXXX processors can miss files those created while the processor is 
> listing and filesystem does not provide timestamp milliseconds precision
> ---
>
> Key: NIFI-4069
> URL: https://issues.apache.org/jira/browse/NIFI-4069
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.0.0
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
> Fix For: 1.4.0
>
> Attachments: ListFilesWithoutMilliseconds.png
>
>
> For some filesystems such as Mac OS X HFS (Hierarchical File System) or EXT3 
> are known that only support timestamp in seconds precision. Also some FTP 
> server is reported that it can only provides timestamp precision in minutes.
> This can cause files to NOT be listed as ListXXX processors logic expects 
> timestamps in milliseconds.
> Specifically, if generate several files in one second, not all files will be 
> listened.
> Steps to reproduce:
> 1. start processor ListFile
> 2. generate 1 zero size files with following command:
> {code}
> for i in {1..1}; do touch ./test_$i; done
> {code}
> 3. see processor stats: out 3952 (0 bytes)
> Current AbstractListProcessor logic adopts LISTING_LAG_NANOS (100ms) and 
> postponing the files those have the latest timestamp within a listing 
> iteration to next iteration, however with those filesystem without 
> milliseconds precision, these logics do not work as expected.
> This issue is originally reported at nifi-dev ML. 
> http://apache-nifi-developer-list.39713.n7.nabble.com/processors-ListFile-ListSFTP-do-not-store-milliseconds-in-timestamp-td16037.html



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (NIFI-3332) Bug in ListXXX causes matching timestamps to be ignored on later runs

2017-08-28 Thread Bryan Bende (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende updated NIFI-3332:
--
Fix Version/s: 1.4.0

> Bug in ListXXX causes matching timestamps to be ignored on later runs
> -
>
> Key: NIFI-3332
> URL: https://issues.apache.org/jira/browse/NIFI-3332
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.7.1, 1.1.1
>Reporter: Joe Skora
>Assignee: Koji Kawamura
>Priority: Critical
> Fix For: 1.4.0
>
> Attachments: listfiles.png, Test-showing-ListFile-timestamp-bug.log, 
> Test-showing-ListFile-timestamp-bug.patch
>
>
> The new state implementation for the ListXXX processors based on 
> AbstractListProcessor creates a race conditions when processor runs occur 
> while a batch of files is being written with the same timestamp.
> The changes to state management dropped tracking of the files processed for a 
> given timestamp.  Without the record of files processed, the remainder of the 
> batch is ignored on the next processor run since their timestamp is not 
> greater than the one timestamp stored in processor state.  With the file 
> tracking it was possible to process files that matched the timestamp exactly 
> and exclude the previously processed files.
> A basic time goes as follows.
>   T0 - system creates or receives batch of files with Tx timestamp where Tx 
> is more than the current timestamp in processor state.
>   T1 - system writes 1st half of Tx batch to the ListFile source directory.
>   T2 - ListFile runs picking up 1st half of Tx batch and stores Tx timestamp 
> in processor state.
>   T3 - system writes 2nd half of Tx batch to ListFile source directory.
>   T4 - ListFile runs ignoring any files with T <= Tx, eliminating 2nd half Tx 
> timestamp batch.
> I've attached a patch[1] for TestListFile.java that adds an instrumented unit 
> test demonstrates the problem and a log[2] of the output from one such run.  
> The test writes 3 files each in two batches with processor runs after each 
> batch.  Batch 2 writes files with timestamps older than, equal to, and newer 
> than the timestamp stored when batch 1 was processed, but only the newer file 
> is picked up.  The older file is correctly ignored but file with the matchin 
> timestamp file should have been processed.
> [1] Test-showing-ListFile-timestamp-bug.patch
> [2] Test-showing-ListFile-timestamp-bug.log



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (NIFI-3332) Bug in ListXXX causes matching timestamps to be ignored on later runs

2017-08-28 Thread Bryan Bende (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-3332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bryan Bende updated NIFI-3332:
--
Resolution: Fixed
Status: Resolved  (was: Patch Available)

> Bug in ListXXX causes matching timestamps to be ignored on later runs
> -
>
> Key: NIFI-3332
> URL: https://issues.apache.org/jira/browse/NIFI-3332
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.7.1, 1.1.1
>Reporter: Joe Skora
>Assignee: Koji Kawamura
>Priority: Critical
> Fix For: 1.4.0
>
> Attachments: listfiles.png, Test-showing-ListFile-timestamp-bug.log, 
> Test-showing-ListFile-timestamp-bug.patch
>
>
> The new state implementation for the ListXXX processors based on 
> AbstractListProcessor creates a race conditions when processor runs occur 
> while a batch of files is being written with the same timestamp.
> The changes to state management dropped tracking of the files processed for a 
> given timestamp.  Without the record of files processed, the remainder of the 
> batch is ignored on the next processor run since their timestamp is not 
> greater than the one timestamp stored in processor state.  With the file 
> tracking it was possible to process files that matched the timestamp exactly 
> and exclude the previously processed files.
> A basic time goes as follows.
>   T0 - system creates or receives batch of files with Tx timestamp where Tx 
> is more than the current timestamp in processor state.
>   T1 - system writes 1st half of Tx batch to the ListFile source directory.
>   T2 - ListFile runs picking up 1st half of Tx batch and stores Tx timestamp 
> in processor state.
>   T3 - system writes 2nd half of Tx batch to ListFile source directory.
>   T4 - ListFile runs ignoring any files with T <= Tx, eliminating 2nd half Tx 
> timestamp batch.
> I've attached a patch[1] for TestListFile.java that adds an instrumented unit 
> test demonstrates the problem and a log[2] of the output from one such run.  
> The test writes 3 files each in two batches with processor runs after each 
> batch.  Batch 2 writes files with timestamps older than, equal to, and newer 
> than the timestamp stored when batch 1 was processed, but only the newer file 
> is picked up.  The older file is correctly ignored but file with the matchin 
> timestamp file should have been processed.
> [1] Test-showing-ListFile-timestamp-bug.patch
> [2] Test-showing-ListFile-timestamp-bug.log



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-3332) Bug in ListXXX causes matching timestamps to be ignored on later runs

2017-08-28 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143900#comment-16143900
 ] 

ASF subversion and git services commented on NIFI-3332:
---

Commit e68ff153e81ddb82d1136d44a96bdb7a70da86d1 in nifi's branch 
refs/heads/master from [~ijokarumawak]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=e68ff15 ]

NIFI-3332: ListXXX to not miss files with the latest processed timestamp

Before this fix, it's possible that ListXXX processors can miss files those 
have the same timestamp as the one which was the latest processed timestamp at 
the previous cycle. Since it only used timestamps, it was not possible to 
determine whether a file is already processed or not.

However, storing every single processed identifier as we used to will not 
perform well.
Instead, this commit makes ListXXX to store only identifiers those have the 
latest timestamp at a cycle to minimize the amount of state data to store.

NIFI-3332: ListXXX to not miss files with the latest processed timestamp

- Fixed TestAbstractListProcessor to use appropriate time precision.
  Without this fix, arbitrary test can fail if generated timestamp does
  not have the desired time unit value, e.g. generated '10:51:00' where
  second precision is tested.
- Fixed TestFTP.basicFileList to use millisecond time precision explicitly
  because FakeFtpServer's time precision is in minutes.
- Changed junit dependency scope to 'provided' as it is needed by
  ListProcessorTestWatcher which is shared among different modules.

This closes #1975.

Signed-off-by: Bryan Bende 


> Bug in ListXXX causes matching timestamps to be ignored on later runs
> -
>
> Key: NIFI-3332
> URL: https://issues.apache.org/jira/browse/NIFI-3332
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.7.1, 1.1.1
>Reporter: Joe Skora
>Assignee: Koji Kawamura
>Priority: Critical
> Attachments: listfiles.png, Test-showing-ListFile-timestamp-bug.log, 
> Test-showing-ListFile-timestamp-bug.patch
>
>
> The new state implementation for the ListXXX processors based on 
> AbstractListProcessor creates a race conditions when processor runs occur 
> while a batch of files is being written with the same timestamp.
> The changes to state management dropped tracking of the files processed for a 
> given timestamp.  Without the record of files processed, the remainder of the 
> batch is ignored on the next processor run since their timestamp is not 
> greater than the one timestamp stored in processor state.  With the file 
> tracking it was possible to process files that matched the timestamp exactly 
> and exclude the previously processed files.
> A basic time goes as follows.
>   T0 - system creates or receives batch of files with Tx timestamp where Tx 
> is more than the current timestamp in processor state.
>   T1 - system writes 1st half of Tx batch to the ListFile source directory.
>   T2 - ListFile runs picking up 1st half of Tx batch and stores Tx timestamp 
> in processor state.
>   T3 - system writes 2nd half of Tx batch to ListFile source directory.
>   T4 - ListFile runs ignoring any files with T <= Tx, eliminating 2nd half Tx 
> timestamp batch.
> I've attached a patch[1] for TestListFile.java that adds an instrumented unit 
> test demonstrates the problem and a log[2] of the output from one such run.  
> The test writes 3 files each in two batches with processor runs after each 
> batch.  Batch 2 writes files with timestamps older than, equal to, and newer 
> than the timestamp stored when batch 1 was processed, but only the newer file 
> is picked up.  The older file is correctly ignored but file with the matchin 
> timestamp file should have been processed.
> [1] Test-showing-ListFile-timestamp-bug.patch
> [2] Test-showing-ListFile-timestamp-bug.log



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-3332) Bug in ListXXX causes matching timestamps to be ignored on later runs

2017-08-28 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143901#comment-16143901
 ] 

ASF subversion and git services commented on NIFI-3332:
---

Commit e68ff153e81ddb82d1136d44a96bdb7a70da86d1 in nifi's branch 
refs/heads/master from [~ijokarumawak]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=e68ff15 ]

NIFI-3332: ListXXX to not miss files with the latest processed timestamp

Before this fix, it's possible that ListXXX processors can miss files those 
have the same timestamp as the one which was the latest processed timestamp at 
the previous cycle. Since it only used timestamps, it was not possible to 
determine whether a file is already processed or not.

However, storing every single processed identifier as we used to will not 
perform well.
Instead, this commit makes ListXXX to store only identifiers those have the 
latest timestamp at a cycle to minimize the amount of state data to store.

NIFI-3332: ListXXX to not miss files with the latest processed timestamp

- Fixed TestAbstractListProcessor to use appropriate time precision.
  Without this fix, arbitrary test can fail if generated timestamp does
  not have the desired time unit value, e.g. generated '10:51:00' where
  second precision is tested.
- Fixed TestFTP.basicFileList to use millisecond time precision explicitly
  because FakeFtpServer's time precision is in minutes.
- Changed junit dependency scope to 'provided' as it is needed by
  ListProcessorTestWatcher which is shared among different modules.

This closes #1975.

Signed-off-by: Bryan Bende 


> Bug in ListXXX causes matching timestamps to be ignored on later runs
> -
>
> Key: NIFI-3332
> URL: https://issues.apache.org/jira/browse/NIFI-3332
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.7.1, 1.1.1
>Reporter: Joe Skora
>Assignee: Koji Kawamura
>Priority: Critical
> Attachments: listfiles.png, Test-showing-ListFile-timestamp-bug.log, 
> Test-showing-ListFile-timestamp-bug.patch
>
>
> The new state implementation for the ListXXX processors based on 
> AbstractListProcessor creates a race conditions when processor runs occur 
> while a batch of files is being written with the same timestamp.
> The changes to state management dropped tracking of the files processed for a 
> given timestamp.  Without the record of files processed, the remainder of the 
> batch is ignored on the next processor run since their timestamp is not 
> greater than the one timestamp stored in processor state.  With the file 
> tracking it was possible to process files that matched the timestamp exactly 
> and exclude the previously processed files.
> A basic time goes as follows.
>   T0 - system creates or receives batch of files with Tx timestamp where Tx 
> is more than the current timestamp in processor state.
>   T1 - system writes 1st half of Tx batch to the ListFile source directory.
>   T2 - ListFile runs picking up 1st half of Tx batch and stores Tx timestamp 
> in processor state.
>   T3 - system writes 2nd half of Tx batch to ListFile source directory.
>   T4 - ListFile runs ignoring any files with T <= Tx, eliminating 2nd half Tx 
> timestamp batch.
> I've attached a patch[1] for TestListFile.java that adds an instrumented unit 
> test demonstrates the problem and a log[2] of the output from one such run.  
> The test writes 3 files each in two batches with processor runs after each 
> batch.  Batch 2 writes files with timestamps older than, equal to, and newer 
> than the timestamp stored when batch 1 was processed, but only the newer file 
> is picked up.  The older file is correctly ignored but file with the matchin 
> timestamp file should have been processed.
> [1] Test-showing-ListFile-timestamp-bug.patch
> [2] Test-showing-ListFile-timestamp-bug.log



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-3332) Bug in ListXXX causes matching timestamps to be ignored on later runs

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143903#comment-16143903
 ] 

ASF GitHub Bot commented on NIFI-3332:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1975


> Bug in ListXXX causes matching timestamps to be ignored on later runs
> -
>
> Key: NIFI-3332
> URL: https://issues.apache.org/jira/browse/NIFI-3332
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.7.1, 1.1.1
>Reporter: Joe Skora
>Assignee: Koji Kawamura
>Priority: Critical
> Attachments: listfiles.png, Test-showing-ListFile-timestamp-bug.log, 
> Test-showing-ListFile-timestamp-bug.patch
>
>
> The new state implementation for the ListXXX processors based on 
> AbstractListProcessor creates a race conditions when processor runs occur 
> while a batch of files is being written with the same timestamp.
> The changes to state management dropped tracking of the files processed for a 
> given timestamp.  Without the record of files processed, the remainder of the 
> batch is ignored on the next processor run since their timestamp is not 
> greater than the one timestamp stored in processor state.  With the file 
> tracking it was possible to process files that matched the timestamp exactly 
> and exclude the previously processed files.
> A basic time goes as follows.
>   T0 - system creates or receives batch of files with Tx timestamp where Tx 
> is more than the current timestamp in processor state.
>   T1 - system writes 1st half of Tx batch to the ListFile source directory.
>   T2 - ListFile runs picking up 1st half of Tx batch and stores Tx timestamp 
> in processor state.
>   T3 - system writes 2nd half of Tx batch to ListFile source directory.
>   T4 - ListFile runs ignoring any files with T <= Tx, eliminating 2nd half Tx 
> timestamp batch.
> I've attached a patch[1] for TestListFile.java that adds an instrumented unit 
> test demonstrates the problem and a log[2] of the output from one such run.  
> The test writes 3 files each in two batches with processor runs after each 
> batch.  Batch 2 writes files with timestamps older than, equal to, and newer 
> than the timestamp stored when batch 1 was processed, but only the newer file 
> is picked up.  The older file is correctly ignored but file with the matchin 
> timestamp file should have been processed.
> [1] Test-showing-ListFile-timestamp-bug.patch
> [2] Test-showing-ListFile-timestamp-bug.log



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4069) ListXXX processors can miss files those created while the processor is listing and filesystem does not provide timestamp milliseconds precision

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143902#comment-16143902
 ] 

ASF GitHub Bot commented on NIFI-4069:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1915


> ListXXX processors can miss files those created while the processor is 
> listing and filesystem does not provide timestamp milliseconds precision
> ---
>
> Key: NIFI-4069
> URL: https://issues.apache.org/jira/browse/NIFI-4069
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.0.0
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
> Attachments: ListFilesWithoutMilliseconds.png
>
>
> For some filesystems such as Mac OS X HFS (Hierarchical File System) or EXT3 
> are known that only support timestamp in seconds precision. Also some FTP 
> server is reported that it can only provides timestamp precision in minutes.
> This can cause files to NOT be listed as ListXXX processors logic expects 
> timestamps in milliseconds.
> Specifically, if generate several files in one second, not all files will be 
> listened.
> Steps to reproduce:
> 1. start processor ListFile
> 2. generate 1 zero size files with following command:
> {code}
> for i in {1..1}; do touch ./test_$i; done
> {code}
> 3. see processor stats: out 3952 (0 bytes)
> Current AbstractListProcessor logic adopts LISTING_LAG_NANOS (100ms) and 
> postponing the files those have the latest timestamp within a listing 
> iteration to next iteration, however with those filesystem without 
> milliseconds precision, these logics do not work as expected.
> This issue is originally reported at nifi-dev ML. 
> http://apache-nifi-developer-list.39713.n7.nabble.com/processors-ListFile-ListSFTP-do-not-store-milliseconds-in-timestamp-td16037.html



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4069) ListXXX processors can miss files those created while the processor is listing and filesystem does not provide timestamp milliseconds precision

2017-08-28 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143899#comment-16143899
 ] 

ASF subversion and git services commented on NIFI-4069:
---

Commit 28ee70222b892fb799f5f74a31a9de678d9fb629 in nifi's branch 
refs/heads/master from [~ijokarumawak]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=28ee702 ]

NIFI-4069: Make ListXXX work with timestamp precision in seconds or minutes

- Refactored variable names to better represents what those are meant for.
- Added deterministic logic which detects target filesystem timestamp precision 
and adjust lag time based on it.
- Changed from using System.nanoTime() to System.currentTimeMillis in test 
because Java File API reports timestamp in milliseconds at the best 
granularity. Also, System.nanoTime should not be used in mix with epoch 
milliseconds because it uses arbitrary origin and measured differently.
- Changed TestListFile to use more longer interval between file timestamps 
those are used by testFilterAge to provide more consistent test result because 
sleep time can be longer with filesystems whose timestamp in seconds precision.
- Added logging at TestListFile.
- Added TestWatcher to dump state in case assertion fails for further 
investigation.
- Added Timestamp Precision property so that user can set if auto-detect is not 
enough
- Adjust timestamps for ages test

This closes #1915.

Signed-off-by: Bryan Bende 


> ListXXX processors can miss files those created while the processor is 
> listing and filesystem does not provide timestamp milliseconds precision
> ---
>
> Key: NIFI-4069
> URL: https://issues.apache.org/jira/browse/NIFI-4069
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.0.0
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
> Attachments: ListFilesWithoutMilliseconds.png
>
>
> For some filesystems such as Mac OS X HFS (Hierarchical File System) or EXT3 
> are known that only support timestamp in seconds precision. Also some FTP 
> server is reported that it can only provides timestamp precision in minutes.
> This can cause files to NOT be listed as ListXXX processors logic expects 
> timestamps in milliseconds.
> Specifically, if generate several files in one second, not all files will be 
> listened.
> Steps to reproduce:
> 1. start processor ListFile
> 2. generate 1 zero size files with following command:
> {code}
> for i in {1..1}; do touch ./test_$i; done
> {code}
> 3. see processor stats: out 3952 (0 bytes)
> Current AbstractListProcessor logic adopts LISTING_LAG_NANOS (100ms) and 
> postponing the files those have the latest timestamp within a listing 
> iteration to next iteration, however with those filesystem without 
> milliseconds precision, these logics do not work as expected.
> This issue is originally reported at nifi-dev ML. 
> http://apache-nifi-developer-list.39713.n7.nabble.com/processors-ListFile-ListSFTP-do-not-store-milliseconds-in-timestamp-td16037.html



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi pull request #1915: NIFI-4069: Make ListXXX work with timestamp precisi...

2017-08-28 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1915


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #1975: NIFI-3332: ListXXX to not miss files with the lates...

2017-08-28 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/1975


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3332) Bug in ListXXX causes matching timestamps to be ignored on later runs

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143895#comment-16143895
 ] 

ASF GitHub Bot commented on NIFI-3332:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/1975
  
Thanks for updating the PR! This looks good now, going to merge to master


> Bug in ListXXX causes matching timestamps to be ignored on later runs
> -
>
> Key: NIFI-3332
> URL: https://issues.apache.org/jira/browse/NIFI-3332
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.7.1, 1.1.1
>Reporter: Joe Skora
>Assignee: Koji Kawamura
>Priority: Critical
> Attachments: listfiles.png, Test-showing-ListFile-timestamp-bug.log, 
> Test-showing-ListFile-timestamp-bug.patch
>
>
> The new state implementation for the ListXXX processors based on 
> AbstractListProcessor creates a race conditions when processor runs occur 
> while a batch of files is being written with the same timestamp.
> The changes to state management dropped tracking of the files processed for a 
> given timestamp.  Without the record of files processed, the remainder of the 
> batch is ignored on the next processor run since their timestamp is not 
> greater than the one timestamp stored in processor state.  With the file 
> tracking it was possible to process files that matched the timestamp exactly 
> and exclude the previously processed files.
> A basic time goes as follows.
>   T0 - system creates or receives batch of files with Tx timestamp where Tx 
> is more than the current timestamp in processor state.
>   T1 - system writes 1st half of Tx batch to the ListFile source directory.
>   T2 - ListFile runs picking up 1st half of Tx batch and stores Tx timestamp 
> in processor state.
>   T3 - system writes 2nd half of Tx batch to ListFile source directory.
>   T4 - ListFile runs ignoring any files with T <= Tx, eliminating 2nd half Tx 
> timestamp batch.
> I've attached a patch[1] for TestListFile.java that adds an instrumented unit 
> test demonstrates the problem and a log[2] of the output from one such run.  
> The test writes 3 files each in two batches with processor runs after each 
> batch.  Batch 2 writes files with timestamps older than, equal to, and newer 
> than the timestamp stored when batch 1 was processed, but only the newer file 
> is picked up.  The older file is correctly ignored but file with the matchin 
> timestamp file should have been processed.
> [1] Test-showing-ListFile-timestamp-bug.patch
> [2] Test-showing-ListFile-timestamp-bug.log



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi issue #1975: NIFI-3332: ListXXX to not miss files with the latest proce...

2017-08-28 Thread bbende
Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/1975
  
Thanks for updating the PR! This looks good now, going to merge to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2528) Update ListenHTTP to honor SSLContextService Protocols

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143865#comment-16143865
 ] 

ASF GitHub Bot commented on NIFI-2528:
--

Github user m-hogue commented on the issue:

https://github.com/apache/nifi/pull/1986
  
@alopresto : I've replied with a couple of questions on your comments.

Also, i'll create an issue to update the SSL Context Service interface for 
the listed processors (once confirmed).


> Update ListenHTTP to honor SSLContextService Protocols
> --
>
> Key: NIFI-2528
> URL: https://issues.apache.org/jira/browse/NIFI-2528
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.0.0, 0.8.0, 0.7.1
>Reporter: Joe Skora
>Assignee: Michael Hogue
>
> Update ListenHTTP to honor SSLContextService Protocols as [NIFI-1688] did for 
> PostHTTP.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi issue #1986: NIFI-2528: added support for SSLContextService protocols i...

2017-08-28 Thread m-hogue
Github user m-hogue commented on the issue:

https://github.com/apache/nifi/pull/1986
  
@alopresto : I've replied with a couple of questions on your comments.

Also, i'll create an issue to update the SSL Context Service interface for 
the listed processors (once confirmed).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2528) Update ListenHTTP to honor SSLContextService Protocols

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143864#comment-16143864
 ] 

ASF GitHub Bot commented on NIFI-2528:
--

Github user m-hogue commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1986#discussion_r135548481
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-ssl-context-bundle/nifi-ssl-context-service/src/main/java/org/apache/nifi/ssl/StandardRestrictedSSLContextService.java
 ---
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.ssl;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.processor.util.StandardValidators;
+
+/**
+ * This class is functionally the same as {@link 
StandardSSLContextService}, but it restricts the allowable
+ * values that can be selected for SSL protocols.
+ */
+@Tags({"ssl", "secure", "certificate", "keystore", "truststore", "jks", 
"p12", "pkcs12", "pkcs"})
+@CapabilityDescription("Restricted implementation of the 
SSLContextService. Provides the ability to configure "
++ "keystore and/or truststore properties once and reuse that 
configuration throughout the application, "
++ "but only allows a restricted set of SSL protocols to be chosen. 
The set of protocols selectable will "
++ "evolve over time as new protocols emerge and older protocols 
are deprecated. This service is recommended "
++ "over StandardSSLContextService if a component doesn't expect to 
communicate with legacy systems since it's "
++ "unlikely that legacy systems will support these protocols.")
+public class StandardRestrictedSSLContextService extends 
StandardSSLContextService implements RestrictedSSLContextService {
+
+public static final PropertyDescriptor RESTRICTED_SSL_ALGORITHM = new 
PropertyDescriptor.Builder()
+.name("SSL Protocol")
+.defaultValue("TLSv1.2")
--- End diff --

Do you suggest that i change the `.name()` to "TLS Protocol" here as well?


> Update ListenHTTP to honor SSLContextService Protocols
> --
>
> Key: NIFI-2528
> URL: https://issues.apache.org/jira/browse/NIFI-2528
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.0.0, 0.8.0, 0.7.1
>Reporter: Joe Skora
>Assignee: Michael Hogue
>
> Update ListenHTTP to honor SSLContextService Protocols as [NIFI-1688] did for 
> PostHTTP.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi pull request #1986: NIFI-2528: added support for SSLContextService prot...

2017-08-28 Thread m-hogue
Github user m-hogue commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1986#discussion_r135548481
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-ssl-context-bundle/nifi-ssl-context-service/src/main/java/org/apache/nifi/ssl/StandardRestrictedSSLContextService.java
 ---
@@ -0,0 +1,81 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.ssl;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.processor.util.StandardValidators;
+
+/**
+ * This class is functionally the same as {@link 
StandardSSLContextService}, but it restricts the allowable
+ * values that can be selected for SSL protocols.
+ */
+@Tags({"ssl", "secure", "certificate", "keystore", "truststore", "jks", 
"p12", "pkcs12", "pkcs"})
+@CapabilityDescription("Restricted implementation of the 
SSLContextService. Provides the ability to configure "
++ "keystore and/or truststore properties once and reuse that 
configuration throughout the application, "
++ "but only allows a restricted set of SSL protocols to be chosen. 
The set of protocols selectable will "
++ "evolve over time as new protocols emerge and older protocols 
are deprecated. This service is recommended "
++ "over StandardSSLContextService if a component doesn't expect to 
communicate with legacy systems since it's "
++ "unlikely that legacy systems will support these protocols.")
+public class StandardRestrictedSSLContextService extends 
StandardSSLContextService implements RestrictedSSLContextService {
+
+public static final PropertyDescriptor RESTRICTED_SSL_ALGORITHM = new 
PropertyDescriptor.Builder()
+.name("SSL Protocol")
+.defaultValue("TLSv1.2")
--- End diff --

Do you suggest that i change the `.name()` to "TLS Protocol" here as well?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (NIFI-4326) ExtractEmailHeaders.java unhandled NullPointerException

2017-08-28 Thread Benjamin Wood (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4326?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Benjamin Wood updated NIFI-4326:

Description: 
The ExtractEmailHeaders  processor throws a NullPointerException if there is no 
TO, CC, and BCC recipients.

If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
and not a 0 length array.

I've already written a patch and submitted it to github as pull request #2111

  was:
The ExtractEmailHeaders using  processor throws a NullPointerException if there 
is no TO, CC, and BCC recipients.

If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
and not a 0 length array.

I've already written a patch and submitted it to github as pull request #2111


> ExtractEmailHeaders.java unhandled NullPointerException
> ---
>
> Key: NIFI-4326
> URL: https://issues.apache.org/jira/browse/NIFI-4326
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.3.0
>Reporter: Benjamin Wood
>Priority: Minor
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> The ExtractEmailHeaders  processor throws a NullPointerException if there is 
> no TO, CC, and BCC recipients.
> If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
> and not a 0 length array.
> I've already written a patch and submitted it to github as pull request #2111



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-2528) Update ListenHTTP to honor SSLContextService Protocols

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143860#comment-16143860
 ] 

ASF GitHub Bot commented on NIFI-2528:
--

Github user m-hogue commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1986#discussion_r135547306
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-ssl-context-bundle/nifi-ssl-context-service/src/main/java/org/apache/nifi/ssl/SSLContextServiceUtils.java
 ---
@@ -0,0 +1,77 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.ssl;
+
+import java.security.NoSuchAlgorithmException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+import javax.net.ssl.SSLContext;
+
+import org.apache.nifi.components.AllowableValue;
+
+public class SSLContextServiceUtils {
+
+/**
+ * Build a set of allowable SSL protocol algorithms based on JVM 
configuration and whether
+ * or not the list should be restricted to include only certain 
protocols.
+ *
+ * @param restricted whether the set of allowable protocol values 
should be restricted.
+ *
+ * @return the computed set of allowable values
+ */
+public static AllowableValue[] buildSSLAlgorithmAllowableValues(final 
boolean restricted) {
+final Set supportedProtocols = new HashSet<>();
+
+// if restricted, only allow the below set of protocols.
+if(restricted) {
+supportedProtocols.add("TLSv1.2");
--- End diff --

Good call. I'll add this.


> Update ListenHTTP to honor SSLContextService Protocols
> --
>
> Key: NIFI-2528
> URL: https://issues.apache.org/jira/browse/NIFI-2528
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.0.0, 0.8.0, 0.7.1
>Reporter: Joe Skora
>Assignee: Michael Hogue
>
> Update ListenHTTP to honor SSLContextService Protocols as [NIFI-1688] did for 
> PostHTTP.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Commented] (NIFI-4318) PutHiveQL processor cannot be stopped

2017-08-28 Thread Matt Burgess (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4318?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143861#comment-16143861
 ] 

Matt Burgess commented on NIFI-4318:


Can you provide more details around the cookie/Kerberos issues you were having 
before you tried to shutdown the processor?

> PutHiveQL processor cannot be stopped
> -
>
> Key: NIFI-4318
> URL: https://issues.apache.org/jira/browse/NIFI-4318
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: 3-nodes cluster
>Reporter: Pierre Villard
> Attachments: image001.png, thread-2.txt, thread.txt
>
>
> I tried to stop PutHiveQL processor after experiencing some cookie/kerberos 
> issues while sending requests to Hive, but the processor could not be stopped 
> and showed running threads (it remained in this situation at least for half 
> an hour). I had to restart NiFi to solve the situation.
> Attached: a screenshot and two thread dumps at about 5 minutes interval.
> It looks like the Kerberos authentication mechanism is falling back to manual 
> user input and wait for some input (see below promptForName):
> {noformat}
> "Timer-Driven Process Thread-2" Id=139 RUNNABLE  (in native code)
>   at java.io.FileInputStream.readBytes(Native Method)
>   at java.io.FileInputStream.read(FileInputStream.java:255)
>   at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
>   at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
>   - waiting on java.io.BufferedInputStream@2e2d3f92
>   at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
>   at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
>   at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
>   - waiting on java.io.InputStreamReader@64628fdf
>   at java.io.InputStreamReader.read(InputStreamReader.java:184)
>   at java.io.BufferedReader.fill(BufferedReader.java:161)
>   at java.io.BufferedReader.readLine(BufferedReader.java:324)
>   - waiting on java.io.InputStreamReader@64628fdf
>   at java.io.BufferedReader.readLine(BufferedReader.java:389)
>   at 
> com.sun.security.auth.callback.TextCallbackHandler.readLine(TextCallbackHandler.java:153)
>   at 
> com.sun.security.auth.callback.TextCallbackHandler.handle(TextCallbackHandler.java:120)
>   at 
> com.sun.security.auth.module.Krb5LoginModule.promptForName(Krb5LoginModule.java:858)
>   at 
> com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:704)
>   at 
> com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
>   at sun.reflect.GeneratedMethodAccessor597.invoke(Unknown Source)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:498)
>   at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
>   at 
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
>   at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
>   at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at 
> javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
>   at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
>   at sun.security.jgss.GSSUtil.login(GSSUtil.java:258)
>   at sun.security.jgss.krb5.Krb5Util.getTicket(Krb5Util.java:158)
>   at 
> sun.security.jgss.krb5.Krb5InitCredential$1.run(Krb5InitCredential.java:335)
>   at 
> sun.security.jgss.krb5.Krb5InitCredential$1.run(Krb5InitCredential.java:331)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at 
> sun.security.jgss.krb5.Krb5InitCredential.getTgt(Krb5InitCredential.java:330)
>   at 
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:145)
>   at 
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
>   at 
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>   at 
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
>   at 
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>   at 
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>   at 
> org.apache.hive.service.auth.HttpAuthUtils$HttpKerberosClientAction.run(HttpAuthUtils.java:183)
>   at 
> org.apache.hive.service.auth.HttpAuthUtils$HttpKerberosClientAction.run(HttpAuthUtils.java:151)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at 

[jira] [Commented] (NIFI-4326) ExtractEmailHeaders.java unhandled NullPointerException

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143859#comment-16143859
 ] 

ASF GitHub Bot commented on NIFI-4326:
--

Github user btwood commented on the issue:

https://github.com/apache/nifi/pull/2111
  
Opened a JIRA Issue NIFI-4326


> ExtractEmailHeaders.java unhandled NullPointerException
> ---
>
> Key: NIFI-4326
> URL: https://issues.apache.org/jira/browse/NIFI-4326
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.3.0
>Reporter: Benjamin Wood
>Priority: Minor
>   Original Estimate: 0.5h
>  Remaining Estimate: 0.5h
>
> The ExtractEmailHeaders using  processor throws a NullPointerException if 
> there is no TO, CC, and BCC recipients.
> If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
> and not a 0 length array.
> I've already written a patch and submitted it to github as pull request #2111



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi pull request #1986: NIFI-2528: added support for SSLContextService prot...

2017-08-28 Thread m-hogue
Github user m-hogue commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1986#discussion_r135547306
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-ssl-context-bundle/nifi-ssl-context-service/src/main/java/org/apache/nifi/ssl/SSLContextServiceUtils.java
 ---
@@ -0,0 +1,77 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.ssl;
+
+import java.security.NoSuchAlgorithmException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+import javax.net.ssl.SSLContext;
+
+import org.apache.nifi.components.AllowableValue;
+
+public class SSLContextServiceUtils {
+
+/**
+ * Build a set of allowable SSL protocol algorithms based on JVM 
configuration and whether
+ * or not the list should be restricted to include only certain 
protocols.
+ *
+ * @param restricted whether the set of allowable protocol values 
should be restricted.
+ *
+ * @return the computed set of allowable values
+ */
+public static AllowableValue[] buildSSLAlgorithmAllowableValues(final 
boolean restricted) {
+final Set supportedProtocols = new HashSet<>();
+
+// if restricted, only allow the below set of protocols.
+if(restricted) {
+supportedProtocols.add("TLSv1.2");
--- End diff --

Good call. I'll add this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Created] (NIFI-4326) ExtractEmailHeaders.java unhandled NullPointerException

2017-08-28 Thread Benjamin Wood (JIRA)
Benjamin Wood created NIFI-4326:
---

 Summary: ExtractEmailHeaders.java unhandled NullPointerException
 Key: NIFI-4326
 URL: https://issues.apache.org/jira/browse/NIFI-4326
 Project: Apache NiFi
  Issue Type: Bug
  Components: Core Framework
Affects Versions: 1.3.0
Reporter: Benjamin Wood
Priority: Minor


The ExtractEmailHeaders using  processor throws a NullPointerException if there 
is no TO, CC, and BCC recipients.

If there are no recipients "originalMessage.getAllRecipients()" returns NULL, 
and not a 0 length array.

I've already written a patch and submitted it to github as pull request #2111



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi issue #2111: Update ExtractEmailHeaders.java

2017-08-28 Thread btwood
Github user btwood commented on the issue:

https://github.com/apache/nifi/pull/2111
  
Opened a JIRA Issue NIFI-4326


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-2528) Update ListenHTTP to honor SSLContextService Protocols

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-2528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143852#comment-16143852
 ] 

ASF GitHub Bot commented on NIFI-2528:
--

Github user m-hogue commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1986#discussion_r135545489
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-ssl-context-bundle/nifi-ssl-context-service/src/main/java/org/apache/nifi/ssl/SSLContextServiceUtils.java
 ---
@@ -0,0 +1,77 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.ssl;
+
+import java.security.NoSuchAlgorithmException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+import javax.net.ssl.SSLContext;
+
+import org.apache.nifi.components.AllowableValue;
+
+public class SSLContextServiceUtils {
--- End diff --

@alopresto : I agree and I looked into doing this. However, the problem is 
that the allowable values for the SSL algorithm PropertyDescriptors are 
statically initialized with the appropriate values. Adding a method to the 
`SSLContextService` interface wouldn't enable you to build the values before 
the class is loaded, which is why i threw a static method into a utility class. 
This actually reminded me that i forgot to remove the original 
`buildAllowableValues()` method in `StandardSSLContextService`. I'll do that on 
my next push.

I can approach this in a different way if you have a recommendation. Thanks!


> Update ListenHTTP to honor SSLContextService Protocols
> --
>
> Key: NIFI-2528
> URL: https://issues.apache.org/jira/browse/NIFI-2528
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 1.0.0, 0.8.0, 0.7.1
>Reporter: Joe Skora
>Assignee: Michael Hogue
>
> Update ListenHTTP to honor SSLContextService Protocols as [NIFI-1688] did for 
> PostHTTP.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi pull request #1986: NIFI-2528: added support for SSLContextService prot...

2017-08-28 Thread m-hogue
Github user m-hogue commented on a diff in the pull request:

https://github.com/apache/nifi/pull/1986#discussion_r135545489
  
--- Diff: 
nifi-nar-bundles/nifi-standard-services/nifi-ssl-context-bundle/nifi-ssl-context-service/src/main/java/org/apache/nifi/ssl/SSLContextServiceUtils.java
 ---
@@ -0,0 +1,77 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.ssl;
+
+import java.security.NoSuchAlgorithmException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Set;
+
+import javax.net.ssl.SSLContext;
+
+import org.apache.nifi.components.AllowableValue;
+
+public class SSLContextServiceUtils {
--- End diff --

@alopresto : I agree and I looked into doing this. However, the problem is 
that the allowable values for the SSL algorithm PropertyDescriptors are 
statically initialized with the appropriate values. Adding a method to the 
`SSLContextService` interface wouldn't enable you to build the values before 
the class is loaded, which is why i threw a static method into a utility class. 
This actually reminded me that i forgot to remove the original 
`buildAllowableValues()` method in `StandardSSLContextService`. I'll do that on 
my next push.

I can approach this in a different way if you have a recommendation. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-4289) Implement put processor for InfluxDB

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143778#comment-16143778
 ] 

ASF GitHub Bot commented on NIFI-4289:
--

Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/2101
  
Hey Nifi Team:  Please let me know your feedback/comments on this 
processor.  Thanks.


> Implement put processor for InfluxDB
> 
>
> Key: NIFI-4289
> URL: https://issues.apache.org/jira/browse/NIFI-4289
> Project: Apache NiFi
>  Issue Type: New Feature
>  Components: Extensions
>Affects Versions: 1.3.0
> Environment: All
>Reporter: Mans Singh
>Assignee: Mans Singh
>Priority: Minor
>  Labels: insert, measurements,, put, timeseries
> Fix For: 1.4.0
>
>
> Support inserting time series measurements into InfluxDB.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi issue #2101: NIFI-4289 - InfluxDB put processor

2017-08-28 Thread mans2singh
Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/2101
  
Hey Nifi Team:  Please let me know your feedback/comments on this 
processor.  Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (NIFI-3332) Bug in ListXXX causes matching timestamps to be ignored on later runs

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-3332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143719#comment-16143719
 ] 

ASF GitHub Bot commented on NIFI-3332:
--

Github user ijokarumawak commented on the issue:

https://github.com/apache/nifi/pull/1975
  
@bbende Thanks for reviewing this.

This PR is now rebased with the latest master. The last commit includes 
following changes.

The failing TestFTP.basicFileList has been added after I worked on this PR. 
It uses FakeFTPServer, which provides timestamp precision in minutes. Then this 
PR adds time precision auto detection by default. The file which was expected 
to be picked was not picked because it hadn't passed the required amount of lag 
time for minute precision. The test has been updated to use millisecond 
precision explicitly and also thread sleep has been added. The same error was 
confirmed in my environment, but it's been addressed.

Similarly, I found that TestAbstractListProcessor tests can fail due to 
luck of time unit setting, when generated timestamp does not have the desired 
time unit value, e.g. generated '10:51:00' where second precision is tested. 
This has been addressed, too.

Finally, the reason for changing junit dependency is 
ListProcessorTestWatcher. It resides in `nifi-processor-utils` and is used by 
the project and also `nifi-standard-processors` project. In order to share 
ListProcessorTestWatcher via nifi-processor-utils, I changed junit scope to 
'compile' because it needs to be accessible from the 'main' source, not by 
'test' source. But 'provided' is more reasonable in this case, so I've updated 
it to 'provided'. Without having a mechanism like ListProcessorTestWatcher, 
debugging test failures will be very difficult, especially if it happens 
occasionally in a remote environment such as Travis CI.


> Bug in ListXXX causes matching timestamps to be ignored on later runs
> -
>
> Key: NIFI-3332
> URL: https://issues.apache.org/jira/browse/NIFI-3332
> Project: Apache NiFi
>  Issue Type: Bug
>  Components: Core Framework
>Affects Versions: 0.7.1, 1.1.1
>Reporter: Joe Skora
>Assignee: Koji Kawamura
>Priority: Critical
> Attachments: listfiles.png, Test-showing-ListFile-timestamp-bug.log, 
> Test-showing-ListFile-timestamp-bug.patch
>
>
> The new state implementation for the ListXXX processors based on 
> AbstractListProcessor creates a race conditions when processor runs occur 
> while a batch of files is being written with the same timestamp.
> The changes to state management dropped tracking of the files processed for a 
> given timestamp.  Without the record of files processed, the remainder of the 
> batch is ignored on the next processor run since their timestamp is not 
> greater than the one timestamp stored in processor state.  With the file 
> tracking it was possible to process files that matched the timestamp exactly 
> and exclude the previously processed files.
> A basic time goes as follows.
>   T0 - system creates or receives batch of files with Tx timestamp where Tx 
> is more than the current timestamp in processor state.
>   T1 - system writes 1st half of Tx batch to the ListFile source directory.
>   T2 - ListFile runs picking up 1st half of Tx batch and stores Tx timestamp 
> in processor state.
>   T3 - system writes 2nd half of Tx batch to ListFile source directory.
>   T4 - ListFile runs ignoring any files with T <= Tx, eliminating 2nd half Tx 
> timestamp batch.
> I've attached a patch[1] for TestListFile.java that adds an instrumented unit 
> test demonstrates the problem and a log[2] of the output from one such run.  
> The test writes 3 files each in two batches with processor runs after each 
> batch.  Batch 2 writes files with timestamps older than, equal to, and newer 
> than the timestamp stored when batch 1 was processed, but only the newer file 
> is picked up.  The older file is correctly ignored but file with the matchin 
> timestamp file should have been processed.
> [1] Test-showing-ListFile-timestamp-bug.patch
> [2] Test-showing-ListFile-timestamp-bug.log



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi issue #1975: NIFI-3332: ListXXX to not miss files with the latest proce...

2017-08-28 Thread ijokarumawak
Github user ijokarumawak commented on the issue:

https://github.com/apache/nifi/pull/1975
  
@bbende Thanks for reviewing this.

This PR is now rebased with the latest master. The last commit includes 
following changes.

The failing TestFTP.basicFileList has been added after I worked on this PR. 
It uses FakeFTPServer, which provides timestamp precision in minutes. Then this 
PR adds time precision auto detection by default. The file which was expected 
to be picked was not picked because it hadn't passed the required amount of lag 
time for minute precision. The test has been updated to use millisecond 
precision explicitly and also thread sleep has been added. The same error was 
confirmed in my environment, but it's been addressed.

Similarly, I found that TestAbstractListProcessor tests can fail due to 
luck of time unit setting, when generated timestamp does not have the desired 
time unit value, e.g. generated '10:51:00' where second precision is tested. 
This has been addressed, too.

Finally, the reason for changing junit dependency is 
ListProcessorTestWatcher. It resides in `nifi-processor-utils` and is used by 
the project and also `nifi-standard-processors` project. In order to share 
ListProcessorTestWatcher via nifi-processor-utils, I changed junit scope to 
'compile' because it needs to be accessible from the 'main' source, not by 
'test' source. But 'provided' is more reasonable in this case, so I've updated 
it to 'provided'. Without having a mechanism like ListProcessorTestWatcher, 
debugging test failures will be very difficult, especially if it happens 
occasionally in a remote environment such as Travis CI.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (NIFI-4318) PutHiveQL processor cannot be stopped

2017-08-28 Thread Pierre Villard (JIRA)

 [ 
https://issues.apache.org/jira/browse/NIFI-4318?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Pierre Villard updated NIFI-4318:
-
Description: 
I tried to stop PutHiveQL processor after experiencing some cookie/kerberos 
issues while sending requests to Hive, but the processor could not be stopped 
and showed running threads (it remained in this situation at least for half an 
hour). I had to restart NiFi to solve the situation.

Attached: a screenshot and two thread dumps at about 5 minutes interval.

It looks like the Kerberos authentication mechanism is falling back to manual 
user input and wait for some input (see below promptForName):

{noformat}
"Timer-Driven Process Thread-2" Id=139 RUNNABLE  (in native code)
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:255)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
- waiting on java.io.BufferedInputStream@2e2d3f92
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
- waiting on java.io.InputStreamReader@64628fdf
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
- waiting on java.io.InputStreamReader@64628fdf
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at 
com.sun.security.auth.callback.TextCallbackHandler.readLine(TextCallbackHandler.java:153)
at 
com.sun.security.auth.callback.TextCallbackHandler.handle(TextCallbackHandler.java:120)
at 
com.sun.security.auth.module.Krb5LoginModule.promptForName(Krb5LoginModule.java:858)
at 
com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:704)
at 
com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
at sun.reflect.GeneratedMethodAccessor597.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
at 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
at java.security.AccessController.doPrivileged(Native Method)
at 
javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
at sun.security.jgss.GSSUtil.login(GSSUtil.java:258)
at sun.security.jgss.krb5.Krb5Util.getTicket(Krb5Util.java:158)
at 
sun.security.jgss.krb5.Krb5InitCredential$1.run(Krb5InitCredential.java:335)
at 
sun.security.jgss.krb5.Krb5InitCredential$1.run(Krb5InitCredential.java:331)
at java.security.AccessController.doPrivileged(Native Method)
at 
sun.security.jgss.krb5.Krb5InitCredential.getTgt(Krb5InitCredential.java:330)
at 
sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:145)
at 
sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at 
sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at 
sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at 
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at 
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at 
org.apache.hive.service.auth.HttpAuthUtils$HttpKerberosClientAction.run(HttpAuthUtils.java:183)
at 
org.apache.hive.service.auth.HttpAuthUtils$HttpKerberosClientAction.run(HttpAuthUtils.java:151)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at 
org.apache.hive.service.auth.HttpAuthUtils.getKerberosServiceTicket(HttpAuthUtils.java:83)
at 
org.apache.hive.jdbc.HttpKerberosRequestInterceptor.addHttpAuthHeader(HttpKerberosRequestInterceptor.java:62)
at 
org.apache.hive.jdbc.HttpRequestInterceptorBase.process(HttpRequestInterceptorBase.java:74)
at 
org.apache.http.protocol.ImmutableHttpProcessor.process(ImmutableHttpProcessor.java:132)
at 
org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:183)
at 

[jira] [Commented] (NIFI-4004) Refactor RecordReaderFactory and SchemaAccessStrategy to be used without incoming FlowFile

2017-08-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4004?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16143456#comment-16143456
 ] 

ASF GitHub Bot commented on NIFI-4004:
--

Github user ijokarumawak commented on the issue:

https://github.com/apache/nifi/pull/1877
  
@markap14 Thanks for pointing that out. I think I got mislead by the two 
getSchema methods in SchemaRegistryService. I accidentally removed the 2nd form 
during refactoring and that caused for the unnecessary InputStream sneaked into 
RecordSetWriterFactory.

```
// This one had been used by readers
public final RecordSchema getSchema(final FlowFile flowFile, final 
InputStream contentStream, final RecordSchema readSchema) throws 
SchemaNotFoundException, IOException {
// This one had been used by writers
public RecordSchema getSchema(final FlowFile flowFile, final 
RecordSchema readSchema) throws SchemaNotFoundException, IOException {
```

I pushed another commit to bring back the 2nd signature above, and change 
FlowFile to Map. Compiled and contrib-check passed locally. 
Also tested different readers/writers with a live flow. Thanks again for your 
time to review!


> Refactor RecordReaderFactory and SchemaAccessStrategy to be used without 
> incoming FlowFile
> --
>
> Key: NIFI-4004
> URL: https://issues.apache.org/jira/browse/NIFI-4004
> Project: Apache NiFi
>  Issue Type: Improvement
>  Components: Extensions
>Affects Versions: 1.2.0
>Reporter: Koji Kawamura
>Assignee: Koji Kawamura
>
> Current RecordReaderFactory and SchemaAccessStrategy implementation assumes 
> there's always an incoming FlowFile available, and use it to resolve Record 
> Schema.
> That is fine for components those convert or update incoming FlowFiles, 
> however there are other components those does not have any incoming 
> FlowFiles, for example, ConsumeKafkaRecord_0_10. Typically, ones fetches data 
> from external system do not have incoming FlowFile. And current API doesn't 
> fit well with these as it requires a FlowFile.
> In fact, [ConsumeKafkaRecord creates a temporal 
> FlowFile|https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-kafka-bundle/nifi-kafka-0-10-processors/src/main/java/org/apache/nifi/processors/kafka/pubsub/ConsumerLease.java#L426]
>  only to get RecordSchema. This should be avoided as we expect more 
> components start using Record reader mechanism.
> This JIRA proposes refactoring current API to allow accessing RecordReaders 
> without needing an incoming FlowFile.
> Additionally, since there's Schema Access Strategy that requires incoming 
> FlowFile containing attribute values to access schema registry, it'd be 
> useful if we could tell user when such RecordReader is specified that it 
> can't be used.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] nifi issue #1877: NIFI-4004: Use RecordReaderFactory without FlowFile.

2017-08-28 Thread ijokarumawak
Github user ijokarumawak commented on the issue:

https://github.com/apache/nifi/pull/1877
  
@markap14 Thanks for pointing that out. I think I got mislead by the two 
getSchema methods in SchemaRegistryService. I accidentally removed the 2nd form 
during refactoring and that caused for the unnecessary InputStream sneaked into 
RecordSetWriterFactory.

```
// This one had been used by readers
public final RecordSchema getSchema(final FlowFile flowFile, final 
InputStream contentStream, final RecordSchema readSchema) throws 
SchemaNotFoundException, IOException {
// This one had been used by writers
public RecordSchema getSchema(final FlowFile flowFile, final 
RecordSchema readSchema) throws SchemaNotFoundException, IOException {
```

I pushed another commit to bring back the 2nd signature above, and change 
FlowFile to Map. Compiled and contrib-check passed locally. 
Also tested different readers/writers with a live flow. Thanks again for your 
time to review!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---