[GitHub] nifi issue #496: NIFI-1965 - Implement QueryDNS Processor

2016-06-11 Thread trixpan
Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/496
  
Last commit added a customValidator to ensure a Parser REGEX is used 
whenever QUERY_PARSER != NONE.

With the last commit this PR should be ready for review. 

Please let me know if you have questions / suggestions.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-06-11 Thread mans2singh
Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/502
  
@pvillard31 - I've added provenance and corrected the merge and conflict 
issue. Please let me know your thoughts.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #479: NIFI-1937 GetHTTP configurable redirect cookie polic...

2016-06-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/479


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #479: NIFI-1937 GetHTTP configurable redirect cookie polic...

2016-06-11 Thread mosermw
Github user mosermw commented on a diff in the pull request:

https://github.com/apache/nifi/pull/479#discussion_r66706268
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GetHTTP.java
 ---
@@ -197,6 +198,30 @@
 .addValidator(StandardValidators.PORT_VALIDATOR)
 .build();
 
+public static final String DEFAULT_COOKIE_POLICY_STR = "default";
+public static final String STANDARD_COOKIE_POLICY_STR = "standard";
+public static final String STRICT_COOKIE_POLICY_STR = "strict";
+public static final String NETSCAPE_COOKIE_POLICY_STR = "netscape";
+public static final String IGNORE_COOKIE_POLICY_STR = "ignore";
+public static final AllowableValue DEFAULT_COOKIE_POLICY = new 
AllowableValue(DEFAULT_COOKIE_POLICY_STR, DEFAULT_COOKIE_POLICY_STR,
+"Default cookie policy that provides a higher degree of 
compatibility with common cookie management of popular HTTP agents for 
non-standard (Netscape style) cookies.");
+public static final AllowableValue STANDARD_COOKIE_POLICY = new 
AllowableValue(STANDARD_COOKIE_POLICY_STR, STANDARD_COOKIE_POLICY_STR,
+"RFC 6265 compliant cookie policy (interoperability 
profile).");
+public static final AllowableValue STRICT_COOKIE_POLICY = new 
AllowableValue(STRICT_COOKIE_POLICY_STR, STRICT_COOKIE_POLICY_STR,
+"RFC 6265 compliant cookie policy (strict profile).");
+public static final AllowableValue NETSCAPE_COOKIE_POLICY = new 
AllowableValue(NETSCAPE_COOKIE_POLICY_STR, NETSCAPE_COOKIE_POLICY_STR,
+"Netscape draft compliant cookie policy.");
+public static final AllowableValue IGNORE_COOKIE_POLICY = new 
AllowableValue(IGNORE_COOKIE_POLICY_STR, IGNORE_COOKIE_POLICY_STR,
+"A cookie policy that ignores cookies.");
+
+public static final PropertyDescriptor REDIRECT_COOKIE_POLICY = new 
PropertyDescriptor.Builder()
+.name("redirect-cookie-policy")
+.displayName("Redirect Cookie Policy")
+.description("When a HTTP server responds to a request with a 
redirect, this is the cookie policy used to copy cookies to the following 
request.")
+.allowableValues(DEFAULT_COOKIE_POLICY, 
STANDARD_COOKIE_POLICY, STRICT_COOKIE_POLICY, NETSCAPE_COOKIE_POLICY, 
IGNORE_COOKIE_POLICY)
+.defaultValue(DEFAULT_COOKIE_POLICY_STR)
--- End diff --

If you have the time, please go ahead.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #479: NIFI-1937 GetHTTP configurable redirect cookie polic...

2016-06-11 Thread trkurc
Github user trkurc commented on a diff in the pull request:

https://github.com/apache/nifi/pull/479#discussion_r66706187
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GetHTTP.java
 ---
@@ -197,6 +198,30 @@
 .addValidator(StandardValidators.PORT_VALIDATOR)
 .build();
 
+public static final String DEFAULT_COOKIE_POLICY_STR = "default";
+public static final String STANDARD_COOKIE_POLICY_STR = "standard";
+public static final String STRICT_COOKIE_POLICY_STR = "strict";
+public static final String NETSCAPE_COOKIE_POLICY_STR = "netscape";
+public static final String IGNORE_COOKIE_POLICY_STR = "ignore";
+public static final AllowableValue DEFAULT_COOKIE_POLICY = new 
AllowableValue(DEFAULT_COOKIE_POLICY_STR, DEFAULT_COOKIE_POLICY_STR,
+"Default cookie policy that provides a higher degree of 
compatibility with common cookie management of popular HTTP agents for 
non-standard (Netscape style) cookies.");
+public static final AllowableValue STANDARD_COOKIE_POLICY = new 
AllowableValue(STANDARD_COOKIE_POLICY_STR, STANDARD_COOKIE_POLICY_STR,
+"RFC 6265 compliant cookie policy (interoperability 
profile).");
+public static final AllowableValue STRICT_COOKIE_POLICY = new 
AllowableValue(STRICT_COOKIE_POLICY_STR, STRICT_COOKIE_POLICY_STR,
+"RFC 6265 compliant cookie policy (strict profile).");
+public static final AllowableValue NETSCAPE_COOKIE_POLICY = new 
AllowableValue(NETSCAPE_COOKIE_POLICY_STR, NETSCAPE_COOKIE_POLICY_STR,
+"Netscape draft compliant cookie policy.");
+public static final AllowableValue IGNORE_COOKIE_POLICY = new 
AllowableValue(IGNORE_COOKIE_POLICY_STR, IGNORE_COOKIE_POLICY_STR,
+"A cookie policy that ignores cookies.");
+
+public static final PropertyDescriptor REDIRECT_COOKIE_POLICY = new 
PropertyDescriptor.Builder()
+.name("redirect-cookie-policy")
+.displayName("Redirect Cookie Policy")
+.description("When a HTTP server responds to a request with a 
redirect, this is the cookie policy used to copy cookies to the following 
request.")
+.allowableValues(DEFAULT_COOKIE_POLICY, 
STANDARD_COOKIE_POLICY, STRICT_COOKIE_POLICY, NETSCAPE_COOKIE_POLICY, 
IGNORE_COOKIE_POLICY)
+.defaultValue(DEFAULT_COOKIE_POLICY_STR)
--- End diff --

Okay, based on that description I am +1 on the pr. Would you like to merge, 
or shall I?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request #479: NIFI-1937 GetHTTP configurable redirect cookie polic...

2016-06-11 Thread mosermw
Github user mosermw commented on a diff in the pull request:

https://github.com/apache/nifi/pull/479#discussion_r66706057
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GetHTTP.java
 ---
@@ -197,6 +198,30 @@
 .addValidator(StandardValidators.PORT_VALIDATOR)
 .build();
 
+public static final String DEFAULT_COOKIE_POLICY_STR = "default";
+public static final String STANDARD_COOKIE_POLICY_STR = "standard";
+public static final String STRICT_COOKIE_POLICY_STR = "strict";
+public static final String NETSCAPE_COOKIE_POLICY_STR = "netscape";
+public static final String IGNORE_COOKIE_POLICY_STR = "ignore";
+public static final AllowableValue DEFAULT_COOKIE_POLICY = new 
AllowableValue(DEFAULT_COOKIE_POLICY_STR, DEFAULT_COOKIE_POLICY_STR,
+"Default cookie policy that provides a higher degree of 
compatibility with common cookie management of popular HTTP agents for 
non-standard (Netscape style) cookies.");
+public static final AllowableValue STANDARD_COOKIE_POLICY = new 
AllowableValue(STANDARD_COOKIE_POLICY_STR, STANDARD_COOKIE_POLICY_STR,
+"RFC 6265 compliant cookie policy (interoperability 
profile).");
+public static final AllowableValue STRICT_COOKIE_POLICY = new 
AllowableValue(STRICT_COOKIE_POLICY_STR, STRICT_COOKIE_POLICY_STR,
+"RFC 6265 compliant cookie policy (strict profile).");
+public static final AllowableValue NETSCAPE_COOKIE_POLICY = new 
AllowableValue(NETSCAPE_COOKIE_POLICY_STR, NETSCAPE_COOKIE_POLICY_STR,
+"Netscape draft compliant cookie policy.");
+public static final AllowableValue IGNORE_COOKIE_POLICY = new 
AllowableValue(IGNORE_COOKIE_POLICY_STR, IGNORE_COOKIE_POLICY_STR,
+"A cookie policy that ignores cookies.");
+
+public static final PropertyDescriptor REDIRECT_COOKIE_POLICY = new 
PropertyDescriptor.Builder()
+.name("redirect-cookie-policy")
+.displayName("Redirect Cookie Policy")
+.description("When a HTTP server responds to a request with a 
redirect, this is the cookie policy used to copy cookies to the following 
request.")
+.allowableValues(DEFAULT_COOKIE_POLICY, 
STANDARD_COOKIE_POLICY, STRICT_COOKIE_POLICY, NETSCAPE_COOKIE_POLICY, 
IGNORE_COOKIE_POLICY)
+.defaultValue(DEFAULT_COOKIE_POLICY_STR)
--- End diff --

@trkurc thanks for reviewing.  Before version 0.6.0 we didn't specify a 
cookie spec, so it was DEFAULT.  I made a change in 0.6.0 to use 
CookieSpecs.STANDARD, thinking this just increased compatibility with more web 
sites.  When I found that was not true, I suggested via this PR that we make 
the CookieSpecs configurable in the processor.  So using 
.defaultValue(DEFAULT_COOKIE_POLICY_STR) here takes us back to the default 
value pre 0.6.0.

I was hoping that it's rare that a cookie policy matters at all to GetHTTP, 
so taking us back to pre 0.6.0 functionality, by default, would be OK if not 
the desired functionality.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-06-11 Thread mans2singh
Github user mans2singh commented on the issue:

https://github.com/apache/nifi/pull/502
  
@pvillard31 - I messed up the merge may have to recreate the branch.  Let 
me fix that and then you can review it. 
Thanks


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #502: Nifi-1972 Apache Ignite Put Cache Processor

2016-06-11 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/502
  
Thanks @mans2singh ! I'll have a look asap (probably on Sunday or next 
week). Regarding the provenance event, I think we could have a SEND event for 
each flow file successfully sent to Ignite. Thoughts?

Note: it seems that your PR has conflicts against master.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #493: NIFI-1037 Created processor that handles HDFS' inotify even...

2016-06-11 Thread pvillard31
Github user pvillard31 commented on the issue:

https://github.com/apache/nifi/pull/493
  
@jjmeyer0 Yes you're absolutely right, I was not suggesting to have the 
three options together but rather think about what could be done to give more 
flexibility. I think there is a processor that combines both regular expression 
and expression language but cannot remember which one. Anyway, I'll have 
another look on Sunday or next week but it looks good to me!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #475: - Add Maven profile to compile nifi-hadoop-libraries-nar us...

2016-06-11 Thread trixpan
Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/475
  
@mattyb149 

It should be working. 

Let me try a build using a brand new MapR cluster + client environment and 
test. I suspect it may misconfiguration on your side or be connected to 
customisations previously introduced in my system. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi issue #475: - Add Maven profile to compile nifi-hadoop-libraries-nar us...

2016-06-11 Thread trixpan
Github user trixpan commented on the issue:

https://github.com/apache/nifi/pull/475
  
@joewitt , as a user.proto-developer I would be happy with any approach 
that results in workable binaries covering a particular flavour / supported 
platform without having to commit a patch every time I pull from git. 

Agree that perhaps settings.xml will be enough but I generally think the 
least we fiddle with wider settings (settings.xml for example) the better. 

I also thought that a profile is better than creating documentation 
articles because I personally believe that code is generally more likely to be 
looked after than documentation articles. 

May be me just me as a total JAVA and Maven newbie but nothing beats the 
simplicity of `-Phadoop_flavour=cdh5`

I fully agree we must ensure licensing is properly taken care of; Having 
said that, I have always been under the impression that unlike the GPL, the ASL 
does impose restrictions around *linking* non-ASF code. So hypothetically 
speaking I suspect we could even go to the extreme lengths of releasing 
binaries linking to non ASL code as long the foreign code licenses are 
respected (e.g. ASL software does not exclude GPL licensed code, it is the [GPL 
- through its terms - that excludes liking by ASF licensed 
code](http://www.apache.org/licenses/GPL-compatibility.html)).

Having said that, I suspect, given the [presence of MapR hadoop related 
code on 
github](https://github.com/mapr/hadoop-common/blob/release-2.7.0-mapr-1506/hadoop-hdfs-project/hadoop-hdfs/pom.xml),
 that their hadoop artifacts are released under ASF 2.0 but perhaps one of 
theirs, like @tdunning can help shedding some light. 

But back to the profile: 

The reason I ended up trying the profile approach is the fact Spark refers 
directly to Cloudera's and MapR's repositories on its main 
[pom.xml](https://github.com/apache/spark/blob/branch-1.6/pom.xml#L285). This 
led me to conclude (perhaps incorrectly) that it would be ok to have a pom 
pointing to a particular set of artifacts as long the binary produced by the 
formal release does not break ASF or foreign license licensing restrictions.

To be hones, Spark's approach is even simpler than using profiles:

Their pom.xml includes all repo's enabled by default and [let the user 
specify the hadoop version 
as](http://spark.apache.org/docs/latest/building-spark.html#specifying-the-hadoop-version):

```
# Cloudera CDH 4.2.0 with MapReduce v1
mvn -Dhadoop.version=2.0.0-mr1-cdh4.2.0 -Phadoop-1 -DskipTests clean package
```

Smartly paying with the fact that while vendors must respect the artifact 
ID, they tend to distinguish their supported code by embedding their names into 
the software version(e.g. 2.0.0-mr1-cdh4.2.0, hadoop-hdfs-2.x-mapr-1506, etc).

Yet, given to support Spark's approach changes would have to be introduced 
to the main pom.xml, I decided to keep the scope of changes minimal, changing 
only `hadoop-libraries-nar` pom, hence reducing the potential of code spilling 
beyond planned/needed.

Hope this makes my way of thinking a bit clearer.

Please let me know your preference and I will be happy to adjust.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---