Re: Nifi language support

2016-01-27 Thread Andy LoPresto
Federico,

That would be awesome; thanks for volunteering. I think the first step would be 
to raise a Jira ticket for internationalization [1]. Right now there are a lot 
of error messages and log statements that are not set up for 
internationalization, but I think the UI would be the first area to tackle.

[1] https://issues.apache.org/jira/secure/CreateIssue!default.jspa

Andy LoPresto
alopresto.apa...@gmail.com
PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4  BACE 3C6E F65B 2F7D EF69

> On Jan 27, 2016, at 3:30 PM, Federico Leven  wrote:
> 
> Hi all
> 
> 
> 
> I would like to know if there's some kind of other languages support or  how
> can I contribute with a translation to spanish.
> 
> 
> 
> Regards
> 
> 
> 
> Federico
> 



signature.asc
Description: Message signed with OpenPGP using GPGMail


Nifi language support

2016-01-27 Thread Federico Leven
Hi all

 

I would like to know if there's some kind of other languages support or  how
can I contribute with a translation to spanish.

 

Regards

 

Federico



Re: Nifi language support

2016-01-27 Thread Thad Guidry
Given the direction of Nifi's milestones and future plans, and not just for
i18N support, but for various other reasons such as Natural
Templates(allows prototyping without breaking anything !) and the talk of
this thread, error and validation messages, property editors, etc

I would highly recommend Thymeleaf 3 as a template engine.  (HTML5
supported already and i18N and its Spring supported.)  We also like it at
Ericsson. :)

http://www.thymeleaf.org/

http://docs.spring.io/spring/docs/current/spring-framework-reference/html/view.html#view-thymeleaf

Below URL is a Step by Step on a JSP conversion process (your Java code,
Javascript, & CSS all can remain unchanged.)  Old JSPs can remain and just
move into an /old_viewlayer folder such as this demonstrates:

 http://www.thymeleaf.org/doc/articles/petclinic.html

Thad
+ThadGuidry 


[GitHub] nifi pull request: NIFI-1107 Multipart Uploads

2016-01-27 Thread jskora
GitHub user jskora opened a pull request:

https://github.com/apache/nifi/pull/192

NIFI-1107 Multipart Uploads

Re-integrate Multipart Upload changes into PutS3Object.
1. add Multipart upload logic to allow resuming an upload after 
process/instance restart,
2. add local state management to track the part uploaded for a flowfile,
3. add configurable AWS S3 state management to abort orphaned uploads, and
4. adapt to IT test naming.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jskora/nifi NIFI-1107-tkurc-squash

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/192.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #192


commit 0fa0a31a8a40d4417dcb99359a6f0e9941db32ae
Author: Joe Skora 
Date:   2016-01-21T19:42:52Z

Re-integrate Multipart Upload changes into PutS3Object.
1. add Multipart upload logic to allow resuming an upload after 
process/instance restart,
2. add local state management to track the part uploaded for a flowfile,
3. add configurable AWS S3 state management to abort orphaned uploads, and
4. adapt to IT test naming.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi-Camel Integration

2016-01-27 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/186#discussion_r51000926
  
--- Diff: 
nifi-nar-bundles/nifi-camel-bundle/nifi-camel-processors/src/main/java/org/apache/nifi/processors/camel/CamelProcessor.java
 ---
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.camel;
+
+import com.google.common.collect.ImmutableList;
+import com.google.common.collect.ImmutableSet;
--- End diff --

Just to eliminate extra dependency, it is preferable to rely on native java 
classes if they provide such functionality and in this case they do for both 
List and Set (see ```Collections.unmodifiable. . .```)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1337: Add Riemann Reporting Task

2016-01-27 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/188#discussion_r51007338
  
--- Diff: 
nifi-nar-bundles/nifi-riemann-bundle/nifi-riemann-reporting-task/src/main/java/org/apache/nifi/reporting/riemann/RiemannReportingTask.java
 ---
@@ -0,0 +1,244 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.reporting.riemann;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.AbstractReportingTask;
+import org.apache.nifi.reporting.Bulletin;
+import org.apache.nifi.reporting.BulletinQuery;
+import org.apache.nifi.reporting.ReportingContext;
+import org.apache.nifi.reporting.riemann.metrics.MetricsService;
+
+import com.aphyr.riemann.Proto;
+import com.aphyr.riemann.client.IPromise;
+import com.aphyr.riemann.client.RiemannClient;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+import com.yammer.metrics.core.VirtualMachineMetrics;
+
+@Tags({ "reporting", "riemann", "metrics" })
+@DynamicProperty(name = "Attribute Name", value = "Attribute Value", 
supportsExpressionLanguage = false,
+description = "Additional attributes may be attached to the event 
by adding dynamic properties")
+@CapabilityDescription("Publish NiFi metrics to Riemann. These metrics 
include " + "JVM, Processor, and General Data Flow metrics. In addition, you 
may also forward bulletin " + "board messages.")
+public class RiemannReportingTask extends AbstractReportingTask {
+public static final PropertyDescriptor RIEMANN_HOST = new 
PropertyDescriptor.Builder().name("Riemann Address").description("Hostname of 
Riemann server").required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor RIEMANN_PORT = new 
PropertyDescriptor.Builder().name("Riemann Port").description("Port that 
Riemann is listening on").required(true).defaultValue("")
+.addValidator(StandardValidators.PORT_VALIDATOR).build();
+public static final PropertyDescriptor TRANSPORT_PROTOCOL = new 
PropertyDescriptor.Builder().name("Transport Protocol").description("Transport 
protocol to speak to Riemann in").required(true)
+.allowableValues(new Transport[] { Transport.TCP, 
Transport.UDP }).defaultValue("TCP").build();
+public static final PropertyDescriptor SERVICE_PREFIX = new 
PropertyDescriptor.Builder().name("Prefix for Service 
Name").description("Prefix to use when reporting to 
Riemann").defaultValue("nifi")
+
.required(true).addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor WRITE_TIMEOUT = new 
PropertyDescriptor.Builder().name("Timeout").description("Timeout in 
milliseconds when writing events to Riemann").required(true)
+
.defaultValue("500ms").addValidator(StandardValidators.TIME_PERIOD_VALIDATOR).build();
+static final PropertyDescriptor HOSTNAME = new 
PropertyDescriptor.Builder().name("Hostname").description("The Hostname of this 
NiFi instance to report to Riemann").required(true)
+
.expressionLanguageSupported(true).defaultValue("${hostname(true)}").addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+static final PropertyDescriptor TAGS = new 

[GitHub] nifi pull request: NIFI-1337: Add Riemann Reporting Task

2016-01-27 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/188#discussion_r51010766
  
--- Diff: 
nifi-nar-bundles/nifi-riemann-bundle/nifi-riemann-reporting-task/src/main/java/org/apache/nifi/reporting/riemann/RiemannReportingTask.java
 ---
@@ -0,0 +1,244 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.reporting.riemann;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.AbstractReportingTask;
+import org.apache.nifi.reporting.Bulletin;
+import org.apache.nifi.reporting.BulletinQuery;
+import org.apache.nifi.reporting.ReportingContext;
+import org.apache.nifi.reporting.riemann.metrics.MetricsService;
+
+import com.aphyr.riemann.Proto;
+import com.aphyr.riemann.client.IPromise;
+import com.aphyr.riemann.client.RiemannClient;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+import com.yammer.metrics.core.VirtualMachineMetrics;
+
+@Tags({ "reporting", "riemann", "metrics" })
+@DynamicProperty(name = "Attribute Name", value = "Attribute Value", 
supportsExpressionLanguage = false,
+description = "Additional attributes may be attached to the event 
by adding dynamic properties")
+@CapabilityDescription("Publish NiFi metrics to Riemann. These metrics 
include " + "JVM, Processor, and General Data Flow metrics. In addition, you 
may also forward bulletin " + "board messages.")
+public class RiemannReportingTask extends AbstractReportingTask {
+public static final PropertyDescriptor RIEMANN_HOST = new 
PropertyDescriptor.Builder().name("Riemann Address").description("Hostname of 
Riemann server").required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor RIEMANN_PORT = new 
PropertyDescriptor.Builder().name("Riemann Port").description("Port that 
Riemann is listening on").required(true).defaultValue("")
+.addValidator(StandardValidators.PORT_VALIDATOR).build();
+public static final PropertyDescriptor TRANSPORT_PROTOCOL = new 
PropertyDescriptor.Builder().name("Transport Protocol").description("Transport 
protocol to speak to Riemann in").required(true)
+.allowableValues(new Transport[] { Transport.TCP, 
Transport.UDP }).defaultValue("TCP").build();
+public static final PropertyDescriptor SERVICE_PREFIX = new 
PropertyDescriptor.Builder().name("Prefix for Service 
Name").description("Prefix to use when reporting to 
Riemann").defaultValue("nifi")
+
.required(true).addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor WRITE_TIMEOUT = new 
PropertyDescriptor.Builder().name("Timeout").description("Timeout in 
milliseconds when writing events to Riemann").required(true)
+
.defaultValue("500ms").addValidator(StandardValidators.TIME_PERIOD_VALIDATOR).build();
+static final PropertyDescriptor HOSTNAME = new 
PropertyDescriptor.Builder().name("Hostname").description("The Hostname of this 
NiFi instance to report to Riemann").required(true)
+
.expressionLanguageSupported(true).defaultValue("${hostname(true)}").addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+static final PropertyDescriptor TAGS = new 

[GitHub] nifi pull request: Nifi-Camel Integration

2016-01-27 Thread PuspenduBanerjee
Github user PuspenduBanerjee commented on the pull request:

https://github.com/apache/nifi/pull/186#issuecomment-175688480
  
Hello,
Now my build is failing in Travis CI, where as it's running fine locally.
Can anyone please help

[INFO] BUILD FAILURE
[INFO] 

[INFO] Total time: 01:07 min
[INFO] Finished at: 2016-01-27T06:50:28+00:00
[INFO] Final Memory: 150M/420M
[INFO] 

[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process (default) on 
project nifi-kite-processors: Failed to resolve dependencies for one or more 
projects in the reactor. Reason: Missing:
[ERROR] --
[ERROR] 1) org.apache.hive:hive-serde:jar:0.12.0-cdh5.0.0
[ERROR] 
[ERROR] Try downloading the file manually from the project website.
[ERROR] 
[ERROR] Then, install it using the command:
[ERROR] mvn install:install-file -DgroupId=org.apache.hive 
-DartifactId=hive-serde -Dversion=0.12.0-cdh5.0.0 -Dpackaging=jar 
-Dfile=/path/to/file
[ERROR] 
[ERROR] Alternatively, if you host your own repository you can deploy the 
file there:
[ERROR] mvn deploy:deploy-file -DgroupId=org.apache.hive 
-DartifactId=hive-serde -Dversion=0.12.0-cdh5.0.0 -Dpackaging=jar 
-Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
[ERROR] 
[ERROR] Path to dependency:
[ERROR] 1) org.apache.nifi:nifi-kite-processors:jar:0.4.2-SNAPSHOT
[ERROR] 2) org.apache.hive.hcatalog:hive-hcatalog-core:jar:1.2.0
[ERROR] 3) org.apache.hive:hive-metastore:jar:1.2.0


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi-Camel Integration

2016-01-27 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/186#discussion_r51003529
  
--- Diff: 
nifi-nar-bundles/nifi-camel-bundle/nifi-camel-processors/src/main/java/org/apache/nifi/processors/camel/CamelProcessor.java
 ---
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.camel;
+
+import com.google.common.collect.ImmutableList;
+import com.google.common.collect.ImmutableSet;
+import groovy.lang.GroovyClassLoader;
+
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.camel.CamelContext;
+import org.apache.camel.Exchange;
+import org.apache.camel.ProducerTemplate;
+import org.apache.camel.ServiceStatus;
+import org.apache.camel.impl.DefaultExchange;
+import org.apache.camel.impl.DefaultShutdownStrategy;
+import org.apache.camel.spi.ShutdownStrategy;
+import org.apache.camel.spring.SpringCamelContext;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.commons.lang3.math.NumberUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.springframework.context.ApplicationContext;
+import org.springframework.context.support.AbstractApplicationContext;
+import org.springframework.context.support.ClassPathXmlApplicationContext;
+import org.springframework.context.support.FileSystemXmlApplicationContext;
+import org.springframework.context.support.GenericXmlApplicationContext;
+import org.springframework.core.io.ByteArrayResource;
+
+/**
+ * This processor runs a Camel Route.
+ */
+@Tags({"camel", "route", "put"})
+@InputRequirement(Requirement.INPUT_ALLOWED)
+@CapabilityDescription("Runs a Camel Route. Each input FlowFile is 
converted into a Camel Exchange "
+   + "for processing by configured Route. It exports 
ProcessSession to camel exchange header 'nifiSession'")
+public class CamelProcessor extends AbstractProcessor {
+
+protected static final Relationship SUCCESS = new 
Relationship.Builder().name("success")
+.description("Camel Route has Executed Successfully").build();
+
+protected static final Relationship FAILURE = new 
Relationship.Builder().name("failure")
+.description("Camel Route has Failed to Execute").build();
+
+public static final PropertyDescriptor CAMEL_SPRING_CONTEXT_FILE_PATH 
= new PropertyDescriptor.Builder()
+.name("Camel Spring Config File Path")
+.description("The Classpath where NiFi can find Spring Application 
context file"
+ + " Ex: 
classpath:/META-INF/camel-application-context.xml")
+
.defaultValue("classpath:/META-INF/camel-application-context.xml").required(true).addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor CAMEL_SPRING_CONTEXT_DEF = new 
PropertyDescriptor.Builder()
+.name("Camel Spring Context Definition")
+.description("Content of Spring Application context ")

[GitHub] nifi pull request: NIFI-1337: Add Riemann Reporting Task

2016-01-27 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/188#discussion_r51005574
  
--- Diff: 
nifi-nar-bundles/nifi-riemann-bundle/nifi-riemann-reporting-task/src/main/java/org/apache/nifi/reporting/riemann/RiemannReportingTask.java
 ---
@@ -0,0 +1,244 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.reporting.riemann;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.AbstractReportingTask;
+import org.apache.nifi.reporting.Bulletin;
+import org.apache.nifi.reporting.BulletinQuery;
+import org.apache.nifi.reporting.ReportingContext;
+import org.apache.nifi.reporting.riemann.metrics.MetricsService;
+
+import com.aphyr.riemann.Proto;
+import com.aphyr.riemann.client.IPromise;
+import com.aphyr.riemann.client.RiemannClient;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+import com.yammer.metrics.core.VirtualMachineMetrics;
+
+@Tags({ "reporting", "riemann", "metrics" })
+@DynamicProperty(name = "Attribute Name", value = "Attribute Value", 
supportsExpressionLanguage = false,
+description = "Additional attributes may be attached to the event 
by adding dynamic properties")
+@CapabilityDescription("Publish NiFi metrics to Riemann. These metrics 
include " + "JVM, Processor, and General Data Flow metrics. In addition, you 
may also forward bulletin " + "board messages.")
+public class RiemannReportingTask extends AbstractReportingTask {
+public static final PropertyDescriptor RIEMANN_HOST = new 
PropertyDescriptor.Builder().name("Riemann Address").description("Hostname of 
Riemann server").required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor RIEMANN_PORT = new 
PropertyDescriptor.Builder().name("Riemann Port").description("Port that 
Riemann is listening on").required(true).defaultValue("")
+.addValidator(StandardValidators.PORT_VALIDATOR).build();
+public static final PropertyDescriptor TRANSPORT_PROTOCOL = new 
PropertyDescriptor.Builder().name("Transport Protocol").description("Transport 
protocol to speak to Riemann in").required(true)
+.allowableValues(new Transport[] { Transport.TCP, 
Transport.UDP }).defaultValue("TCP").build();
+public static final PropertyDescriptor SERVICE_PREFIX = new 
PropertyDescriptor.Builder().name("Prefix for Service 
Name").description("Prefix to use when reporting to 
Riemann").defaultValue("nifi")
+
.required(true).addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor WRITE_TIMEOUT = new 
PropertyDescriptor.Builder().name("Timeout").description("Timeout in 
milliseconds when writing events to Riemann").required(true)
+
.defaultValue("500ms").addValidator(StandardValidators.TIME_PERIOD_VALIDATOR).build();
+static final PropertyDescriptor HOSTNAME = new 
PropertyDescriptor.Builder().name("Hostname").description("The Hostname of this 
NiFi instance to report to Riemann").required(true)
--- End diff --

Just curious if there is a reason for these properties not being public? 
More of a consistency question, no big deal.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as 

[GitHub] nifi pull request: Nifi-Camel Integration

2016-01-27 Thread PuspenduBanerjee
Github user PuspenduBanerjee commented on a diff in the pull request:

https://github.com/apache/nifi/pull/186#discussion_r51009342
  
--- Diff: 
nifi-nar-bundles/nifi-camel-bundle/nifi-camel-processors/src/main/java/org/apache/nifi/processors/camel/CamelProcessor.java
 ---
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.camel;
+
+import com.google.common.collect.ImmutableList;
+import com.google.common.collect.ImmutableSet;
+import groovy.lang.GroovyClassLoader;
+
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.camel.CamelContext;
+import org.apache.camel.Exchange;
+import org.apache.camel.ProducerTemplate;
+import org.apache.camel.ServiceStatus;
+import org.apache.camel.impl.DefaultExchange;
+import org.apache.camel.impl.DefaultShutdownStrategy;
+import org.apache.camel.spi.ShutdownStrategy;
+import org.apache.camel.spring.SpringCamelContext;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.commons.lang3.math.NumberUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.springframework.context.ApplicationContext;
+import org.springframework.context.support.AbstractApplicationContext;
+import org.springframework.context.support.ClassPathXmlApplicationContext;
+import org.springframework.context.support.FileSystemXmlApplicationContext;
+import org.springframework.context.support.GenericXmlApplicationContext;
+import org.springframework.core.io.ByteArrayResource;
+
+/**
+ * This processor runs a Camel Route.
+ */
+@Tags({"camel", "route", "put"})
+@InputRequirement(Requirement.INPUT_ALLOWED)
+@CapabilityDescription("Runs a Camel Route. Each input FlowFile is 
converted into a Camel Exchange "
+   + "for processing by configured Route. It exports 
ProcessSession to camel exchange header 'nifiSession'")
+public class CamelProcessor extends AbstractProcessor {
+
+protected static final Relationship SUCCESS = new 
Relationship.Builder().name("success")
+.description("Camel Route has Executed Successfully").build();
+
+protected static final Relationship FAILURE = new 
Relationship.Builder().name("failure")
+.description("Camel Route has Failed to Execute").build();
+
+public static final PropertyDescriptor CAMEL_SPRING_CONTEXT_FILE_PATH 
= new PropertyDescriptor.Builder()
+.name("Camel Spring Config File Path")
+.description("The Classpath where NiFi can find Spring Application 
context file"
+ + " Ex: 
classpath:/META-INF/camel-application-context.xml")
+
.defaultValue("classpath:/META-INF/camel-application-context.xml").required(true).addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor CAMEL_SPRING_CONTEXT_DEF = new 
PropertyDescriptor.Builder()
+.name("Camel Spring Context Definition")
+.description("Content of Spring Application 

[GitHub] nifi pull request: Nifi-Camel Integration

2016-01-27 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/186#discussion_r51004665
  
--- Diff: 
nifi-nar-bundles/nifi-camel-bundle/nifi-camel-processors/src/main/java/org/apache/nifi/processors/camel/CamelProcessor.java
 ---
@@ -0,0 +1,240 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.processors.camel;
+
+import com.google.common.collect.ImmutableList;
+import com.google.common.collect.ImmutableSet;
+import groovy.lang.GroovyClassLoader;
+
+import java.util.List;
+import java.util.Set;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.camel.CamelContext;
+import org.apache.camel.Exchange;
+import org.apache.camel.ProducerTemplate;
+import org.apache.camel.ServiceStatus;
+import org.apache.camel.impl.DefaultExchange;
+import org.apache.camel.impl.DefaultShutdownStrategy;
+import org.apache.camel.spi.ShutdownStrategy;
+import org.apache.camel.spring.SpringCamelContext;
+import org.apache.commons.lang3.StringUtils;
+import org.apache.commons.lang3.math.NumberUtils;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.components.Validator;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.processor.AbstractProcessor;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.springframework.context.ApplicationContext;
+import org.springframework.context.support.AbstractApplicationContext;
+import org.springframework.context.support.ClassPathXmlApplicationContext;
+import org.springframework.context.support.FileSystemXmlApplicationContext;
+import org.springframework.context.support.GenericXmlApplicationContext;
+import org.springframework.core.io.ByteArrayResource;
+
+/**
+ * This processor runs a Camel Route.
+ */
+@Tags({"camel", "route", "put"})
+@InputRequirement(Requirement.INPUT_ALLOWED)
+@CapabilityDescription("Runs a Camel Route. Each input FlowFile is 
converted into a Camel Exchange "
+   + "for processing by configured Route. It exports 
ProcessSession to camel exchange header 'nifiSession'")
+public class CamelProcessor extends AbstractProcessor {
+
+protected static final Relationship SUCCESS = new 
Relationship.Builder().name("success")
+.description("Camel Route has Executed Successfully").build();
+
+protected static final Relationship FAILURE = new 
Relationship.Builder().name("failure")
+.description("Camel Route has Failed to Execute").build();
+
+public static final PropertyDescriptor CAMEL_SPRING_CONTEXT_FILE_PATH 
= new PropertyDescriptor.Builder()
+.name("Camel Spring Config File Path")
+.description("The Classpath where NiFi can find Spring Application 
context file"
+ + " Ex: 
classpath:/META-INF/camel-application-context.xml")
+
.defaultValue("classpath:/META-INF/camel-application-context.xml").required(true).addValidator(Validator.VALID)
+.build();
+
+public static final PropertyDescriptor CAMEL_SPRING_CONTEXT_DEF = new 
PropertyDescriptor.Builder()
+.name("Camel Spring Context Definition")
+.description("Content of Spring Application context ")

[GitHub] nifi pull request: NIFI-1337: Add Riemann Reporting Task

2016-01-27 Thread rickysaltzer
Github user rickysaltzer commented on a diff in the pull request:

https://github.com/apache/nifi/pull/188#discussion_r51010255
  
--- Diff: 
nifi-nar-bundles/nifi-riemann-bundle/nifi-riemann-reporting-task/src/main/java/org/apache/nifi/reporting/riemann/RiemannReportingTask.java
 ---
@@ -0,0 +1,244 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.reporting.riemann;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.AbstractReportingTask;
+import org.apache.nifi.reporting.Bulletin;
+import org.apache.nifi.reporting.BulletinQuery;
+import org.apache.nifi.reporting.ReportingContext;
+import org.apache.nifi.reporting.riemann.metrics.MetricsService;
+
+import com.aphyr.riemann.Proto;
+import com.aphyr.riemann.client.IPromise;
+import com.aphyr.riemann.client.RiemannClient;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+import com.yammer.metrics.core.VirtualMachineMetrics;
+
+@Tags({ "reporting", "riemann", "metrics" })
+@DynamicProperty(name = "Attribute Name", value = "Attribute Value", 
supportsExpressionLanguage = false,
+description = "Additional attributes may be attached to the event 
by adding dynamic properties")
+@CapabilityDescription("Publish NiFi metrics to Riemann. These metrics 
include " + "JVM, Processor, and General Data Flow metrics. In addition, you 
may also forward bulletin " + "board messages.")
+public class RiemannReportingTask extends AbstractReportingTask {
+public static final PropertyDescriptor RIEMANN_HOST = new 
PropertyDescriptor.Builder().name("Riemann Address").description("Hostname of 
Riemann server").required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor RIEMANN_PORT = new 
PropertyDescriptor.Builder().name("Riemann Port").description("Port that 
Riemann is listening on").required(true).defaultValue("")
+.addValidator(StandardValidators.PORT_VALIDATOR).build();
+public static final PropertyDescriptor TRANSPORT_PROTOCOL = new 
PropertyDescriptor.Builder().name("Transport Protocol").description("Transport 
protocol to speak to Riemann in").required(true)
+.allowableValues(new Transport[] { Transport.TCP, 
Transport.UDP }).defaultValue("TCP").build();
+public static final PropertyDescriptor SERVICE_PREFIX = new 
PropertyDescriptor.Builder().name("Prefix for Service 
Name").description("Prefix to use when reporting to 
Riemann").defaultValue("nifi")
+
.required(true).addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor WRITE_TIMEOUT = new 
PropertyDescriptor.Builder().name("Timeout").description("Timeout in 
milliseconds when writing events to Riemann").required(true)
+
.defaultValue("500ms").addValidator(StandardValidators.TIME_PERIOD_VALIDATOR).build();
+static final PropertyDescriptor HOSTNAME = new 
PropertyDescriptor.Builder().name("Hostname").description("The Hostname of this 
NiFi instance to report to Riemann").required(true)
+
.expressionLanguageSupported(true).defaultValue("${hostname(true)}").addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+static final PropertyDescriptor TAGS = new 

[GitHub] nifi pull request: NIFI-1337: Add Riemann Reporting Task

2016-01-27 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/188#discussion_r51006207
  
--- Diff: 
nifi-nar-bundles/nifi-riemann-bundle/nifi-riemann-reporting-task/src/main/java/org/apache/nifi/reporting/riemann/RiemannReportingTask.java
 ---
@@ -0,0 +1,244 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.reporting.riemann;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.AbstractReportingTask;
+import org.apache.nifi.reporting.Bulletin;
+import org.apache.nifi.reporting.BulletinQuery;
+import org.apache.nifi.reporting.ReportingContext;
+import org.apache.nifi.reporting.riemann.metrics.MetricsService;
+
+import com.aphyr.riemann.Proto;
+import com.aphyr.riemann.client.IPromise;
+import com.aphyr.riemann.client.RiemannClient;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+import com.yammer.metrics.core.VirtualMachineMetrics;
+
+@Tags({ "reporting", "riemann", "metrics" })
+@DynamicProperty(name = "Attribute Name", value = "Attribute Value", 
supportsExpressionLanguage = false,
+description = "Additional attributes may be attached to the event 
by adding dynamic properties")
+@CapabilityDescription("Publish NiFi metrics to Riemann. These metrics 
include " + "JVM, Processor, and General Data Flow metrics. In addition, you 
may also forward bulletin " + "board messages.")
+public class RiemannReportingTask extends AbstractReportingTask {
+public static final PropertyDescriptor RIEMANN_HOST = new 
PropertyDescriptor.Builder().name("Riemann Address").description("Hostname of 
Riemann server").required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor RIEMANN_PORT = new 
PropertyDescriptor.Builder().name("Riemann Port").description("Port that 
Riemann is listening on").required(true).defaultValue("")
+.addValidator(StandardValidators.PORT_VALIDATOR).build();
+public static final PropertyDescriptor TRANSPORT_PROTOCOL = new 
PropertyDescriptor.Builder().name("Transport Protocol").description("Transport 
protocol to speak to Riemann in").required(true)
+.allowableValues(new Transport[] { Transport.TCP, 
Transport.UDP }).defaultValue("TCP").build();
+public static final PropertyDescriptor SERVICE_PREFIX = new 
PropertyDescriptor.Builder().name("Prefix for Service 
Name").description("Prefix to use when reporting to 
Riemann").defaultValue("nifi")
+
.required(true).addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor WRITE_TIMEOUT = new 
PropertyDescriptor.Builder().name("Timeout").description("Timeout in 
milliseconds when writing events to Riemann").required(true)
+
.defaultValue("500ms").addValidator(StandardValidators.TIME_PERIOD_VALIDATOR).build();
+static final PropertyDescriptor HOSTNAME = new 
PropertyDescriptor.Builder().name("Hostname").description("The Hostname of this 
NiFi instance to report to Riemann").required(true)
+
.expressionLanguageSupported(true).defaultValue("${hostname(true)}").addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+static final PropertyDescriptor TAGS = new 

[GitHub] nifi pull request: NIFI-1337: Add Riemann Reporting Task

2016-01-27 Thread olegz
Github user olegz commented on a diff in the pull request:

https://github.com/apache/nifi/pull/188#discussion_r51008302
  
--- Diff: 
nifi-nar-bundles/nifi-riemann-bundle/nifi-riemann-reporting-task/src/main/java/org/apache/nifi/reporting/riemann/RiemannReportingTask.java
 ---
@@ -0,0 +1,244 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.reporting.riemann;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.annotation.lifecycle.OnScheduled;
+import org.apache.nifi.annotation.lifecycle.OnStopped;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.reporting.AbstractReportingTask;
+import org.apache.nifi.reporting.Bulletin;
+import org.apache.nifi.reporting.BulletinQuery;
+import org.apache.nifi.reporting.ReportingContext;
+import org.apache.nifi.reporting.riemann.metrics.MetricsService;
+
+import com.aphyr.riemann.Proto;
+import com.aphyr.riemann.client.IPromise;
+import com.aphyr.riemann.client.RiemannClient;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+import com.yammer.metrics.core.VirtualMachineMetrics;
+
+@Tags({ "reporting", "riemann", "metrics" })
+@DynamicProperty(name = "Attribute Name", value = "Attribute Value", 
supportsExpressionLanguage = false,
+description = "Additional attributes may be attached to the event 
by adding dynamic properties")
+@CapabilityDescription("Publish NiFi metrics to Riemann. These metrics 
include " + "JVM, Processor, and General Data Flow metrics. In addition, you 
may also forward bulletin " + "board messages.")
+public class RiemannReportingTask extends AbstractReportingTask {
+public static final PropertyDescriptor RIEMANN_HOST = new 
PropertyDescriptor.Builder().name("Riemann Address").description("Hostname of 
Riemann server").required(true)
+.addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor RIEMANN_PORT = new 
PropertyDescriptor.Builder().name("Riemann Port").description("Port that 
Riemann is listening on").required(true).defaultValue("")
+.addValidator(StandardValidators.PORT_VALIDATOR).build();
+public static final PropertyDescriptor TRANSPORT_PROTOCOL = new 
PropertyDescriptor.Builder().name("Transport Protocol").description("Transport 
protocol to speak to Riemann in").required(true)
+.allowableValues(new Transport[] { Transport.TCP, 
Transport.UDP }).defaultValue("TCP").build();
+public static final PropertyDescriptor SERVICE_PREFIX = new 
PropertyDescriptor.Builder().name("Prefix for Service 
Name").description("Prefix to use when reporting to 
Riemann").defaultValue("nifi")
+
.required(true).addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+public static final PropertyDescriptor WRITE_TIMEOUT = new 
PropertyDescriptor.Builder().name("Timeout").description("Timeout in 
milliseconds when writing events to Riemann").required(true)
+
.defaultValue("500ms").addValidator(StandardValidators.TIME_PERIOD_VALIDATOR).build();
+static final PropertyDescriptor HOSTNAME = new 
PropertyDescriptor.Builder().name("Hostname").description("The Hostname of this 
NiFi instance to report to Riemann").required(true)
+
.expressionLanguageSupported(true).defaultValue("${hostname(true)}").addValidator(StandardValidators.NON_EMPTY_VALIDATOR).build();
+static final PropertyDescriptor TAGS = new 

[GitHub] nifi pull request: NIFI-1333 fixed FlowController shutdown deadloc...

2016-01-27 Thread mcgilman
Github user mcgilman commented on the pull request:

https://github.com/apache/nifi/pull/148#issuecomment-175755889
  
Would just like to confirm with @trkurc before merging in. Sounds like he 
may have been still working through some of the details. If he's content I'm 
happy to merge this in.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1193: Add support for storing data in Hive...

2016-01-27 Thread rdblue
Github user rdblue commented on the pull request:

https://github.com/apache/nifi/pull/147#issuecomment-175765398
  
@apiri I've reopened this and will fix it in this PR. Thanks for rolling 
back the changes in master.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: Nifi-Camel Integration

2016-01-27 Thread PuspenduBanerjee
Github user PuspenduBanerjee commented on the pull request:

https://github.com/apache/nifi/pull/186#issuecomment-175781324
  
@apiri Thanks. It has built correctly with intermittent Heap issue. :+1: 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Re: Configuration Service

2016-01-27 Thread Joe Witt
Matt,

That is basically what we're talking about providing but it would be
transparent through the expression language capability.  This approach
will make templates far more portable than they are today.

Thanks
Joe

On Wed, Jan 27, 2016 at 11:41 AM, Angry Duck Studio
 wrote:
> Why can't we just have a built-in persistent k-v store controller service
> that any processor can use? Could even modify the expression language to
> interact with it. Maybe something like ${props:} where 'props' refers
> to the built-in store.
>
> -Matt R
>
> On Wed, Jan 27, 2016 at 7:16 AM, Matt Gilman 
> wrote:
>
>> Simon,
>>
>> One idea that's been thrown around is adding a 'variable registry' [1]
>> where you could define variables at a group level that could be referenced
>> by the encapsulated components. Additionally, this would help with the
>> portability of templates when needing to define different values for
>> different environments.
>>
>> Matt
>>
>> [1] https://cwiki.apache.org/confluence/display/NIFI/Variable+Registry
>>
>> On Wed, Jan 27, 2016 at 4:30 AM, Simon Ball  wrote:
>>
>> > Other thoughts:
>> >
>> > This will have implications for broadcasting OnPropertyChanged events,
>> and
>> > potentially locking processors around the changes in properties held by
>> the
>> > service. I’d love to hear if anyone can think of any other significant
>> land
>> > mines, or has had any thoughts on anything similar.
>> >
>> > Simon
>> >
>> > On 27 Jan 2016, at 09:26, Simon Ball  > sb...@hortonworks.com>> wrote:
>> >
>> > One of the problems with complex flows is repetition of common
>> > configuration. Many people also want to be able to configure things like
>> > connection strings into an environment specific location outside of the
>> > Flow, and parameterise the flow. Things like the Kerberos|SSL|etc Context
>> > service help in part with this, but the problem could be generalised.
>> >
>> > What do people think of the idea of a configuration provider controller
>> > service, providing essentially the same concept as environment variables,
>> > which holds essentially a key value store which processors can refer to?
>> A
>> > simple way to refer to contents could be an extension on expression
>> > language providing a config(‘key’) function.
>> >
>> > What are your thoughts on this? Worth knocking up a quick prototype
>> > implementation (I’m happy to do so if there is community interest)?
>> >
>> > Simon
>> >
>> >
>>


Re: Configuration Service

2016-01-27 Thread Simon Ball
That certainly covers a part of it, but I take it since that is read at start 
time, you couldn't have a property change on a flow in progress. If we had 
something like the variable repository, we could fire property change events on 
updates, and change centralised properties at runtime. For example, failover of 
non load balanced endpoints, or just adjustments of other global variables that 
might feed a flow.

Simon


> On 27 Jan 2016, at 17:16, Bryan Bende  wrote:
> 
> FWIW you can externalize connection strings and other similar properties
> into nifi.properties and have your processors references these environment
> variables through expression language. This way you can move the
> flow.xml.gz between two environments without changing anything after you
> setup the nifi.properties for each environment. The variable registry would
> just make this a much better user experience.
> 
>> On Wed, Jan 27, 2016 at 12:07 PM, Joe Witt  wrote:
>> 
>> Matt,
>> 
>> That is basically what we're talking about providing but it would be
>> transparent through the expression language capability.  This approach
>> will make templates far more portable than they are today.
>> 
>> Thanks
>> Joe
>> 
>> On Wed, Jan 27, 2016 at 11:41 AM, Angry Duck Studio
>>  wrote:
>>> Why can't we just have a built-in persistent k-v store controller service
>>> that any processor can use? Could even modify the expression language to
>>> interact with it. Maybe something like ${props:} where 'props'
>> refers
>>> to the built-in store.
>>> 
>>> -Matt R
>>> 
>>> On Wed, Jan 27, 2016 at 7:16 AM, Matt Gilman 
>>> wrote:
>>> 
 Simon,
 
 One idea that's been thrown around is adding a 'variable registry' [1]
 where you could define variables at a group level that could be
>> referenced
 by the encapsulated components. Additionally, this would help with the
 portability of templates when needing to define different values for
 different environments.
 
 Matt
 
 [1] https://cwiki.apache.org/confluence/display/NIFI/Variable+Registry
 
 On Wed, Jan 27, 2016 at 4:30 AM, Simon Ball 
>> wrote:
 
> Other thoughts:
> 
> This will have implications for broadcasting OnPropertyChanged events,
 and
> potentially locking processors around the changes in properties held
>> by
 the
> service. I’d love to hear if anyone can think of any other significant
 land
> mines, or has had any thoughts on anything similar.
> 
> Simon
> 
> On 27 Jan 2016, at 09:26, Simon Ball > wrote:
> 
> One of the problems with complex flows is repetition of common
> configuration. Many people also want to be able to configure things
>> like
> connection strings into an environment specific location outside of
>> the
> Flow, and parameterise the flow. Things like the Kerberos|SSL|etc
>> Context
> service help in part with this, but the problem could be generalised.
> 
> What do people think of the idea of a configuration provider
>> controller
> service, providing essentially the same concept as environment
>> variables,
> which holds essentially a key value store which processors can refer
>> to?
 A
> simple way to refer to contents could be an extension on expression
> language providing a config(‘key’) function.
> 
> What are your thoughts on this? Worth knocking up a quick prototype
> implementation (I’m happy to do so if there is community interest)?
> 
> Simon
>> 


[GitHub] nifi pull request: NIFI-1423 Allow to penalize FlowFiles to No Ret...

2016-01-27 Thread olegz
Github user olegz commented on the pull request:

https://github.com/apache/nifi/pull/183#issuecomment-175748677
  
+1


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1333 fixed FlowController shutdown deadloc...

2016-01-27 Thread trkurc
Github user trkurc commented on the pull request:

https://github.com/apache/nifi/pull/148#issuecomment-175758188
  
Still working it - I'll be done tonight. I can take care of the merge.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1333 fixed FlowController shutdown deadloc...

2016-01-27 Thread mcgilman
Github user mcgilman commented on the pull request:

https://github.com/apache/nifi/pull/148#issuecomment-175758510
  
Sounds good. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


Configuration Service

2016-01-27 Thread Simon Ball
One of the problems with complex flows is repetition of common configuration. 
Many people also want to be able to configure things like connection strings 
into an environment specific location outside of the Flow, and parameterise the 
flow. Things like the Kerberos|SSL|etc Context service help in part with this, 
but the problem could be generalised.

What do people think of the idea of a configuration provider controller 
service, providing essentially the same concept as environment variables, which 
holds essentially a key value store which processors can refer to? A simple way 
to refer to contents could be an extension on expression language providing a 
config(‘key’) function. 

What are your thoughts on this? Worth knocking up a quick prototype 
implementation (I’m happy to do so if there is community interest)?

Simon

Re: Configuration Service

2016-01-27 Thread Simon Ball
Other thoughts:

This will have implications for broadcasting OnPropertyChanged events, and 
potentially locking processors around the changes in properties held by the 
service. I’d love to hear if anyone can think of any other significant land 
mines, or has had any thoughts on anything similar.

Simon

On 27 Jan 2016, at 09:26, Simon Ball 
> wrote:

One of the problems with complex flows is repetition of common configuration. 
Many people also want to be able to configure things like connection strings 
into an environment specific location outside of the Flow, and parameterise the 
flow. Things like the Kerberos|SSL|etc Context service help in part with this, 
but the problem could be generalised.

What do people think of the idea of a configuration provider controller 
service, providing essentially the same concept as environment variables, which 
holds essentially a key value store which processors can refer to? A simple way 
to refer to contents could be an extension on expression language providing a 
config(‘key’) function.

What are your thoughts on this? Worth knocking up a quick prototype 
implementation (I’m happy to do so if there is community interest)?

Simon



[GitHub] nifi pull request: NIFI-1118 Update SplitText Processor - add supp...

2016-01-27 Thread markobean
Github user markobean commented on a diff in the pull request:

https://github.com/apache/nifi/pull/135#discussion_r50990975
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/SplitText.java
 ---
@@ -143,26 +165,12 @@ protected void init(final 
ProcessorInitializationContext context) {
 return properties;
 }
 
-private int readLines(final InputStream in, final int maxNumLines, 
final OutputStream out, final boolean keepAllNewLines) throws IOException {
-int numLines = 0;
-for (int i = 0; i < maxNumLines; i++) {
-final long bytes = countBytesToSplitPoint(in, out, 
keepAllNewLines || (i != maxNumLines - 1));
-if (bytes <= 0) {
-return numLines;
-}
-
-numLines++;
-}
-
-return numLines;
-}
-
-private long countBytesToSplitPoint(final InputStream in, final 
OutputStream out, final boolean includeLineDelimiter) throws IOException {
+private int readLine(final InputStream in, final OutputStream out,
+  final boolean includeLineDelimiter) throws 
IOException {
 int lastByte = -1;
-long bytesRead = 0L;
+int bytesRead = 0;
 
 while (true) {
-in.mark(1);
--- End diff --

This "in.mark(1)" was removed because marking and resetting the input 
stream has changed. Previously, this mark/reset was used to rollback the 
reading of the character after a \r. (More below.) Now, mark/reset is used at a 
higher level to potentially rollback an entire line read from the input 
flowfile if that line exceeds the size limit imposed by the FRAGMENT_MAX_SIZE 
property.
Removing this in.mark() does not generate an IOException in the below (line 
206) in.reset() because an in.mark() was previously made prior to the call to 
readLine() (line 239 or line 323.) Nonetheless, the overall logic is still 
incorrect. There are two logical in.reset() conditions: character-based and 
line-based. This must be corrected and made consistent (line-based).
The special consideration of the \r character confuses me ('if' block 
beginning on Line 205.) In the original code, the byte after the \r is rolled 
back. Why? If the byte after \r is \n, the special consideration for \r is not 
required as the 'if' on line 194 captures the end of line. Is it valid and 
intended to have the \r indicate an end of line even when a subsequent \n is 
not present? (Note: testing of present SplitText processor shows incorrect 
behavior for only \r without \n; it duplicates the first character of the next 
"line".)
If special consideration for \r is not required and the 'if' block 
beginning on line 205 is removed, both the in.mark and in.reset within the 
readLine() method go away, and I believe all will be correct.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] nifi pull request: NIFI-1118 Update SplitText Processor - add supp...

2016-01-27 Thread markobean
Github user markobean commented on a diff in the pull request:

https://github.com/apache/nifi/pull/135#discussion_r50989494
  
--- Diff: 
nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/SplitText.java
 ---
@@ -230,100 +270,112 @@ public void onTrigger(final ProcessContext context, 
final ProcessSession session
 final ProcessorLog logger = getLogger();
 final int headerCount = 
context.getProperty(HEADER_LINE_COUNT).asInteger();
 final int splitCount = 
context.getProperty(LINE_SPLIT_COUNT).asInteger();
+final double maxFragmentSize;
+if (context.getProperty(FRAGMENT_MAX_SIZE).isSet()) {
+maxFragmentSize = 
context.getProperty(FRAGMENT_MAX_SIZE).asDataSize(DataUnit.B);
+} else {
+maxFragmentSize = Integer.MAX_VALUE;
+}
+final String headerMarker = 
context.getProperty(HEADER_MARKER).getValue();
 final boolean removeTrailingNewlines = 
context.getProperty(REMOVE_TRAILING_NEWLINES).asBoolean();
-
 final ObjectHolder errorMessage = new ObjectHolder<>(null);
-final ArrayList splitInfos = new ArrayList<>();
-
 final long startNanos = System.nanoTime();
 final List splits = new ArrayList<>();
+
 session.read(flowFile, new InputStreamCallback() {
 @Override
 public void process(final InputStream rawIn) throws 
IOException {
 try (final BufferedInputStream bufferedIn = new 
BufferedInputStream(rawIn);
 final ByteCountingInputStream in = new 
ByteCountingInputStream(bufferedIn)) {
 
-// if we have header lines, copy them into a 
ByteArrayOutputStream
+// Identify header, if any
 final ByteArrayOutputStream headerStream = new 
ByteArrayOutputStream();
-final int headerLinesCopied = readLines(in, 
headerCount, headerStream, true);
-if (headerLinesCopied < headerCount) {
-errorMessage.set("Header Line Count is set to " + 
headerCount + " but file had only " + headerLinesCopied + " lines");
+final SplitInfo headerInfo = readHeader(headerCount, 
headerMarker, in, headerStream, true);
+if (headerInfo.lengthLines < headerCount) {
+errorMessage.set("Header Line Count is set to " + 
headerCount + " but file had only "
++ headerInfo.lengthLines + " lines");
 return;
 }
 
 while (true) {
-if (headerCount > 0) {
-// if we have header lines, create a new 
FlowFile, copy the header lines to that file,
-// and then start copying lines
-final IntegerHolder linesCopied = new 
IntegerHolder(0);
-FlowFile splitFile = session.create(flowFile);
-try {
-splitFile = session.write(splitFile, new 
OutputStreamCallback() {
-@Override
-public void process(final OutputStream 
rawOut) throws IOException {
-try (final BufferedOutputStream 
out = new BufferedOutputStream(rawOut)) {
+FlowFile splitFile = session.create(flowFile);
+final SplitInfo flowFileInfo = new SplitInfo();
+
+// if we have header lines, write them out
+// and then start copying lines
+try {
+splitFile = session.write(splitFile, new 
OutputStreamCallback() {
+@Override
+public void process(final OutputStream 
rawOut) throws IOException {
+try (final BufferedOutputStream out = 
new BufferedOutputStream(rawOut)) {
+long lineCount = 0;
+long byteCount = 0;
+// Process header
+if (headerInfo.lengthLines > 0) {
+flowFileInfo.lengthBytes = 
headerInfo.lengthBytes;
+byteCount = 
headerInfo.lengthBytes;
 headerStream.writeTo(out);
-linesCopied.set(readLines(in, 
splitCount, out, 

Re: Configuration Service

2016-01-27 Thread Matt Gilman
Simon,

One idea that's been thrown around is adding a 'variable registry' [1]
where you could define variables at a group level that could be referenced
by the encapsulated components. Additionally, this would help with the
portability of templates when needing to define different values for
different environments.

Matt

[1] https://cwiki.apache.org/confluence/display/NIFI/Variable+Registry

On Wed, Jan 27, 2016 at 4:30 AM, Simon Ball  wrote:

> Other thoughts:
>
> This will have implications for broadcasting OnPropertyChanged events, and
> potentially locking processors around the changes in properties held by the
> service. I’d love to hear if anyone can think of any other significant land
> mines, or has had any thoughts on anything similar.
>
> Simon
>
> On 27 Jan 2016, at 09:26, Simon Ball > wrote:
>
> One of the problems with complex flows is repetition of common
> configuration. Many people also want to be able to configure things like
> connection strings into an environment specific location outside of the
> Flow, and parameterise the flow. Things like the Kerberos|SSL|etc Context
> service help in part with this, but the problem could be generalised.
>
> What do people think of the idea of a configuration provider controller
> service, providing essentially the same concept as environment variables,
> which holds essentially a key value store which processors can refer to? A
> simple way to refer to contents could be an extension on expression
> language providing a config(‘key’) function.
>
> What are your thoughts on this? Worth knocking up a quick prototype
> implementation (I’m happy to do so if there is community interest)?
>
> Simon
>
>