[jira] [Commented] (NIFI-1994) Templates do not handle Controller Services correctly
[ https://issues.apache.org/jira/browse/NIFI-1994?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323659#comment-15323659 ] ASF GitHub Bot commented on NIFI-1994: -- GitHub user markap14 opened a pull request: https://github.com/apache/nifi/pull/517 NIFI-1994: Fixed issues with controller services and templates Fixed issue with Controller Service Fully Qualified Class Names and ensure that services are added to the process groups as appropriate when instantiating templates You can merge this pull request into a Git repository by running: $ git pull https://github.com/markap14/nifi NIFI-1994 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/517.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #517 commit a5fecda5a2ffb35e21d950aa19a07127e19a419e Author: Bryan RosanderDate: 2016-05-27T14:56:02Z NIFI-1975 - Processor for parsing evtx files Signed-off-by: Matt Burgess This closes #492 commit c120c4982d4fc811b06b672e3983b8ca5fb8ae64 Author: Koji Kawamura Date: 2016-06-06T13:19:26Z NIFI-1857: HTTPS Site-to-Site - Enable HTTP(S) for Site-to-Site communication - Support HTTP Proxy in the middle of local and remote NiFi - Support BASIC and DIGEST auth with Proxy Server - Provide 2-phase style commit same as existing socket version - [WIP] Test with the latest cluster env (without NCM) hasn't tested yet - Fixed Buffer handling issues at asyc http client POST - Fixed JS error when applying Remote Process Group Port setting from UI - Use compression setting from UI - Removed already finished TODO comments - Added additional buffer draining code after receiving EOF - Added inspection and assert code to make sure Site-to-Site client has written data fully to output stream - Changed default nifi.remote.input.secure from true to false This closes #497. commit bfebe76d17b2024c8ae90fd3837df71ba77d Author: Mark Payne Date: 2016-06-10T00:39:29Z NIFI-1994: Fixed issue with Controller Service Fully Qualified Class Names and ensure that services are added to the process groups as appropriate when instantiating templates > Templates do not handle Controller Services correctly > - > > Key: NIFI-1994 > URL: https://issues.apache.org/jira/browse/NIFI-1994 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework >Affects Versions: 1.0.0 >Reporter: Mark Payne >Assignee: Mark Payne >Priority: Blocker > Fix For: 1.0.0 > > > If we create a template that includes a Controller Service and then try to > instantiate it, we get an error that indicates that the type does not exist > because it references the Controller Service "simple class name" instead of > the fully qualified class name. > Additionally, if there is already a template with a controller service, > attempting to instantiate it results in the controller services not being > added to the flow. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (NIFI-1994) Templates do not handle Controller Services correctly
Mark Payne created NIFI-1994: Summary: Templates do not handle Controller Services correctly Key: NIFI-1994 URL: https://issues.apache.org/jira/browse/NIFI-1994 Project: Apache NiFi Issue Type: Bug Components: Core Framework Affects Versions: 1.0.0 Reporter: Mark Payne Assignee: Mark Payne Priority: Blocker Fix For: 1.0.0 If we create a template that includes a Controller Service and then try to instantiate it, we get an error that indicates that the type does not exist because it references the Controller Service "simple class name" instead of the fully qualified class name. Additionally, if there is already a template with a controller service, attempting to instantiate it results in the controller services not being added to the flow. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1993) Upgrade CGLIB to the latest 3.2
[ https://issues.apache.org/jira/browse/NIFI-1993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323635#comment-15323635 ] ASF GitHub Bot commented on NIFI-1993: -- GitHub user olegz opened a pull request: https://github.com/apache/nifi/pull/516 NIFI-1993 upgraded CGLIB to 3.2.2 You can merge this pull request into a Git repository by running: $ git pull https://github.com/olegz/nifi NIFI-1993 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/516.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #516 commit ce60b68a53df5127d3a6500ff31894209277ed30 Author: Oleg ZhurakouskyDate: 2016-06-10T00:08:00Z NIFI-1993 upgraded CGLIB to 3.2.2 > Upgrade CGLIB to the latest 3.2 > --- > > Key: NIFI-1993 > URL: https://issues.apache.org/jira/browse/NIFI-1993 > Project: Apache NiFi > Issue Type: Improvement >Reporter: Oleg Zhurakousky >Assignee: Oleg Zhurakousky >Priority: Minor > Fix For: 1.0.0 > > > While, working in NIFI-826 I've encountered problem related to Groovy tests > (Spoke) and java 1.8 which is essentially described here: > https://groups.google.com/forum/#!topic/spockframework/59WIHGgcSNE > The stack trace from the failing Spoke test: > {code} > test InstantiateTemplate moves and scales > templates[0](org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec) Time > elapsed: 0.46 sec <<< ERROR! > java.lang.IllegalArgumentException: null > at > net.sf.cglib.proxy.BridgeMethodResolver.resolveAll(BridgeMethodResolver.java:61) > at net.sf.cglib.proxy.Enhancer.emitMethods(Enhancer.java:911) > at net.sf.cglib.proxy.Enhancer.generateClass(Enhancer.java:498) > at > net.sf.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25) > at > net.sf.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:216) > at net.sf.cglib.proxy.Enhancer.createHelper(Enhancer.java:377) > at net.sf.cglib.proxy.Enhancer.createClass(Enhancer.java:317) > at > org.spockframework.mock.runtime.ProxyBasedMockFactory$CglibMockFactory.createMock(ProxyBasedMockFactory.java:91) > at > org.spockframework.mock.runtime.ProxyBasedMockFactory.create(ProxyBasedMockFactory.java:49) > at > org.spockframework.mock.runtime.JavaMockFactory.create(JavaMockFactory.java:51) > at > org.spockframework.mock.runtime.CompositeMockFactory.create(CompositeMockFactory.java:44) > at > org.spockframework.lang.SpecInternals.createMock(SpecInternals.java:45) > at > org.spockframework.lang.SpecInternals.createMockImpl(SpecInternals.java:281) > at org.spockframework.lang.SpecInternals.MockImpl(SpecInternals.java:99) > at > groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46) > at > groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46) > at > org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec$__spock_feature_0_0_closure2.closure7$_closure8(StandardTemplateDAOSpec.groovy:71) > at groovy.lang.Closure.call(Closure.java:426) > at > org.spockframework.mock.response.CodeResponseGenerator.invokeClosure(CodeResponseGenerator.java:53) > at > org.spockframework.mock.response.CodeResponseGenerator.doRespond(CodeResponseGenerator.java:36) > at > org.spockframework.mock.response.SingleResponseGenerator.respond(SingleResponseGenerator.java:31) > at > org.spockframework.mock.response.ResponseGeneratorChain.respond(ResponseGeneratorChain.java:45) > at > org.spockframework.mock.runtime.MockInteraction.accept(MockInteraction.java:76) > at > org.spockframework.mock.runtime.MockInteractionDecorator.accept(MockInteractionDecorator.java:46) > at > org.spockframework.mock.runtime.InteractionScope$1.accept(InteractionScope.java:41) > at > org.spockframework.mock.runtime.MockController.handle(MockController.java:39) > at > org.spockframework.mock.runtime.JavaMockInterceptor.intercept(JavaMockInterceptor.java:72) > at > org.spockframework.mock.runtime.CglibMockInterceptorAdapter.intercept(CglibMockInterceptorAdapter.java:30) > at > org.apache.nifi.web.dao.impl.StandardTemplateDAO.instantiateTemplate(StandardTemplateDAO.java:91) > at org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec.test > InstantiateTemplate moves and scales > templates(StandardTemplateDAOSpec.groovy:62) > {code} > Upgrading to CGLIB 3.2 resolves the issue -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1993) Upgrade CGLIB to the latest 3.2
[ https://issues.apache.org/jira/browse/NIFI-1993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323641#comment-15323641 ] ASF GitHub Bot commented on NIFI-1993: -- Github user olegz commented on the issue: https://github.com/apache/nifi/pull/516 @mattyb149 I got one better ;) Try PR https://github.com/apache/nifi/pull/515 as it stands now and then with this change. That is how it was discovered. I've attached some notes in JIRA. Let me know if you need more details. > Upgrade CGLIB to the latest 3.2 > --- > > Key: NIFI-1993 > URL: https://issues.apache.org/jira/browse/NIFI-1993 > Project: Apache NiFi > Issue Type: Improvement >Reporter: Oleg Zhurakousky >Assignee: Oleg Zhurakousky >Priority: Minor > Fix For: 1.0.0 > > > While, working in NIFI-826 I've encountered problem related to Groovy tests > (Spoke) and java 1.8 which is essentially described here: > https://groups.google.com/forum/#!topic/spockframework/59WIHGgcSNE > The stack trace from the failing Spoke test: > {code} > test InstantiateTemplate moves and scales > templates[0](org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec) Time > elapsed: 0.46 sec <<< ERROR! > java.lang.IllegalArgumentException: null > at > net.sf.cglib.proxy.BridgeMethodResolver.resolveAll(BridgeMethodResolver.java:61) > at net.sf.cglib.proxy.Enhancer.emitMethods(Enhancer.java:911) > at net.sf.cglib.proxy.Enhancer.generateClass(Enhancer.java:498) > at > net.sf.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25) > at > net.sf.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:216) > at net.sf.cglib.proxy.Enhancer.createHelper(Enhancer.java:377) > at net.sf.cglib.proxy.Enhancer.createClass(Enhancer.java:317) > at > org.spockframework.mock.runtime.ProxyBasedMockFactory$CglibMockFactory.createMock(ProxyBasedMockFactory.java:91) > at > org.spockframework.mock.runtime.ProxyBasedMockFactory.create(ProxyBasedMockFactory.java:49) > at > org.spockframework.mock.runtime.JavaMockFactory.create(JavaMockFactory.java:51) > at > org.spockframework.mock.runtime.CompositeMockFactory.create(CompositeMockFactory.java:44) > at > org.spockframework.lang.SpecInternals.createMock(SpecInternals.java:45) > at > org.spockframework.lang.SpecInternals.createMockImpl(SpecInternals.java:281) > at org.spockframework.lang.SpecInternals.MockImpl(SpecInternals.java:99) > at > groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46) > at > groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46) > at > org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec$__spock_feature_0_0_closure2.closure7$_closure8(StandardTemplateDAOSpec.groovy:71) > at groovy.lang.Closure.call(Closure.java:426) > at > org.spockframework.mock.response.CodeResponseGenerator.invokeClosure(CodeResponseGenerator.java:53) > at > org.spockframework.mock.response.CodeResponseGenerator.doRespond(CodeResponseGenerator.java:36) > at > org.spockframework.mock.response.SingleResponseGenerator.respond(SingleResponseGenerator.java:31) > at > org.spockframework.mock.response.ResponseGeneratorChain.respond(ResponseGeneratorChain.java:45) > at > org.spockframework.mock.runtime.MockInteraction.accept(MockInteraction.java:76) > at > org.spockframework.mock.runtime.MockInteractionDecorator.accept(MockInteractionDecorator.java:46) > at > org.spockframework.mock.runtime.InteractionScope$1.accept(InteractionScope.java:41) > at > org.spockframework.mock.runtime.MockController.handle(MockController.java:39) > at > org.spockframework.mock.runtime.JavaMockInterceptor.intercept(JavaMockInterceptor.java:72) > at > org.spockframework.mock.runtime.CglibMockInterceptorAdapter.intercept(CglibMockInterceptorAdapter.java:30) > at > org.apache.nifi.web.dao.impl.StandardTemplateDAO.instantiateTemplate(StandardTemplateDAO.java:91) > at org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec.test > InstantiateTemplate moves and scales > templates(StandardTemplateDAOSpec.groovy:62) > {code} > Upgrading to CGLIB 3.2 resolves the issue -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1993) Upgrade CGLIB to the latest 3.2
[ https://issues.apache.org/jira/browse/NIFI-1993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323637#comment-15323637 ] ASF GitHub Bot commented on NIFI-1993: -- Github user olegz commented on the issue: https://github.com/apache/nifi/pull/516 NOTE to merger: This is 1.0.0 only since the reasons for the upgrade are related to Java 1.8 > Upgrade CGLIB to the latest 3.2 > --- > > Key: NIFI-1993 > URL: https://issues.apache.org/jira/browse/NIFI-1993 > Project: Apache NiFi > Issue Type: Improvement >Reporter: Oleg Zhurakousky >Assignee: Oleg Zhurakousky >Priority: Minor > Fix For: 1.0.0 > > > While, working in NIFI-826 I've encountered problem related to Groovy tests > (Spoke) and java 1.8 which is essentially described here: > https://groups.google.com/forum/#!topic/spockframework/59WIHGgcSNE > The stack trace from the failing Spoke test: > {code} > test InstantiateTemplate moves and scales > templates[0](org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec) Time > elapsed: 0.46 sec <<< ERROR! > java.lang.IllegalArgumentException: null > at > net.sf.cglib.proxy.BridgeMethodResolver.resolveAll(BridgeMethodResolver.java:61) > at net.sf.cglib.proxy.Enhancer.emitMethods(Enhancer.java:911) > at net.sf.cglib.proxy.Enhancer.generateClass(Enhancer.java:498) > at > net.sf.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25) > at > net.sf.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:216) > at net.sf.cglib.proxy.Enhancer.createHelper(Enhancer.java:377) > at net.sf.cglib.proxy.Enhancer.createClass(Enhancer.java:317) > at > org.spockframework.mock.runtime.ProxyBasedMockFactory$CglibMockFactory.createMock(ProxyBasedMockFactory.java:91) > at > org.spockframework.mock.runtime.ProxyBasedMockFactory.create(ProxyBasedMockFactory.java:49) > at > org.spockframework.mock.runtime.JavaMockFactory.create(JavaMockFactory.java:51) > at > org.spockframework.mock.runtime.CompositeMockFactory.create(CompositeMockFactory.java:44) > at > org.spockframework.lang.SpecInternals.createMock(SpecInternals.java:45) > at > org.spockframework.lang.SpecInternals.createMockImpl(SpecInternals.java:281) > at org.spockframework.lang.SpecInternals.MockImpl(SpecInternals.java:99) > at > groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46) > at > groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46) > at > org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec$__spock_feature_0_0_closure2.closure7$_closure8(StandardTemplateDAOSpec.groovy:71) > at groovy.lang.Closure.call(Closure.java:426) > at > org.spockframework.mock.response.CodeResponseGenerator.invokeClosure(CodeResponseGenerator.java:53) > at > org.spockframework.mock.response.CodeResponseGenerator.doRespond(CodeResponseGenerator.java:36) > at > org.spockframework.mock.response.SingleResponseGenerator.respond(SingleResponseGenerator.java:31) > at > org.spockframework.mock.response.ResponseGeneratorChain.respond(ResponseGeneratorChain.java:45) > at > org.spockframework.mock.runtime.MockInteraction.accept(MockInteraction.java:76) > at > org.spockframework.mock.runtime.MockInteractionDecorator.accept(MockInteractionDecorator.java:46) > at > org.spockframework.mock.runtime.InteractionScope$1.accept(InteractionScope.java:41) > at > org.spockframework.mock.runtime.MockController.handle(MockController.java:39) > at > org.spockframework.mock.runtime.JavaMockInterceptor.intercept(JavaMockInterceptor.java:72) > at > org.spockframework.mock.runtime.CglibMockInterceptorAdapter.intercept(CglibMockInterceptorAdapter.java:30) > at > org.apache.nifi.web.dao.impl.StandardTemplateDAO.instantiateTemplate(StandardTemplateDAO.java:91) > at org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec.test > InstantiateTemplate moves and scales > templates(StandardTemplateDAOSpec.groovy:62) > {code} > Upgrading to CGLIB 3.2 resolves the issue -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1987) Nifi-Spark-Receiver build failure due to orgspark repository change
[ https://issues.apache.org/jira/browse/NIFI-1987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323475#comment-15323475 ] Daniel Cave commented on NIFI-1987: --- I've tested on both Linux and Windows machines today with fresh 0.x pull downs and non-NiFi repos and I can't replicate it either. I don't have any CDH customizations on my attempted new build machines that would have created the persistent error yesterday. For now I'm going to close this ticket pending more testing to track down exactly the sub-dependency reference and/or being able to recreate it on fresh code. Thanks for your testing help as well. > Nifi-Spark-Receiver build failure due to orgspark repository change > --- > > Key: NIFI-1987 > URL: https://issues.apache.org/jira/browse/NIFI-1987 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 0.6.1 >Reporter: Daniel Cave >Assignee: Oleg Zhurakousky >Priority: Critical > Fix For: 1.0.0, 0.7.0 > > > Builds on machines with no or incomplete local maven repos failing due to > nifi-spark-receiver sub-dependency on > oss.sonatype.org/content/repositories/orgspark-project-1113 related > dependencies. On 6/7/16 it appears that orgspark-project-1113 was removed > from the repo and was replaced with orgspark-project-1123. This blocks > complete NiFi builds where a complete local repo was not available with all > 1113 sub-dependencies already in place. See below: > [INFO] Scanning for projects... > [INFO] Inspecting build with total of 1 modules... > [INFO] Installing Nexus Staging features: > [INFO] ... total of 1 executions of maven-deploy-plugin replaced with > nexus-staging-maven-plugin > [INFO] > [INFO] > > [INFO] Building nifi-spark-receiver 0.7.0-SNAPSHOT > [INFO] > > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT.pom > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT.pom > [WARNING] The POM for > org.apache.avro:avro-mapred:jar:hadoop2:1.7.6-cdh5.7.0-SNAPSHOT is missing, > no dependency information available > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-site-to-site-client/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-commons/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-api/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-utils/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-security-utils/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-client-dto/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-framework/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-framework-bundle/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-nar-bundles/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT-hadoop2.jar > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT-hadoop2.jar > [INFO] > > [INFO] BUILD FAILURE > [INFO] > > [INFO] Total time: 6.771 s > [INFO] Finished at: 2016-06-08T13:13:41-05:00 > [INFO] Final Memory: 28M/381M > [INFO] > > [ERROR] Failed to execute goal on project nifi-spark-receiver: Could not > resolve dependencies for project > org.apache.nifi:nifi-spark-receiver:jar:0.7.0-SNAPSHOT:
[jira] [Commented] (NIFI-1857) Support HTTP(S) as a transport mechanism for Site-to-Site
[ https://issues.apache.org/jira/browse/NIFI-1857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323434#comment-15323434 ] ASF GitHub Bot commented on NIFI-1857: -- Github user ijokarumawak commented on the issue: https://github.com/apache/nifi/pull/497 @markap14 Thanks for taking your time to review and test, I'm so glad to hear that worked! The CI test haven't been finished, maybe that's why this isn't closed yet. I'll check the status again later. > Support HTTP(S) as a transport mechanism for Site-to-Site > - > > Key: NIFI-1857 > URL: https://issues.apache.org/jira/browse/NIFI-1857 > Project: Apache NiFi > Issue Type: Improvement > Components: Core Framework >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Original Estimate: 480h > Remaining Estimate: 480h > > We should add support for using HTTP(S) for site-to-site to be an alternative > to the current socket based approach. > This would support the same push based or pull based approach site-to-site > offers now but it would use HTTP(S) for all interactions to include learning > about ports, learning about NCM topology, and actually exchanging data. This > mechanism should also support interaction via an HTTP proxy. > This would also require some UI work to allow the user to specify which > protocol for site-to-site to use such as 'raw' vs 'http'. We also need to > document any limitations with regard to SSL support for this mode and we'd > need to provide 'how-to' when using proxies like http_proxy or something else. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (NIFI-1993) Upgrade CGLIB to the latest 3.2
[ https://issues.apache.org/jira/browse/NIFI-1993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Oleg Zhurakousky updated NIFI-1993: --- Description: While, working in NIFI-826 I've encountered problem related to Groovy tests (Spoke) and java 1.8 which is essentially described here: https://groups.google.com/forum/#!topic/spockframework/59WIHGgcSNE The stack trace from the failing Spoke test: {code} test InstantiateTemplate moves and scales templates[0](org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec) Time elapsed: 0.46 sec <<< ERROR! java.lang.IllegalArgumentException: null at net.sf.cglib.proxy.BridgeMethodResolver.resolveAll(BridgeMethodResolver.java:61) at net.sf.cglib.proxy.Enhancer.emitMethods(Enhancer.java:911) at net.sf.cglib.proxy.Enhancer.generateClass(Enhancer.java:498) at net.sf.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25) at net.sf.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:216) at net.sf.cglib.proxy.Enhancer.createHelper(Enhancer.java:377) at net.sf.cglib.proxy.Enhancer.createClass(Enhancer.java:317) at org.spockframework.mock.runtime.ProxyBasedMockFactory$CglibMockFactory.createMock(ProxyBasedMockFactory.java:91) at org.spockframework.mock.runtime.ProxyBasedMockFactory.create(ProxyBasedMockFactory.java:49) at org.spockframework.mock.runtime.JavaMockFactory.create(JavaMockFactory.java:51) at org.spockframework.mock.runtime.CompositeMockFactory.create(CompositeMockFactory.java:44) at org.spockframework.lang.SpecInternals.createMock(SpecInternals.java:45) at org.spockframework.lang.SpecInternals.createMockImpl(SpecInternals.java:281) at org.spockframework.lang.SpecInternals.MockImpl(SpecInternals.java:99) at groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46) at groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46) at org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec$__spock_feature_0_0_closure2.closure7$_closure8(StandardTemplateDAOSpec.groovy:71) at groovy.lang.Closure.call(Closure.java:426) at org.spockframework.mock.response.CodeResponseGenerator.invokeClosure(CodeResponseGenerator.java:53) at org.spockframework.mock.response.CodeResponseGenerator.doRespond(CodeResponseGenerator.java:36) at org.spockframework.mock.response.SingleResponseGenerator.respond(SingleResponseGenerator.java:31) at org.spockframework.mock.response.ResponseGeneratorChain.respond(ResponseGeneratorChain.java:45) at org.spockframework.mock.runtime.MockInteraction.accept(MockInteraction.java:76) at org.spockframework.mock.runtime.MockInteractionDecorator.accept(MockInteractionDecorator.java:46) at org.spockframework.mock.runtime.InteractionScope$1.accept(InteractionScope.java:41) at org.spockframework.mock.runtime.MockController.handle(MockController.java:39) at org.spockframework.mock.runtime.JavaMockInterceptor.intercept(JavaMockInterceptor.java:72) at org.spockframework.mock.runtime.CglibMockInterceptorAdapter.intercept(CglibMockInterceptorAdapter.java:30) at org.apache.nifi.web.dao.impl.StandardTemplateDAO.instantiateTemplate(StandardTemplateDAO.java:91) at org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec.test InstantiateTemplate moves and scales templates(StandardTemplateDAOSpec.groovy:62) {code} Upgrading to CGLIB 3.2 resolves the issue was: While, working in NIFI-826 I've encountered problem related to Groovy tests (Spoke) and java 1.8 which is essentially described here: https://groups.google.com/forum/#!topic/spockframework/59WIHGgcSNE Upgrading to CGLIB 3.2 resolves the issue > Upgrade CGLIB to the latest 3.2 > --- > > Key: NIFI-1993 > URL: https://issues.apache.org/jira/browse/NIFI-1993 > Project: Apache NiFi > Issue Type: Improvement >Reporter: Oleg Zhurakousky >Assignee: Oleg Zhurakousky >Priority: Minor > Fix For: 1.0.0 > > > While, working in NIFI-826 I've encountered problem related to Groovy tests > (Spoke) and java 1.8 which is essentially described here: > https://groups.google.com/forum/#!topic/spockframework/59WIHGgcSNE > The stack trace from the failing Spoke test: > {code} > test InstantiateTemplate moves and scales > templates[0](org.apache.nifi.web.dao.impl.StandardTemplateDAOSpec) Time > elapsed: 0.46 sec <<< ERROR! > java.lang.IllegalArgumentException: null > at > net.sf.cglib.proxy.BridgeMethodResolver.resolveAll(BridgeMethodResolver.java:61) > at net.sf.cglib.proxy.Enhancer.emitMethods(Enhancer.java:911) > at net.sf.cglib.proxy.Enhancer.generateClass(Enhancer.java:498) > at >
[jira] [Created] (NIFI-1993) Upgrade CGLIB to the latest 3.2
Oleg Zhurakousky created NIFI-1993: -- Summary: Upgrade CGLIB to the latest 3.2 Key: NIFI-1993 URL: https://issues.apache.org/jira/browse/NIFI-1993 Project: Apache NiFi Issue Type: Improvement Reporter: Oleg Zhurakousky Assignee: Oleg Zhurakousky Priority: Minor Fix For: 1.0.0 While, working in NIFI-826 I've encountered problem related to Groovy tests (Spoke) and java 1.8 which is essentially described here: https://groups.google.com/forum/#!topic/spockframework/59WIHGgcSNE Upgrading to CGLIB 3.2 resolves the issue -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1982) Compressed check box in Configure Remote Port UI is not used
[ https://issues.apache.org/jira/browse/NIFI-1982?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323374#comment-15323374 ] ASF GitHub Bot commented on NIFI-1982: -- Github user ijokarumawak closed the pull request at: https://github.com/apache/nifi/pull/509 > Compressed check box in Configure Remote Port UI is not used > > > Key: NIFI-1982 > URL: https://issues.apache.org/jira/browse/NIFI-1982 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Core UI >Affects Versions: 0.6.1 >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Fix For: 0.7.0 > > Attachments: configure-remote-port.png > > > RIght click Remote Process Group -> Remote Ports -> (Edit icon) -> Configure > Remote Port setting window > It has "Compressed" checkbox. However if user checked the UI component, > RemoteGroupPort doesn't use it to enable compression of Site-to-Site client. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (NIFI-1992) Update Site-to-Site Client to support Zero-Master Clustering Paradigm
Mark Payne created NIFI-1992: Summary: Update Site-to-Site Client to support Zero-Master Clustering Paradigm Key: NIFI-1992 URL: https://issues.apache.org/jira/browse/NIFI-1992 Project: Apache NiFi Issue Type: Task Components: Core Framework, Tools and Build Affects Versions: 1.0.0 Reporter: Mark Payne Assignee: Mark Payne Priority: Blocker Fix For: 1.0.0 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1857) Support HTTP(S) as a transport mechanism for Site-to-Site
[ https://issues.apache.org/jira/browse/NIFI-1857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323225#comment-15323225 ] ASF GitHub Bot commented on NIFI-1857: -- Github user markap14 commented on the issue: https://github.com/apache/nifi/pull/497 @ijokarumawak I added the "This closes #497" message to the commit that I pushed, but it doesn't seem to have worked... can you close the PR? > Support HTTP(S) as a transport mechanism for Site-to-Site > - > > Key: NIFI-1857 > URL: https://issues.apache.org/jira/browse/NIFI-1857 > Project: Apache NiFi > Issue Type: Improvement > Components: Core Framework >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Original Estimate: 480h > Remaining Estimate: 480h > > We should add support for using HTTP(S) for site-to-site to be an alternative > to the current socket based approach. > This would support the same push based or pull based approach site-to-site > offers now but it would use HTTP(S) for all interactions to include learning > about ports, learning about NCM topology, and actually exchanging data. This > mechanism should also support interaction via an HTTP proxy. > This would also require some UI work to allow the user to specify which > protocol for site-to-site to use such as 'raw' vs 'http'. We also need to > document any limitations with regard to SSL support for this mode and we'd > need to provide 'how-to' when using proxies like http_proxy or something else. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Resolved] (NIFI-1857) Support HTTP(S) as a transport mechanism for Site-to-Site
[ https://issues.apache.org/jira/browse/NIFI-1857?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Mark Payne resolved NIFI-1857. -- Resolution: Fixed > Support HTTP(S) as a transport mechanism for Site-to-Site > - > > Key: NIFI-1857 > URL: https://issues.apache.org/jira/browse/NIFI-1857 > Project: Apache NiFi > Issue Type: Improvement > Components: Core Framework >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Original Estimate: 480h > Remaining Estimate: 480h > > We should add support for using HTTP(S) for site-to-site to be an alternative > to the current socket based approach. > This would support the same push based or pull based approach site-to-site > offers now but it would use HTTP(S) for all interactions to include learning > about ports, learning about NCM topology, and actually exchanging data. This > mechanism should also support interaction via an HTTP proxy. > This would also require some UI work to allow the user to specify which > protocol for site-to-site to use such as 'raw' vs 'http'. We also need to > document any limitations with regard to SSL support for this mode and we'd > need to provide 'how-to' when using proxies like http_proxy or something else. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1857) Support HTTP(S) as a transport mechanism for Site-to-Site
[ https://issues.apache.org/jira/browse/NIFI-1857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323193#comment-15323193 ] ASF GitHub Bot commented on NIFI-1857: -- Github user markap14 commented on the issue: https://github.com/apache/nifi/pull/497 @ijokarumawak this is looking good now! I have pulled down the latest PR, rebased against master, and have been able to test this running directly against my NiFi instance and while using nginx as a proxy. Nicely done! I have pushed this to master. > Support HTTP(S) as a transport mechanism for Site-to-Site > - > > Key: NIFI-1857 > URL: https://issues.apache.org/jira/browse/NIFI-1857 > Project: Apache NiFi > Issue Type: Improvement > Components: Core Framework >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Original Estimate: 480h > Remaining Estimate: 480h > > We should add support for using HTTP(S) for site-to-site to be an alternative > to the current socket based approach. > This would support the same push based or pull based approach site-to-site > offers now but it would use HTTP(S) for all interactions to include learning > about ports, learning about NCM topology, and actually exchanging data. This > mechanism should also support interaction via an HTTP proxy. > This would also require some UI work to allow the user to specify which > protocol for site-to-site to use such as 'raw' vs 'http'. We also need to > document any limitations with regard to SSL support for this mode and we'd > need to provide 'how-to' when using proxies like http_proxy or something else. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[10/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
NIFI-1857: HTTPS Site-to-Site - Enable HTTP(S) for Site-to-Site communication - Support HTTP Proxy in the middle of local and remote NiFi - Support BASIC and DIGEST auth with Proxy Server - Provide 2-phase style commit same as existing socket version - [WIP] Test with the latest cluster env (without NCM) hasn't tested yet - Fixed Buffer handling issues at asyc http client POST - Fixed JS error when applying Remote Process Group Port setting from UI - Use compression setting from UI - Removed already finished TODO comments - Added additional buffer draining code after receiving EOF - Added inspection and assert code to make sure Site-to-Site client has written data fully to output stream - Changed default nifi.remote.input.secure from true to false This closes #497. Project: http://git-wip-us.apache.org/repos/asf/nifi/repo Commit: http://git-wip-us.apache.org/repos/asf/nifi/commit/c120c498 Tree: http://git-wip-us.apache.org/repos/asf/nifi/tree/c120c498 Diff: http://git-wip-us.apache.org/repos/asf/nifi/diff/c120c498 Branch: refs/heads/master Commit: c120c4982d4fc811b06b672e3983b8ca5fb8ae64 Parents: a5fecda Author: Koji KawamuraAuthored: Mon Jun 6 22:19:26 2016 +0900 Committer: Mark Payne Committed: Thu Jun 9 15:09:57 2016 -0400 -- .../org/apache/nifi/util/NiFiProperties.java| 39 +- nifi-commons/nifi-site-to-site-client/pom.xml | 19 + .../apache/nifi/remote/AbstractTransaction.java | 389 +++ .../remote/ClientTransactionCompletion.java | 57 + .../remote/client/AbstractSiteToSiteClient.java | 54 + .../apache/nifi/remote/client/PeerSelector.java | 341 ++ .../nifi/remote/client/PeerStatusProvider.java | 27 + .../nifi/remote/client/SiteInfoProvider.java| 231 .../nifi/remote/client/SiteToSiteClient.java| 95 +- .../remote/client/SiteToSiteClientConfig.java | 13 + .../nifi/remote/client/http/HttpClient.java | 200 .../TransportProtocolVersionNegotiator.java | 42 + .../client/socket/EndpointConnectionPool.java | 559 +- .../nifi/remote/client/socket/SocketClient.java | 40 +- .../remote/cluster/AdaptedNodeInformation.java | 11 + .../nifi/remote/cluster/NodeInformation.java| 20 +- .../remote/cluster/NodeInformationAdapter.java |4 +- .../remote/exception/HandshakeException.java| 15 + .../io/http/HttpCommunicationsSession.java | 97 ++ .../apache/nifi/remote/io/http/HttpInput.java | 58 + .../apache/nifi/remote/io/http/HttpOutput.java | 45 + .../http/HttpServerCommunicationsSession.java | 72 ++ .../nifi/remote/protocol/ClientProtocol.java| 14 +- .../nifi/remote/protocol/HandshakeProperty.java | 59 + .../apache/nifi/remote/protocol/Response.java | 52 + .../nifi/remote/protocol/ResponseCode.java | 152 +++ .../protocol/SiteToSiteTransportProtocol.java | 22 + .../protocol/http/HttpClientTransaction.java| 187 .../nifi/remote/protocol/http/HttpHeaders.java | 35 + .../nifi/remote/protocol/http/HttpProxy.java| 59 + .../protocol/socket/HandshakeProperty.java | 59 - .../nifi/remote/protocol/socket/Response.java | 52 - .../remote/protocol/socket/ResponseCode.java| 148 --- .../protocol/socket/SocketClientProtocol.java | 173 +-- .../socket/SocketClientTransaction.java | 345 +- .../SocketClientTransactionCompletion.java | 57 - .../nifi/remote/util/EventReportUtil.java | 50 + .../nifi/remote/util/NiFiRestApiUtil.java | 100 -- .../remote/util/SiteToSiteRestApiClient.java| 992 + .../nifi/remote/client/TestPeerSelector.java| 125 +++ .../nifi/remote/client/http/TestHttpClient.java | 950 .../socket/TestEndpointConnectionStatePool.java | 92 -- .../remote/protocol/SiteToSiteTestUtils.java| 237 .../http/TestHttpClientTransaction.java | 346 ++ .../socket/TestSocketClientTransaction.java | 334 ++ .../src/main/asciidoc/administration-guide.adoc | 12 +- .../src/main/asciidoc/getting-started.adoc |1 + .../images/configure-remote-process-group.png | Bin 0 -> 36406 bytes nifi-docs/src/main/asciidoc/user-guide.adoc | 37 +- .../apache/nifi/web/api/dto/ControllerDTO.java | 19 + .../nifi/web/api/dto/RemoteProcessGroupDTO.java | 45 + .../apache/nifi/web/api/dto/remote/PeerDTO.java | 78 ++ .../apache/nifi/web/api/entity/PeersEntity.java | 46 + .../web/api/entity/TransactionResultEntity.java | 53 + .../cluster/protocol/ConnectionResponse.java| 10 +- .../nifi/cluster/protocol/NodeIdentifier.java | 26 +- .../jaxb/message/AdaptedConnectionResponse.java |9 + .../jaxb/message/AdaptedNodeIdentifier.java | 11 + .../jaxb/message/ConnectionResponseAdapter.java |3 +- .../jaxb/message/NodeIdentifierAdapter.java |3 +-
[04/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/SocketRemoteSiteListener.java -- diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/SocketRemoteSiteListener.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/SocketRemoteSiteListener.java index 809147e..a5d4bbe 100644 --- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/SocketRemoteSiteListener.java +++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/SocketRemoteSiteListener.java @@ -16,6 +16,19 @@ */ package org.apache.nifi.remote; +import org.apache.nifi.groups.ProcessGroup; +import org.apache.nifi.remote.cluster.NodeInformant; +import org.apache.nifi.remote.exception.HandshakeException; +import org.apache.nifi.remote.io.socket.SocketChannelCommunicationsSession; +import org.apache.nifi.remote.io.socket.ssl.SSLSocketChannel; +import org.apache.nifi.remote.io.socket.ssl.SSLSocketChannelCommunicationsSession; +import org.apache.nifi.remote.protocol.CommunicationsSession; +import org.apache.nifi.remote.protocol.RequestType; +import org.apache.nifi.remote.protocol.ServerProtocol; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import javax.net.ssl.SSLContext; import java.io.DataInputStream; import java.io.DataOutputStream; import java.io.EOFException; @@ -30,24 +43,9 @@ import java.net.SocketTimeoutException; import java.nio.channels.ServerSocketChannel; import java.nio.channels.SocketChannel; import java.util.Arrays; -import java.util.HashMap; import java.util.concurrent.atomic.AtomicBoolean; import java.util.concurrent.atomic.AtomicReference; -import javax.net.ssl.SSLContext; - -import org.apache.nifi.groups.ProcessGroup; -import org.apache.nifi.remote.cluster.NodeInformant; -import org.apache.nifi.remote.exception.HandshakeException; -import org.apache.nifi.remote.io.socket.SocketChannelCommunicationsSession; -import org.apache.nifi.remote.io.socket.ssl.SSLSocketChannel; -import org.apache.nifi.remote.io.socket.ssl.SSLSocketChannelCommunicationsSession; -import org.apache.nifi.remote.protocol.CommunicationsSession; -import org.apache.nifi.remote.protocol.RequestType; -import org.apache.nifi.remote.protocol.ServerProtocol; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - public class SocketRemoteSiteListener implements RemoteSiteListener { public static final String DEFAULT_FLOWFILE_PATH = "./"; @@ -261,11 +259,11 @@ public class SocketRemoteSiteListener implements RemoteSiteListener { break; case RECEIVE_FLOWFILES: // peer wants to receive FlowFiles, so we will transfer FlowFiles. - protocol.getPort().transferFlowFiles(peer, protocol, new HashMap()); + protocol.getPort().transferFlowFiles(peer, protocol); break; case SEND_FLOWFILES: // Peer wants to send FlowFiles, so we will receive. - protocol.getPort().receiveFlowFiles(peer, protocol, new HashMap ()); + protocol.getPort().receiveFlowFiles(peer, protocol); break; case REQUEST_PEER_LIST: protocol.sendPeerList(peer); http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java -- diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java index 02a44b7..3f59b50 100644 --- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java +++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java @@ -47,6 +47,7 @@ import org.apache.nifi.remote.exception.PortNotRunningException;
[08/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/protocol/http/HttpProxy.java -- diff --git a/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/protocol/http/HttpProxy.java b/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/protocol/http/HttpProxy.java new file mode 100644 index 000..4c0ebe7 --- /dev/null +++ b/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/protocol/http/HttpProxy.java @@ -0,0 +1,59 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.remote.protocol.http; + +import org.apache.commons.lang3.StringUtils; +import org.apache.http.HttpHost; + +public class HttpProxy { +private final String host; +private final Integer port; +private final String username; +private final String password; + +public HttpProxy(final String host, final Integer port, final String username, final String password) { +this.host = host; +this.port = port; +this.username = username; +this.password = password; +} + + +public String getHost() { +return host; +} + +public Integer getPort() { +return port; +} + +public String getUsername() { +return username; +} + +public String getPassword() { +return password; +} + +public HttpHost getHttpHost() { +if (StringUtils.isEmpty(host)) { +return null; +} +return new HttpHost(host, port == null ? 80 : port); +} + +} http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/protocol/socket/HandshakeProperty.java -- diff --git a/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/protocol/socket/HandshakeProperty.java b/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/protocol/socket/HandshakeProperty.java deleted file mode 100644 index 016690c..000 --- a/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/protocol/socket/HandshakeProperty.java +++ /dev/null @@ -1,59 +0,0 @@ -/* - * Licensed to the Apache Software Foundation (ASF) under one or more - * contributor license agreements. See the NOTICE file distributed with - * this work for additional information regarding copyright ownership. - * The ASF licenses this file to You under the Apache License, Version 2.0 - * (the "License"); you may not use this file except in compliance with - * the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package org.apache.nifi.remote.protocol.socket; - -/** - * Enumeration of Properties that can be used for the Site-to-Site Socket - * Protocol. - */ -public enum HandshakeProperty { - -/** - * Boolean value indicating whether or not the contents of a FlowFile should - * be GZipped when transferred. - */ -GZIP, -/** - * The unique identifier of the port to communicate with - */ -PORT_IDENTIFIER, -/** - * Indicates the number of milliseconds after the request was made that the - * client will wait for a response. If no response has been received by the - * time this value expires, the server can move on without attempting to - * service the request because the client will have already disconnected. - */ -REQUEST_EXPIRATION_MILLIS, -/** - * The preferred number of FlowFiles that the server should send to the - * client when pulling data. This property was introduced in version 5 of - * the protocol. - */ -BATCH_COUNT, -/** -
[07/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/util/SiteToSiteRestApiClient.java -- diff --git a/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/util/SiteToSiteRestApiClient.java b/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/util/SiteToSiteRestApiClient.java new file mode 100644 index 000..4195ae9 --- /dev/null +++ b/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/util/SiteToSiteRestApiClient.java @@ -0,0 +1,992 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.remote.util; + +import org.apache.commons.lang3.StringUtils; +import org.apache.http.Header; +import org.apache.http.HttpEntity; +import org.apache.http.HttpException; +import org.apache.http.HttpHost; +import org.apache.http.HttpInetConnection; +import org.apache.http.HttpRequest; +import org.apache.http.HttpResponse; +import org.apache.http.HttpResponseInterceptor; +import org.apache.http.StatusLine; +import org.apache.http.auth.AuthScope; +import org.apache.http.auth.UsernamePasswordCredentials; +import org.apache.http.client.CredentialsProvider; +import org.apache.http.client.config.RequestConfig; +import org.apache.http.client.methods.CloseableHttpResponse; +import org.apache.http.client.methods.HttpDelete; +import org.apache.http.client.methods.HttpGet; +import org.apache.http.client.methods.HttpPost; +import org.apache.http.client.methods.HttpPut; +import org.apache.http.client.methods.HttpRequestBase; +import org.apache.http.client.utils.URIUtils; +import org.apache.http.conn.ManagedHttpClientConnection; +import org.apache.http.entity.BasicHttpEntity; +import org.apache.http.impl.client.BasicCredentialsProvider; +import org.apache.http.impl.client.CloseableHttpClient; +import org.apache.http.impl.client.HttpClientBuilder; +import org.apache.http.impl.client.HttpClients; +import org.apache.http.impl.nio.client.CloseableHttpAsyncClient; +import org.apache.http.impl.nio.client.HttpAsyncClientBuilder; +import org.apache.http.impl.nio.client.HttpAsyncClients; +import org.apache.http.nio.ContentEncoder; +import org.apache.http.nio.IOControl; +import org.apache.http.nio.conn.ManagedNHttpClientConnection; +import org.apache.http.nio.protocol.BasicAsyncResponseConsumer; +import org.apache.http.nio.protocol.HttpAsyncRequestProducer; +import org.apache.http.protocol.HttpContext; +import org.apache.http.protocol.HttpCoreContext; +import org.apache.http.util.EntityUtils; +import org.apache.nifi.remote.TransferDirection; +import org.apache.nifi.remote.client.http.TransportProtocolVersionNegotiator; +import org.apache.nifi.remote.exception.PortNotRunningException; +import org.apache.nifi.remote.exception.ProtocolException; +import org.apache.nifi.remote.exception.UnknownPortException; +import org.apache.nifi.remote.io.http.HttpCommunicationsSession; +import org.apache.nifi.remote.io.http.HttpInput; +import org.apache.nifi.remote.io.http.HttpOutput; +import org.apache.nifi.remote.protocol.CommunicationsSession; +import org.apache.nifi.remote.protocol.ResponseCode; +import org.apache.nifi.remote.protocol.http.HttpHeaders; +import org.apache.nifi.remote.protocol.http.HttpProxy; +import org.apache.nifi.security.util.CertificateUtils; +import org.apache.nifi.stream.io.ByteArrayInputStream; +import org.apache.nifi.stream.io.ByteArrayOutputStream; +import org.apache.nifi.stream.io.StreamUtils; +import org.apache.nifi.web.api.dto.ControllerDTO; +import org.apache.nifi.web.api.dto.remote.PeerDTO; +import org.apache.nifi.web.api.entity.ControllerEntity; +import org.apache.nifi.web.api.entity.PeersEntity; +import org.apache.nifi.web.api.entity.TransactionResultEntity; +import org.codehaus.jackson.JsonParseException; +import org.codehaus.jackson.map.DeserializationConfig; +import org.codehaus.jackson.map.JsonMappingException; +import org.codehaus.jackson.map.ObjectMapper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import javax.net.ssl.SSLContext; +import javax.net.ssl.SSLPeerUnverifiedException; +import
[06/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-commons/nifi-site-to-site-client/src/test/java/org/apache/nifi/remote/client/socket/TestEndpointConnectionStatePool.java -- diff --git a/nifi-commons/nifi-site-to-site-client/src/test/java/org/apache/nifi/remote/client/socket/TestEndpointConnectionStatePool.java b/nifi-commons/nifi-site-to-site-client/src/test/java/org/apache/nifi/remote/client/socket/TestEndpointConnectionStatePool.java deleted file mode 100644 index 8336745..000 --- a/nifi-commons/nifi-site-to-site-client/src/test/java/org/apache/nifi/remote/client/socket/TestEndpointConnectionStatePool.java +++ /dev/null @@ -1,92 +0,0 @@ -/* - * Licensed to the Apache Software Foundation (ASF) under one or more - * contributor license agreements. See the NOTICE file distributed with - * this work for additional information regarding copyright ownership. - * The ASF licenses this file to You under the Apache License, Version 2.0 - * (the "License"); you may not use this file except in compliance with - * the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package org.apache.nifi.remote.client.socket; - -import java.io.IOException; -import java.util.ArrayList; -import java.util.List; - -import org.apache.nifi.remote.PeerStatus; -import org.apache.nifi.remote.TransferDirection; -import org.apache.nifi.remote.cluster.ClusterNodeInformation; -import org.apache.nifi.remote.cluster.NodeInformation; -import org.junit.Test; - -public class TestEndpointConnectionStatePool { - -@Test -public void testFormulateDestinationListForOutput() throws IOException { -final ClusterNodeInformation clusterNodeInfo = new ClusterNodeInformation(); -final List collection = new ArrayList<>(); -collection.add(new NodeInformation("ShouldGetMedium", 1, , true, 4096)); -collection.add(new NodeInformation("ShouldGetLots", 2, , true, 10240)); -collection.add(new NodeInformation("ShouldGetLittle", 3, , true, 1024)); -collection.add(new NodeInformation("ShouldGetMedium", 4, , true, 4096)); -collection.add(new NodeInformation("ShouldGetMedium", 5, , true, 4096)); - -clusterNodeInfo.setNodeInformation(collection); -final List destinations = EndpointConnectionPool.formulateDestinationList(clusterNodeInfo, TransferDirection.RECEIVE); -for (final PeerStatus peerStatus : destinations) { -System.out.println(peerStatus.getPeerDescription()); -} -} - -@Test -public void testFormulateDestinationListForOutputHugeDifference() throws IOException { -final ClusterNodeInformation clusterNodeInfo = new ClusterNodeInformation(); -final List collection = new ArrayList<>(); -collection.add(new NodeInformation("ShouldGetLittle", 1, , true, 500)); -collection.add(new NodeInformation("ShouldGetLots", 2, , true, 5)); - -clusterNodeInfo.setNodeInformation(collection); -final List destinations = EndpointConnectionPool.formulateDestinationList(clusterNodeInfo, TransferDirection.RECEIVE); -for (final PeerStatus peerStatus : destinations) { -System.out.println(peerStatus.getPeerDescription()); -} -} - -@Test -public void testFormulateDestinationListForInputPorts() throws IOException { -final ClusterNodeInformation clusterNodeInfo = new ClusterNodeInformation(); -final List collection = new ArrayList<>(); -collection.add(new NodeInformation("ShouldGetMedium", 1, , true, 4096)); -collection.add(new NodeInformation("ShouldGetLittle", 2, , true, 10240)); -collection.add(new NodeInformation("ShouldGetLots", 3, , true, 1024)); -collection.add(new NodeInformation("ShouldGetMedium", 4, , true, 4096)); -collection.add(new NodeInformation("ShouldGetMedium", 5, , true, 4096)); - -clusterNodeInfo.setNodeInformation(collection); -final List destinations = EndpointConnectionPool.formulateDestinationList(clusterNodeInfo, TransferDirection.SEND); -for (final PeerStatus peerStatus : destinations) { -System.out.println(peerStatus.getPeerDescription()); -} -} - -@Test -public void testFormulateDestinationListForInputPortsHugeDifference() throws IOException { -final ClusterNodeInformation clusterNodeInfo = new ClusterNodeInformation(); -final List collection = new ArrayList<>(); -collection.add(new
[01/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
Repository: nifi Updated Branches: refs/heads/master a5fecda5a -> c120c4982 http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/test/java/org/apache/nifi/web/api/TestSiteToSiteResource.java -- diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/test/java/org/apache/nifi/web/api/TestSiteToSiteResource.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/test/java/org/apache/nifi/web/api/TestSiteToSiteResource.java new file mode 100644 index 000..6457067 --- /dev/null +++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-web/nifi-web-api/src/test/java/org/apache/nifi/web/api/TestSiteToSiteResource.java @@ -0,0 +1,516 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.web.api; + +import org.apache.nifi.remote.HttpRemoteSiteListener; +import org.apache.nifi.remote.Peer; +import org.apache.nifi.remote.RootGroupPort; +import org.apache.nifi.remote.VersionNegotiator; +import org.apache.nifi.remote.exception.HandshakeException; +import org.apache.nifi.remote.io.http.HttpServerCommunicationsSession; +import org.apache.nifi.remote.protocol.ResponseCode; +import org.apache.nifi.remote.protocol.http.HttpFlowFileServerProtocol; +import org.apache.nifi.remote.protocol.http.HttpHeaders; +import org.apache.nifi.util.NiFiProperties; +import org.apache.nifi.web.NiFiServiceFacade; +import org.apache.nifi.web.api.dto.ControllerDTO; +import org.apache.nifi.web.api.entity.ControllerEntity; +import org.apache.nifi.web.api.entity.PeersEntity; +import org.apache.nifi.web.api.entity.TransactionResultEntity; +import org.apache.nifi.web.api.request.ClientIdParameter; +import org.junit.BeforeClass; +import org.junit.Test; + +import javax.servlet.ServletContext; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; +import javax.ws.rs.core.Response; +import javax.ws.rs.core.StreamingOutput; +import javax.ws.rs.core.UriBuilder; +import javax.ws.rs.core.UriInfo; +import java.io.InputStream; +import java.net.URI; +import java.net.URISyntaxException; +import java.net.URL; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertNull; +import static org.junit.Assert.assertTrue; +import static org.mockito.Matchers.any; +import static org.mockito.Matchers.eq; +import static org.mockito.Mockito.doAnswer; +import static org.mockito.Mockito.doReturn; +import static org.mockito.Mockito.doThrow; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.spy; + +public class TestSiteToSiteResource { + +@BeforeClass +public static void setup() throws Exception { +final URL resource = TestSiteToSiteResource.class.getResource("/site-to-site/nifi.properties"); +final String propertiesFile = resource.toURI().getPath(); +System.setProperty(NiFiProperties.PROPERTIES_FILE_PATH, propertiesFile); +} + +@Test +public void testGetControllerForOlderVersion() throws Exception { +final HttpServletRequest req = mock(HttpServletRequest.class); +final NiFiServiceFacade serviceFacade = mock(NiFiServiceFacade.class); +final ControllerEntity controllerEntity = new ControllerEntity(); +final ControllerDTO controller = new ControllerDTO(); +controllerEntity.setController(controller); + +controller.setRemoteSiteHttpListeningPort(8080); +controller.setRemoteSiteListeningPort(9990); + +doReturn(controller).when(serviceFacade).getController(); + +final SiteToSiteResource resource = new SiteToSiteResource(); +resource.setProperties(NiFiProperties.getInstance()); +resource.setServiceFacade(serviceFacade); +final Response response = resource.getController(req); + +ControllerEntity resultEntity = (ControllerEntity)response.getEntity(); + +assertEquals(200, response.getStatus()); +assertNull("remoteSiteHttpListeningPort should be null since older version doesn't recognize this field" + +
[03/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/protocol/socket/SocketFlowFileServerProtocol.java -- diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/protocol/socket/SocketFlowFileServerProtocol.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/protocol/socket/SocketFlowFileServerProtocol.java index 22ca29f..a2a7223 100644 --- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/protocol/socket/SocketFlowFileServerProtocol.java +++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/protocol/socket/SocketFlowFileServerProtocol.java @@ -16,106 +16,50 @@ */ package org.apache.nifi.remote.protocol.socket; -import java.io.DataInputStream; -import java.io.DataOutputStream; -import java.io.IOException; -import java.io.InputStream; -import java.io.OutputStream; -import java.net.InetAddress; -import java.util.HashMap; -import java.util.HashSet; -import java.util.Map; -import java.util.Set; -import java.util.UUID; -import java.util.concurrent.TimeUnit; -import java.util.zip.CRC32; -import java.util.zip.CheckedInputStream; -import java.util.zip.CheckedOutputStream; - -import org.apache.nifi.connectable.Connection; -import org.apache.nifi.connectable.Port; -import org.apache.nifi.flowfile.FlowFile; -import org.apache.nifi.flowfile.attributes.CoreAttributes; -import org.apache.nifi.groups.ProcessGroup; -import org.apache.nifi.processor.ProcessContext; -import org.apache.nifi.processor.ProcessSession; -import org.apache.nifi.processor.Relationship; -import org.apache.nifi.processor.io.InputStreamCallback; import org.apache.nifi.remote.Peer; -import org.apache.nifi.remote.PortAuthorizationResult; import org.apache.nifi.remote.RemoteResourceFactory; -import org.apache.nifi.remote.RootGroupPort; import org.apache.nifi.remote.StandardVersionNegotiator; import org.apache.nifi.remote.VersionNegotiator; -import org.apache.nifi.remote.cluster.NodeInformant; import org.apache.nifi.remote.codec.FlowFileCodec; import org.apache.nifi.remote.exception.HandshakeException; import org.apache.nifi.remote.exception.ProtocolException; -import org.apache.nifi.remote.io.CompressionInputStream; -import org.apache.nifi.remote.io.CompressionOutputStream; +import org.apache.nifi.remote.protocol.AbstractFlowFileServerProtocol; import org.apache.nifi.remote.protocol.CommunicationsSession; -import org.apache.nifi.remote.protocol.DataPacket; +import org.apache.nifi.remote.protocol.HandshakenProperties; import org.apache.nifi.remote.protocol.RequestType; -import org.apache.nifi.remote.protocol.ServerProtocol; -import org.apache.nifi.remote.util.StandardDataPacket; -import org.apache.nifi.util.FormatUtils; +import org.apache.nifi.remote.protocol.ResponseCode; import org.apache.nifi.util.NiFiProperties; -import org.apache.nifi.util.StopWatch; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - -public class SocketFlowFileServerProtocol implements ServerProtocol { - -public static final String RESOURCE_NAME = "SocketFlowFileProtocol"; -private ProcessGroup rootGroup; -private String commsIdentifier; -private boolean handshakeCompleted; +import java.io.DataInputStream; +import java.io.DataOutputStream; +import java.io.IOException; +import java.net.InetAddress; +import java.util.HashMap; +import java.util.Map; -private Boolean useGzip; -private long requestExpirationMillis; -private RootGroupPort port; -private boolean shutdown = false; -private FlowFileCodec negotiatedFlowFileCodec = null; -private String transitUriPrefix = null; +public class SocketFlowFileServerProtocol extends AbstractFlowFileServerProtocol { -private int requestedBatchCount = 0; -private long requestedBatchBytes = 0L; -private long requestedBatchNanos = 0L; -private static final long DEFAULT_BATCH_NANOS = TimeUnit.SECONDS.toNanos(5L); +public static final String RESOURCE_NAME = "SocketFlowFileProtocol"; private final VersionNegotiator versionNegotiator = new StandardVersionNegotiator(5, 4, 3, 2, 1); -private final Logger logger = LoggerFactory.getLogger(SocketFlowFileServerProtocol.class); @Override -public void setRootProcessGroup(final ProcessGroup group) { -if (!group.isRootGroup()) { -throw new IllegalArgumentException(); -} -this.rootGroup = group; -} +protected HandshakenProperties doHandshake(Peer peer) throws IOException, HandshakeException { -@Override -public void handshake(final Peer peer) throws IOException, HandshakeException { -if (handshakeCompleted)
[09/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/client/socket/EndpointConnectionPool.java -- diff --git a/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/client/socket/EndpointConnectionPool.java b/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/client/socket/EndpointConnectionPool.java index fa35f28..8a6a91f 100644 --- a/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/client/socket/EndpointConnectionPool.java +++ b/nifi-commons/nifi-site-to-site-client/src/main/java/org/apache/nifi/remote/client/socket/EndpointConnectionPool.java @@ -16,29 +16,43 @@ */ package org.apache.nifi.remote.client.socket; -import java.io.BufferedReader; +import org.apache.nifi.events.EventReporter; +import org.apache.nifi.remote.Peer; +import org.apache.nifi.remote.PeerDescription; +import org.apache.nifi.remote.PeerStatus; +import org.apache.nifi.remote.RemoteDestination; +import org.apache.nifi.remote.RemoteResourceInitiator; +import org.apache.nifi.remote.TransferDirection; +import org.apache.nifi.remote.client.PeerSelector; +import org.apache.nifi.remote.client.PeerStatusProvider; +import org.apache.nifi.remote.client.SiteInfoProvider; +import org.apache.nifi.remote.client.SiteToSiteClientConfig; +import org.apache.nifi.remote.codec.FlowFileCodec; +import org.apache.nifi.remote.exception.HandshakeException; +import org.apache.nifi.remote.exception.PortNotRunningException; +import org.apache.nifi.remote.exception.TransmissionDisabledException; +import org.apache.nifi.remote.exception.UnknownPortException; +import org.apache.nifi.remote.io.socket.SocketChannelCommunicationsSession; +import org.apache.nifi.remote.io.socket.ssl.SSLSocketChannel; +import org.apache.nifi.remote.io.socket.ssl.SSLSocketChannelCommunicationsSession; +import org.apache.nifi.remote.protocol.CommunicationsSession; +import org.apache.nifi.remote.protocol.socket.SocketClientProtocol; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import javax.net.ssl.SSLContext; import java.io.DataInputStream; import java.io.DataOutputStream; import java.io.File; -import java.io.FileInputStream; -import java.io.FileOutputStream; import java.io.IOException; -import java.io.InputStream; -import java.io.InputStreamReader; -import java.io.OutputStream; import java.net.InetSocketAddress; import java.net.URI; -import java.net.URISyntaxException; import java.nio.channels.SocketChannel; -import java.nio.charset.StandardCharsets; import java.security.cert.CertificateException; import java.util.ArrayList; -import java.util.Collection; import java.util.Collections; -import java.util.HashMap; import java.util.HashSet; import java.util.List; -import java.util.Map; import java.util.Objects; import java.util.Set; import java.util.concurrent.BlockingQueue; @@ -49,125 +63,47 @@ import java.util.concurrent.LinkedBlockingQueue; import java.util.concurrent.ScheduledExecutorService; import java.util.concurrent.ThreadFactory; import java.util.concurrent.TimeUnit; -import java.util.concurrent.atomic.AtomicLong; -import java.util.concurrent.locks.Lock; -import java.util.concurrent.locks.ReadWriteLock; -import java.util.concurrent.locks.ReentrantLock; -import java.util.concurrent.locks.ReentrantReadWriteLock; -import java.util.regex.Pattern; -import javax.net.ssl.SSLContext; -import org.apache.nifi.events.EventReporter; -import org.apache.nifi.remote.Peer; -import org.apache.nifi.remote.PeerDescription; -import org.apache.nifi.remote.PeerStatus; -import org.apache.nifi.remote.RemoteDestination; -import org.apache.nifi.remote.RemoteResourceInitiator; -import org.apache.nifi.remote.TransferDirection; -import org.apache.nifi.remote.client.SiteToSiteClientConfig; -import org.apache.nifi.remote.cluster.ClusterNodeInformation; -import org.apache.nifi.remote.cluster.NodeInformation; -import org.apache.nifi.remote.codec.FlowFileCodec; -import org.apache.nifi.remote.exception.HandshakeException; -import org.apache.nifi.remote.exception.PortNotRunningException; -import org.apache.nifi.remote.exception.TransmissionDisabledException; -import org.apache.nifi.remote.exception.UnknownPortException; -import org.apache.nifi.remote.io.socket.SocketChannelCommunicationsSession; -import org.apache.nifi.remote.io.socket.ssl.SSLSocketChannel; -import org.apache.nifi.remote.io.socket.ssl.SSLSocketChannelCommunicationsSession; -import org.apache.nifi.remote.protocol.CommunicationsSession; -import org.apache.nifi.remote.protocol.socket.SocketClientProtocol; -import org.apache.nifi.remote.util.NiFiRestApiUtil; -import org.apache.nifi.remote.util.PeerStatusCache; -import org.apache.nifi.reporting.Severity; -import org.apache.nifi.stream.io.BufferedOutputStream; -import org.apache.nifi.web.api.dto.ControllerDTO; -import
[05/10] nifi git commit: NIFI-1857: HTTPS Site-to-Site
http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedConnectionResponse.java -- diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedConnectionResponse.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedConnectionResponse.java index a4eb46e..9a53a72 100644 --- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedConnectionResponse.java +++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedConnectionResponse.java @@ -34,6 +34,7 @@ public class AdaptedConnectionResponse { private String rejectionReason; private int tryLaterSeconds; private Integer managerRemoteInputPort; +private Integer managerRemoteInputHttpPort; private Boolean managerRemoteCommsSecure; private String instanceId; private List nodeStatuses; @@ -88,6 +89,14 @@ public class AdaptedConnectionResponse { return managerRemoteInputPort; } +public void setManagerRemoteInputHttpPort(Integer managerRemoteInputHttpPort) { +this.managerRemoteInputHttpPort = managerRemoteInputHttpPort; +} + +public Integer getManagerRemoteInputHttpPort() { +return managerRemoteInputHttpPort; +} + public void setManagerRemoteCommsSecure(Boolean secure) { this.managerRemoteCommsSecure = secure; } http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedNodeIdentifier.java -- diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedNodeIdentifier.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedNodeIdentifier.java index beca014..a2d9968 100644 --- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedNodeIdentifier.java +++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/AdaptedNodeIdentifier.java @@ -27,6 +27,8 @@ public class AdaptedNodeIdentifier { private int socketPort; private String siteToSiteAddress; private Integer siteToSitePort; +private Integer siteToSiteHttpApiPort; + private boolean siteToSiteSecure; public AdaptedNodeIdentifier() { @@ -96,4 +98,13 @@ public class AdaptedNodeIdentifier { public void setSiteToSiteSecure(boolean siteToSiteSecure) { this.siteToSiteSecure = siteToSiteSecure; } + +public Integer getSiteToSiteHttpApiPort() { +return siteToSiteHttpApiPort; +} + +public void setSiteToSiteHttpApiPort(Integer siteToSiteHttpApiPort) { +this.siteToSiteHttpApiPort = siteToSiteHttpApiPort; +} + } http://git-wip-us.apache.org/repos/asf/nifi/blob/c120c498/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/ConnectionResponseAdapter.java -- diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/ConnectionResponseAdapter.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/ConnectionResponseAdapter.java index ca98a86..cf64e71 100644 --- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/ConnectionResponseAdapter.java +++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-framework-cluster-protocol/src/main/java/org/apache/nifi/cluster/protocol/jaxb/message/ConnectionResponseAdapter.java @@ -32,6 +32,7 @@ public class ConnectionResponseAdapter extends
[jira] [Created] (NIFI-1991) Zero Master Cluster: Modifying a disconnected node causes other nodes to not be able to join cluster
Andrew Lim created NIFI-1991: Summary: Zero Master Cluster: Modifying a disconnected node causes other nodes to not be able to join cluster Key: NIFI-1991 URL: https://issues.apache.org/jira/browse/NIFI-1991 Project: Apache NiFi Issue Type: Bug Components: Core Framework Reporter: Andrew Lim Priority: Blocker Fix For: 1.0.0 I created a zero master cluster with 3 nodes. All three were up and running. I stopped all 3 instances and then brought up just Node1. In the Node1 UI, I made some edits to the flow. I attempted but was not able to bring up Node2 and Node3 to join the cluster. Here is what I saw in the logs: 2016-06-09 14:35:47,051 WARN [main] org.apache.nifi.web.server.JettyServer Failed to start web server... shutting down. java.lang.Exception: Unable to load flow due to: java.io.IOException: org.apache.nifi.controller.UninheritableFlowException: Failed to connect node to cluster because local flow is different than cluster flow. at org.apache.nifi.web.server.JettyServer.start(JettyServer.java:753) ~[nifi-jetty-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] at org.apache.nifi.NiFi.(NiFi.java:137) [nifi-runtime-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] at org.apache.nifi.NiFi.main(NiFi.java:227) [nifi-runtime-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] Caused by: java.io.IOException: org.apache.nifi.controller.UninheritableFlowException: Failed to connect node to cluster because local flow is different than cluster flow. at org.apache.nifi.controller.StandardFlowService.load(StandardFlowService.java:501) ~[nifi-framework-core-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] at org.apache.nifi.web.server.JettyServer.start(JettyServer.java:744) ~[nifi-jetty-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] ... 2 common frames omitted Caused by: org.apache.nifi.controller.UninheritableFlowException: Failed to connect node to cluster because local flow is different than cluster flow. at org.apache.nifi.controller.StandardFlowService.loadFromConnectionResponse(StandardFlowService.java:862) ~[nifi-framework-core-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] at org.apache.nifi.controller.StandardFlowService.load(StandardFlowService.java:497) ~[nifi-framework-core-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] ... 3 common frames omitted Caused by: org.apache.nifi.controller.UninheritableFlowException: Proposed configuration is not inheritable by the flow controller because of flow differences: Found difference in Flows: Local Fingerprint contains additional configuration from Cluster Fingerprint: eba77dac-32ad-356b-8a82-f669d046aa21eba77dac-32ad-356b-8a82-f669d046aa21org.apache.nifi.processors.standard.LogAttributeNO_VALUEAttributes to IgnoreNO_VALUEAttributes to LogNO_VALUELog prefixNO_VALUEs at org.apache.nifi.controller.StandardFlowSynchronizer.sync(StandardFlowSynchronizer.java:219) ~[nifi-framework-core-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] at org.apache.nifi.controller.FlowController.synchronize(FlowController.java:1329) ~[nifi-framework-core-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] at org.apache.nifi.persistence.StandardXMLFlowConfigurationDAO.load(StandardXMLFlowConfigurationDAO.java:75) ~[nifi-framework-core-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] at org.apache.nifi.controller.StandardFlowService.loadFromBytes(StandardFlowService.java:668) ~[nifi-framework-core-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] at org.apache.nifi.controller.StandardFlowService.loadFromConnectionResponse(StandardFlowService.java:839) ~[nifi-framework-core-1.0.0-SNAPSHOT.jar:1.0.0-SNAPSHOT] ... 4 common frames omitted 2016-06-09 14:35:47,052 INFO [Thread-1] org.apache.nifi.NiFi Initiating shutdown of Jetty web server... -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-826) Export templates in a deterministic way
[ https://issues.apache.org/jira/browse/NIFI-826?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323117#comment-15323117 ] ASF GitHub Bot commented on NIFI-826: - Github user olegz commented on the issue: https://github.com/apache/nifi/pull/515 @mcgilman so, this is the initial commit that essentially demonstrates the approach that is discussed in JIRA. Basically the new ID _ inceptionId_ is generated and is immutable and perpetual. With such contract the serialization of the components is no deterministic. There is initial test there that demonstrates, but I'll be adding more. Let's find time to discuss it and also see if we need to address purging of the elements that do not belong there as part of this effort or a separate. > Export templates in a deterministic way > --- > > Key: NIFI-826 > URL: https://issues.apache.org/jira/browse/NIFI-826 > Project: Apache NiFi > Issue Type: Improvement > Components: Core Framework >Reporter: Matt Gilman >Assignee: Oleg Zhurakousky > Fix For: 1.0.0 > > > Templates should be exported in a deterministic way so that they can be > compared or diff'ed with another. Items to consider... > - The ordering of components > - The id's used to identify the components > - Consider excluding irrelevant items. When components are imported some > settings are ignored (run state). -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-826) Export templates in a deterministic way
[ https://issues.apache.org/jira/browse/NIFI-826?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323108#comment-15323108 ] ASF GitHub Bot commented on NIFI-826: - GitHub user olegz opened a pull request: https://github.com/apache/nifi/pull/515 NIFI-826 [REVIEW ONLY] Initial commit for deterministic templates - Added _inceptionId_ via InceptionAware class which is the super class of ComponentDTO and TemplateDTO and is also comparable - The 'inceptionId' is also a Type 1 UUID which is comparable You can merge this pull request into a Git repository by running: $ git pull https://github.com/olegz/nifi NIFI-826 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/515.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #515 commit 79421a081234695f576e3a388365f1505aa1d2c7 Author: Oleg ZhurakouskyDate: 2016-06-09T18:42:18Z NIFI-826 [REVIEW ONLY] Initial commit for deterministic templates - Added _inceptionId_ via InceptionAware class which is the super class of ComponentDTO and TemplateDTO and is also comparable - The 'inceptionId' is also a Type 1 UUID which is comparable > Export templates in a deterministic way > --- > > Key: NIFI-826 > URL: https://issues.apache.org/jira/browse/NIFI-826 > Project: Apache NiFi > Issue Type: Improvement > Components: Core Framework >Reporter: Matt Gilman >Assignee: Oleg Zhurakousky > Fix For: 1.0.0 > > > Templates should be exported in a deterministic way so that they can be > compared or diff'ed with another. Items to consider... > - The ordering of components > - The id's used to identify the components > - Consider excluding irrelevant items. When components are imported some > settings are ignored (run state). -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1982) Compressed check box in Configure Remote Port UI is not used
[ https://issues.apache.org/jira/browse/NIFI-1982?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323049#comment-15323049 ] ASF GitHub Bot commented on NIFI-1982: -- Github user bbende commented on the issue: https://github.com/apache/nifi/pull/509 @ijokarumawak you can close this when you get a chance since it has been merged, thanks! > Compressed check box in Configure Remote Port UI is not used > > > Key: NIFI-1982 > URL: https://issues.apache.org/jira/browse/NIFI-1982 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Core UI >Affects Versions: 0.6.1 >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Fix For: 0.7.0 > > Attachments: configure-remote-port.png > > > RIght click Remote Process Group -> Remote Ports -> (Edit icon) -> Configure > Remote Port setting window > It has "Compressed" checkbox. However if user checked the UI component, > RemoteGroupPort doesn't use it to enable compression of Site-to-Site client. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (NIFI-1990) Implement consistent security controls for cluster, site-to-site, and API communications
Andy LoPresto created NIFI-1990: --- Summary: Implement consistent security controls for cluster, site-to-site, and API communications Key: NIFI-1990 URL: https://issues.apache.org/jira/browse/NIFI-1990 Project: Apache NiFi Issue Type: Bug Components: Core Framework Affects Versions: 0.6.1 Reporter: Andy LoPresto Assignee: Andy LoPresto Priority: Critical Fix For: 1.0.0 As discovered in [NIFI-1981], edge cases in configuration of cluster communications over TLS without client authentication caused errors in the application. We should provide a consistent experience, from documentation to configuration to execution: * Machine to machine communication should have two settings -- plaintext or TLS with mutual authentication. ** Cluster ** Site to Site * The API / UI should allow more granular control -- plaintext, TLS with server authentication only, or TLS with mutual authentication. Some clients (API consumers, users in an enterprise environment) may have client certificates, but the majority will not, and TLS authentication of the server, and data integrity and confidentiality assurances should still be available. ** Site to site over the API (see [NIFI-1857]) will respect this setting for the TLS handshake negotiation, but will manually enforce the presence of a client certificate in an HTTP header on any request arriving over HTTPS. The {{nifi.security.needClientAuth}} setting should be removed from nifi.properties. A new setting {{nifi.security.api.needClientAuth}} will be added, and documented to explicitly apply only to the API (and, by extension, Web UI). -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1989) Need to align commons-io artifact group as per MVNCENTRAL-244
[ https://issues.apache.org/jira/browse/NIFI-1989?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15323002#comment-15323002 ] ASF GitHub Bot commented on NIFI-1989: -- GitHub user PuspenduBanerjee opened a pull request: https://github.com/apache/nifi/pull/514 fixes NIFI-1989 Fixes : https://issues.sonatype.org/browse/MVNCENTRAL-244 > Taking a look at the POM at http://repo1.maven.org/maven2/org/apache/commons/commons-io/1.3.2/commons-io-1.3.2.pom, the groupId and artifactId are different from the deploy path and it seems the identical artifacts are available at http://repo1.maven.org/maven2/commons-io/commons-io/1.3.2/ Having the same classes available under two different GAV coordinates may lead to a lot of problems should one decide to update the component to a new version. I see a lot of old builds could break when we delete from the old coordinates, but what about a (correct) relocation POM? Builds would continue to work and the mishap would appear on the screen. You can merge this pull request into a Git repository by running: $ git pull https://github.com/PuspenduBanerjee/nifi NIFI-1989 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/514.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #514 commit e25bdd7b2b455728fb14ab7d0cd9a2c76c271aad Author: Puspendu BanerjeeDate: 2016-06-09T18:00:57Z fixes NIFI-1989 > Need to align commons-io artifact group as per MVNCENTRAL-244 > - > > Key: NIFI-1989 > URL: https://issues.apache.org/jira/browse/NIFI-1989 > Project: Apache NiFi > Issue Type: Improvement > Components: Tools and Build >Affects Versions: 1.0.0 >Reporter: Puspendu Banerjee >Priority: Minor > Labels: artifact, dependency, maven > > [WARNING] While downloading org.apache.commons:commons-io:1.3.2 > This artifact has been relocated to commons-io:commons-io:1.3.2. > https://issues.sonatype.org/browse/MVNCENTRAL-244 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (NIFI-1989) Need to align commons-io artifact group as per MVNCENTRAL-244
Puspendu Banerjee created NIFI-1989: --- Summary: Need to align commons-io artifact group as per MVNCENTRAL-244 Key: NIFI-1989 URL: https://issues.apache.org/jira/browse/NIFI-1989 Project: Apache NiFi Issue Type: Improvement Components: Tools and Build Affects Versions: 1.0.0 Reporter: Puspendu Banerjee Priority: Minor [WARNING] While downloading org.apache.commons:commons-io:1.3.2 This artifact has been relocated to commons-io:commons-io:1.3.2. https://issues.sonatype.org/browse/MVNCENTRAL-244 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1895) PutHBaseJSON processor treats all Values as Strings
[ https://issues.apache.org/jira/browse/NIFI-1895?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322939#comment-15322939 ] Ryan Templeton commented on NIFI-1895: -- Thanks Pierre I wasn't sure what to set the stays to. > PutHBaseJSON processor treats all Values as Strings > --- > > Key: NIFI-1895 > URL: https://issues.apache.org/jira/browse/NIFI-1895 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Affects Versions: 0.6.1 >Reporter: Ryan Templeton > > line 184 of PutHBaseJSON.java treats all JsonNode values as strings by > calling the .asText() method. We are working with using this processor to > load IoT time series data and this causes issues in HBase with > timestamps/numerics not getting sorted correctly. > The operator should inspect the node value to determine type and convert as > such. > Numeric integral - Long (assumes widest type) > Numeric not integral - Double (assumes widest type) > Logical - Boolean > everything else (including current Complex Type logic) - String -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Reopened] (NIFI-1895) PutHBaseJSON processor treats all Values as Strings
[ https://issues.apache.org/jira/browse/NIFI-1895?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Pierre Villard reopened NIFI-1895: -- > PutHBaseJSON processor treats all Values as Strings > --- > > Key: NIFI-1895 > URL: https://issues.apache.org/jira/browse/NIFI-1895 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Affects Versions: 0.6.1 >Reporter: Ryan Templeton > > line 184 of PutHBaseJSON.java treats all JsonNode values as strings by > calling the .asText() method. We are working with using this processor to > load IoT time series data and this causes issues in HBase with > timestamps/numerics not getting sorted correctly. > The operator should inspect the node value to determine type and convert as > such. > Numeric integral - Long (assumes widest type) > Numeric not integral - Double (assumes widest type) > Logical - Boolean > everything else (including current Complex Type logic) - String -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1975) Processor to Parse .evtx files
[ https://issues.apache.org/jira/browse/NIFI-1975?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322860#comment-15322860 ] ASF GitHub Bot commented on NIFI-1975: -- Github user brosander closed the pull request at: https://github.com/apache/nifi/pull/492 > Processor to Parse .evtx files > -- > > Key: NIFI-1975 > URL: https://issues.apache.org/jira/browse/NIFI-1975 > Project: Apache NiFi > Issue Type: Sub-task >Reporter: Bryan Rosander > > Windows event logs are stored in .evtx format as-of Windows Vista. If we > port the pure python implementation of an evtx parser at > https://github.com/williballenthin/python-evtx to Java, we should be able to > ingest those files in NiFi on any operating system > These files are located in C:\Windows\System32\winevt\Logs unless exported > elsewhere. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1975) Processor to Parse .evtx files
[ https://issues.apache.org/jira/browse/NIFI-1975?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322859#comment-15322859 ] ASF GitHub Bot commented on NIFI-1975: -- Github user brosander commented on the issue: https://github.com/apache/nifi/pull/492 changes merged upstream > Processor to Parse .evtx files > -- > > Key: NIFI-1975 > URL: https://issues.apache.org/jira/browse/NIFI-1975 > Project: Apache NiFi > Issue Type: Sub-task >Reporter: Bryan Rosander > > Windows event logs are stored in .evtx format as-of Windows Vista. If we > port the pure python implementation of an evtx parser at > https://github.com/williballenthin/python-evtx to Java, we should be able to > ingest those files in NiFi on any operating system > These files are located in C:\Windows\System32\winevt\Logs unless exported > elsewhere. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Resolved] (NIFI-1895) PutHBaseJSON processor treats all Values as Strings
[ https://issues.apache.org/jira/browse/NIFI-1895?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ryan Templeton resolved NIFI-1895. -- Resolution: Fixed Pull request issued from rtempleton/nifi NIFI-1895 branch > PutHBaseJSON processor treats all Values as Strings > --- > > Key: NIFI-1895 > URL: https://issues.apache.org/jira/browse/NIFI-1895 > Project: Apache NiFi > Issue Type: Improvement > Components: Extensions >Affects Versions: 0.6.1 >Reporter: Ryan Templeton > > line 184 of PutHBaseJSON.java treats all JsonNode values as strings by > calling the .asText() method. We are working with using this processor to > load IoT time series data and this causes issues in HBase with > timestamps/numerics not getting sorted correctly. > The operator should inspect the node value to determine type and convert as > such. > Numeric integral - Long (assumes widest type) > Numeric not integral - Double (assumes widest type) > Logical - Boolean > everything else (including current Complex Type logic) - String -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1974) Support Custom Properties in Expression Language
[ https://issues.apache.org/jira/browse/NIFI-1974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322808#comment-15322808 ] ASF GitHub Bot commented on NIFI-1974: -- Github user YolandaMDavis commented on a diff in the pull request: https://github.com/apache/nifi/pull/501#discussion_r66475348 --- Diff: nifi-api/src/main/java/org/apache/nifi/registry/FileVariableRegistry.java --- @@ -0,0 +1,70 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.registry; + +import java.io.File; +import java.io.IOException; +import java.nio.file.Path; +import java.util.Map; + + +public abstract class FileVariableRegistry extends MultiMapVariableRegistry { + +public FileVariableRegistry() { +super(); +} + +public FileVariableRegistry(File... files){ +super(); +addVariables(files); +} + +public FileVariableRegistry(Path... paths){ +super(); +addVariables(paths); +} + +@SuppressWarnings({"unchecked", "rawtypes"}) +public void addVariables(File ...files){ +if(files != null) { +for (final File file : files) { +try { +registry.addMap(convertFile(file)); +} catch (IOException iex) { +throw new IllegalArgumentException("A file provided was invalid.", iex); --- End diff -- After reviewing I made the suggested change, since removing the public off of the constructor to ensure only Factory creation made the exception on the constructor much more palatable :). > Support Custom Properties in Expression Language > > > Key: NIFI-1974 > URL: https://issues.apache.org/jira/browse/NIFI-1974 > Project: Apache NiFi > Issue Type: New Feature >Reporter: Yolanda M. Davis >Assignee: Yolanda M. Davis > Fix For: 1.0.0 > > > Add a property in "nifi.properties" config file to allows users to specify a > list of custom properties files (containing data such as environmental > specific values, or sensitive values, etc.). The key/value pairs should be > loaded upon NIFI startup and availbale to processors for use in expression > languages. > Optimally this will lay the groundwork for a UI driven Variable Registry. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[5/6] nifi git commit: NIFI-1975 - Processor for parsing evtx files
http://git-wip-us.apache.org/repos/asf/nifi/blob/a5fecda5/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/ChunkHeader.java -- diff --git a/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/ChunkHeader.java b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/ChunkHeader.java new file mode 100644 index 000..7f01adf --- /dev/null +++ b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/ChunkHeader.java @@ -0,0 +1,199 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.processors.evtx.parser; + +import com.google.common.annotations.VisibleForTesting; +import com.google.common.primitives.UnsignedInteger; +import com.google.common.primitives.UnsignedLong; +import org.apache.nifi.logging.ComponentLog; +import org.apache.nifi.processors.evtx.parser.bxml.NameStringNode; +import org.apache.nifi.processors.evtx.parser.bxml.TemplateNode; + +import java.io.IOException; +import java.util.Collections; +import java.util.HashMap; +import java.util.Map; +import java.util.zip.CRC32; + +/** + * A Chunk is a self-contained group of templates, strings, and nodes + */ +public class ChunkHeader extends Block { +public static final String ELF_CHNK = "ElfChnk"; +private final String magicString; +private final UnsignedLong fileFirstRecordNumber; +private final UnsignedLong fileLastRecordNumber; +private final UnsignedLong logFirstRecordNumber; +private final UnsignedLong logLastRecordNumber; +private final UnsignedInteger headerSize; +private final UnsignedInteger lastRecordOffset; +private final int nextRecordOffset; +private final UnsignedInteger dataChecksum; +private final String unused; +private final UnsignedInteger headerChecksum; +private final MapnameStrings; +private final Map templateNodes; +private final int chunkNumber; +private final ComponentLog log; +private UnsignedLong recordNumber; + +public ChunkHeader(BinaryReader binaryReader, ComponentLog log, long headerOffset, int chunkNumber) throws IOException { +super(binaryReader, headerOffset); +this.log = log; +this.chunkNumber = chunkNumber; +CRC32 crc32 = new CRC32(); +crc32.update(binaryReader.peekBytes(120)); + +magicString = binaryReader.readString(8); +fileFirstRecordNumber = binaryReader.readQWord(); +fileLastRecordNumber = binaryReader.readQWord(); +logFirstRecordNumber = binaryReader.readQWord(); +logLastRecordNumber = binaryReader.readQWord(); +headerSize = binaryReader.readDWord(); +lastRecordOffset = binaryReader.readDWord(); +nextRecordOffset = NumberUtil.intValueMax(binaryReader.readDWord(), Integer.MAX_VALUE, "Invalid next record offset."); +dataChecksum = binaryReader.readDWord(); +unused = binaryReader.readString(68); + +if (!ELF_CHNK.equals(magicString)) { +throw new IOException("Invalid magic string " + this); +} + +headerChecksum = binaryReader.readDWord(); + +// These are included into the checksum +crc32.update(binaryReader.peekBytes(384)); + +if (crc32.getValue() != headerChecksum.longValue()) { +throw new IOException("Invalid checksum " + this); +} +if (lastRecordOffset.compareTo(UnsignedInteger.valueOf(Integer.MAX_VALUE)) > 0) { +throw new IOException("Last record offset too big to fit into signed integer"); +} + +nameStrings = new HashMap<>(); +for (int i = 0; i < 64; i++) { +int offset = NumberUtil.intValueMax(binaryReader.readDWord(), Integer.MAX_VALUE, "Invalid offset."); +while (offset > 0) { +NameStringNode nameStringNode = new NameStringNode(new BinaryReader(binaryReader, offset), this); +nameStrings.put(offset, nameStringNode); +
[2/6] nifi git commit: NIFI-1975 - Processor for parsing evtx files
http://git-wip-us.apache.org/repos/asf/nifi/blob/a5fecda5/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/CDataSectionNodeTest.java -- diff --git a/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/CDataSectionNodeTest.java b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/CDataSectionNodeTest.java new file mode 100644 index 000..118bdcc --- /dev/null +++ b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/CDataSectionNodeTest.java @@ -0,0 +1,60 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.processors.evtx.parser.bxml; + +import org.apache.nifi.processors.evtx.parser.BxmlNodeVisitor; +import org.junit.Test; + +import java.io.IOException; + +import static org.junit.Assert.assertEquals; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoMoreInteractions; + +public class CDataSectionNodeTest extends BxmlNodeWithTokenTestBase { +private String content; +private CDataSectionNode cDataSectionNode; + +@Override +public void setup() throws IOException { +super.setup(); +content = "cdata content"; +testBinaryReaderBuilder.putWord(content.length() + 2); +testBinaryReaderBuilder.putWString(content); +cDataSectionNode = new CDataSectionNode(testBinaryReaderBuilder.build(), chunkHeader, parent); +} + +@Override +protected byte getToken() { +return BxmlNode.C_DATA_SECTION_TOKEN; +} + +@Test +public void testInit() { +assertEquals(content, cDataSectionNode.getCdata()); +} + +@Test +public void testVisitor() throws IOException { +BxmlNodeVisitor mock = mock(BxmlNodeVisitor.class); +cDataSectionNode.accept(mock); +verify(mock).visit(cDataSectionNode); +verifyNoMoreInteractions(mock); +} +} http://git-wip-us.apache.org/repos/asf/nifi/blob/a5fecda5/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/CloseElementNodeTest.java -- diff --git a/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/CloseElementNodeTest.java b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/CloseElementNodeTest.java new file mode 100644 index 000..f057ced --- /dev/null +++ b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/CloseElementNodeTest.java @@ -0,0 +1,50 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.processors.evtx.parser.bxml; + +import org.apache.nifi.processors.evtx.parser.BxmlNodeVisitor; +import org.junit.Test; + +import java.io.IOException; + +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoMoreInteractions; + +public class CloseElementNodeTest extends BxmlNodeWithTokenTestBase { +private CloseElementNode
[6/6] nifi git commit: NIFI-1975 - Processor for parsing evtx files
NIFI-1975 - Processor for parsing evtx files Signed-off-by: Matt BurgessThis closes #492 Project: http://git-wip-us.apache.org/repos/asf/nifi/repo Commit: http://git-wip-us.apache.org/repos/asf/nifi/commit/a5fecda5 Tree: http://git-wip-us.apache.org/repos/asf/nifi/tree/a5fecda5 Diff: http://git-wip-us.apache.org/repos/asf/nifi/diff/a5fecda5 Branch: refs/heads/master Commit: a5fecda5a2ffb35e21d950aa19a07127e19a419e Parents: 1c22bc0 Author: Bryan Rosander Authored: Fri May 27 10:56:02 2016 -0400 Committer: Matt Burgess Committed: Thu Jun 9 12:07:00 2016 -0400 -- NOTICE | 5 + nifi-assembly/NOTICE| 6 + nifi-assembly/pom.xml | 5 + .../nifi-evtx-bundle/nifi-evtx-nar/pom.xml | 40 ++ .../src/main/resources/META-INF/LICENSE | 209 .../src/main/resources/META-INF/NOTICE | 36 ++ .../nifi-evtx-processors/pom.xml| 68 +++ .../processors/evtx/MalformedChunkHandler.java | 43 ++ .../apache/nifi/processors/evtx/ParseEvtx.java | 290 +++ .../nifi/processors/evtx/ResultProcessor.java | 47 ++ .../nifi/processors/evtx/RootNodeHandler.java | 27 ++ .../processors/evtx/RootNodeHandlerFactory.java | 25 + .../processors/evtx/XmlBxmlNodeVisitor.java | 190 .../evtx/XmlBxmlNodeVisitorFactory.java | 27 ++ .../processors/evtx/XmlRootNodeHandler.java | 83 .../processors/evtx/parser/BinaryReader.java| 294 .../nifi/processors/evtx/parser/Block.java | 84 .../processors/evtx/parser/BxmlNodeVisitor.java | 121 + .../processors/evtx/parser/ChunkHeader.java | 199 .../nifi/processors/evtx/parser/FileHeader.java | 171 +++ .../evtx/parser/FileHeaderFactory.java | 27 ++ .../evtx/parser/MalformedChunkException.java| 43 ++ .../nifi/processors/evtx/parser/NumberUtil.java | 68 +++ .../nifi/processors/evtx/parser/Record.java | 71 +++ .../evtx/parser/bxml/AttributeNode.java | 52 ++ .../processors/evtx/parser/bxml/BxmlNode.java | 134 ++ .../evtx/parser/bxml/BxmlNodeFactory.java | 27 ++ .../evtx/parser/bxml/BxmlNodeWithToken.java | 43 ++ .../parser/bxml/BxmlNodeWithTokenAndString.java | 65 +++ .../evtx/parser/bxml/CDataSectionNode.java | 61 +++ .../evtx/parser/bxml/CloseElementNode.java | 49 ++ .../evtx/parser/bxml/CloseEmptyElementNode.java | 46 ++ .../evtx/parser/bxml/CloseStartElementNode.java | 49 ++ .../bxml/ConditionalSubstitutionNode.java | 61 +++ .../evtx/parser/bxml/EndOfStreamNode.java | 46 ++ .../evtx/parser/bxml/EntityReferenceNode.java | 50 ++ .../evtx/parser/bxml/NameStringNode.java| 69 +++ .../parser/bxml/NormalSubstitutionNode.java | 61 +++ .../evtx/parser/bxml/OpenStartElementNode.java | 81 .../bxml/ProcessingInstructionDataNode.java | 58 +++ .../bxml/ProcessingInstructionTargetNode.java | 47 ++ .../processors/evtx/parser/bxml/RootNode.java | 83 .../evtx/parser/bxml/StreamStartNode.java | 58 +++ .../evtx/parser/bxml/TemplateInstanceNode.java | 87 .../evtx/parser/bxml/TemplateNode.java | 78 +++ .../processors/evtx/parser/bxml/ValueNode.java | 111 + .../evtx/parser/bxml/value/BXmlTypeNode.java| 46 ++ .../evtx/parser/bxml/value/BinaryTypeNode.java | 46 ++ .../evtx/parser/bxml/value/BooleanTypeNode.java | 43 ++ .../evtx/parser/bxml/value/DoubleTypeNode.java | 43 ++ .../parser/bxml/value/FiletimeTypeNode.java | 49 ++ .../evtx/parser/bxml/value/FloatTypeNode.java | 43 ++ .../evtx/parser/bxml/value/GuidTypeNode.java| 41 ++ .../evtx/parser/bxml/value/Hex32TypeNode.java | 41 ++ .../evtx/parser/bxml/value/Hex64TypeNode.java | 41 ++ .../evtx/parser/bxml/value/NullTypeNode.java| 38 ++ .../evtx/parser/bxml/value/SIDTypeNode.java | 54 +++ .../parser/bxml/value/SignedByteTypeNode.java | 41 ++ .../parser/bxml/value/SignedDWordTypeNode.java | 42 ++ .../parser/bxml/value/SignedQWordTypeNode.java | 42 ++ .../parser/bxml/value/SignedWordTypeNode.java | 41 ++ .../evtx/parser/bxml/value/SizeTypeNode.java| 45 ++ .../evtx/parser/bxml/value/StringTypeNode.java | 45 ++ .../parser/bxml/value/SystemtimeTypeNode.java | 61 +++ .../parser/bxml/value/UnsignedByteTypeNode.java | 41 ++ .../bxml/value/UnsignedDWordTypeNode.java | 42 ++ .../bxml/value/UnsignedQWordTypeNode.java | 42 ++ .../parser/bxml/value/UnsignedWordTypeNode.java | 41 ++ .../evtx/parser/bxml/value/VariantTypeNode.java | 52 ++ .../bxml/value/VariantTypeNodeFactory.java | 28 ++ .../parser/bxml/value/WStringArrayTypeNode.java | 68 +++ .../evtx/parser/bxml/value/WStringTypeNode.java | 46 ++
[4/6] nifi git commit: NIFI-1975 - Processor for parsing evtx files
http://git-wip-us.apache.org/repos/asf/nifi/blob/a5fecda5/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/bxml/TemplateNode.java -- diff --git a/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/bxml/TemplateNode.java b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/bxml/TemplateNode.java new file mode 100644 index 000..0b21c38 --- /dev/null +++ b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/bxml/TemplateNode.java @@ -0,0 +1,78 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.processors.evtx.parser.bxml; + +import com.google.common.primitives.UnsignedInteger; +import org.apache.nifi.processors.evtx.parser.BinaryReader; +import org.apache.nifi.processors.evtx.parser.BxmlNodeVisitor; +import org.apache.nifi.processors.evtx.parser.ChunkHeader; +import org.apache.nifi.processors.evtx.parser.NumberUtil; + +import java.io.IOException; + +/** + * Template node describing structure of xml to be rendered into + */ +public class TemplateNode extends BxmlNode { +private final int nextOffset; +private final UnsignedInteger templateId; +private final String guid; +private final int dataLength; + +public TemplateNode(BinaryReader binaryReader, ChunkHeader chunkHeader) throws IOException { +super(binaryReader, chunkHeader, null); +nextOffset = NumberUtil.intValueMax(binaryReader.readDWord(), Integer.MAX_VALUE, "Invalid offset."); + +//TemplateId and Guid overlap +templateId = new BinaryReader(binaryReader, binaryReader.getPosition()).readDWord(); +guid = binaryReader.readGuid(); +dataLength = NumberUtil.intValueMax(binaryReader.readDWord(), Integer.MAX_VALUE - 0x18, "Data length too large."); +init(); +} + +@Override +public String toString() { +return "TemplateNode{" + +"nextOffset=" + nextOffset + +", templateId=" + templateId + +", guid='" + guid + '\'' + +", dataLength=" + dataLength + +'}'; +} + +public int getNextOffset() { +return nextOffset; +} + +public UnsignedInteger getTemplateId() { +return templateId; +} + +public String getGuid() { +return guid; +} + +public int getDataLength() { +return dataLength; +} + +@Override +public void accept(BxmlNodeVisitor bxmlNodeVisitor) throws IOException { +bxmlNodeVisitor.visit(this); +} +} http://git-wip-us.apache.org/repos/asf/nifi/blob/a5fecda5/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/bxml/ValueNode.java -- diff --git a/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/bxml/ValueNode.java b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/bxml/ValueNode.java new file mode 100644 index 000..013ffb7 --- /dev/null +++ b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/main/java/org/apache/nifi/processors/evtx/parser/bxml/ValueNode.java @@ -0,0 +1,111 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + *
[3/6] nifi git commit: NIFI-1975 - Processor for parsing evtx files
http://git-wip-us.apache.org/repos/asf/nifi/blob/a5fecda5/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/ParseEvtxTest.java -- diff --git a/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/ParseEvtxTest.java b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/ParseEvtxTest.java new file mode 100644 index 000..2e5e90d --- /dev/null +++ b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/ParseEvtxTest.java @@ -0,0 +1,481 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.processors.evtx; + +import org.apache.nifi.flowfile.FlowFile; +import org.apache.nifi.flowfile.attributes.CoreAttributes; +import org.apache.nifi.logging.ComponentLog; +import org.apache.nifi.processor.ProcessSession; +import org.apache.nifi.processor.io.OutputStreamCallback; +import org.apache.nifi.processors.evtx.parser.ChunkHeader; +import org.apache.nifi.processors.evtx.parser.FileHeader; +import org.apache.nifi.processors.evtx.parser.FileHeaderFactory; +import org.apache.nifi.processors.evtx.parser.MalformedChunkException; +import org.apache.nifi.processors.evtx.parser.Record; +import org.apache.nifi.processors.evtx.parser.bxml.RootNode; +import org.apache.nifi.util.MockFlowFile; +import org.apache.nifi.util.TestRunner; +import org.apache.nifi.util.TestRunners; +import org.junit.Before; +import org.junit.Test; +import org.junit.runner.RunWith; +import org.mockito.Mock; +import org.mockito.runners.MockitoJUnitRunner; +import org.w3c.dom.Document; +import org.w3c.dom.Element; +import org.w3c.dom.Node; +import org.w3c.dom.NodeList; +import org.xml.sax.SAXException; + +import javax.xml.parsers.DocumentBuilderFactory; +import javax.xml.parsers.ParserConfigurationException; +import javax.xml.stream.XMLStreamException; +import java.io.ByteArrayInputStream; +import java.io.IOException; +import java.io.InputStream; +import java.io.OutputStream; +import java.util.Arrays; +import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; +import java.util.concurrent.atomic.AtomicReference; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertTrue; +import static org.mockito.Mockito.any; +import static org.mockito.Mockito.anyString; +import static org.mockito.Mockito.eq; +import static org.mockito.Mockito.isA; +import static org.mockito.Mockito.mock; +import static org.mockito.Mockito.verify; +import static org.mockito.Mockito.verifyNoMoreInteractions; +import static org.mockito.Mockito.when; + +@RunWith(MockitoJUnitRunner.class) +public class ParseEvtxTest { +public static final DocumentBuilderFactory DOCUMENT_BUILDER_FACTORY = DocumentBuilderFactory.newInstance(); +public static final String USER_DATA = "UserData"; +public static final String EVENT_DATA = "EventData"; +public static final Set DATA_TAGS = new HashSet<>(Arrays.asList(EVENT_DATA, USER_DATA)); + +@Mock +FileHeaderFactory fileHeaderFactory; + +@Mock +MalformedChunkHandler malformedChunkHandler; + +@Mock +RootNodeHandlerFactory rootNodeHandlerFactory; + +@Mock +ResultProcessor resultProcessor; + +@Mock +ComponentLog componentLog; + +@Mock +InputStream in; + +@Mock +OutputStream out; + +@Mock +FileHeader fileHeader; + +ParseEvtx parseEvtx; + +@Before +public void setup() throws XMLStreamException, IOException { +parseEvtx = new ParseEvtx(fileHeaderFactory, malformedChunkHandler, rootNodeHandlerFactory, resultProcessor); +when(fileHeaderFactory.create(in, componentLog)).thenReturn(fileHeader); +} + +@Test +public void testGetNameFile() { +String basename = "basename"; +assertEquals(basename + ".xml", parseEvtx.getName(basename, null, null, ParseEvtx.XML_EXTENSION)); +} + +@Test +public void testGetNameFileChunk() { +String basename = "basename"; +assertEquals(basename +
[1/6] nifi git commit: NIFI-1975 - Processor for parsing evtx files
Repository: nifi Updated Branches: refs/heads/master 1c22bc015 -> a5fecda5a http://git-wip-us.apache.org/repos/asf/nifi/blob/a5fecda5/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/value/SignedWordTypeNodeTest.java -- diff --git a/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/value/SignedWordTypeNodeTest.java b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/value/SignedWordTypeNodeTest.java new file mode 100644 index 000..3a880af --- /dev/null +++ b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/value/SignedWordTypeNodeTest.java @@ -0,0 +1,34 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.processors.evtx.parser.bxml.value; + +import org.apache.nifi.processors.evtx.parser.bxml.BxmlNodeTestBase; +import org.junit.Test; + +import java.io.IOException; + +import static org.junit.Assert.assertEquals; + +public class SignedWordTypeNodeTest extends BxmlNodeTestBase { +@Test +public void testSignedWordTypeNode() throws IOException { +short value = -5; +assertEquals(Short.toString(value), +new SignedWordTypeNode(testBinaryReaderBuilder.putWord(value).build(), chunkHeader, parent, -1).getValue()); +} +} http://git-wip-us.apache.org/repos/asf/nifi/blob/a5fecda5/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/value/SizeTypeNodeTest.java -- diff --git a/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/value/SizeTypeNodeTest.java b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/value/SizeTypeNodeTest.java new file mode 100644 index 000..c654ce0 --- /dev/null +++ b/nifi-nar-bundles/nifi-evtx-bundle/nifi-evtx-processors/src/test/java/org/apache/nifi/processors/evtx/parser/bxml/value/SizeTypeNodeTest.java @@ -0,0 +1,43 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.processors.evtx.parser.bxml.value; + +import com.google.common.primitives.UnsignedInteger; +import com.google.common.primitives.UnsignedLong; +import org.apache.nifi.processors.evtx.parser.bxml.BxmlNodeTestBase; +import org.junit.Test; + +import java.io.IOException; + +import static org.junit.Assert.assertEquals; + +public class SizeTypeNodeTest extends BxmlNodeTestBase { +@Test +public void testSizeTypeNodeDWord() throws IOException { +UnsignedInteger value = UnsignedInteger.fromIntBits(Integer.MAX_VALUE + 132); +assertEquals(value.toString(), +new SizeTypeNode(testBinaryReaderBuilder.putDWord(value).build(), chunkHeader, parent, 4).getValue()); +} + +@Test +public void testSizeTypeNodeQWord() throws IOException { +UnsignedLong value = UnsignedLong.fromLongBits(Long.MAX_VALUE + 132); +assertEquals(value.toString(), +new SizeTypeNode(testBinaryReaderBuilder.putQWord(value).build(), chunkHeader, parent, -1).getValue()); +} +}
[jira] [Commented] (NIFI-1975) Processor to Parse .evtx files
[ https://issues.apache.org/jira/browse/NIFI-1975?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322778#comment-15322778 ] ASF GitHub Bot commented on NIFI-1975: -- Github user mattyb149 commented on the issue: https://github.com/apache/nifi/pull/492 +1 LGTM, built and ran tests and contrib-check. Ran a NiFi flow with multiple EVTX files exercising all relationships and granularities. Great contribution, thanks much! Merging to master > Processor to Parse .evtx files > -- > > Key: NIFI-1975 > URL: https://issues.apache.org/jira/browse/NIFI-1975 > Project: Apache NiFi > Issue Type: Sub-task >Reporter: Bryan Rosander > > Windows event logs are stored in .evtx format as-of Windows Vista. If we > port the pure python implementation of an evtx parser at > https://github.com/williballenthin/python-evtx to Java, we should be able to > ingest those files in NiFi on any operating system > These files are located in C:\Windows\System32\winevt\Logs unless exported > elsewhere. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-401) New Scheduling strategy (On primary node - CRON )
[ https://issues.apache.org/jira/browse/NIFI-401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322731#comment-15322731 ] ASF GitHub Bot commented on NIFI-401: - GitHub user beugley opened a pull request: https://github.com/apache/nifi/pull/512 NIFI-401 You can merge this pull request into a Git repository by running: $ git pull https://github.com/beugley/nifi NIFI-401 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/nifi/pull/512.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #512 commit 011b5e724648d3f72746b18e17e79be0ebee200d Author: Brian EugleyDate: 2016-06-09T15:32:15Z NIFI-401 > New Scheduling strategy (On primary node - CRON ) > - > > Key: NIFI-401 > URL: https://issues.apache.org/jira/browse/NIFI-401 > Project: Apache NiFi > Issue Type: Improvement > Components: Core Framework >Reporter: Matthew Clarke >Priority: Minor > Attachments: initial prototype .png > > > Currently the only scheduling strategy supported when a processor is set to > use "On primary Node" is Timer Driven. There should be a second option to > allow cron driven On primary Node scheduling strategy. This would allow > users to more control over when a given primary node only processor runs. > This would prevent these processors from running when configuration changes > or instance restarts occur. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1987) Nifi-Spark-Receiver build failure due to orgspark repository change
[ https://issues.apache.org/jira/browse/NIFI-1987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322708#comment-15322708 ] Oleg Zhurakousky commented on NIFI-1987: Looking forward to your findings as I completely wiped off my local maven repo and did a clean build and all went fine. Further more, I've looked at all of the dependencies that were pulled in (both transient and immediate) and non were linked to anything CDH. > Nifi-Spark-Receiver build failure due to orgspark repository change > --- > > Key: NIFI-1987 > URL: https://issues.apache.org/jira/browse/NIFI-1987 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 0.6.1 >Reporter: Daniel Cave >Assignee: Oleg Zhurakousky >Priority: Critical > Fix For: 1.0.0, 0.7.0 > > > Builds on machines with no or incomplete local maven repos failing due to > nifi-spark-receiver sub-dependency on > oss.sonatype.org/content/repositories/orgspark-project-1113 related > dependencies. On 6/7/16 it appears that orgspark-project-1113 was removed > from the repo and was replaced with orgspark-project-1123. This blocks > complete NiFi builds where a complete local repo was not available with all > 1113 sub-dependencies already in place. See below: > [INFO] Scanning for projects... > [INFO] Inspecting build with total of 1 modules... > [INFO] Installing Nexus Staging features: > [INFO] ... total of 1 executions of maven-deploy-plugin replaced with > nexus-staging-maven-plugin > [INFO] > [INFO] > > [INFO] Building nifi-spark-receiver 0.7.0-SNAPSHOT > [INFO] > > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT.pom > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT.pom > [WARNING] The POM for > org.apache.avro:avro-mapred:jar:hadoop2:1.7.6-cdh5.7.0-SNAPSHOT is missing, > no dependency information available > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-site-to-site-client/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-commons/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-api/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-utils/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-security-utils/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-client-dto/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-framework/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-framework-bundle/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-nar-bundles/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT-hadoop2.jar > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT-hadoop2.jar > [INFO] > > [INFO] BUILD FAILURE > [INFO] > > [INFO] Total time: 6.771 s > [INFO] Finished at: 2016-06-08T13:13:41-05:00 > [INFO] Final Memory: 28M/381M > [INFO] > > [ERROR] Failed to execute goal on project nifi-spark-receiver: Could not > resolve dependencies for project > org.apache.nifi:nifi-spark-receiver:jar:0.7.0-SNAPSHOT: Could not find > artifact org.apache.avro:avro-mapred:jar:hadoop2:1.7.6-cdh5.7.0-SNAPSHOT in > apache.snapshots (http://repository.apache.org/snapshots) -> [Help 1] > [ERROR] > [ERROR] To
[jira] [Commented] (NIFI-1987) Nifi-Spark-Receiver build failure due to orgspark repository change
[ https://issues.apache.org/jira/browse/NIFI-1987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322693#comment-15322693 ] Daniel Cave commented on NIFI-1987: --- No local changes in those packages, in fact I'd never looked in nifi-external before. The issue is directly in nifi-spark-receiver seemingly as a sub-dependency of the nifi-spark-streaming dependency. I was able to find the dependency by running mvn (or mvn.cmd) dependency:tree on that package in a server where I had a pre-existing repo (and hence no build issue). I am installing NiFi on a completely clean server and will retest then give you feedback this afternoon. That should give you a better error message since it will be from a clean repo instead of a partial repo and should make things easier for you to reproduce. > Nifi-Spark-Receiver build failure due to orgspark repository change > --- > > Key: NIFI-1987 > URL: https://issues.apache.org/jira/browse/NIFI-1987 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 0.6.1 >Reporter: Daniel Cave >Assignee: Oleg Zhurakousky >Priority: Critical > Fix For: 1.0.0, 0.7.0 > > > Builds on machines with no or incomplete local maven repos failing due to > nifi-spark-receiver sub-dependency on > oss.sonatype.org/content/repositories/orgspark-project-1113 related > dependencies. On 6/7/16 it appears that orgspark-project-1113 was removed > from the repo and was replaced with orgspark-project-1123. This blocks > complete NiFi builds where a complete local repo was not available with all > 1113 sub-dependencies already in place. See below: > [INFO] Scanning for projects... > [INFO] Inspecting build with total of 1 modules... > [INFO] Installing Nexus Staging features: > [INFO] ... total of 1 executions of maven-deploy-plugin replaced with > nexus-staging-maven-plugin > [INFO] > [INFO] > > [INFO] Building nifi-spark-receiver 0.7.0-SNAPSHOT > [INFO] > > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT.pom > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT.pom > [WARNING] The POM for > org.apache.avro:avro-mapred:jar:hadoop2:1.7.6-cdh5.7.0-SNAPSHOT is missing, > no dependency information available > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-site-to-site-client/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-commons/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-api/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-utils/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-security-utils/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-client-dto/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-framework/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-framework-bundle/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-nar-bundles/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT-hadoop2.jar > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT-hadoop2.jar > [INFO] > > [INFO] BUILD FAILURE > [INFO] > > [INFO] Total time: 6.771 s > [INFO] Finished at: 2016-06-08T13:13:41-05:00 > [INFO] Final Memory: 28M/381M > [INFO] >
[jira] [Commented] (NIFI-1982) Compressed check box in Configure Remote Port UI is not used
[ https://issues.apache.org/jira/browse/NIFI-1982?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322682#comment-15322682 ] ASF GitHub Bot commented on NIFI-1982: -- Github user ijokarumawak commented on the issue: https://github.com/apache/nifi/pull/509 @bbende Yes, the same change is in PR #497 . This is only for 0.x. Thanks! > Compressed check box in Configure Remote Port UI is not used > > > Key: NIFI-1982 > URL: https://issues.apache.org/jira/browse/NIFI-1982 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Core UI >Affects Versions: 0.6.1 >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Fix For: 0.7.0 > > Attachments: configure-remote-port.png > > > RIght click Remote Process Group -> Remote Ports -> (Edit icon) -> Configure > Remote Port setting window > It has "Compressed" checkbox. However if user checked the UI component, > RemoteGroupPort doesn't use it to enable compression of Site-to-Site client. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1935) Added ConvertDynamicJsonToAvro processor
[ https://issues.apache.org/jira/browse/NIFI-1935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322615#comment-15322615 ] Daniel Cave commented on NIFI-1935: --- Let me revisit the original version you referenced and I will get back with you > Added ConvertDynamicJsonToAvro processor > > > Key: NIFI-1935 > URL: https://issues.apache.org/jira/browse/NIFI-1935 > Project: Apache NiFi > Issue Type: New Feature > Components: Extensions >Affects Versions: 1.0.0, 0.7.0 >Reporter: Daniel Cave >Assignee: Alex Halldin >Priority: Minor > Fix For: 1.0.0, 0.7.0 > > Attachments: > 0001-NIFI-1935-Added-ConvertDynamicJSONToAvro.java.-Added.patch > > > ConvertJsonToAvro required a predefined Avro schema to convert JSON and > required the presence of all field on the incoming JSON. > ConvertDynamicJsonToAvro functions similarly, however it now accepts the JSON > and schema as incoming flowfiles and creates the Avro dynamically. > This processor requires the InferAvroSchema processor in its upstream flow so > that it can use the original and schema flowfiles as input. These two > flowfiles will have the unique attribute inferredAvroId set on them by > InferAvroSchema so that they can be properly matched in > ConvertDynamicJsonToAvro. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (NIFI-1988) File filter options
Raymond created NIFI-1988: - Summary: File filter options Key: NIFI-1988 URL: https://issues.apache.org/jira/browse/NIFI-1988 Project: Apache NiFi Issue Type: Wish Components: Extensions Affects Versions: 0.6.1 Environment: Windows Reporter: Raymond Priority: Minor File Filter on the GetFile processor accepts a regular expression (Java Regex). As not all developers use Java/Regex it would nice to give other file filter options as well, like filename exact, contains, prefix and suffix. For example the file "helloworld.txt" would be picked up if: filename_exact --> helloworld.txt filename_contains --> helloworld.txt, ello, d.txt filename_prefix--> hello, helloworld filename_suffix--> .txt, txt, world.txt I don't know if this is current possible, but it would be nice to add this as a combobox on top of the value field (regex, exact, contains, prefix, suffix). Underneath the filter could still be a regular expression (if other filters would slow down the pick up process). -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Comment Edited] (NIFI-1935) Added ConvertDynamicJsonToAvro processor
[ https://issues.apache.org/jira/browse/NIFI-1935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322550#comment-15322550 ] Bryan Bende edited comment on NIFI-1935 at 6/9/16 2:06 PM: --- [~Daniel Cave] [~ahalldin] Was looking this over, and admittedly haven't done a full review, but just wanted to mention something... the description says that part of the reason for creating this processor was because ConverJsonToAvro requires a predefined schema, but I'm not sure that is the case... The schema property in ConvertJsonToAvro supports expression language: {code} static final PropertyDescriptor SCHEMA = new PropertyDescriptor.Builder() .name("Record schema") .description("Outgoing Avro schema for each record created from a JSON object") .addValidator(SCHEMA_VALIDATOR) .expressionLanguageSupported(true) .required(true) .build(); {code} And InferAvroSchema has a property to select where to store the schema - attributes vs. content. So if you choose attributes, it creates an attribute on the FlowFile called "inferred.avro.schema" which contains the schema as the value, so now you have a single flow file with the JSON as the content and the schema in the attributes. You could then set the SCHEMA property in ConvertJsonToAvro to {code} ${inferred.avro.schema} {code} to reference the schema, so each incoming flow file could have a different schema. Given the above, do you think ConvertDynamicJsonToAvro still provides additional benefits over ConvertJsonToAvro? was (Author: bende): [~Daniel Cave] [~ahalldin] Was looking this over, and admittedly haven't done a full review, but just wanted to mention something... the description says that part of the reason for creating this processor was because ConverJsonToAvro requires a predefined schema, but I'm not sure that is the case... The schema property in ConvertJsonToAvro supports expression language: {code} static final PropertyDescriptor SCHEMA = new PropertyDescriptor.Builder() .name("Record schema") .description("Outgoing Avro schema for each record created from a JSON object") .addValidator(SCHEMA_VALIDATOR) .expressionLanguageSupported(true) .required(true) .build(); {code} And InferAvroSchema has a property to select where to store the schema - attributes vs. content. So if you choose attributes, it creates an attribute on the FlowFile called "inferred.avro.schema" which contains the schema as the value, so now you have a single flow file with the JSON as the content and the schema in the attributes. You could then set the SCHEMA property in ConvertJsonToAvro to ${inferred.avro.schema} to reference the schema, so each incoming flow file could have a different schema. Given the above, do you think ConvertDynamicJsonToAvro still provides additional benefits over ConvertJsonToAvro? > Added ConvertDynamicJsonToAvro processor > > > Key: NIFI-1935 > URL: https://issues.apache.org/jira/browse/NIFI-1935 > Project: Apache NiFi > Issue Type: New Feature > Components: Extensions >Affects Versions: 1.0.0, 0.7.0 >Reporter: Daniel Cave >Assignee: Alex Halldin >Priority: Minor > Fix For: 1.0.0, 0.7.0 > > Attachments: > 0001-NIFI-1935-Added-ConvertDynamicJSONToAvro.java.-Added.patch > > > ConvertJsonToAvro required a predefined Avro schema to convert JSON and > required the presence of all field on the incoming JSON. > ConvertDynamicJsonToAvro functions similarly, however it now accepts the JSON > and schema as incoming flowfiles and creates the Avro dynamically. > This processor requires the InferAvroSchema processor in its upstream flow so > that it can use the original and schema flowfiles as input. These two > flowfiles will have the unique attribute inferredAvroId set on them by > InferAvroSchema so that they can be properly matched in > ConvertDynamicJsonToAvro. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1935) Added ConvertDynamicJsonToAvro processor
[ https://issues.apache.org/jira/browse/NIFI-1935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322550#comment-15322550 ] Bryan Bende commented on NIFI-1935: --- [~Daniel Cave] [~ahalldin] Was looking this over, and admittedly haven't done a full review, but just wanted to mention something... the description says that part of the reason for creating this processor was because ConverJsonToAvro requires a predefined schema, but I'm not sure that is the case... The schema property in ConvertJsonToAvro supports expression language: {code} static final PropertyDescriptor SCHEMA = new PropertyDescriptor.Builder() .name("Record schema") .description("Outgoing Avro schema for each record created from a JSON object") .addValidator(SCHEMA_VALIDATOR) .expressionLanguageSupported(true) .required(true) .build(); {code} And InferAvroSchema has a property to select where to store the schema - attributes vs. content. So if you choose attributes, it creates an attribute on the FlowFile called "inferred.avro.schema" which contains the schema as the value, so now you have a single flow file with the JSON as the content and the schema in the attributes. You could then set the SCHEMA property in ConvertJsonToAvro to ${inferred.avro.schema} to reference the schema, so each incoming flow file could have a different schema. Given the above, do you think ConvertDynamicJsonToAvro still provides additional benefits over ConvertJsonToAvro? > Added ConvertDynamicJsonToAvro processor > > > Key: NIFI-1935 > URL: https://issues.apache.org/jira/browse/NIFI-1935 > Project: Apache NiFi > Issue Type: New Feature > Components: Extensions >Affects Versions: 1.0.0, 0.7.0 >Reporter: Daniel Cave >Assignee: Alex Halldin >Priority: Minor > Fix For: 1.0.0, 0.7.0 > > Attachments: > 0001-NIFI-1935-Added-ConvertDynamicJSONToAvro.java.-Added.patch > > > ConvertJsonToAvro required a predefined Avro schema to convert JSON and > required the presence of all field on the incoming JSON. > ConvertDynamicJsonToAvro functions similarly, however it now accepts the JSON > and schema as incoming flowfiles and creates the Avro dynamically. > This processor requires the InferAvroSchema processor in its upstream flow so > that it can use the original and schema flowfiles as input. These two > flowfiles will have the unique attribute inferredAvroId set on them by > InferAvroSchema so that they can be properly matched in > ConvertDynamicJsonToAvro. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1982) Compressed check box in Configure Remote Port UI is not used
[ https://issues.apache.org/jira/browse/NIFI-1982?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322506#comment-15322506 ] ASF subversion and git services commented on NIFI-1982: --- Commit 86c96d9304fea850a0f0947ec40254a3054704bd in nifi's branch refs/heads/0.x from [~ijokarumawak] [ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=86c96d9 ] NIFI-1982: Use Compressed check box value. - The Compressed check box UI input was not used - This commit enables Site-to-Site compression configuration from UI This closes #509. Signed-off-by: Bryan Bende> Compressed check box in Configure Remote Port UI is not used > > > Key: NIFI-1982 > URL: https://issues.apache.org/jira/browse/NIFI-1982 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Core UI >Affects Versions: 0.6.1 >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Fix For: 0.7.0 > > Attachments: configure-remote-port.png > > > RIght click Remote Process Group -> Remote Ports -> (Edit icon) -> Configure > Remote Port setting window > It has "Compressed" checkbox. However if user checked the UI component, > RemoteGroupPort doesn't use it to enable compression of Site-to-Site client. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
nifi git commit: NIFI-1982: Use Compressed check box value.
Repository: nifi Updated Branches: refs/heads/0.x c4ddb5212 -> 86c96d930 NIFI-1982: Use Compressed check box value. - The Compressed check box UI input was not used - This commit enables Site-to-Site compression configuration from UI This closes #509. Signed-off-by: Bryan BendeProject: http://git-wip-us.apache.org/repos/asf/nifi/repo Commit: http://git-wip-us.apache.org/repos/asf/nifi/commit/86c96d93 Tree: http://git-wip-us.apache.org/repos/asf/nifi/tree/86c96d93 Diff: http://git-wip-us.apache.org/repos/asf/nifi/diff/86c96d93 Branch: refs/heads/0.x Commit: 86c96d9304fea850a0f0947ec40254a3054704bd Parents: c4ddb52 Author: Koji Kawamura Authored: Wed Jun 8 20:54:40 2016 +0900 Committer: Bryan Bende Committed: Thu Jun 9 09:37:24 2016 -0400 -- .../main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java | 1 + 1 file changed, 1 insertion(+) -- http://git-wip-us.apache.org/repos/asf/nifi/blob/86c96d93/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java -- diff --git a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java index 552c0c6..9f6f783 100644 --- a/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java +++ b/nifi-nar-bundles/nifi-framework-bundle/nifi-framework/nifi-site-to-site/src/main/java/org/apache/nifi/remote/StandardRemoteGroupPort.java @@ -131,6 +131,7 @@ public class StandardRemoteGroupPort extends RemoteGroupPort { .url(remoteGroup.getTargetUri().toString()) .portIdentifier(getIdentifier()) .sslContext(sslContext) +.useCompression(isUseCompression()) .eventReporter(remoteGroup.getEventReporter()) .peerPersistenceFile(getPeerPersistenceFile(getIdentifier())) .nodePenalizationPeriod(penalizationMillis, TimeUnit.MILLISECONDS)
[jira] [Commented] (NIFI-1982) Compressed check box in Configure Remote Port UI is not used
[ https://issues.apache.org/jira/browse/NIFI-1982?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322504#comment-15322504 ] ASF GitHub Bot commented on NIFI-1982: -- Github user bbende commented on the issue: https://github.com/apache/nifi/pull/509 +1 looks good, will merge to 0.x... I am assuming this was only meant for 0.x and the same fix is in your other PR for http site-to-site > Compressed check box in Configure Remote Port UI is not used > > > Key: NIFI-1982 > URL: https://issues.apache.org/jira/browse/NIFI-1982 > Project: Apache NiFi > Issue Type: Bug > Components: Core Framework, Core UI >Affects Versions: 0.6.1 >Reporter: Koji Kawamura >Assignee: Koji Kawamura > Fix For: 0.7.0 > > Attachments: configure-remote-port.png > > > RIght click Remote Process Group -> Remote Ports -> (Edit icon) -> Configure > Remote Port setting window > It has "Compressed" checkbox. However if user checked the UI component, > RemoteGroupPort doesn't use it to enable compression of Site-to-Site client. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (NIFI-1975) Processor to Parse .evtx files
[ https://issues.apache.org/jira/browse/NIFI-1975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bryan Rosander updated NIFI-1975: - Description: Windows event logs are stored in .evtx format as-of Windows Vista. If we port the pure python implementation of an evtx parser at https://github.com/williballenthin/python-evtx to Java, we should be able to ingest those files in NiFi on any operating system These files are located in C:\Windows\System32\winevt\Logs unless exported elsewhere. was: Windows event logs are stored in .evtx format as-of Windows Vista. If we port the pure python implementation of an evtx parser at https://github.com/williballenthin/python-evtx to Java, we should be able to ingest those files in NiFi on any operating system These files are located in C:\Windows\System32\winevt\Logs unless moved elsewhere. > Processor to Parse .evtx files > -- > > Key: NIFI-1975 > URL: https://issues.apache.org/jira/browse/NIFI-1975 > Project: Apache NiFi > Issue Type: Sub-task >Reporter: Bryan Rosander > > Windows event logs are stored in .evtx format as-of Windows Vista. If we > port the pure python implementation of an evtx parser at > https://github.com/williballenthin/python-evtx to Java, we should be able to > ingest those files in NiFi on any operating system > These files are located in C:\Windows\System32\winevt\Logs unless exported > elsewhere. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (NIFI-1975) Processor to Parse .evtx files
[ https://issues.apache.org/jira/browse/NIFI-1975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Bryan Rosander updated NIFI-1975: - Description: Windows event logs are stored in .evtx format as-of Windows Vista. If we port the pure python implementation of an evtx parser at https://github.com/williballenthin/python-evtx to Java, we should be able to ingest those files in NiFi on any operating system These files are located in C:\Windows\System32\winevt\Logs unless moved elsewhere. was:Windows event logs are stored in .evtx format as-of Windows Vista. If we port the pure python implementation of an evtx parser at https://github.com/williballenthin/python-evtx to Java, we should be able to ingest those files in NiFi on any operating system > Processor to Parse .evtx files > -- > > Key: NIFI-1975 > URL: https://issues.apache.org/jira/browse/NIFI-1975 > Project: Apache NiFi > Issue Type: Sub-task >Reporter: Bryan Rosander > > Windows event logs are stored in .evtx format as-of Windows Vista. If we > port the pure python implementation of an evtx parser at > https://github.com/williballenthin/python-evtx to Java, we should be able to > ingest those files in NiFi on any operating system > These files are located in C:\Windows\System32\winevt\Logs unless moved > elsewhere. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1987) Nifi-Spark-Receiver build failure due to orgspark repository change
[ https://issues.apache.org/jira/browse/NIFI-1987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322386#comment-15322386 ] Oleg Zhurakousky commented on NIFI-1987: [~Daniel Cave] I've looked and can't find any dependency on CDH and in fact can't reproduce the same Maven output. Is it possible that you have some local modifications to POMs (perhaps to accommodate some API dependency on CDH custom functionality)? The only CDH dependencies I see (in the entire NIFI distribution) is in Kite module which is expected. > Nifi-Spark-Receiver build failure due to orgspark repository change > --- > > Key: NIFI-1987 > URL: https://issues.apache.org/jira/browse/NIFI-1987 > Project: Apache NiFi > Issue Type: Bug >Affects Versions: 0.6.1 >Reporter: Daniel Cave >Assignee: Oleg Zhurakousky >Priority: Critical > Fix For: 1.0.0, 0.7.0 > > > Builds on machines with no or incomplete local maven repos failing due to > nifi-spark-receiver sub-dependency on > oss.sonatype.org/content/repositories/orgspark-project-1113 related > dependencies. On 6/7/16 it appears that orgspark-project-1113 was removed > from the repo and was replaced with orgspark-project-1123. This blocks > complete NiFi builds where a complete local repo was not available with all > 1113 sub-dependencies already in place. See below: > [INFO] Scanning for projects... > [INFO] Inspecting build with total of 1 modules... > [INFO] Installing Nexus Staging features: > [INFO] ... total of 1 executions of maven-deploy-plugin replaced with > nexus-staging-maven-plugin > [INFO] > [INFO] > > [INFO] Building nifi-spark-receiver 0.7.0-SNAPSHOT > [INFO] > > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT.pom > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT.pom > [WARNING] The POM for > org.apache.avro:avro-mapred:jar:hadoop2:1.7.6-cdh5.7.0-SNAPSHOT is missing, > no dependency information available > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-site-to-site-client/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-commons/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-api/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-utils/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-security-utils/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-client-dto/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-framework/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-framework-bundle/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/nifi/nifi-nar-bundles/0.7.0-SNAPSHOT/maven-metadata.xml > Downloading: > http://repository.apache.org/snapshots/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT-hadoop2.jar > Downloading: > https://oss.sonatype.org/content/repositories/orgspark-project-1113/org/apache/avro/avro-mapred/1.7.6-cdh5.7.0-SNAPSHOT/avro-mapred-1.7.6-cdh5.7.0-SNAPSHOT-hadoop2.jar > [INFO] > > [INFO] BUILD FAILURE > [INFO] > > [INFO] Total time: 6.771 s > [INFO] Finished at: 2016-06-08T13:13:41-05:00 > [INFO] Final Memory: 28M/381M > [INFO] > > [ERROR] Failed to execute goal on project nifi-spark-receiver: Could not > resolve dependencies for project > org.apache.nifi:nifi-spark-receiver:jar:0.7.0-SNAPSHOT: Could not find > artifact org.apache.avro:avro-mapred:jar:hadoop2:1.7.6-cdh5.7.0-SNAPSHOT in >
[jira] [Commented] (NIFI-1974) Support Custom Properties in Expression Language
[ https://issues.apache.org/jira/browse/NIFI-1974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322323#comment-15322323 ] ASF GitHub Bot commented on NIFI-1974: -- Github user YolandaMDavis commented on the issue: https://github.com/apache/nifi/pull/501 @markap14 Thanks for reviewing! Concerning the extension of VariableRegistryProvider by the ControllerServiceLookup, it came from the need to populate the VariableRegistry from NiFiProperties, if available, (which was provided by implementations of ControllerServiceLookup, including FlowController, WebClusterManager) and provide it to StatePropertyValue (which received a ControllerServiceLookup object in it's constructor). I thought about having a third variable in the constructor for StatePropertyValue for the variableRegistry however when attempting to implement I needed to interrogate the ControllerServiceLookup anyway in many cases. I definitely understand the weirdness which is why a the least I had it extend the interface as opposed to adding the getVariableRegistry method to that interface directly. > Support Custom Properties in Expression Language > > > Key: NIFI-1974 > URL: https://issues.apache.org/jira/browse/NIFI-1974 > Project: Apache NiFi > Issue Type: New Feature >Reporter: Yolanda M. Davis >Assignee: Yolanda M. Davis > Fix For: 1.0.0 > > > Add a property in "nifi.properties" config file to allows users to specify a > list of custom properties files (containing data such as environmental > specific values, or sensitive values, etc.). The key/value pairs should be > loaded upon NIFI startup and availbale to processors for use in expression > languages. > Optimally this will lay the groundwork for a UI driven Variable Registry. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1974) Support Custom Properties in Expression Language
[ https://issues.apache.org/jira/browse/NIFI-1974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322287#comment-15322287 ] ASF GitHub Bot commented on NIFI-1974: -- Github user YolandaMDavis commented on a diff in the pull request: https://github.com/apache/nifi/pull/501#discussion_r66415203 --- Diff: nifi-api/src/test/java/org/apache/nifi/registry/TestVariableRegistry.java --- @@ -0,0 +1,126 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.registry; + +import java.nio.file.Path; +import java.nio.file.Paths; +import java.util.HashMap; +import java.util.Map; +import java.util.Properties; + +import org.junit.Test; + +import static org.junit.Assert.assertTrue; + +public class TestVariableRegistry { + +@Test +public void testReadMap(){ +Mapvariables1 = new HashMap<>(); +variables1.put("fake.property.1","fake test value"); + +Map variables2 = new HashMap<>(); +variables1.put("fake.property.2","fake test value"); + +VariableRegistry registry = VariableRegistryFactory.getInstance(variables1,variables2); + +Map variables = registry.getVariables(); +assertTrue(variables.size() == 2); +assertTrue(variables.get("fake.property.1").equals("fake test value")); + assertTrue(registry.getVariableValue("fake.property.2").equals("fake test value")); +} + +@Test +public void testReadProperties(){ +Properties properties = new Properties(); +properties.setProperty("fake.property.1","fake test value"); +VariableRegistry registry = VariableRegistryFactory.getInstance(properties); +Map variables = registry.getVariables(); +assertTrue(variables.get("fake.property.1").equals("fake test value")); +} + +@Test +public void testReadFiles(){ +final Path fooPath = Paths.get("src/test/resources/TestVariableRegistry/foobar.properties"); +final Path testPath = Paths.get("src/test/resources/TestVariableRegistry/test.properties"); +VariableRegistry registry = VariableRegistryFactory.getInstance(fooPath.toFile(),testPath.toFile()); +Map variables = registry.getVariables(); +assertTrue(variables.size() == 3); +assertTrue(variables.get("fake.property.1").equals("test me out 1")); +assertTrue(variables.get("fake.property.3").equals("test me out 3, test me out 4")); +} + +@Test +public void testReadPaths(){ +final Path fooPath = Paths.get("src/test/resources/TestVariableRegistry/foobar.properties"); +final Path testPath = Paths.get("src/test/resources/TestVariableRegistry/test.properties"); +VariableRegistry registry = VariableRegistryFactory.getInstance(fooPath,testPath); +Map variables = registry.getVariables(); +assertTrue(variables.size() == 3); +assertTrue(variables.get("fake.property.1").equals("test me out 1")); +assertTrue(variables.get("fake.property.3").equals("test me out 3, test me out 4")); +} + +@Test +public void testAddRegistry(){ + +final Map variables1 = new HashMap<>(); +variables1.put("fake.property.1","fake test value"); + + +final Path fooPath = Paths.get("src/test/resources/TestVariableRegistry/foobar.properties"); +VariableRegistry pathRegistry = VariableRegistryFactory.getInstance(fooPath); + +final Path testPath = Paths.get("src/test/resources/TestVariableRegistry/test.properties"); +VariableRegistry fileRegistry = VariableRegistryFactory.getInstance(testPath.toFile()); + +Properties properties = new Properties(); +
[jira] [Commented] (NIFI-1974) Support Custom Properties in Expression Language
[ https://issues.apache.org/jira/browse/NIFI-1974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322271#comment-15322271 ] ASF GitHub Bot commented on NIFI-1974: -- Github user YolandaMDavis commented on a diff in the pull request: https://github.com/apache/nifi/pull/501#discussion_r66412944 --- Diff: nifi-api/src/main/java/org/apache/nifi/registry/FileVariableRegistry.java --- @@ -0,0 +1,70 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.registry; + +import java.io.File; +import java.io.IOException; +import java.nio.file.Path; +import java.util.Map; + + +public abstract class FileVariableRegistry extends MultiMapVariableRegistry { + +public FileVariableRegistry() { +super(); +} + +public FileVariableRegistry(File... files){ +super(); +addVariables(files); +} + +public FileVariableRegistry(Path... paths){ +super(); +addVariables(paths); +} + +@SuppressWarnings({"unchecked", "rawtypes"}) +public void addVariables(File ...files){ +if(files != null) { +for (final File file : files) { +try { +registry.addMap(convertFile(file)); +} catch (IOException iex) { +throw new IllegalArgumentException("A file provided was invalid.", iex); --- End diff -- wrapped it because of it's use in the constructor, wasn't too keen on throwing IOE from the constructor as well. Also as I side note I think I can remove references to public. > Support Custom Properties in Expression Language > > > Key: NIFI-1974 > URL: https://issues.apache.org/jira/browse/NIFI-1974 > Project: Apache NiFi > Issue Type: New Feature >Reporter: Yolanda M. Davis >Assignee: Yolanda M. Davis > Fix For: 1.0.0 > > > Add a property in "nifi.properties" config file to allows users to specify a > list of custom properties files (containing data such as environmental > specific values, or sensitive values, etc.). The key/value pairs should be > loaded upon NIFI startup and availbale to processors for use in expression > languages. > Optimally this will lay the groundwork for a UI driven Variable Registry. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1974) Support Custom Properties in Expression Language
[ https://issues.apache.org/jira/browse/NIFI-1974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322262#comment-15322262 ] ASF GitHub Bot commented on NIFI-1974: -- Github user YolandaMDavis commented on a diff in the pull request: https://github.com/apache/nifi/pull/501#discussion_r66411796 --- Diff: nifi-api/src/main/java/org/apache/nifi/registry/VariableRegistryUtils.java --- @@ -0,0 +1,55 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.registry; + +import java.util.Collections; +import java.util.HashMap; +import java.util.Map; + +import org.apache.nifi.flowfile.FlowFile; + +public class VariableRegistryUtils { + +public static VariableRegistry createVariableRegistry(){ +VariableRegistry variableRegistry = VariableRegistryFactory.getInstance(); +VariableRegistry envRegistry = VariableRegistryFactory.getInstance(System.getenv()); +VariableRegistry propRegistry = VariableRegistryFactory.getInstance(System.getProperties()); +variableRegistry.addRegistry(envRegistry); +variableRegistry.addRegistry(propRegistry); --- End diff -- Yes that is correct since the first found would be the first matched. > Support Custom Properties in Expression Language > > > Key: NIFI-1974 > URL: https://issues.apache.org/jira/browse/NIFI-1974 > Project: Apache NiFi > Issue Type: New Feature >Reporter: Yolanda M. Davis >Assignee: Yolanda M. Davis > Fix For: 1.0.0 > > > Add a property in "nifi.properties" config file to allows users to specify a > list of custom properties files (containing data such as environmental > specific values, or sensitive values, etc.). The key/value pairs should be > loaded upon NIFI startup and availbale to processors for use in expression > languages. > Optimally this will lay the groundwork for a UI driven Variable Registry. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (NIFI-1974) Support Custom Properties in Expression Language
[ https://issues.apache.org/jira/browse/NIFI-1974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322237#comment-15322237 ] ASF GitHub Bot commented on NIFI-1974: -- Github user YolandaMDavis commented on a diff in the pull request: https://github.com/apache/nifi/pull/501#discussion_r66409919 --- Diff: nifi-api/src/main/java/org/apache/nifi/registry/MultiMap.java --- @@ -0,0 +1,154 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.nifi.registry; + +import java.util.ArrayList; +import java.util.Collection; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; + +public class MultiMapimplements Map { + +private final List
[jira] [Commented] (NIFI-1974) Support Custom Properties in Expression Language
[ https://issues.apache.org/jira/browse/NIFI-1974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322213#comment-15322213 ] ASF GitHub Bot commented on NIFI-1974: -- Github user YolandaMDavis commented on a diff in the pull request: https://github.com/apache/nifi/pull/501#discussion_r66407532 --- Diff: nifi-api/src/test/java/org/apache/nifi/registry/TestVariableRegistry.java --- @@ -0,0 +1,126 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.nifi.registry; + +import java.nio.file.Path; +import java.nio.file.Paths; +import java.util.HashMap; +import java.util.Map; +import java.util.Properties; + +import org.junit.Test; + +import static org.junit.Assert.assertTrue; + +public class TestVariableRegistry { + +@Test +public void testReadMap(){ +Mapvariables1 = new HashMap<>(); +variables1.put("fake.property.1","fake test value"); + +Map variables2 = new HashMap<>(); +variables1.put("fake.property.2","fake test value"); + +VariableRegistry registry = VariableRegistryFactory.getInstance(variables1,variables2); + +Map variables = registry.getVariables(); +assertTrue(variables.size() == 2); +assertTrue(variables.get("fake.property.1").equals("fake test value")); + assertTrue(registry.getVariableValue("fake.property.2").equals("fake test value")); +} + +@Test +public void testReadProperties(){ +Properties properties = new Properties(); +properties.setProperty("fake.property.1","fake test value"); +VariableRegistry registry = VariableRegistryFactory.getInstance(properties); +Map variables = registry.getVariables(); +assertTrue(variables.get("fake.property.1").equals("fake test value")); +} + +@Test +public void testReadFiles(){ +final Path fooPath = Paths.get("src/test/resources/TestVariableRegistry/foobar.properties"); +final Path testPath = Paths.get("src/test/resources/TestVariableRegistry/test.properties"); +VariableRegistry registry = VariableRegistryFactory.getInstance(fooPath.toFile(),testPath.toFile()); +Map variables = registry.getVariables(); +assertTrue(variables.size() == 3); +assertTrue(variables.get("fake.property.1").equals("test me out 1")); +assertTrue(variables.get("fake.property.3").equals("test me out 3, test me out 4")); +} + +@Test +public void testReadPaths(){ +final Path fooPath = Paths.get("src/test/resources/TestVariableRegistry/foobar.properties"); +final Path testPath = Paths.get("src/test/resources/TestVariableRegistry/test.properties"); +VariableRegistry registry = VariableRegistryFactory.getInstance(fooPath,testPath); +Map variables = registry.getVariables(); +assertTrue(variables.size() == 3); +assertTrue(variables.get("fake.property.1").equals("test me out 1")); +assertTrue(variables.get("fake.property.3").equals("test me out 3, test me out 4")); +} + +@Test +public void testAddRegistry(){ + +final Map variables1 = new HashMap<>(); +variables1.put("fake.property.1","fake test value"); + + +final Path fooPath = Paths.get("src/test/resources/TestVariableRegistry/foobar.properties"); +VariableRegistry pathRegistry = VariableRegistryFactory.getInstance(fooPath); + +final Path testPath = Paths.get("src/test/resources/TestVariableRegistry/test.properties"); +VariableRegistry fileRegistry = VariableRegistryFactory.getInstance(testPath.toFile()); + +Properties properties = new Properties(); +
[jira] [Commented] (NIFI-1974) Support Custom Properties in Expression Language
[ https://issues.apache.org/jira/browse/NIFI-1974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15322206#comment-15322206 ] ASF GitHub Bot commented on NIFI-1974: -- Github user YolandaMDavis commented on a diff in the pull request: https://github.com/apache/nifi/pull/501#discussion_r66406906 --- Diff: nifi-api/src/test/resources/TestVariableRegistry/foobar.properties --- @@ -0,0 +1 @@ +fake.property.3=test me out 3, test me out 4 --- End diff -- I already added the rat check but I can add the ASF license, that's not a problem. > Support Custom Properties in Expression Language > > > Key: NIFI-1974 > URL: https://issues.apache.org/jira/browse/NIFI-1974 > Project: Apache NiFi > Issue Type: New Feature >Reporter: Yolanda M. Davis >Assignee: Yolanda M. Davis > Fix For: 1.0.0 > > > Add a property in "nifi.properties" config file to allows users to specify a > list of custom properties files (containing data such as environmental > specific values, or sensitive values, etc.). The key/value pairs should be > loaded upon NIFI startup and availbale to processors for use in expression > languages. > Optimally this will lay the groundwork for a UI driven Variable Registry. -- This message was sent by Atlassian JIRA (v6.3.4#6332)