INT96 Error while converting Parquet File

2021-03-25 Thread Bimal Mehta
Hi,

I am trying to convert a Parquet file into csv using ConvertRecord (1.13.0)
processor.
I have created a controller service of type ParquetReader 1.13.0
I have set the property Avro Read Compatibility to false as I
understand that this error has its root in Avro dependency. I also tried it
by setting the property to True but it still fails with the same error.
I am reading the file from a S3 bucket and then using the ConvertRecord
processor to change it into CSV format for further processing. Any help
would be appreciated on how to handle this scenario?
Am using NiFi 1.13.0


Thanks
Bimal Mehta


Nifi Single Instance mode to Cluster Mode

2019-10-28 Thread Bimal Mehta
Hi,

We are doing a migration from a single instance NiFi to a NiFi cluster mode
(with 3 nodes) using Cloudera Manager.
After staring all the 3 nodes from Cloudera Manager, only 1 shows connected
and the others appear disconnected.
In the log file for the 2 nodes that are showing disconnected, we get the
following error:

455 ERROR org.apache.nifi.controller.serialization.FlowFromDOMFactory:
There was a problem decrypting a sensitive flow configuration value. Check
that the nifi.sensitive.props.key value in nifi.properties matches the
value used to encrypt the flow.xml.gz file
org.apache.nifi.encrypt.EncryptionException:
org.apache.nifi.encrypt.EncryptionException: Could not decrypt sensitive
value

I am unable to update the   nifi.sensitive.props.key  as it is generated by
Cloudera Manager when we start the nodes.
Any suggestions?

Thanks
Bimal Mehta


Re: ExecuteSQLRecord Bug

2019-10-08 Thread Bimal Mehta
Hi,

Just following up on the below email.
Is this a known issue? Happy to raise a JIRA to having it fixed in the next
release or its already fixed in the later versions?

Thanks

On Mon, Sep 16, 2019 at 1:23 PM Bimal Mehta  wrote:

> Hi,
>
> We are using ExecuteSQLRecord processor that reads the query and certain
> attributes from the previous flow,
> We have the following properties set;
> Max Rows Per Flow File: 5
> Output Batch Size: 5
>
> In our case we have 1 mn  records coming from the source. 20 flowfiles get
> created that get passed below.
> However the inherited attributes are only passed to the first 5 flow files.
> They are not being passed to the other 15.
> Is this a bug? How do we overcome this?
>
> We are in NiFi 1.9.0
>
> Thanks
> Bimal Mehta
>


ExecuteSQLRecord Bug

2019-09-16 Thread Bimal Mehta
Hi,

We are using ExecuteSQLRecord processor that reads the query and certain
attributes from the previous flow,
We have the following properties set;
Max Rows Per Flow File: 5
Output Batch Size: 5

In our case we have 1 mn  records coming from the source. 20 flowfiles get
created that get passed below.
However the inherited attributes are only passed to the first 5 flow files.
They are not being passed to the other 15.
Is this a bug? How do we overcome this?

We are in NiFi 1.9.0

Thanks
Bimal Mehta


Re: Variables to start a NiFi flow

2019-08-27 Thread Bimal Mehta
Thanks Peter.
The generate flow file option was a good fit for our case. We are also able
to trigger it using a shell script by using curl.


On Tue, Aug 27, 2019, 1:45 AM Peter Turcsanyi 
wrote:

> Hi Bimal,
>
> With Variable Registry, you can implement it in the following way:
> Put your flow into a Process Group. Use variable references in your
> processors (eg. ${db.table}) and define the variables at the process group
> level. Then copy the process group (by simply copying it or creating a
> template from it first) and set the variables to the proper values in each
> process group. You can also configure separate scheduling in each process
> group.
> The drawback that you need to multiply your flow.
>
> Another approach:
> Defined your flow only once and use FlowFile attributes instead of
> variables in variable registry.
> Use GenerateFlowFile and add the FlowFile attributes via the dynamic
> properties of this processor. Configure a separate GenerateFlowFile for
> each of your source tables and connect them to the same "SQL" processor
> (which was the entry point earlier). Configure the scheduling on these
> GenerateFlowFile-s.
> The problem is that not all "SQL" processors support flowfile input. You
> can use ExecuteSQL(Record) or GenerateTableFetch in this way, but not
> QueryDatabaseTable.
>
> Regards,
> Peter
>
> On Mon, Aug 26, 2019 at 6:30 PM Bimal Mehta  wrote:
>
>> Hi,
>>
>> We have a data flow which extracts data from source database table and
>> loads into target hive table. This flow needs to  run several times in a
>> day to get delta records from source table and also for multiple tables .
>> Now we need to replicate this same process for all the different source
>> tables. So rather than creating multiple data flows for each separate
>> table, can I use the existing flow and pass parameters like source table
>> name to that flow and the flow starts. Basically looking for an interface
>> where the user can pass the table names that we want to load at a given
>> point in time  and the flow is triggered for that table. Variable Registry
>> comes to mind, but I am not sure how to make it work for this use case. We
>> are using NiFi 1.9.0  as part of CDF bundle.
>>
>> Thanks
>> Bimal Mehta
>>
>


Variables to start a NiFi flow

2019-08-26 Thread Bimal Mehta
Hi,

We have a data flow which extracts data from source database table and
loads into target hive table. This flow needs to  run several times in a
day to get delta records from source table and also for multiple tables .
Now we need to replicate this same process for all the different source
tables. So rather than creating multiple data flows for each separate
table, can I use the existing flow and pass parameters like source table
name to that flow and the flow starts. Basically looking for an interface
where the user can pass the table names that we want to load at a given
point in time  and the flow is triggered for that table. Variable Registry
comes to mind, but I am not sure how to make it work for this use case. We
are using NiFi 1.9.0  as part of CDF bundle.

Thanks
Bimal Mehta


Re: Custom Processor Upgrade

2019-08-20 Thread Bimal Mehta
Thanks all for the support.
We resolved the issue by creating a new nar file itself. It seems the old
version code was using some outdated dependencies which was not getting
fixed when upgrading the processor.


On Thu, Aug 15, 2019 at 2:06 PM James Srinivasan 
wrote:

> I find strace (or procmon for Windows) very handy to debug such resource
> loading issues.
>
> On Thu, 15 Aug 2019, 19:02 Bryan Bende,  wrote:
>
>> I was making sure you didn't have any code that was dependent on the
>> internal structure of how the NARs are unpacked.
>>
>> I can't explain why it can't find the application-context.xml since I
>> don't have access to your code, but I don't see why that would be
>> related to moving to NAR_INF from META_INF, nothing should really be
>> relying on that structure.
>>
>> On Thu, Aug 15, 2019 at 1:27 PM Bimal Mehta  wrote:
>> >
>> > Its inside one of the jars within the NAR_INF folder. Does it need to
>> be somehwere else?
>> > Also I think we extended the AbstractNiFiProcessor from the custom kylo
>> processor while migrating it as kylo processor was not working as is in our
>> environment.. Will check that and have it packaged in the processors module.
>> >
>> > On Thu, Aug 15, 2019 at 1:50 AM Bryan Bende  wrote:
>> >>
>> >> Where is application-context.xml in your NAR?
>> >>
>> >> And how are you trying to load it in
>> com.thinkbiganalytics.nifi.processor.AbstractNiFiProcessor ?
>> >>
>> >> I would expect it to be packaged into the jar that contains your
>> processors, most likely in src/main/resources of the processors module
>> which then ends up at the root of the jar.
>> >>
>> >> On Wed, Aug 14, 2019 at 5:36 PM Bimal Mehta 
>> wrote:
>> >>>
>> >>> Ahh, seems like a Springboot error.
>> >>> Is it to do with upgraded Jetty server ?
>> >>>
>> >>> Caused by:
>> org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected
>> exception parsing XML document from class path resource
>> [application-context.xml]; nested exception is
>> org.springframework.beans.FatalBeanException: Class
>> [org.springframework.context.config.ContextNamespaceHandler] for namespace [
>> http://www.springframework.org/schema/context] does not implement the
>> [org.springframework.beans.factory.xml.NamespaceHandler] interface
>> >>> at
>> org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:414)
>> >>> at
>> org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:336)
>> >>> at
>> org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:304)
>> >>> at
>> org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:181)
>> >>> at
>> org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:217)
>> >>> at
>> org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:188)
>> >>> at
>> org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:252)
>> >>> at
>> org.springframework.context.support.AbstractXmlApplicationContext.loadBeanDefinitions(AbstractXmlApplicationContext.java:127)
>> >>> at
>> org.springframework.context.support.AbstractXmlApplicationContext.loadBeanDefinitions(AbstractXmlApplicationContext.java:93)
>> >>> at
>> org.springframework.context.support.AbstractRefreshableApplicationContext.refreshBeanFactory(AbstractRefreshableApplicationContext.java:129)
>> >>> at
>> org.springframework.context.support.AbstractApplicationContext.obtainFreshBeanFactory(AbstractApplicationContext.java:609)
>> >>> at
>> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:510)
>> >>> at
>> org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:139)
>> >>> at
>> org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:83)
>> >>> at
>> com.thinkbiganalytics.nifi.processor.AbstractNiFiProcessor.init(AbstractNiFiProcessor.ja

Creating External Hive Table on Parquet Files

2019-08-20 Thread Bimal Mehta
Hi,
We have a sub Flow -
QueryDatabaseTable->SplitAvro->ConvertAvrotoORC->PutHDFS->ReplaceText->PutHQL.

We are using ConverAvrotoORC for generating the DDL.
The flow works fine. However in Cloudera distribution we are not able to
query the table that has been created as the DDL creates the table as ORC.
Cloudera Distribution we have does not support ORC.
We now want to create the external table as Parquet.
Lets say now instead of ConvertAvrotoORC, I use ConvertAvrotoParquet. How
do we create the external table as Parquet and point it to the HDFS parquet
file? There is no hive-ddl now as it gets generated only when using
ConvertAvrotoORC.


Re: Custom Processor Upgrade

2019-08-15 Thread Bimal Mehta
Its inside one of the jars within the NAR_INF folder. Does it need to be
somehwere else?
Also I think we extended the AbstractNiFiProcessor from the custom kylo
processor while migrating it as kylo processor was not working as is in our
environment.. Will check that and have it packaged in the processors module.

On Thu, Aug 15, 2019 at 1:50 AM Bryan Bende  wrote:

> Where is application-context.xml in your NAR?
>
> And how are you trying to load it in 
> com.thinkbiganalytics.nifi.processor.AbstractNiFiProcessor
> ?
>
> I would expect it to be packaged into the jar that contains your
> processors, most likely in src/main/resources of the processors module
>  which then ends up at the root of the jar.
>
> On Wed, Aug 14, 2019 at 5:36 PM Bimal Mehta  wrote:
>
>> Ahh, seems like a Springboot error.
>> Is it to do with upgraded Jetty server ?
>>
>> Caused by:
>> org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected
>> exception parsing XML document from class path resource
>> [application-context.xml]; nested exception is
>> org.springframework.beans.FatalBeanException: Class
>> [org.springframework.context.config.ContextNamespaceHandler] for namespace [
>> http://www.springframework.org/schema/context] does not implement the
>> [org.springframework.beans.factory.xml.NamespaceHandler] interface
>> at
>> org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:414)
>> at
>> org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:336)
>> at
>> org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:304)
>> at
>> org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:181)
>> at
>> org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:217)
>> at
>> org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:188)
>> at
>> org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:252)
>> at
>> org.springframework.context.support.AbstractXmlApplicationContext.loadBeanDefinitions(AbstractXmlApplicationContext.java:127)
>> at
>> org.springframework.context.support.AbstractXmlApplicationContext.loadBeanDefinitions(AbstractXmlApplicationContext.java:93)
>> at
>> org.springframework.context.support.AbstractRefreshableApplicationContext.refreshBeanFactory(AbstractRefreshableApplicationContext.java:129)
>> at
>> org.springframework.context.support.AbstractApplicationContext.obtainFreshBeanFactory(AbstractApplicationContext.java:609)
>> at
>> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:510)
>> at
>> org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:139)
>> at
>> org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:83)
>> at
>> com.thinkbiganalytics.nifi.processor.AbstractNiFiProcessor.init(AbstractNiFiProcessor.java:48)
>> at
>> org.apache.nifi.processor.AbstractSessionFactoryProcessor.initialize(AbstractSessionFactoryProcessor.java:63)
>> at
>> org.apache.nifi.controller.ExtensionBuilder.createLoggableProcessor(ExtensionBuilder.java:421)
>> ... 50 common frames omitted
>> Caused by: org.springframework.beans.FatalBeanException: Class
>> [org.springframework.context.config.ContextNamespaceHandler] for namespace [
>> http://www.springframework.org/schema/context] does not implement the
>> [org.springframework.beans.factory.xml.NamespaceHandler] interface
>> at
>> org.springframework.beans.factory.xml.DefaultNamespaceHandlerResolver.resolve(DefaultNamespaceHandlerResolver.java:128)
>> at
>> org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1406)
>> at
>> org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1401)
>> at
>> org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:168)
>> at
>> org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:138)
>> at
>> org.springframework.beans.factory.xml.DefaultBeanDefiniti

Re: Custom Processor Upgrade

2019-08-14 Thread Bimal Mehta
Ahh, seems like a Springboot error.
Is it to do with upgraded Jetty server ?

Caused by: org.springframework.beans.factory.BeanDefinitionStoreException:
Unexpected exception parsing XML document from class path resource
[application-context.xml]; nested exception is
org.springframework.beans.FatalBeanException: Class
[org.springframework.context.config.ContextNamespaceHandler] for namespace [
http://www.springframework.org/schema/context] does not implement the
[org.springframework.beans.factory.xml.NamespaceHandler] interface
at
org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:414)
at
org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:336)
at
org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:304)
at
org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:181)
at
org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:217)
at
org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:188)
at
org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:252)
at
org.springframework.context.support.AbstractXmlApplicationContext.loadBeanDefinitions(AbstractXmlApplicationContext.java:127)
at
org.springframework.context.support.AbstractXmlApplicationContext.loadBeanDefinitions(AbstractXmlApplicationContext.java:93)
at
org.springframework.context.support.AbstractRefreshableApplicationContext.refreshBeanFactory(AbstractRefreshableApplicationContext.java:129)
at
org.springframework.context.support.AbstractApplicationContext.obtainFreshBeanFactory(AbstractApplicationContext.java:609)
at
org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:510)
at
org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:139)
at
org.springframework.context.support.ClassPathXmlApplicationContext.(ClassPathXmlApplicationContext.java:83)
at
com.thinkbiganalytics.nifi.processor.AbstractNiFiProcessor.init(AbstractNiFiProcessor.java:48)
at
org.apache.nifi.processor.AbstractSessionFactoryProcessor.initialize(AbstractSessionFactoryProcessor.java:63)
at
org.apache.nifi.controller.ExtensionBuilder.createLoggableProcessor(ExtensionBuilder.java:421)
... 50 common frames omitted
Caused by: org.springframework.beans.FatalBeanException: Class
[org.springframework.context.config.ContextNamespaceHandler] for namespace [
http://www.springframework.org/schema/context] does not implement the
[org.springframework.beans.factory.xml.NamespaceHandler] interface
at
org.springframework.beans.factory.xml.DefaultNamespaceHandlerResolver.resolve(DefaultNamespaceHandlerResolver.java:128)
at
org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1406)
at
org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1401)
at
org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:168)
at
org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:138)
at
org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:94)
at
org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:508)
at
org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:392)
... 66 common frames omitted

On Wed, Aug 14, 2019 at 4:44 PM Bryan Bende  wrote:

> You have to add another instance of the processor which should
> generate the same stracktrace you sent earlier, except this time there
> should be a second part to it with "Caused by " and then more of
> the stacktrace that wasn't there before.
>
> On Wed, Aug 14, 2019 at 4:41 PM Bimal Mehta  wrote:
> >
> > Hi Bryan,
> >
> > I did what you said.
> > This is what I got
> >
> > 2019-08-14 20:16:18,948 DEBUG [Validate Components Thread-3]
> o.a.n.controller.AbstractComponentNode Computed validation errors with
> Validation Context StandardValidationContext[componentId=
> 6fbe2407-7799-3908-f4c4-bf2f8940bf1e ,
> properties={PropertyDescriptor[Header Line Count]=1,
> PropertyDescriptor[Enable processing]=${searchTerm}}]; results = ['Missing
> Processor' validated against 'Any Property' is invalid because Processor is
&g

Re: Custom Processor Upgrade

2019-08-14 Thread Bimal Mehta
Hi Bryan,

I did what you said.
This is what I got

2019-08-14 20:16:18,948 DEBUG [Validate Components Thread-3]
o.a.n.controller.AbstractComponentNode Computed validation errors with
Validation Context StandardValidationContext[componentId=
6fbe2407-7799-3908-f4c4-bf2f8940bf1e ,
properties={PropertyDescriptor[Header Line Count]=1,
PropertyDescriptor[Enable processing]=${searchTerm}}]; results = ['Missing
Processor' validated against 'Any Property' is invalid because Processor is
of type  org.apache.nifi.init.InitiateScan, but this is not a valid
Processor type]
2019-08-14 20:16:18,946 DEBUG [Timer-Driven Process Thread-6]
o.a.n.c.r.m.SecondPrecisionEventContainer Updated bin 39. Did NOT replace.

On Wed, Aug 14, 2019 at 1:42 PM Bryan Bende  wrote:

> Can you edit logback.xml and add the following, the get the stacktrace
> again?
>
> 
>
> This should include a root cause exception which we are missing right now.
>
> I think it takes about 20-30 seconds for logback to pick up the edits
> to logback.xml.
>
> On Wed, Aug 14, 2019 at 12:53 PM Bimal Mehta  wrote:
> >
> > For the custom processor we have, we are extending the
> AbstractNifiProcessor.java.
> > The processor is used to scan metadata of an incoming flow file.
> > The error we get in logs is as below.
> > 2019-08-13 23:21:21,529 ERROR [main]
> o.a.nifi.controller.ExtensionBuilder Could not create Processor of type
> org.apache.nifi.init.InitiateScan for ID
> 6fbe2407-7799-3908-f4c4-bf2f8940bf1e; creating "Ghost" implementation
> > org.apache.nifi.controller.exception.ProcessorInstantiationException:
> org.apache.nifi.init.InitiateScan
> > at
> org.apache.nifi.controller.ExtensionBuilder.createLoggableProcessor(ExtensionBuilder.java:425)
> > at
> org.apache.nifi.controller.ExtensionBuilder.buildProcessor(ExtensionBuilder.java:191)
> > at
> org.apache.nifi.controller.flow.StandardFlowManager.createProcessor(StandardFlowManager.java:298)
> > at
> org.apache.nifi.controller.flow.StandardFlowManager.createProcessor(StandardFlowManager.java:274)
> > at
> org.apache.nifi.controller.StandardFlowSynchronizer.addProcessGroup(StandardFlowSynchronizer.java:1262)
> > at
> org.apache.nifi.controller.StandardFlowSynchronizer.addProcessGroup(StandardFlowSynchronizer.java:1389)
> > at
> org.apache.nifi.controller.StandardFlowSynchronizer.addProcessGroup(StandardFlowSynchronizer.java:1389)
> > at
> org.apache.nifi.controller.StandardFlowSynchronizer.sync(StandardFlowSynchronizer.java:362)
> > at
> org.apache.nifi.controller.FlowController.synchronize(FlowController.java:1296)
> > at
> org.apache.nifi.persistence.StandardXMLFlowConfigurationDAO.load(StandardXMLFlowConfigurationDAO.java:88)
> > at
> org.apache.nifi.controller.StandardFlowService.loadFromBytes(StandardFlowService.java:812)
> > at
> org.apache.nifi.controller.StandardFlowService.load(StandardFlowService.java:557)
> > at
> org.apache.nifi.web.contextlistener.ApplicationStartupContextListener.contextInitialized(ApplicationStartupContextListener.java:72)
> > at
> org.eclipse.jetty.server.handler.ContextHandler.callContextInitialized(ContextHandler.java:953)
> > at
> org.eclipse.jetty.servlet.ServletContextHandler.callContextInitialized(ServletContextHandler.java:558)
> > at
> org.eclipse.jetty.server.handler.ContextHandler.startContext(ContextHandler.java:918)
> > at
> org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:370)
> > at
> org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1497)
> > at
> org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1459)
> > at
> org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:848)
> > at
> org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:287)
> > at
> org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:545)
> > at
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> > at
> org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:138)
> > at
> org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
> > at
> org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:113)
> > at
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> > at
> org.eclipse.jetty.util.component.Conta

Re: Custom Processor Upgrade

2019-08-14 Thread Bimal Mehta
r.java:113)
at org.eclipse.jetty.server.Server.doStart(Server.java:386)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at
org.apache.nifi.web.server.JettyServer.start(JettyServer.java:935)
at org.apache.nifi.NiFi.(NiFi.java:158)
at org.apache.nifi.NiFi.(NiFi.java:72)
at org.apache.nifi.NiFi.main(NiFi.java:297)

On Wed, Aug 14, 2019 at 7:42 AM Bryan Bende  wrote:

> Without access to the code for your NAR I can only really guess, but it
> sounds like an exception is happening  when trying to call the constructor
> of your processor and then it bounces into creating a ghost processor.
>
> What is in the logs at the time you get the ghost processor?
>
> On Tue, Aug 13, 2019 at 10:54 PM Bimal Mehta  wrote:
>
>> Does that mean I need to recreate the processor? Or there is some
>> workaround?
>>
>> The processor gets unpacked and its bundled dependencies go in NAR_INF.
>> However when I drag the processor on the canvas, it comes with a yellow
>> triangle (and gives the error message I stated above) and properties are
>> missing as well.
>>
>>
>> On Tue, Aug 13, 2019 at 10:47 PM Bryan Bende  wrote:
>>
>>> I don’t remember all the reasoning behind the change, but it had to do
>>> with an issue when we upgraded Jetty...
>>>
>>> https://issues.apache.org/jira/browse/NIFI-5479
>>>
>>> On Tue, Aug 13, 2019 at 9:47 PM Bimal Mehta  wrote:
>>>
>>>> Yes it does show as an option.
>>>> One thing I noticed is that the when the nar is unpacked, the bundled
>>>> dependencies are inside META_INF in the work folder in NiFi 1.6.0, however
>>>> in NiFI 1.9.0 they go inside NAR_INF.
>>>> Why does this happen?
>>>> It seems the custom processor that we have uses Springboot, and
>>>> references applicationcontext file which was inside META_INF when it was
>>>> built. However I cant see that file anymore in the unpacked nar.
>>>>
>>>> On Tue, Aug 13, 2019 at 8:57 PM Bryan Bende  wrote:
>>>>
>>>>> Does that custom processor type show as an option if you try to add a
>>>>> new processor to the canvas?
>>>>>
>>>>> On Tue, Aug 13, 2019 at 4:54 PM Bimal Mehta 
>>>>> wrote:
>>>>>
>>>>>> Hi Mike and Bryan,
>>>>>>
>>>>>> One of my custom processors appears as inactive in NiFi with a yellow
>>>>>> triangle error.
>>>>>> When I hover over it I see a message saying 'Missing Processor'
>>>>>> validated against 'Any Property' is invalid. This is not a valid 
>>>>>> processor.
>>>>>> In the log it seems to invoke GhostProcessor.java which is giving the
>>>>>> above error when restarting nifi.
>>>>>> This custom processor sits (with my other processors) in my
>>>>>> custom_lib  folder and I have provided that path in the nifi properties
>>>>>> file as
>>>>>>
>>>>>> *nifi.nar.library.directory.custom=/opt/nifi/custom_lib*
>>>>>>
>>>>>>
>>>>>> Not sure what I missed?
>>>>>>
>>>>>> Do I need to make entry of this custom processor somewhere?
>>>>>>
>>>>>>
>>>>>> On Thu, Aug 8, 2019 at 9:14 AM Bimal Mehta 
>>>>>> wrote:
>>>>>>
>>>>>>> Thanks Mike and Bryan.
>>>>>>> Yes it seems my template was still referring the old version.
>>>>>>> I will have it updated now and will reimport.
>>>>>>> Also the version of NiFi we are using is the one that comes with
>>>>>>> CDF. I am not sure if CDF supports 1.9.2 yet or not. I will reach out to
>>>>>>> Cloudera and see if we can get it upgraded.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Aug 8, 2019, 8:51 AM Bryan Bende  wrote:
>>>>>>>
>>>>>>>> What is in the template for the bundle coordinates of your
>>>>>>>> processor?
>>>>>>>> and does that match the coordinates of the NAR that is deployed?
>>>>>>>>
>>>>>>>> Example:
>>>>>>>>
>>>>>>>>
>>>>>>>>   org.apache.nifi
>>>>>>>

Re: Custom Processor Upgrade

2019-08-13 Thread Bimal Mehta
Does that mean I need to recreate the processor? Or there is some
workaround?

The processor gets unpacked and its bundled dependencies go in NAR_INF.
However when I drag the processor on the canvas, it comes with a yellow
triangle (and gives the error message I stated above) and properties are
missing as well.


On Tue, Aug 13, 2019 at 10:47 PM Bryan Bende  wrote:

> I don’t remember all the reasoning behind the change, but it had to do
> with an issue when we upgraded Jetty...
>
> https://issues.apache.org/jira/browse/NIFI-5479
>
> On Tue, Aug 13, 2019 at 9:47 PM Bimal Mehta  wrote:
>
>> Yes it does show as an option.
>> One thing I noticed is that the when the nar is unpacked, the bundled
>> dependencies are inside META_INF in the work folder in NiFi 1.6.0, however
>> in NiFI 1.9.0 they go inside NAR_INF.
>> Why does this happen?
>> It seems the custom processor that we have uses Springboot, and
>> references applicationcontext file which was inside META_INF when it was
>> built. However I cant see that file anymore in the unpacked nar.
>>
>> On Tue, Aug 13, 2019 at 8:57 PM Bryan Bende  wrote:
>>
>>> Does that custom processor type show as an option if you try to add a
>>> new processor to the canvas?
>>>
>>> On Tue, Aug 13, 2019 at 4:54 PM Bimal Mehta  wrote:
>>>
>>>> Hi Mike and Bryan,
>>>>
>>>> One of my custom processors appears as inactive in NiFi with a yellow
>>>> triangle error.
>>>> When I hover over it I see a message saying 'Missing Processor'
>>>> validated against 'Any Property' is invalid. This is not a valid processor.
>>>> In the log it seems to invoke GhostProcessor.java which is giving the
>>>> above error when restarting nifi.
>>>> This custom processor sits (with my other processors) in my custom_lib
>>>> folder and I have provided that path in the nifi properties file as
>>>>
>>>> *nifi.nar.library.directory.custom=/opt/nifi/custom_lib*
>>>>
>>>>
>>>> Not sure what I missed?
>>>>
>>>> Do I need to make entry of this custom processor somewhere?
>>>>
>>>>
>>>> On Thu, Aug 8, 2019 at 9:14 AM Bimal Mehta  wrote:
>>>>
>>>>> Thanks Mike and Bryan.
>>>>> Yes it seems my template was still referring the old version.
>>>>> I will have it updated now and will reimport.
>>>>> Also the version of NiFi we are using is the one that comes with CDF.
>>>>> I am not sure if CDF supports 1.9.2 yet or not. I will reach out to
>>>>> Cloudera and see if we can get it upgraded.
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Aug 8, 2019, 8:51 AM Bryan Bende  wrote:
>>>>>
>>>>>> What is in the template for the bundle coordinates of your processor?
>>>>>> and does that match the coordinates of the NAR that is deployed?
>>>>>>
>>>>>> Example:
>>>>>>
>>>>>>
>>>>>>   org.apache.nifi
>>>>>>   nifi-update-attribute-nar
>>>>>>   1.10.0-SNAPSHOT
>>>>>> 
>>>>>>
>>>>>> If you made a new version of your NAR, say 2.0.0 and your template
>>>>>> references 1.0.0, then you'll need to update your template.
>>>>>>
>>>>>> On Wed, Aug 7, 2019 at 10:05 PM Mike Thomsen 
>>>>>> wrote:
>>>>>> >
>>>>>> > If it's happening immediately upon trying to import the template, I
>>>>>> believe that's the error message saying that the 1.9 instance cannot find
>>>>>> the NAR file which provided the processor. Also, if you're referring to
>>>>>> 1.9.0 and not 1.9.2 you're going to want to upgrade to the latter because
>>>>>> there are a few critical bugs fixed in 1.9.2.
>>>>>> >
>>>>>> > On Wed, Aug 7, 2019 at 9:19 PM Bimal Mehta 
>>>>>> wrote:
>>>>>> >>
>>>>>> >> Thanks Bryan.
>>>>>> >> My custom processors are part of a template. However when I try to
>>>>>> import my template in NiFi 1.9, I get an error message saying
>>>>>> >> PutFeedMetadata is not known to this NiFi instance. I did update
>>>>>> all the dependencies to NiFi 1.9 and even the p

Re: Custom Processor Upgrade

2019-08-13 Thread Bimal Mehta
Yes it does show as an option.
One thing I noticed is that the when the nar is unpacked, the bundled
dependencies are inside META_INF in the work folder in NiFi 1.6.0, however
in NiFI 1.9.0 they go inside NAR_INF.
Why does this happen?
It seems the custom processor that we have uses Springboot, and references
applicationcontext file which was inside META_INF when it was built.
However I cant see that file anymore in the unpacked nar.

On Tue, Aug 13, 2019 at 8:57 PM Bryan Bende  wrote:

> Does that custom processor type show as an option if you try to add a new
> processor to the canvas?
>
> On Tue, Aug 13, 2019 at 4:54 PM Bimal Mehta  wrote:
>
>> Hi Mike and Bryan,
>>
>> One of my custom processors appears as inactive in NiFi with a yellow
>> triangle error.
>> When I hover over it I see a message saying 'Missing Processor' validated
>> against 'Any Property' is invalid. This is not a valid processor.
>> In the log it seems to invoke GhostProcessor.java which is giving the
>> above error when restarting nifi.
>> This custom processor sits (with my other processors) in my custom_lib
>> folder and I have provided that path in the nifi properties file as
>>
>> *nifi.nar.library.directory.custom=/opt/nifi/custom_lib*
>>
>>
>> Not sure what I missed?
>>
>> Do I need to make entry of this custom processor somewhere?
>>
>>
>> On Thu, Aug 8, 2019 at 9:14 AM Bimal Mehta  wrote:
>>
>>> Thanks Mike and Bryan.
>>> Yes it seems my template was still referring the old version.
>>> I will have it updated now and will reimport.
>>> Also the version of NiFi we are using is the one that comes with CDF. I
>>> am not sure if CDF supports 1.9.2 yet or not. I will reach out to Cloudera
>>> and see if we can get it upgraded.
>>>
>>>
>>>
>>> On Thu, Aug 8, 2019, 8:51 AM Bryan Bende  wrote:
>>>
>>>> What is in the template for the bundle coordinates of your processor?
>>>> and does that match the coordinates of the NAR that is deployed?
>>>>
>>>> Example:
>>>>
>>>>
>>>>   org.apache.nifi
>>>>   nifi-update-attribute-nar
>>>>   1.10.0-SNAPSHOT
>>>> 
>>>>
>>>> If you made a new version of your NAR, say 2.0.0 and your template
>>>> references 1.0.0, then you'll need to update your template.
>>>>
>>>> On Wed, Aug 7, 2019 at 10:05 PM Mike Thomsen 
>>>> wrote:
>>>> >
>>>> > If it's happening immediately upon trying to import the template, I
>>>> believe that's the error message saying that the 1.9 instance cannot find
>>>> the NAR file which provided the processor. Also, if you're referring to
>>>> 1.9.0 and not 1.9.2 you're going to want to upgrade to the latter because
>>>> there are a few critical bugs fixed in 1.9.2.
>>>> >
>>>> > On Wed, Aug 7, 2019 at 9:19 PM Bimal Mehta 
>>>> wrote:
>>>> >>
>>>> >> Thanks Bryan.
>>>> >> My custom processors are part of a template. However when I try to
>>>> import my template in NiFi 1.9, I get an error message saying
>>>> >> PutFeedMetadata is not known to this NiFi instance. I did update all
>>>> the dependencies to NiFi 1.9 and even the plugins. We are using a Cloudera
>>>> distributed version of NiFi 1.9.
>>>> >> Any idea why is this happening?
>>>> >>
>>>> >> Thanks
>>>> >>
>>>> >>
>>>> >>
>>>> >> On Wed, Aug 7, 2019 at 3:46 PM Bryan Bende  wrote:
>>>> >>>
>>>> >>> Hello,
>>>> >>>
>>>> >>> Most likely your processor built against 1.6 would run fine in 1.9,
>>>> >>> but to make sure you just need to update any nifi dependencies in
>>>> your
>>>> >>> poms to 1.9.2.
>>>> >>>
>>>> >>> If you created your project from the archetype and didn't change
>>>> >>> anything, then this should just be changing the parent in the root
>>>> pom
>>>> >>> to the new version of nifi-nar-bundles.
>>>> >>>
>>>> >>> If you set it up yourself, then anywhere you depend on nifi-api you
>>>> >>> need to change.
>>>> >>>
>>>> >>> -Bryan
>>>> >>>
>>>> >>> On Wed, Aug 7, 2019 at 3:18 PM Bimal Mehta 
>>>> wrote:
>>>> >>> >
>>>> >>> > Hi,
>>>> >>> >
>>>> >>> > If we have a custom processor that was created with NiFi 1.6,
>>>> what are the steps we need to follow to make it work in 1.9?
>>>> >>> > Is there some sort of steps that explains the jar and pom updates
>>>> we need to do for making it work in 1.9?
>>>>
>>> --
> Sent from Gmail Mobile
>


Re: Custom Processor Upgrade

2019-08-13 Thread Bimal Mehta
Hi Mike and Bryan,

One of my custom processors appears as inactive in NiFi with a yellow
triangle error.
When I hover over it I see a message saying 'Missing Processor' validated
against 'Any Property' is invalid. This is not a valid processor.
In the log it seems to invoke GhostProcessor.java which is giving the above
error when restarting nifi.
This custom processor sits (with my other processors) in my custom_lib
folder and I have provided that path in the nifi properties file as

*nifi.nar.library.directory.custom=/opt/nifi/custom_lib*


Not sure what I missed?

Do I need to make entry of this custom processor somewhere?


On Thu, Aug 8, 2019 at 9:14 AM Bimal Mehta  wrote:

> Thanks Mike and Bryan.
> Yes it seems my template was still referring the old version.
> I will have it updated now and will reimport.
> Also the version of NiFi we are using is the one that comes with CDF. I am
> not sure if CDF supports 1.9.2 yet or not. I will reach out to Cloudera and
> see if we can get it upgraded.
>
>
>
> On Thu, Aug 8, 2019, 8:51 AM Bryan Bende  wrote:
>
>> What is in the template for the bundle coordinates of your processor?
>> and does that match the coordinates of the NAR that is deployed?
>>
>> Example:
>>
>>
>>   org.apache.nifi
>>   nifi-update-attribute-nar
>>   1.10.0-SNAPSHOT
>> 
>>
>> If you made a new version of your NAR, say 2.0.0 and your template
>> references 1.0.0, then you'll need to update your template.
>>
>> On Wed, Aug 7, 2019 at 10:05 PM Mike Thomsen 
>> wrote:
>> >
>> > If it's happening immediately upon trying to import the template, I
>> believe that's the error message saying that the 1.9 instance cannot find
>> the NAR file which provided the processor. Also, if you're referring to
>> 1.9.0 and not 1.9.2 you're going to want to upgrade to the latter because
>> there are a few critical bugs fixed in 1.9.2.
>> >
>> > On Wed, Aug 7, 2019 at 9:19 PM Bimal Mehta  wrote:
>> >>
>> >> Thanks Bryan.
>> >> My custom processors are part of a template. However when I try to
>> import my template in NiFi 1.9, I get an error message saying
>> >> PutFeedMetadata is not known to this NiFi instance. I did update all
>> the dependencies to NiFi 1.9 and even the plugins. We are using a Cloudera
>> distributed version of NiFi 1.9.
>> >> Any idea why is this happening?
>> >>
>> >> Thanks
>> >>
>> >>
>> >>
>> >> On Wed, Aug 7, 2019 at 3:46 PM Bryan Bende  wrote:
>> >>>
>> >>> Hello,
>> >>>
>> >>> Most likely your processor built against 1.6 would run fine in 1.9,
>> >>> but to make sure you just need to update any nifi dependencies in your
>> >>> poms to 1.9.2.
>> >>>
>> >>> If you created your project from the archetype and didn't change
>> >>> anything, then this should just be changing the parent in the root pom
>> >>> to the new version of nifi-nar-bundles.
>> >>>
>> >>> If you set it up yourself, then anywhere you depend on nifi-api you
>> >>> need to change.
>> >>>
>> >>> -Bryan
>> >>>
>> >>> On Wed, Aug 7, 2019 at 3:18 PM Bimal Mehta 
>> wrote:
>> >>> >
>> >>> > Hi,
>> >>> >
>> >>> > If we have a custom processor that was created with NiFi 1.6, what
>> are the steps we need to follow to make it work in 1.9?
>> >>> > Is there some sort of steps that explains the jar and pom updates
>> we need to do for making it work in 1.9?
>>
>


Re: Data Ingestion using NiFi

2019-08-13 Thread Bimal Mehta
Thanks Mike.
ExecuteSQL looks good and am trying it.

Also I wanted to understand how can we control triggering the NiFi jobs
from devops tools like CloudBees/ElectricFlow?

On Tue, Aug 13, 2019 at 7:35 AM Mike Thomsen  wrote:

> Bimal,
>
> 1. Take a look at ExecuteSQLRecord and see if that works for you. I don't
> use SQL databases that much, but it works like a charm for me and others
> for querying and getting an inferred avro schema based on the schema of the
> database table (you can massage it into another format with ConvertRecord).
> 2. Take a look at QueryRecord and PartitionRecord with them configured to
> use Avro readers and writers.
>
> Mike
>
> On Tue, Aug 13, 2019 at 12:25 AM Bimal Mehta  wrote:
>
>> Hi NiFi users,
>>
>> We had been using the kylo data ingest template to read the data from our
>> Oracle and DB2 databases and move it into HDFS and Hive.
>> The kylo data ingest template also provided some features to validate,
>> profile and split the data based on validation rules. We also built some
>> custom processors and added them to the template.
>> We recently migrated to NiFi 1.9.0 (CDF), and a lot of Kylo processors
>> don't work there. We were able to make our custom processors work in 1.9.0
>> but the kylo nar files don't work. I don't know if any work around exists
>> for that.
>>
>> However given that the kylo project is dead, I don't want to depend on
>> those kylo-nar files and processors, what I wanted to understand is how do
>> I replicate that functionality using the standard processors available in
>> NiFi.
>>
>> Essentially are there processors that allow me to do the below:
>> 1. Read data from database - I know QueryDatabaseTable. Any other? How do
>> I make it parameterized so that I don't need to create one flow for one
>> table. How can we pass the table name while running the job?
>> 2. Partition and convert to avro- I know splitavro, but does it partition
>> also, and how do I pass the partition parameters
>> 3. Write data to HDFS and Hive- I know PutHDFS works for writing to HDFS,
>> but should I use PutSQL for Hive by converting the avro in step 2 to SQL?
>> Or is there a better option. Does this support upserts as well?
>> 4. Apply validation rules to the data before being written into Hive.
>> Like calling a custom spark job that will execute the validation rules and
>> split the data. Any processor that can help achieve this?
>>
>> I know a few users in this group had used kylo on top of NiFi. It will be
>> great if some of you can provide your perspective as well.
>>
>> Thanks in advance.
>>
>> Bimal Mehta
>>
>


Data Ingestion using NiFi

2019-08-12 Thread Bimal Mehta
Hi NiFi users,

We had been using the kylo data ingest template to read the data from our
Oracle and DB2 databases and move it into HDFS and Hive.
The kylo data ingest template also provided some features to validate,
profile and split the data based on validation rules. We also built some
custom processors and added them to the template.
We recently migrated to NiFi 1.9.0 (CDF), and a lot of Kylo processors
don't work there. We were able to make our custom processors work in 1.9.0
but the kylo nar files don't work. I don't know if any work around exists
for that.

However given that the kylo project is dead, I don't want to depend on
those kylo-nar files and processors, what I wanted to understand is how do
I replicate that functionality using the standard processors available in
NiFi.

Essentially are there processors that allow me to do the below:
1. Read data from database - I know QueryDatabaseTable. Any other? How do I
make it parameterized so that I don't need to create one flow for one
table. How can we pass the table name while running the job?
2. Partition and convert to avro- I know splitavro, but does it partition
also, and how do I pass the partition parameters
3. Write data to HDFS and Hive- I know PutHDFS works for writing to HDFS,
but should I use PutSQL for Hive by converting the avro in step 2 to SQL?
Or is there a better option. Does this support upserts as well?
4. Apply validation rules to the data before being written into Hive. Like
calling a custom spark job that will execute the validation rules and split
the data. Any processor that can help achieve this?

I know a few users in this group had used kylo on top of NiFi. It will be
great if some of you can provide your perspective as well.

Thanks in advance.

Bimal Mehta


Re: Custom Processor Upgrade

2019-08-08 Thread Bimal Mehta
Thanks Mike and Bryan.
Yes it seems my template was still referring the old version.
I will have it updated now and will reimport.
Also the version of NiFi we are using is the one that comes with CDF. I am
not sure if CDF supports 1.9.2 yet or not. I will reach out to Cloudera and
see if we can get it upgraded.



On Thu, Aug 8, 2019, 8:51 AM Bryan Bende  wrote:

> What is in the template for the bundle coordinates of your processor?
> and does that match the coordinates of the NAR that is deployed?
>
> Example:
>
>
>   org.apache.nifi
>   nifi-update-attribute-nar
>   1.10.0-SNAPSHOT
> 
>
> If you made a new version of your NAR, say 2.0.0 and your template
> references 1.0.0, then you'll need to update your template.
>
> On Wed, Aug 7, 2019 at 10:05 PM Mike Thomsen 
> wrote:
> >
> > If it's happening immediately upon trying to import the template, I
> believe that's the error message saying that the 1.9 instance cannot find
> the NAR file which provided the processor. Also, if you're referring to
> 1.9.0 and not 1.9.2 you're going to want to upgrade to the latter because
> there are a few critical bugs fixed in 1.9.2.
> >
> > On Wed, Aug 7, 2019 at 9:19 PM Bimal Mehta  wrote:
> >>
> >> Thanks Bryan.
> >> My custom processors are part of a template. However when I try to
> import my template in NiFi 1.9, I get an error message saying
> >> PutFeedMetadata is not known to this NiFi instance. I did update all
> the dependencies to NiFi 1.9 and even the plugins. We are using a Cloudera
> distributed version of NiFi 1.9.
> >> Any idea why is this happening?
> >>
> >> Thanks
> >>
> >>
> >>
> >> On Wed, Aug 7, 2019 at 3:46 PM Bryan Bende  wrote:
> >>>
> >>> Hello,
> >>>
> >>> Most likely your processor built against 1.6 would run fine in 1.9,
> >>> but to make sure you just need to update any nifi dependencies in your
> >>> poms to 1.9.2.
> >>>
> >>> If you created your project from the archetype and didn't change
> >>> anything, then this should just be changing the parent in the root pom
> >>> to the new version of nifi-nar-bundles.
> >>>
> >>> If you set it up yourself, then anywhere you depend on nifi-api you
> >>> need to change.
> >>>
> >>> -Bryan
> >>>
> >>> On Wed, Aug 7, 2019 at 3:18 PM Bimal Mehta  wrote:
> >>> >
> >>> > Hi,
> >>> >
> >>> > If we have a custom processor that was created with NiFi 1.6, what
> are the steps we need to follow to make it work in 1.9?
> >>> > Is there some sort of steps that explains the jar and pom updates we
> need to do for making it work in 1.9?
>


Re: Custom Processor Upgrade

2019-08-07 Thread Bimal Mehta
Thanks Bryan.
My custom processors are part of a template. However when I try to import
my template in NiFi 1.9, I get an error message saying
PutFeedMetadata is not known to this NiFi instance. I did update all the
dependencies to NiFi 1.9 and even the plugins. We are using a Cloudera
distributed version of NiFi 1.9.
Any idea why is this happening?

Thanks



On Wed, Aug 7, 2019 at 3:46 PM Bryan Bende  wrote:

> Hello,
>
> Most likely your processor built against 1.6 would run fine in 1.9,
> but to make sure you just need to update any nifi dependencies in your
> poms to 1.9.2.
>
> If you created your project from the archetype and didn't change
> anything, then this should just be changing the parent in the root pom
> to the new version of nifi-nar-bundles.
>
> If you set it up yourself, then anywhere you depend on nifi-api you
> need to change.
>
> -Bryan
>
> On Wed, Aug 7, 2019 at 3:18 PM Bimal Mehta  wrote:
> >
> > Hi,
> >
> > If we have a custom processor that was created with NiFi 1.6, what are
> the steps we need to follow to make it work in 1.9?
> > Is there some sort of steps that explains the jar and pom updates we
> need to do for making it work in 1.9?
>


Custom Processor Upgrade

2019-08-07 Thread Bimal Mehta
Hi,

If we have a custom processor that was created with NiFi 1.6, what are the
steps we need to follow to make it work in 1.9?
Is there some sort of steps that explains the jar and pom updates we need
to do for making it work in 1.9?