Re: HDF NIfi - Does Nifi writes provenance/data on HDP Node ?

2017-06-15 Thread Shashi Vishwakarma
Hi Koji

I am trying to evaluate HDF NIfi from security perspective. I am trying to
make sure when HDF Nifi talks to HDP , it does not leak/spill  any kind of
information on HDP data nodes (i.e. on local disk). I am fine if it is
writing it on HDFS.




On Thu, Jun 15, 2017 at 2:35 AM, Koji Kawamura <ijokaruma...@gmail.com>
wrote:

> Hi Shashi,
>
> Sorry for delayed response. I am not aware that NiFi writes any
> provenance information on HDP nodes. But if your goal is to expose
> NiFi provenance data to HDFS, Hive (or Spark) to analyze provenance
> data using those services, then SiteToSiteProvenanceReportingTask
> might be helpful.
>
> SiteToSiteProvenanceReportingTask can sends provenance events in JSON
> format. You can send it to a NiFi input port then pass those into HDFS
> by PutHDFS processor.
>
> If not, would you elaborate what you are trying to accomplish?
>
> Thanks,
> Koji
>
> On Mon, Jun 12, 2017 at 6:25 AM, Shashi Vishwakarma
> <shashi.vish...@gmail.com> wrote:
> > Hi
> >
> > I have HDF cluster with 3 Nifi instance which lunches jobs(Hive/Spark) on
> > HDP cluster. Usually nifi writes all information to different
> repositories
> > available on local machine.
> >
> > My question is - Does nifi writes any data,provenance information or does
> > spilling on HDP nodes (ex. data nodes in HDP cluster) while accessing
> > HDFS,Hive or Spark services ?
> >
> > Thanks
> >
> > Shashi
>


HDF NIfi - Does Nifi writes provenance/data on HDP Node ?

2017-06-11 Thread Shashi Vishwakarma
Hi

I have HDF cluster with 3 Nifi instance which lunches jobs(Hive/Spark) on
HDP cluster. Usually nifi writes all information to different repositories
available on local machine.

My question is - Does nifi writes any data,provenance information or does
spilling on HDP nodes (ex. data nodes in HDP cluster) while accessing
HDFS,Hive or Spark services ?

Thanks

Shashi


Re: Keytab Configuration for Nifi processor

2017-06-09 Thread Shashi Vishwakarma
PutHDFS processor does not resolves hostname when I pass nifi/_HOST@REALM.
Anyone way to configure it ?

On Fri, Jun 9, 2017 at 10:52 AM, Shashi Vishwakarma <
shashi.vish...@gmail.com> wrote:

> Hi
>
> Above solution did not worked. In log I can see that Kerberos error as
> "Unable to obtain password".  Nifi is not able to resolve _HOST value .
>
> Thanks
> Shashi
>
> On Thu, Jun 8, 2017 at 9:10 PM, Pierre Villard <
> pierre.villard...@gmail.com> wrote:
>
>> Hi,
>>
>> Using nifi/_HOST@REALM
>> should resolve your problem.
>>
>> Hope this helps.
>>
>>
>> 2017-06-08 22:00 GMT+02:00 Shashi Vishwakarma <shashi.vish...@gmail.com>:
>>
>>> Hi
>>>
>>> I have Nifi 3 node cluster (Installed Via Hortonworks Data Flow - HDF )
>>> in Kerborized environment. As part of installation Ambari has created nifi
>>> service keytab .
>>>
>>> Can I use this nifi.service.keytab for configuring processors like
>>> PutHDFS who talks to Hadoop services ?
>>>
>>> The nifi.service.keytab is machine specific and always expect principal
>>> names with machine information. ex nifi/HOSTNAME@REALM
>>>
>>> If I configure my Processor with nfii/NODE1_Hostname@REALM information
>>> then I see kerberos authentication exception in other two nodes.
>>>
>>> How do I dynamically resolve hostname to use nifi service  keytab  ?
>>>
>>> Thanks
>>> Shashi
>>>
>>
>>
>


Re: Keytab Configuration for Nifi processor

2017-06-09 Thread Shashi Vishwakarma
Hi

Above solution did not worked. In log I can see that Kerberos error as
"Unable to obtain password".  Nifi is not able to resolve _HOST value .

Thanks
Shashi

On Thu, Jun 8, 2017 at 9:10 PM, Pierre Villard <pierre.villard...@gmail.com>
wrote:

> Hi,
>
> Using nifi/_HOST@REALM
> should resolve your problem.
>
> Hope this helps.
>
>
> 2017-06-08 22:00 GMT+02:00 Shashi Vishwakarma <shashi.vish...@gmail.com>:
>
>> Hi
>>
>> I have Nifi 3 node cluster (Installed Via Hortonworks Data Flow - HDF )
>> in Kerborized environment. As part of installation Ambari has created nifi
>> service keytab .
>>
>> Can I use this nifi.service.keytab for configuring processors like
>> PutHDFS who talks to Hadoop services ?
>>
>> The nifi.service.keytab is machine specific and always expect principal
>> names with machine information. ex nifi/HOSTNAME@REALM
>>
>> If I configure my Processor with nfii/NODE1_Hostname@REALM information
>> then I see kerberos authentication exception in other two nodes.
>>
>> How do I dynamically resolve hostname to use nifi service  keytab  ?
>>
>> Thanks
>> Shashi
>>
>
>


Keytab Configuration for Nifi processor

2017-06-08 Thread Shashi Vishwakarma
Hi

I have Nifi 3 node cluster (Installed Via Hortonworks Data Flow - HDF ) in
Kerborized environment. As part of installation Ambari has created nifi
service keytab .

Can I use this nifi.service.keytab for configuring processors like PutHDFS
who talks to Hadoop services ?

The nifi.service.keytab is machine specific and always expect principal
names with machine information. ex nifi/HOSTNAME@REALM

If I configure my Processor with nfii/NODE1_Hostname@REALM information then
I see kerberos authentication exception in other two nodes.

How do I dynamically resolve hostname to use nifi service  keytab  ?

Thanks
Shashi


Re: Spark Streaming with Nifi

2017-06-05 Thread Shashi Vishwakarma
Hi Andrew,

I am trying to understand here bit more in detail. Essentially I will have
to write some custom code in my spark streaming job and construct
provenance event and send it to some store like Hbase,PubSub system to be
consumed by others.

Is that correct ?

If yes how do I execute other processor which are present in pipeline ?

Ex

Nifi --> Kakfa -- > Spark Streaming --> Processor 1 --> Processor 2

Thanks
Shashi








On Mon, Jun 5, 2017 at 12:36 AM, Andrew Psaltis <psaltis.and...@gmail.com>
wrote:

> Hi Shashi,
> Thanks for the explanation.  I have a better understanding of what you are
> trying to accomplish. Although Spark streaming is micro-batch, you would
> not want to keep launching jobs for each batch.   Think of it as the Spark
> scheduler having a while loop in which it executes your job then sleeps for
> X amount of time based on the interval you configure.
>
> Perhaps a better way would be to do the following:
> 1. Use the S2S ProvenanceReportingTask to send provenance information from
> your NiFi instance to a second instance or cluster.
> 2. In the second NiFi instance/cluster ( the one receiving the provenance
> data) you write the data into say HBase or Solr or system X.
> 3. In your Spark streaming job you right into the same data store a
> "provenance" event -- obviously this will not have all the fields that a
> true NiFi provenance record does, but you can come close.
>
> With this then once you would then have all provenance data in an external
> system that you can query to understand the whole system.
>
> Thanks,
> Andrew
>
> P.S. sorry if this is choppy or not well formed, on mobile.
>
> On Sun, Jun 4, 2017 at 17:46 Shashi Vishwakarma <shashi.vish...@gmail.com>
> wrote:
>
>> Thanks Andrew.
>>
>> I agree that decoupling component is good solution from long term
>> perspective. My current data pipeline in Nifi is designed for batch
>> processing which I am trying to convert into streaming model.
>>
>> One of the processor in data pipeline invokes Spark job , once job
>> finished control  is returned to Nifi processor in turn which generates
>> provenance event for job. This provenance event is important for us.
>>
>> Keeping batch model architecture in mind, I want to designed spark
>> streaming based model in which Nifi Spark streaming processor will process
>> micro batch and job status will be returned to Nifi with provenance event.
>> Then I can capture that provenance data for my reports.
>>
>> Essentially I will be using Nifi for capturing provenance event where
>> actual processing will be done by Spark streaming job.
>>
>> Do you see this approach logical ?
>>
>> Thanks
>> Shashi
>>
>>
>> On Sun, Jun 4, 2017 at 3:10 PM, Andrew Psaltis <psaltis.and...@gmail.com>
>> wrote:
>>
>>> Hi Shashi,
>>> I'm sure there is a way to make this work. However, my first question is
>>> why you would want to? By design a Spark Streaming application should
>>> always be running and consuming data from some source, hence the notion of
>>> streaming. Tying Spark Streaming to NiFi would ultimately result in a more
>>> coupled and fragile architecture. Perhaps a different way to think about it
>>> would be to set things up like this:
>>>
>>> NiFi --> Kafka <-- Spark Streaming
>>>
>>> With this you can do what you are doing today -- using NiFi to ingest,
>>> transform, make routing decisions, and feed data into Kafka. In essence you
>>> would be using NiFi to do all the preparation of the data for Spark
>>> Streaming. Kafka would serve the purpose of a buffer between NiFi and Spark
>>> Streaming. Finally, Spark Streaming would ingest data from Kafka and do
>>> what it is designed for -- stream processing. Having a decoupled
>>> architecture like this also allows you to manage each tier separately, thus
>>> you can tune, scale, develop, and deploy all separately.
>>>
>>> I know I did not directly answer your question on how to make it work.
>>> But, hopefully this helps provide an approach that will be a better long
>>> term solution. There may be something I am missing in your initial
>>> questions.
>>>
>>> Thanks,
>>> Andrew
>>>
>>>
>>>
>>> On Sat, Jun 3, 2017 at 10:43 PM, Shashi Vishwakarma <
>>> shashi.vish...@gmail.com> wrote:
>>>
>>>> Hi
>>>>
>>>> I am looking for way where I can make use of spark streaming in Nifi. I
>>>> see couple of post where SiteToSite tc

Re: Spark Streaming with Nifi

2017-06-04 Thread Shashi Vishwakarma
Thanks Andrew.

I agree that decoupling component is good solution from long term
perspective. My current data pipeline in Nifi is designed for batch
processing which I am trying to convert into streaming model.

One of the processor in data pipeline invokes Spark job , once job finished
control  is returned to Nifi processor in turn which generates provenance
event for job. This provenance event is important for us.

Keeping batch model architecture in mind, I want to designed spark
streaming based model in which Nifi Spark streaming processor will process
micro batch and job status will be returned to Nifi with provenance event.
Then I can capture that provenance data for my reports.

Essentially I will be using Nifi for capturing provenance event where
actual processing will be done by Spark streaming job.

Do you see this approach logical ?

Thanks
Shashi


On Sun, Jun 4, 2017 at 3:10 PM, Andrew Psaltis <psaltis.and...@gmail.com>
wrote:

> Hi Shashi,
> I'm sure there is a way to make this work. However, my first question is
> why you would want to? By design a Spark Streaming application should
> always be running and consuming data from some source, hence the notion of
> streaming. Tying Spark Streaming to NiFi would ultimately result in a more
> coupled and fragile architecture. Perhaps a different way to think about it
> would be to set things up like this:
>
> NiFi --> Kafka <-- Spark Streaming
>
> With this you can do what you are doing today -- using NiFi to ingest,
> transform, make routing decisions, and feed data into Kafka. In essence you
> would be using NiFi to do all the preparation of the data for Spark
> Streaming. Kafka would serve the purpose of a buffer between NiFi and Spark
> Streaming. Finally, Spark Streaming would ingest data from Kafka and do
> what it is designed for -- stream processing. Having a decoupled
> architecture like this also allows you to manage each tier separately, thus
> you can tune, scale, develop, and deploy all separately.
>
> I know I did not directly answer your question on how to make it work.
> But, hopefully this helps provide an approach that will be a better long
> term solution. There may be something I am missing in your initial
> questions.
>
> Thanks,
> Andrew
>
>
>
> On Sat, Jun 3, 2017 at 10:43 PM, Shashi Vishwakarma <
> shashi.vish...@gmail.com> wrote:
>
>> Hi
>>
>> I am looking for way where I can make use of spark streaming in Nifi. I
>> see couple of post where SiteToSite tcp connection is used for spark
>> streaming application but I thinking it will be good If I can launch Spark
>> streaming from Nifi custom processor.
>>
>> PublishKafka will publish message into Kafka followed by Nifi Spark
>> streaming processor will read from Kafka Topic.
>>
>> I can launch Spark streaming application from custom Nifi processor using
>> Spark Streaming launcher API but biggest challenge is that it will create
>> spark streaming context for each flow file which can be costly operation.
>>
>> Does any one suggest storing spark streaming context  in controller
>> service ? or any better approach for running spark streaming application
>> with Nifi ?
>>
>> Thanks and Regards,
>> Shashi
>>
>>
>>
>
>
> --
> Thanks,
> Andrew
>
> Subscribe to my book: Streaming Data <http://manning.com/psaltis>
> <https://www.linkedin.com/pub/andrew-psaltis/1/17b/306>
> twiiter: @itmdata <http://twitter.com/intent/user?screen_name=itmdata>
>


Spark Streaming with Nifi

2017-06-03 Thread Shashi Vishwakarma
Hi

I am looking for way where I can make use of spark streaming in Nifi. I see
couple of post where SiteToSite tcp connection is used for spark streaming
application but I thinking it will be good If I can launch Spark streaming
from Nifi custom processor.

PublishKafka will publish message into Kafka followed by Nifi Spark
streaming processor will read from Kafka Topic.

I can launch Spark streaming application from custom Nifi processor using
Spark Streaming launcher API but biggest challenge is that it will create
spark streaming context for each flow file which can be costly operation.

Does any one suggest storing spark streaming context  in controller service
? or any better approach for running spark streaming application with Nifi ?

Thanks and Regards,
Shashi


Unable to Start Nifi | Database nay be already in use : “Locked by another process”

2017-02-19 Thread Shashi Vishwakarma
Hi

I am trying to start nifi but facing h2.jdbc.jdbcSqlException : Database my
be already in used exception.

2017-02-20 16:09:04,189 INFO [main] /nifi-api No Spring
WebApplicationInitializer types detected on classpath
2017-02-20 16:09:04,218 INFO [main] /nifi-api Initializing Spring root
WebApplicationContext
2017-02-20 16:09:05,791 INFO [main]
o.a.nifi.properties.NiFiPropertiesLoader Determined default
nifi.properties path to be '/opt/nifi/current/./conf/nifi.properties'
2017-02-20 16:09:05,793 INFO [main]
o.a.nifi.properties.NiFiPropertiesLoader Determined default
nifi.properties path to be '/opt/nifi/current/./conf/nifi.properties'
2017-02-20 16:09:05,794 INFO [main]
o.a.nifi.properties.NiFiPropertiesLoader Loaded 115 properties from
/opt/nifi/current/./conf/nifi.properties
2017-02-20 16:09:07,878 ERROR [main] o.s.web.context.ContextLoader
Context initialization failed
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'niFiWebApiSecurityConfiguration': Injection
of autowired dependencies failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Could not
autowire method: public void
org.apache.nifi.web.NiFiWebApiSecurityConfiguration.setJwtAuthenticationProvider(org.apache.nifi.web.security.jwt.JwtAuthenticationProvider);
nested exception is
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'jwtAuthenticationProvider' defined in class
path resource [nifi-web-security-context.xml]: Cannot resolve
reference to bean 'jwtService' while setting constructor argument;
nested exception is
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'jwtService' defined in class path resource
[nifi-web-security-context.xml]: Cannot resolve reference to bean
'keyService' while setting constructor argument; nested exception is
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'keyService' defined in class path resource
[nifi-administration-context.xml]: Cannot resolve reference to bean
'keyTransactionBuilder' while setting bean property
'transactionBuilder'; nested exception is
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'keyTransactionBuilder' defined in class path
resource [nifi-administration-context.xml]: Cannot resolve reference
to bean 'keyDataSource' while setting bean property 'dataSource';
nested exception is
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'keyDataSource': FactoryBean threw exception
on object creation; nested exception is org.h2.jdbc.JdbcSQLException:
Database may be already in use: "Locked by another process". Possible
solutions: close all other connection(s); use the server mode
[90020-176]
at 
org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:334)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1214)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:482)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:772)
~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:839)
~[spring-context-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:538)
~[spring-context-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at 
org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:446)
~[spring-web-4.2.4.RELEASE.jar:4.2.4.RELEASE]
 

Re: Nifi | Multiple nar dependency in pom

2016-09-10 Thread Shashi Vishwakarma
Thanks. Above information was very much usefull.

Thanks
Shashi

On Tue, Sep 6, 2016 at 6:16 PM, Matt Gilman <matt.c.gil...@gmail.com> wrote:

> That is correct. Currently, each NAR can only have a single NAR
> dependency. Typically we package the Controller Service APIs together or
> establish a chain. By establishing a chain this is building a transitive
> NAR dependency. Any Controller Service APIs bundled in ancestor NARs will
> be available.
>
> Note, I'm specifically calling out Controller Service APIs as the
> implementations of the Controller Services do not need to be in this NAR
> dependency chain I'm describing. They can be bundled in a separate adjacent
> NARs that share the same Controller Service API NAR dependency.
>
> Thanks
>
> Matt
>
> On Tue, Sep 6, 2016 at 6:27 AM, Shashi Vishwakarma <
> shashi.vish...@gmail.com> wrote:
>
>> Hi
>>
>> I am developing a two custom processor , one having a dependency on
>> controller service 1  and another having a dependency on controller service
>> 2.
>>
>> In processor nar pom , i tried to include both dependency as below.
>>
>>  
>> com.abc.nifi
>> nifi-custom1-service-api-nar
>> 0.3.0-SNAPSHOT
>> nar
>> 
>>
>> 
>> com.abc.nifi.services
>> nifi-custom2-services-nar
>> 0.3.0-SNAPSHOT
>> nar
>> 
>>
>> After compiling , it is giving following error.
>>
>>  Failed to execute goal org.apache.nifi:nifi-nar-maven-plugin:1.1.0:nar
>> (default-nar) on project nifi-custom-nar: Error assembling NAR: Each NAR
>> represents a ClassLoader. A NAR dependency allows that NAR's ClassLoader to
>> be used as the parent of this NAR's ClassLoader. As a result, only a single
>> NAR dependency is allowed.
>>
>> Does that means I cant not include two nar dependency? Is there any
>> way/workaround for this?
>>
>> Thanks
>> Shashi
>>
>
>


Re: Controller Service Implementation Help

2016-09-05 Thread Shashi Vishwakarma
Thanks a lot. It worked. There were two dependencies. I was missing nar
dependency in pom.

It was a great help.

On Tue, Sep 6, 2016 at 12:16 AM, Bryan Bende <bbe...@gmail.com> wrote:

> Shashi,
>
> What does the structure of your controller service and processor projects
> look like? and the dependencies between them?
>
> I feel like this might be related to something not being correct in the
> way the projects depend on each other.
>
> Take a look at this page [1] which shows the recommended project
> structures, and specifically the section about linking a processor and
> controller service. There is also this GitHub example project [2] that
> shows a custom processor using a custom controller service.
>
> Thanks,
>
> Bryan
>
> [1] https://cwiki.apache.org/confluence/display/NIFI/Maven+
> Projects+for+Extensions
> [2] https://github.com/bbende/nifi-dependency-example
>
>
> On Mon, Sep 5, 2016 at 3:03 AM, Shashi Vishwakarma <
> shashi.vish...@gmail.com> wrote:
>
>> Hi Joe,
>>
>> Thanks for your reply. I have set 
>> .identifiesControllerService(MyService.class)
>> . Any other option that might have missed ? Is there any way to debug it?
>>
>> Thanks
>> Shashi
>>
>> On Mon, Sep 5, 2016 at 8:12 AM, Joe Witt <joe.w...@gmail.com> wrote:
>>
>>> Shashi
>>>
>>> You should not have to enter the controller service identifier. In your
>>> processors property descriptor have you used 'identifiesControllerService'
>>> set.  Such as in this example
>>>
>>>
>>> static final PropertyDescriptor SSL_CONTEXT_SERVICE = new
>>> PropertyDescriptor.Builder()
>>>
>>> .name("ssl.context.service")
>>>
>>> .displayName("SSL Context Service")
>>>
>>> .description("Specifies the SSL Context Service to use")
>>>
>>> .required(false)
>>>
>>> .identifiesControllerService(SSLContextService.class)
>>>
>>> .build();
>>>
>>>
>>> Thanks
>>>
>>> Joe
>>>
>>> On Sep 4, 2016 3:02 PM, "Shashi Vishwakarma" <shashi.vish...@gmail.com>
>>> wrote:
>>>
>>>> Hi
>>>>
>>>> I need some help in my designing a controller service. I am developing
>>>> a controller service which returns a object of a class and then that object
>>>> can be referenced by multiple process.For Example
>>>>
>>>> Employee emp = new Employee();
>>>> emp.setName("Jack");
>>>>
>>>> return emp;
>>>>
>>>> emp object will returned from controller service and will be used by
>>>> custom processor.
>>>>
>>>> Let me know if my design is correct . I tried implementing this
>>>> approach but does not seems to be working. While referencing controller
>>>> service getting below value in processor..
>>>>
>>>>
>>>>
>>>>
>>>> My processor is showing some warning.
>>>>
>>>>
>>>> Any pointers on this?
>>>>
>>>> Highly appreciate your help..
>>>>
>>>> Thanks
>>>> Shashi
>>>> ​
>>>>
>>>>
>>
>


Re: Controller Service Implementation Help

2016-09-05 Thread Shashi Vishwakarma
Hi Joe,

Thanks for your reply. I have
set .identifiesControllerService(MyService.class) . Any other option that
might have missed ? Is there any way to debug it?

Thanks
Shashi

On Mon, Sep 5, 2016 at 8:12 AM, Joe Witt <joe.w...@gmail.com> wrote:

> Shashi
>
> You should not have to enter the controller service identifier. In your
> processors property descriptor have you used 'identifiesControllerService'
> set.  Such as in this example
>
>
> static final PropertyDescriptor SSL_CONTEXT_SERVICE = new
> PropertyDescriptor.Builder()
>
> .name("ssl.context.service")
>
> .displayName("SSL Context Service")
>
> .description("Specifies the SSL Context Service to use")
>
> .required(false)
>
> .identifiesControllerService(SSLContextService.class)
>
> .build();
>
>
> Thanks
>
> Joe
>
> On Sep 4, 2016 3:02 PM, "Shashi Vishwakarma" <shashi.vish...@gmail.com>
> wrote:
>
>> Hi
>>
>> I need some help in my designing a controller service. I am developing a
>> controller service which returns a object of a class and then that object
>> can be referenced by multiple process.For Example
>>
>> Employee emp = new Employee();
>> emp.setName("Jack");
>>
>> return emp;
>>
>> emp object will returned from controller service and will be used by
>> custom processor.
>>
>> Let me know if my design is correct . I tried implementing this approach
>> but does not seems to be working. While referencing controller service
>> getting below value in processor..
>>
>>
>>
>>
>> My processor is showing some warning.
>>
>>
>> Any pointers on this?
>>
>> Highly appreciate your help..
>>
>> Thanks
>> Shashi
>> ​
>>
>>


Controller Service Implementation Help

2016-09-04 Thread Shashi Vishwakarma
Hi

I need some help in my designing a controller service. I am developing a
controller service which returns a object of a class and then that object
can be referenced by multiple process.For Example

Employee emp = new Employee();
emp.setName("Jack");

return emp;

emp object will returned from controller service and will be used by custom
processor.

Let me know if my design is correct . I tried implementing this approach
but does not seems to be working. While referencing controller service
getting below value in processor..




My processor is showing some warning.


Any pointers on this?

Highly appreciate your help..

Thanks
Shashi
​


Re: NiFi | Contoller Service is not getting updated

2016-09-04 Thread Shashi Vishwakarma
Thanks . It worked. Do I need to remove directory everytime when i change
controller service?

On Sun, Sep 4, 2016 at 7:03 PM, Matthew Clarke <matt.clarke@gmail.com>
wrote:

> When nifi starts it unpacks the nars in to a work directory. Try deleting
> that NiFi work directory before restarting to see if your changes are seen.
>
> On Sep 4, 2016 9:06 AM, "Shashi Vishwakarma" <shashi.vish...@gmail.com>
> wrote:
>
>> Hi
>>
>> I created a sample controller service - 'MyControllerService' and
>> packaged it into nar and pasted into nifi lib directory. I restarted nifi
>> service to see changes. I was able to see MyControllerService in Controller
>> setting. After that i made small label change into controller service and
>> followed same process but changes are not getting affected.
>>
>> Even i removed nar files from nifi/lib just to check if it getting
>> removed from list. That too not happening.
>>
>> Even I dont see any exception in nifi/log.
>>
>> Any pointer for this issue?
>>
>> Thanks
>> Shashi
>>
>


NiFi | Contoller Service is not getting updated

2016-09-04 Thread Shashi Vishwakarma
Hi

I created a sample controller service - 'MyControllerService' and packaged
it into nar and pasted into nifi lib directory. I restarted nifi service to
see changes. I was able to see MyControllerService in Controller setting.
After that i made small label change into controller service and followed
same process but changes are not getting affected.

Even i removed nar files from nifi/lib just to check if it getting removed
from list. That too not happening.

Even I dont see any exception in nifi/log.

Any pointer for this issue?

Thanks
Shashi


run.as option does not work other than Nifi user

2016-06-03 Thread Shashi Vishwakarma
Hi

I want to run my NiFi application using ec2-user rather than default nifi
user. I changed *run.as =ec2-user* in bootstrap.conf but it
did not worked .It is not allowing me to start Nifi application getting
following error while staring nifi service.



   1. ./nifi.sh start
   2. nifi.sh: JAVA_HOME not set; results may vary
   3.
   4.
   5. Java home:
   6. NiFi home: /opt/nifi/current
   7.
   8.
   9. Bootstrap Config File: /opt/nifi/current/conf/bootstrap.conf
   10.
   11. Error: Could not find or load main class
org.apache.nifi.bootstrap.RunNiFi

Any pointer to this issue?

Thanks

Shashi