ListenFTP connection issue

2022-09-17 Thread Mike Harding
Hi all,

I have a ListenFTP processor running in a nifi docker container that runs
within a bridge network (so it doesn't bind to the host IP). But this
causes an issue (unreachable host) for FTP clients attempting to connect in
passive mode where the FTP server sends the internal bridge network IP, not
the host IP address to the client.

I need the nifi container to run on the bridge network but I don't see any
way in which I can tell the listenFTP processor to send the host IP address
to the host. Changing the listenFTP bind address to the host IP/hostname
throws an error.

Anyone experienced this issue or could suggest a workaround?

Thanks,
Mike

p.s - an active FTP connection from the client is problematic where the
client doesnt have a static IP / behind a NAT.


Re: Is it possible to reference python requests module in ExecuteScript?

2017-05-06 Thread Mike Harding
..just to follow up @Andre - your solution worked for me - thankyou.

Mike

On 6 May 2017 at 11:22, Mike Harding <mikeyhard...@gmail.com> wrote:

> Thanks for all the suggestions. Regards using Groovy I did try and use it
> to solve my problem but just couldn't get it to work correctly. I tried to
> implement something similar to the following solution given here (
> http://stackoverflow.com/questions/24827855/groovy-
> httpbuilder-issues-with-cookies) to address a cookie authentication issue
> but the cookie didn't seem to attach/work with follow-on requests and still
> gave me 401 but I managed to get something working in python hence why I
> was trying to use that.
>
> I'm wondering whether I should just use ExecuteProcess/ExecuteStreamCommand
> as a quick fix?
>
> Mike
>
> On 6 May 2017 at 09:29, Giovanni Lanzani <giovannilanz...@godatadriven.com
> > wrote:
>
>> Please do not remove the Python scripting facilities.
>>
>> I believe most people experience it as very slow because (ok, Python is
>> slow) they only get a flow file at a time.
>>
>> I think Matt pointed out in this ML once that you can use
>>
>> flowfiles = session.get(1000)
>> for flowfile in filter(None, flowfiles):
>> # do things
>>
>> In that case Jython will be kept alive much longer. Or am I missing
>> something?
>>
>> Giovanni
>>
>> > -Original Message-
>> > From: Joe Witt [mailto:joe.w...@gmail.com]
>> > Sent: Saturday, May 6, 2017 1:12 AM
>> > To: users@nifi.apache.org
>> > Subject: Re: Is it possible to reference python requests module in
>> > ExecuteScript?
>> >
>> > It is worth discussing whether there is sufficient value to warrant
>> keeping
>> > jython/python support in the processors or whether we should pull it.
>> It is
>> > certainly something we can document as being highly limited but we don't
>> > really know how limited.  Frankly given the performance I've seen with
>> it I'd be
>> > ok removing it entirely.  One is better off calling the script via a
>> system call.
>> > Groovy is one that I've seen perform very well and be fully featured.
>> >
>> > On Fri, May 5, 2017 at 6:38 PM, Russell Bateman <r...@windofkeltia.com>
>> > wrote:
>> > > We really want to use ExecuteScript because our end users are
>> Pythonistas.
>> > > They tend to punctuate their flows with the equivalent of PutFile and
>> > > GetFile with Python scripts doing stuff on flowfiles that pass out of
>> > > NiFi before returning into NiFi.
>> > >
>> > > However, we find it nearly impossible to replace even the tamest of
>> > > undertakings. If there were a good set of NiFi/Python shims that, from
>> > > PyCharm, etc., gave us the ability to prototype, test and debug before
>> > > copying and pasting into ExecuteScript, that would be wonderful. It
>> > > hasn't worked out that way. Most of our experience is copying, pasting
>> > > into the processor property, only to find something wrong, sometimes
>> > > syntax, sometimes something runtime.
>> > >
>> > > On their behalf, I played with this processor a few hours a while
>> back.
>> > > Another colleague too. Googling this underused tool hasn't been
>> > > helpful, so the overall experience is negative so far. I can get most
>> > > of the examples out there to work, but as soon as I try to do "real"
>> > > work from my point of view, my plans sort of cave in.
>> > >
>> > > Likely the Groovy and/or Ruby options are better? But, we're not
>> > > Groovy or Ruby guys here. I understand the problems with this tool and
>> > > so I understand what the obstacles are to it growing stronger. The
>> > > problems won't yield to a few hours one Saturday afternoon. Better
>> > > problem-logging underneath and
>> > > better- and more lenient Python support on top. The second one is
>> > > tough, though.
>> > >
>> > > My approach is to minimize those black holes these guys put into their
>> > > flows by creating custom processors for what I can't solve using
>> > > standard processors.
>> > >
>> > > Trying not to be too negative here...
>> > >
>> > >
>> > > On 05/05/2017 04:09 PM, Andre wrote:
>> > >
>> > > Mike,
>> > >
>> > > I believe it is possible to use requests under jython, however the
>> > > process isn't very intuitive.
>> > >
>> > > I know one folk that if I recall correctly has used it. Happy to try
>> > > to find out how it is done.
>> > >
>> > > Cheers
>> > >
>> > > On Sat, May 6, 2017 at 4:57 AM, Mike Harding <mikeyhard...@gmail.com>
>> > wrote:
>> > >>
>> > >> Hi All, I'm now looking at using ExecuteScript and python engine to
>> > >> execute HTTP requests using the requests module. I've tried
>> > >> referencing requests the module but when I try to import requests I
>> > >> get a module reference error.
>> > >> I downloaded the module from here >
>> > >> https://pypi.python.org/pypi/requests
>> > >> Not sure why it isnt picking it up. Ive tried referencing the
>> > >> directory and the .py directly with no success.
>> > >> Any ideas where im going wrong?
>> > >> Cheers,
>> > >> Mike
>>
>
>


Re: Is it possible to reference python requests module in ExecuteScript?

2017-05-06 Thread Mike Harding
Thanks for all the suggestions. Regards using Groovy I did try and use it
to solve my problem but just couldn't get it to work correctly. I tried to
implement something similar to the following solution given here (
http://stackoverflow.com/questions/24827855/groovy-httpbuilder-issues-with-cookies)
to address a cookie authentication issue but the cookie didn't seem to
attach/work with follow-on requests and still gave me 401 but I managed to
get something working in python hence why I was trying to use that.

I'm wondering whether I should just use ExecuteProcess/ExecuteStreamCommand
as a quick fix?

Mike

On 6 May 2017 at 09:29, Giovanni Lanzani <giovannilanz...@godatadriven.com>
wrote:

> Please do not remove the Python scripting facilities.
>
> I believe most people experience it as very slow because (ok, Python is
> slow) they only get a flow file at a time.
>
> I think Matt pointed out in this ML once that you can use
>
> flowfiles = session.get(1000)
> for flowfile in filter(None, flowfiles):
> # do things
>
> In that case Jython will be kept alive much longer. Or am I missing
> something?
>
> Giovanni
>
> > -Original Message-
> > From: Joe Witt [mailto:joe.w...@gmail.com]
> > Sent: Saturday, May 6, 2017 1:12 AM
> > To: users@nifi.apache.org
> > Subject: Re: Is it possible to reference python requests module in
> > ExecuteScript?
> >
> > It is worth discussing whether there is sufficient value to warrant
> keeping
> > jython/python support in the processors or whether we should pull it.
> It is
> > certainly something we can document as being highly limited but we don't
> > really know how limited.  Frankly given the performance I've seen with
> it I'd be
> > ok removing it entirely.  One is better off calling the script via a
> system call.
> > Groovy is one that I've seen perform very well and be fully featured.
> >
> > On Fri, May 5, 2017 at 6:38 PM, Russell Bateman <r...@windofkeltia.com>
> > wrote:
> > > We really want to use ExecuteScript because our end users are
> Pythonistas.
> > > They tend to punctuate their flows with the equivalent of PutFile and
> > > GetFile with Python scripts doing stuff on flowfiles that pass out of
> > > NiFi before returning into NiFi.
> > >
> > > However, we find it nearly impossible to replace even the tamest of
> > > undertakings. If there were a good set of NiFi/Python shims that, from
> > > PyCharm, etc., gave us the ability to prototype, test and debug before
> > > copying and pasting into ExecuteScript, that would be wonderful. It
> > > hasn't worked out that way. Most of our experience is copying, pasting
> > > into the processor property, only to find something wrong, sometimes
> > > syntax, sometimes something runtime.
> > >
> > > On their behalf, I played with this processor a few hours a while back.
> > > Another colleague too. Googling this underused tool hasn't been
> > > helpful, so the overall experience is negative so far. I can get most
> > > of the examples out there to work, but as soon as I try to do "real"
> > > work from my point of view, my plans sort of cave in.
> > >
> > > Likely the Groovy and/or Ruby options are better? But, we're not
> > > Groovy or Ruby guys here. I understand the problems with this tool and
> > > so I understand what the obstacles are to it growing stronger. The
> > > problems won't yield to a few hours one Saturday afternoon. Better
> > > problem-logging underneath and
> > > better- and more lenient Python support on top. The second one is
> > > tough, though.
> > >
> > > My approach is to minimize those black holes these guys put into their
> > > flows by creating custom processors for what I can't solve using
> > > standard processors.
> > >
> > > Trying not to be too negative here...
> > >
> > >
> > > On 05/05/2017 04:09 PM, Andre wrote:
> > >
> > > Mike,
> > >
> > > I believe it is possible to use requests under jython, however the
> > > process isn't very intuitive.
> > >
> > > I know one folk that if I recall correctly has used it. Happy to try
> > > to find out how it is done.
> > >
> > > Cheers
> > >
> > > On Sat, May 6, 2017 at 4:57 AM, Mike Harding <mikeyhard...@gmail.com>
> > wrote:
> > >>
> > >> Hi All, I'm now looking at using ExecuteScript and python engine to
> > >> execute HTTP requests using the requests module. I've tried
> > >> referencing requests the module but when I try to import requests I
> > >> get a module reference error.
> > >> I downloaded the module from here >
> > >> https://pypi.python.org/pypi/requests
> > >> Not sure why it isnt picking it up. Ive tried referencing the
> > >> directory and the .py directly with no success.
> > >> Any ideas where im going wrong?
> > >> Cheers,
> > >> Mike
>


Is it possible to reference python requests module in ExecuteScript?

2017-05-05 Thread Mike Harding
Hi All,

I'm now looking at using ExecuteScript and python engine to execute HTTP
requests using the requests module.

I've tried referencing requests the module but when I try to import
requests I get a module reference error.

I downloaded the module from here > https://pypi.python.org/pypi/requests

Not sure why it isnt picking it up. Ive tried referencing the directory and
the .py directly with no success.

Any ideas where im going wrong?

Cheers,
Mike


Re: Issue with Groovy script

2017-05-05 Thread Mike Harding
Thanks both - I assumed if I included the root directory that would not
only pick up the http-builder.jar but also the dependencies. Including the
dependencies directory fixed the issue.

Much appreciated,
Mike

On 4 May 2017 at 20:09, Matt Burgess <mattyb...@apache.org> wrote:

> Mike,
>
> To follow up on Andy's question, you will likely need more than just
> the http-builder JAR, I don't believe it is shaded (aka "fat JAR"). I
> have the "http-builder-0.7-all.zip" unzipped to a folder, and it has
> the http-builder-0.7.jar at the root level, but then a "dependencies"
> folder as well. If you have something similar, you will want to add
> the JAR and the dependencies folder to the Module Directory property.
>
> Regards,
> Matt
>
> On Thu, May 4, 2017 at 3:04 PM, Andy LoPresto <alopre...@apache.org>
> wrote:
> > Mike,
> >
> > When you say you’ve “included the http-builder jar as a dependency” do
> you
> > mean you provided the location of the directory containing that JAR as
> the
> > Module Path in the ExecuteScript processor?
> >
> > Andy LoPresto
> > alopre...@apache.org
> > alopresto.apa...@gmail.com
> > PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4  BACE 3C6E F65B 2F7D EF69
> >
> > On May 4, 2017, at 1:58 PM, Mike Harding <mikeyhard...@gmail.com> wrote:
> >
> > Hi all,
> >
> > I'm trying to run a simple groovy script in ExecuteScript processor to
> make
> > a HTTP GET request (I understand their are processors get this but I'm
> just
> > exploring Groovy at the minute).
> >
> >> import groovyx.net.http.HTTPBuilder
> >> flowFile = session.get()
> >> def http = new HTTPBuilder('https://google.com')
> >> def html = http.get(path : '/search', query : [q:'waffles'])
> >> log.warn(html)
> >> session.transfer(flowFile, REL_SUCCESS)
> >
> >
> > Ive included the http-builder jar as a dependency but I'm getting the
> error:
> >
> > 
> >
> > I'm not new to NiFi but new to using Groovy. I've tried import
> > org.apache.http.* but that doesn't help. I'm assuming that the missing
> class
> > library is a default library in Groovy?
> >
> > Any help much appreciated,
> > Mike
> >
> >
>


Issue with Groovy script

2017-05-04 Thread Mike Harding
Hi all,

I'm trying to run a simple groovy script in ExecuteScript processor to make
a HTTP GET request (I understand their are processors get this but I'm just
exploring Groovy at the minute).

import groovyx.net.http.HTTPBuilder
> flowFile = session.get()
> def http = new HTTPBuilder('https://google.com')
> def html = http.get(path : '/search', query : [q:'waffles'])
> log.warn(html)
> session.transfer(flowFile, REL_SUCCESS)


Ive included the http-builder jar as a dependency but I'm getting the error:

[image: Inline images 1]

I'm not new to NiFi but new to using Groovy. I've tried import
org.apache.http.* but that doesn't help. I'm assuming that the missing
class library is a default library in Groovy?

Any help much appreciated,
Mike


Handling HTTP Cookies in NiFi

2017-05-04 Thread Mike Harding
Hi All,

There is an external web service that I need to authenticate with through
NiFi. The service provides a POST endpoint (only) to pass my credentials to
authenticate, which then responses with a cookie with an auth token to be
used in subsequent requests.

I can perform a POST using InvokeHTTP which returns successfully with a
'Set-Cookie' attribute in the flowfile but I don't understand how I can set
this cookie somewhere in NiFi to then be used in subsequent requests.

For example after I have authenticated I want to perform a GET request to
access some data.

Any help much appreciated,

Mike


Re: ExecuteSQL with LIMIT

2017-01-27 Thread Mike Harding
Please ignore this - I had the processor on the wrong scheduler causing the
behaviour I was seeing.

Mike

On 27 January 2017 at 13:00, Mike Harding <mikeyhard...@gmail.com> wrote:

> Hi Mike,
>
> I want to run a select command with ExecuteSQL but it doesn't seem to work
> and the query pulls all of the records. Is it supported or will another
> processor type allow me to LIMIT the result set returned?
>
> Thanks,
> Mike
>


ExecuteSQL with LIMIT

2017-01-27 Thread Mike Harding
Hi Mike,

I want to run a select command with ExecuteSQL but it doesn't seem to work
and the query pulls all of the records. Is it supported or will another
processor type allow me to LIMIT the result set returned?

Thanks,
Mike


Re: SplitJson:GC Overhead Limit Exceeded

2016-11-17 Thread Mike Harding
..just for info in bootstrap.conf my heap size is as follows:

java.arg.2=-Xms512m

java.arg.3=-Xmx512m

Would it be a simple case of increasing this? The size of the flowfile json
array is 35MB.

Mike



On 17 November 2016 at 15:47, Mike Harding <mikeyhard...@gmail.com> wrote:

> Hi All,
>
> I have a flowfile containing a JSON array with 30k objects that I am
> trying to split into separate flowfiles for down stream processing.
>
> The problem is the processor reports a GC Overhead Limit Exceeded warning
> and administratively yields.
>
> Is there anyway of setting up a back pressure option or some changes to
> the nifi config to best address this.
>
> Thanks,
> Mike
>


SplitJson:GC Overhead Limit Exceeded

2016-11-17 Thread Mike Harding
Hi All,

I have a flowfile containing a JSON array with 30k objects that I am trying
to split into separate flowfiles for down stream processing.

The problem is the processor reports a GC Overhead Limit Exceeded warning
and administratively yields.

Is there anyway of setting up a back pressure option or some changes to the
nifi config to best address this.

Thanks,
Mike


Best practice for querying table mid-flow

2016-09-30 Thread Mike Harding
Hi All,

I have a Nifi data flow that receives flowfiles each containing a JSON
object. As part of the transformation of each flowfile I want to query a
hive table using a property in the flowfile's JSON content to retrieve
additional information that I then want to inject into the flowfile. The
updated flowfile is then passed onto the next processor downstream.

Currently the only way I can think of to do this is to:

1 - Put the Flowfile's JSON object into attributes using EvaluateJsonPath
processor.

2 - Pass the Flowfile to a SelectHiveQL processor which runs the query
(using the property from the attribute) and returns the result.

3 - I then pass this to an ExecuteScript processor where I extract the
query result from the Flowfile content and write out the original JSON
object (stored in the attribute) to a new Flowfile content using the query
result to update properties in the JSON object.

Does this make sense, feels like there must be a simpler way?

Mike


Re: Drop FlowFIle in ExecuteScript

2016-08-31 Thread Mike Harding
Cheers Matt - is there any API documentation for the Session object online ?

On 31 August 2016 at 16:03, Matt Burgess <mattyb...@gmail.com> wrote:

> Mike,
>
> You can explicitly drop the flow file using session.remove(flowFile).
> I believe for auto-terminating connections that is what is happening
> under the hood.
>
> Regards,
> Matt
>
> On Wed, Aug 31, 2016 at 11:01 AM, Mike Harding <mikeyhard...@gmail.com>
> wrote:
> > Hi all,
> >
> > I have an ExecuteScript processor that creates new flow files to pass on
> to
> > downstream processors from an incoming flowfile.
> >
> > Once I have generated and transferred the newly created flowfiles to a
> > "SUCCESS" relationship I then transfer the original flow file to an
> > auto-terminating failure relationship and that kind of works. But I'm
> just
> > wondering in reality is this flowfile actually being purged from nifi
> (after
> > some default expiration?) or is there some way I can explicitly drop the
> > flowfile in my ExecuteScript processor (javascript) code? Whats the
> default
> > behaviour here?
> >
> > Cheers,
> > Mike
>


Drop FlowFIle in ExecuteScript

2016-08-31 Thread Mike Harding
Hi all,

I have an ExecuteScript processor that creates new flow files to pass on to
downstream processors from an incoming flowfile.

Once I have generated and transferred the newly created flowfiles to a
"SUCCESS" relationship I then transfer the original flow file to an
auto-terminating failure relationship and that kind of works. But I'm just
wondering in reality is this flowfile actually being purged from nifi
(after some default expiration?) or is there some way I can explicitly drop
the flowfile in my ExecuteScript processor (javascript) code? Whats the
default behaviour here?

Cheers,
Mike


Re: NiFi global variables / persisting state outside of a pipeline

2016-08-25 Thread Mike Harding
Thanks Bryan - I was unaware of the MapCacheServer functionality - I've now
implemented the approached suggested and it works perfectly.

Mike

On 25 August 2016 at 15:05, Joe Witt <joe.w...@gmail.com> wrote:

> also this is a great use case which has been done quite a bit in the
> past using exactly the sort of logic Bryan calls out.  We've also done
> things like written custom controller services specific to the type of
> data and data structures needed for the job.  But the
> plumbing/infrastructure for it is well supported to avoid the RPC
> calls you mention, ensure the cache gets frequently updated live, and
> that the cache can be used by numerous components at once.
>
> Thanks
> Joe
>
> On Thu, Aug 25, 2016 at 9:57 AM, Bryan Bende <bbe...@gmail.com> wrote:
> > Hi Mike,
> >
> > I think one approach might the following...
> >
> > Setup controller services for DistributedMapCacheServer and
> > DistributedMapCacheClient, then have part of your flow that is triggered
> > periodically and queries your Hive table, probably need to split/parse
> the
> > results, and then use PutDistributedMapCache processor to store them in
> the
> > cache.
> >
> > In the other part of your flow use FetchDistributedMapCache to do a look
> up
> > against the cache.
> >
> > I haven't worked through all of the exact steps, but I think something
> like
> > that should work.
> >
> > Thanks,
> >
> > Bryan
> >
> > On Thu, Aug 25, 2016 at 6:38 AM, Mike Harding <mikeyhard...@gmail.com>
> > wrote:
> >>
> >> Hi All,
> >>
> >> I have a mapping table stored in hive that maps an ID to a readable name
> >> string. When a JSON object enters my nifi pipeline as a flowfile I want
> to
> >> be able to inject the readable name string into the JSON object. The
> problem
> >> is currently as each flowfile enters the pipe I have to make a
> SelectHiveQL
> >> call tofirst get the lookup table data and store as attributes.
> >>
> >> Is there a way I can load the lookup table data once or on a periodic
> >> basis into nifi (as a global variable/attribute) to save having to make
> the
> >> select call for each flowfile which translates to 1000's of calls a
> minute?
> >>
> >> Thanks,
> >> Mike
> >
> >
>


NiFi global variables / persisting state outside of a pipeline

2016-08-25 Thread Mike Harding
Hi All,

I have a mapping table stored in hive that maps an ID to a readable name
string. When a JSON object enters my nifi pipeline as a flowfile I want to
be able to inject the readable name string into the JSON object. The
problem is currently as each flowfile enters the pipe I have to make a
SelectHiveQL call tofirst get the lookup table data and store as attributes.

Is there a way I can load the lookup table data once or on a periodic basis
into nifi (as a global variable/attribute) to save having to make the
select call for each flowfile which translates to 1000's of calls a minute?

Thanks,
Mike


Re: nifi process running at 114% of node CPU !!?

2016-07-20 Thread Mike Harding
Hi Mike,

Thanks for creating the JIRA - I'm unable to create a JIRA as I'm away on
holiday at the minute.

Mike
On Fri, 15 Jul 2016 at 14:30, Michael Moser <moser...@gmail.com> wrote:

> I went ahead and created NIFI-2268 for this, since it was fresh in my
> mind.  ListenHTTP calls ProcessContext.yield() whenever it doesn't have
> work to do, so HandleHttpRequest could do the same.
>
> -- Mike
>
>
> On Thu, Jul 14, 2016 at 7:00 PM, Joe Witt <joe.w...@gmail.com> wrote:
>
>> Mike,
>>
>> If you don't mind could you file a JIRA for this.  Frankly it sounds
>> like a bug to me.  We should consider making a default scheduling
>> period of something a bit slower.  Frankly just dialing back to 100 ms
>> would be sufficient most likely.  If you agree this is a bug please
>> file one here: https://issues.apache.org/jira/browse/NIFI
>>
>> If you could attach a template of the flow that behaves badly and the
>> one that behaves better that would be ideal but if not just a good
>> description should do.
>>
>> Thanks
>> Joe
>>
>> On Thu, Jul 14, 2016 at 6:57 PM, Mike Harding <mikeyhard...@gmail.com>
>> wrote:
>> > Thanks all - I checked the logs and there is nothing I can see thats
>> seems
>> > erroneous. I increased the number of threads for the processor and
>> added the
>> > 10 second scheduling and it has dropped dramatically from 2.5M tasks to
>> 300
>> > over 5 minute period. CPU for the nifi java process is now running at
>> 8-10%
>> > CPU.
>> >
>> > I don't think I saw this issue when using HTTPListen processor which I
>> > recently from to HttpRequestHandle.
>> >
>> > Cheers,
>> > Mike
>> >
>> > On 14 July 2016 at 16:41, Aldrin Piri <aldrinp...@gmail.com> wrote:
>> >>
>> >> Mike,
>> >>
>> >> To add some context, while NiFi will intelligently schedule processors
>> to
>> >> execute, given HandleHTTPRequest's function as a listener, it is
>> constantly
>> >> scheduled to run, checking for a request to handle.  I assume by
>> number of
>> >> tasks, you mean the rolling count over the last 5 minutes.  As
>> mentioned by
>> >> Andy, you can tamper this rate by increasing the run scheduld if the
>> >> handling of the HTTP requests with a slight latency is acceptable to
>> you and
>> >> your needs.
>> >>
>> >>
>> >> On Thu, Jul 14, 2016 at 11:05 AM Andy LoPresto <alopre...@apache.org>
>> >> wrote:
>> >>>
>> >>> Mike,
>> >>>
>> >>> You can adjust the processor properties for the HandleHTTPRequest
>> >>> processor in the scheduling tab.
>> >>>
>> >>> “Concurrent tasks” limits the number of threads this processor will
>> use
>> >>> (default is 1)
>> >>> “Run schedule” determines the frequency that this processor will be
>> run
>> >>> (default is ‘0 sec’ which means continuously)
>> >>>
>> >>> If you are only getting requests on a much slower schedule, you could
>> >>> reduce the run schedule to ~10 seconds and see if this is better for
>> you. I
>> >>> have not encountered NiFi running at such high CPU percentage with
>> that
>> >>> little data.
>> >>>
>> >>> As for the high number of tasks, that is definitely an anomaly.
>> >>> Configuration best practices [1] currently recommend increasing the
>> limit to
>> >>> the 10k range, but 2.5M for a single processor is unusual. Can you
>> inspect
>> >>> the logs (located in $NIFI_HOME/logs) to see if there are errors or
>> more
>> >>> insight there?
>> >>>
>> >>> [1]
>> >>>
>> https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html#configuration-best-practices
>> >>>
>> >>>
>> >>>
>> >>> Andy LoPresto
>> >>> alopre...@apache.org
>> >>> alopresto.apa...@gmail.com
>> >>> PGP Fingerprint: 70EC B3E5 98A6 5A3F D3C4  BACE 3C6E F65B 2F7D EF69
>> >>>
>> >>> On Jul 14, 2016, at 10:36 AM, Mike Harding <mikeyhard...@gmail.com>
>> >>> wrote:
>> >>>
>> >>> ps - also noticed it seems to generate a lot of tasks, currently 2.5M
>> >>> compared to other processes in the pipeline wh

Re: SelectHiveQL HiveConnectionPool issues

2016-05-09 Thread Mike Harding
Query is "select * from  limit 1"

Table schema has a map<string, string> column type which is the cause, the
rest are string.

Cheers,
Mike



On Mon, 9 May 2016 at 17:56, Matt Burgess <mattyb...@gmail.com> wrote:

> Mike,
>
> It shouldn't matter what the underlying format is, as the Hive driver
> should take care of the type coercion. Your error refers to a column that
> is of type JAVA_OBJECT, which in Hive usually happens when you have an
> "interval" type (Added in Hive 1.2.0 [1] but apparently not yet
> documented). Does your select query do things like date arithmetic? If so,
> the SelectHiveQL processor does not currently support interval types, but I
> can take a look. If not, then perhaps one or more of your columns needs
> explicit type coercion in the SELECT query, such that it is recognized as a
> more "conventional" SQL type.
>
> Regards,
> Matt
>
> [1] https://issues.apache.org/jira/browse/HIVE-9792
>
>
> On Mon, May 9, 2016 at 12:34 PM, Mike Harding <mikeyhard...@gmail.com>
> wrote:
>
>> aaah of course! Thanks Matt that fixed it.
>> When I run my select query I can now receive the results in CSV but when
>> I select to export it in Avro I get the following exception:
>>
>> [image: Inline images 1]
>>
>> I'm assuming this is happening because the underlying data on HDFS my
>> hive table is reading from is not Avro? its currently standard JSON.
>>
>> Thanks,
>> Mike
>>
>>
>>
>>
>>
>>
>> On 9 May 2016 at 17:09, Matt Burgess <mattyb...@gmail.com> wrote:
>>
>>> Your URL has a scheme of "mysql", try replacing with "hive2", and also
>>> maybe explicitly setting the port:
>>>
>>> jdbc:hive2://:1/default
>>>
>>> If that doesn't work, can you see if there is an error/stack trace in
>>> logs/nifi-app.log?
>>>
>>> Regards,
>>> Matt
>>>
>>> On Mon, May 9, 2016 at 12:04 PM, Mike Harding <mikeyhard...@gmail.com>
>>> wrote:
>>> > Hi All,
>>> >
>>> > I'm trying to test out the new SelectHiveQL processor but I'm
>>> struggling to
>>> > get the HiveConnectionPool configured correctly as I keep getting
>>> 'error
>>> > getting hive connection'.
>>> >
>>> > I'm setting the database URL to my db 'default' as
>>> > jdbc:mysql:///default
>>> >
>>> > Nifi is installed on a different node in my cluster so I have set the
>>> > hive-site.xml to point to /etc/spark/2.4.0.0-169/0/hive-site.xml
>>> >
>>> > I currently have Hive Authorization = None and HIveServer2
>>> authentication =
>>> > none but I still specify a user name used to create the db without a
>>> > password.
>>> >
>>> > Would appreciate it if someone could share how they have things
>>> configured.
>>> >
>>> > Thanks,
>>> > Mike
>>>
>>
>>
>


Re: SelectHiveQL HiveConnectionPool issues

2016-05-09 Thread Mike Harding
aaah of course! Thanks Matt that fixed it.
When I run my select query I can now receive the results in CSV but when I
select to export it in Avro I get the following exception:

[image: Inline images 1]

I'm assuming this is happening because the underlying data on HDFS my hive
table is reading from is not Avro? its currently standard JSON.

Thanks,
Mike






On 9 May 2016 at 17:09, Matt Burgess <mattyb...@gmail.com> wrote:

> Your URL has a scheme of "mysql", try replacing with "hive2", and also
> maybe explicitly setting the port:
>
> jdbc:hive2://:1/default
>
> If that doesn't work, can you see if there is an error/stack trace in
> logs/nifi-app.log?
>
> Regards,
> Matt
>
> On Mon, May 9, 2016 at 12:04 PM, Mike Harding <mikeyhard...@gmail.com>
> wrote:
> > Hi All,
> >
> > I'm trying to test out the new SelectHiveQL processor but I'm struggling
> to
> > get the HiveConnectionPool configured correctly as I keep getting 'error
> > getting hive connection'.
> >
> > I'm setting the database URL to my db 'default' as
> > jdbc:mysql:///default
> >
> > Nifi is installed on a different node in my cluster so I have set the
> > hive-site.xml to point to /etc/spark/2.4.0.0-169/0/hive-site.xml
> >
> > I currently have Hive Authorization = None and HIveServer2
> authentication =
> > none but I still specify a user name used to create the db without a
> > password.
> >
> > Would appreciate it if someone could share how they have things
> configured.
> >
> > Thanks,
> > Mike
>


Is it possible to call a HIVE table from a ExecuteScript Processor?

2016-04-26 Thread Mike Harding
Hi All,

I have a requirement to access a lookup Hive table to translate a code
number in a FlowFile to a readable name. I'm just unsure how trivial it is
to connect to the db from an ExecuteScript processor?

Nifi and the hiveserver2 sit on the same node so I'm wondering if its
possible to use HiveServer2's JDBC client (
https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-JDBC)
without any issues?

Thanks in advance,
Mike


Re: howto dynamically change the PutHDFS target directory

2016-04-18 Thread Mike Harding
Awesome! thanks for the heads up..I'll give that a try.

Mike

On 18 April 2016 at 15:02, Bryan Bende <bbe...@gmail.com> wrote:

> Mike,
>
> If I am understanding correctly I think this can be done today... The
> Directory property on PutHDFS supports expression language, so you could
> set it to a value like:
>
> /data/${now():format('dd-MM-yy')}/
>
> This could be set directly in PutHDFS, although it is also a common
> pattern to stick an UpdateAttribute processor in front of PutHDFS and set
> filename and hadoop.dir attributes, and then in PutHDFS reference those as
> ${filename} and ${hadoop.dir}
>
> The advantage to the UpdateAttribute approach is that you can have a
> single PutHDFS processor that actually writes to many different locations.
>
> Hope that helps.
>
> -Bryan
>
>
> On Mon, Apr 18, 2016 at 2:53 PM, Oleg Zhurakousky <
> ozhurakou...@hortonworks.com> wrote:
>
>> Mike
>>
>> Indeed a very common requirement and we should support it.
>> Would you mind raising a JIRA for it?
>> https://issues.apache.org/jira/browse/NIFI
>>
>> Cheers
>> Oleg
>>
>> On Apr 18, 2016, at 9:50 AM, Mike Harding <mikeyhard...@gmail.com> wrote:
>>
>> Hi All,
>>
>> I have a requirement to write a data stream into HDFS, where the
>> flowfiles received per day are group into a directory. e.g. so I would end
>> up with a folder structure as follows:
>>
>> data/18-04-16
>> data/19-04-16
>> data/20-04-16 ... etc
>>
>> Currently I can specify in the config for the putHDFS processor a target
>> directory but I want this to change and point to a new directory as each
>> day ends.
>>
>> So using nifi id like to 1) be able to create new directories in HDFS
>> (although I could potentially write a bash script to do the directory
>> creation) and 2) change the target directory as the day changes.
>>
>> Any help much appreciated,
>>
>> Mike
>>
>>
>>
>


howto dynamically change the PutHDFS target directory

2016-04-18 Thread Mike Harding
Hi All,

I have a requirement to write a data stream into HDFS, where the flowfiles
received per day are group into a directory. e.g. so I would end up with a
folder structure as follows:

data/18-04-16
data/19-04-16
data/20-04-16 ... etc

Currently I can specify in the config for the putHDFS processor a target
directory but I want this to change and point to a new directory as each
day ends.

So using nifi id like to 1) be able to create new directories in HDFS
(although I could potentially write a bash script to do the directory
creation) and 2) change the target directory as the day changes.

Any help much appreciated,

Mike


Re: javascript executescript processor

2016-03-07 Thread Mike Harding
Thanks Matt! worked a treat.

Mike

On 7 March 2016 at 17:42, Matt Burgess <mattyb...@gmail.com> wrote:

> Looks like there are a few more differences between Sun Rhino (Java 7) and
> Nashorn (Java 8) than I thought.  For your importClass statements above,
> those shouldn't be in quotes. Also I couldn't get the one-method interface
> override to work even though the doc says it should, so I used an anonymous
> class with 'process' mapped to the function.
>
> I will update the post or add a new one soon; in the meantime, here's a
> Java 7 version of the script I got working:
>
> importClass(org.apache.commons.io.IOUtils)
> importClass(java.nio.charset.StandardCharsets)
> importClass(org.apache.nifi.processor.io.StreamCallback)
>
> var flowFile = session.get();
> if (flowFile != null) {
>
> flowFile = session.write(flowFile,
> new StreamCallback() {process : function(inputStream,
> outputStream) {
> var text = IOUtils.toString(inputStream,
> StandardCharsets.UTF_8)
>var obj = JSON.parse(text)
> var newObj = {
> "Range": 5,
> "Rating": obj.rating.primary.value,
> "SecondaryRatings": {}
>}
>for(var key in obj.rating){
> var attrName = key;
> var attrValue = obj.rating[key];
>   if(attrName != "primary") {
>   newObj.SecondaryRatings[attrName] = {"id": attrName, "Range": 5,
> "Value": attrValue.value};
> }
> }
>   outputStream.write(new java.lang.String(JSON.stringify(newObj, null,
> '\t')).getBytes(StandardCharsets.UTF_8))
> }})
> flowFile = session.putAttribute(flowFile, "filename",
> flowFile.getAttribute('filename').split('.')[0]+'_translated.json')
> session.transfer(flowFile, REL_SUCCESS)
> }
>
> Regards,
> Matt
>
> On Mon, Mar 7, 2016 at 12:05 PM, Mike Harding <mikeyhard...@gmail.com>
> wrote:
>
>> Just wondering as importClass fails if I do the following.
>>
>> importClass("org.apache.nifi.processor.io.StreamCallback");
>> importClass("org.apache.commons.io.IOUtils");
>> importClass("java.nio.charset.StandardCharsets");
>>
>>
>> Apologies for all the questions - I've never tried to include Java
>> classes in a script before.
>>
>>
>>
>> On 7 March 2016 at 17:03, Mike Harding <mikeyhard...@gmail.com> wrote:
>>
>>> aaa ok cool. Given that org.apache.nifi.processor.io.StreamCallback is
>>> an interface do I need to include the underlying classes?
>>>
>>> On 7 March 2016 at 16:29, Matt Burgess <mattyb...@gmail.com> wrote:
>>>
>>>> Looks like on Rhino you need a different syntax to import stuff:
>>>> http://docs.oracle.com/javase/7/docs/technotes/guides/scripting/programmer_guide/#jsengine
>>>>
>>>> On Mon, Mar 7, 2016 at 11:26 AM, Matt Burgess <mattyb...@gmail.com>
>>>> wrote:
>>>>
>>>>> So that's weird since you're running NiFi on Java already and trying
>>>>> to use JavaScript. What version of Java are you using? I wonder if the
>>>>> Java.type() thing is new in Java 8's Nashorn.
>>>>>
>>>>> Sent from my iPhone
>>>>>
>>>>> On Mar 7, 2016, at 11:22 AM, Mike Harding <mikeyhard...@gmail.com>
>>>>> wrote:
>>>>>
>>>>> Hi Matt,
>>>>>
>>>>> Thanks for doing this - I've just tried to run the template and I get
>>>>> the reference error: "Java" is not defined. I have JAVA_HOME set on my
>>>>> ubuntu machine - just wondering if theres a new config setting I'm missing
>>>>> perhaps?
>>>>>
>>>>> Mike
>>>>>
>>>>> On 2 March 2016 at 21:40, Matt Burgess <mattyb...@gmail.com> wrote:
>>>>>
>>>>>> Ask and ye shall receive ;) I realize most of my examples are in
>>>>>> Groovy so it was a good idea to do some non-trivial stuff in another
>>>>>> language, thanks for the suggestion!
>>>>>>
>>>>>> I recreated the JSON-to-JSON template but with Javascript as the
>>>>>> language:
>>>>>> http://funnifi.blogspot.com/2016/03/executescript-json-to-json-revisited.html
>>>>>>
>>>>>> Regards,
>>>>>> Matt
>>>>>>
>>>>>> On Wed, Mar 2, 2016 at 10:52 AM, Mike Harding <mikeyhard...@gmail.com
>>>>>> > wrote:
>>>&

Re: javascript executescript processor

2016-03-07 Thread Mike Harding
Just wondering as importClass fails if I do the following.

importClass("org.apache.nifi.processor.io.StreamCallback");
importClass("org.apache.commons.io.IOUtils");
importClass("java.nio.charset.StandardCharsets");


Apologies for all the questions - I've never tried to include Java classes
in a script before.



On 7 March 2016 at 17:03, Mike Harding <mikeyhard...@gmail.com> wrote:

> aaa ok cool. Given that org.apache.nifi.processor.io.StreamCallback is an
> interface do I need to include the underlying classes?
>
> On 7 March 2016 at 16:29, Matt Burgess <mattyb...@gmail.com> wrote:
>
>> Looks like on Rhino you need a different syntax to import stuff:
>> http://docs.oracle.com/javase/7/docs/technotes/guides/scripting/programmer_guide/#jsengine
>>
>> On Mon, Mar 7, 2016 at 11:26 AM, Matt Burgess <mattyb...@gmail.com>
>> wrote:
>>
>>> So that's weird since you're running NiFi on Java already and trying to
>>> use JavaScript. What version of Java are you using? I wonder if the
>>> Java.type() thing is new in Java 8's Nashorn.
>>>
>>> Sent from my iPhone
>>>
>>> On Mar 7, 2016, at 11:22 AM, Mike Harding <mikeyhard...@gmail.com>
>>> wrote:
>>>
>>> Hi Matt,
>>>
>>> Thanks for doing this - I've just tried to run the template and I get
>>> the reference error: "Java" is not defined. I have JAVA_HOME set on my
>>> ubuntu machine - just wondering if theres a new config setting I'm missing
>>> perhaps?
>>>
>>> Mike
>>>
>>> On 2 March 2016 at 21:40, Matt Burgess <mattyb...@gmail.com> wrote:
>>>
>>>> Ask and ye shall receive ;) I realize most of my examples are in Groovy
>>>> so it was a good idea to do some non-trivial stuff in another language,
>>>> thanks for the suggestion!
>>>>
>>>> I recreated the JSON-to-JSON template but with Javascript as the
>>>> language:
>>>> http://funnifi.blogspot.com/2016/03/executescript-json-to-json-revisited.html
>>>>
>>>> Regards,
>>>> Matt
>>>>
>>>> On Wed, Mar 2, 2016 at 10:52 AM, Mike Harding <mikeyhard...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi Matt,
>>>>>
>>>>> Do you know if there is documentation that describes the ExecuteScript
>>>>> JavaScript API at the moment ? Just as a practical example how would I
>>>>> translate the Groovy code sample you walk through in this post >
>>>>> http://funnifi.blogspot.co.uk/2016/02/executescript-json-to-json-conversion.html
>>>>>
>>>>> Thanks,
>>>>> M
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On 1 March 2016 at 18:32, Mike Harding <mikeyhard...@gmail.com> wrote:
>>>>>
>>>>>> Hi Matt,
>>>>>>
>>>>>> That's exactly what I'm looking for - much appreciated !
>>>>>>
>>>>>> Thanks,
>>>>>> Mike
>>>>>>
>>>>>> On Tue, 1 Mar 2016 at 18:13, Matt Burgess <mattyb...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Mike,
>>>>>>>
>>>>>>> I have a blog containing a few posts on how to use ExecuteScript and
>>>>>>> InvokeScriptedProcessor: http://funnifi.blogspot.com
>>>>>>>
>>>>>>> One contains an example using Javascript to get data from Hazelcast
>>>>>>> and update flowfile attributes:
>>>>>>> http://funnifi.blogspot.com/2016/02/executescript-using-modules.html
>>>>>>>
>>>>>>> If you'd like to share what you'd like to do with ExecuteScript, I'd
>>>>>>> be happy to help you get going!
>>>>>>>
>>>>>>> Regards,
>>>>>>> Matt
>>>>>>>
>>>>>>> On Tue, Mar 1, 2016 at 11:53 AM, Mike Harding <
>>>>>>> mikeyhard...@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> I'd like to utilise the ExecuteScript processor but I understand
>>>>>>>> that its experimental. Can anyone point me in the direction of an 
>>>>>>>> example
>>>>>>>> or tutorial preferably using Javascript on how to get started with it?
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Mike
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>
>>>>
>>>
>>
>


Re: javascript executescript processor

2016-03-07 Thread Mike Harding
aaa ok cool. Given that org.apache.nifi.processor.io.StreamCallback is an
interface do I need to include the underlying classes?

On 7 March 2016 at 16:29, Matt Burgess <mattyb...@gmail.com> wrote:

> Looks like on Rhino you need a different syntax to import stuff:
> http://docs.oracle.com/javase/7/docs/technotes/guides/scripting/programmer_guide/#jsengine
>
> On Mon, Mar 7, 2016 at 11:26 AM, Matt Burgess <mattyb...@gmail.com> wrote:
>
>> So that's weird since you're running NiFi on Java already and trying to
>> use JavaScript. What version of Java are you using? I wonder if the
>> Java.type() thing is new in Java 8's Nashorn.
>>
>> Sent from my iPhone
>>
>> On Mar 7, 2016, at 11:22 AM, Mike Harding <mikeyhard...@gmail.com> wrote:
>>
>> Hi Matt,
>>
>> Thanks for doing this - I've just tried to run the template and I get the
>> reference error: "Java" is not defined. I have JAVA_HOME set on my ubuntu
>> machine - just wondering if theres a new config setting I'm missing perhaps?
>>
>> Mike
>>
>> On 2 March 2016 at 21:40, Matt Burgess <mattyb...@gmail.com> wrote:
>>
>>> Ask and ye shall receive ;) I realize most of my examples are in Groovy
>>> so it was a good idea to do some non-trivial stuff in another language,
>>> thanks for the suggestion!
>>>
>>> I recreated the JSON-to-JSON template but with Javascript as the
>>> language:
>>> http://funnifi.blogspot.com/2016/03/executescript-json-to-json-revisited.html
>>>
>>> Regards,
>>> Matt
>>>
>>> On Wed, Mar 2, 2016 at 10:52 AM, Mike Harding <mikeyhard...@gmail.com>
>>> wrote:
>>>
>>>> Hi Matt,
>>>>
>>>> Do you know if there is documentation that describes the ExecuteScript
>>>> JavaScript API at the moment ? Just as a practical example how would I
>>>> translate the Groovy code sample you walk through in this post >
>>>> http://funnifi.blogspot.co.uk/2016/02/executescript-json-to-json-conversion.html
>>>>
>>>> Thanks,
>>>> M
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On 1 March 2016 at 18:32, Mike Harding <mikeyhard...@gmail.com> wrote:
>>>>
>>>>> Hi Matt,
>>>>>
>>>>> That's exactly what I'm looking for - much appreciated !
>>>>>
>>>>> Thanks,
>>>>> Mike
>>>>>
>>>>> On Tue, 1 Mar 2016 at 18:13, Matt Burgess <mattyb...@gmail.com> wrote:
>>>>>
>>>>>> Mike,
>>>>>>
>>>>>> I have a blog containing a few posts on how to use ExecuteScript and
>>>>>> InvokeScriptedProcessor: http://funnifi.blogspot.com
>>>>>>
>>>>>> One contains an example using Javascript to get data from Hazelcast
>>>>>> and update flowfile attributes:
>>>>>> http://funnifi.blogspot.com/2016/02/executescript-using-modules.html
>>>>>>
>>>>>> If you'd like to share what you'd like to do with ExecuteScript, I'd
>>>>>> be happy to help you get going!
>>>>>>
>>>>>> Regards,
>>>>>> Matt
>>>>>>
>>>>>> On Tue, Mar 1, 2016 at 11:53 AM, Mike Harding <mikeyhard...@gmail.com
>>>>>> > wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I'd like to utilise the ExecuteScript processor but I understand
>>>>>>> that its experimental. Can anyone point me in the direction of an 
>>>>>>> example
>>>>>>> or tutorial preferably using Javascript on how to get started with it?
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Mike
>>>>>>>
>>>>>>
>>>>>>
>>>>
>>>
>>
>


Re: javascript executescript processor

2016-03-02 Thread Mike Harding
Hi Matt,

Do you know if there is documentation that describes the ExecuteScript
JavaScript API at the moment ? Just as a practical example how would I
translate the Groovy code sample you walk through in this post >
http://funnifi.blogspot.co.uk/2016/02/executescript-json-to-json-conversion.html

Thanks,
M





On 1 March 2016 at 18:32, Mike Harding <mikeyhard...@gmail.com> wrote:

> Hi Matt,
>
> That's exactly what I'm looking for - much appreciated !
>
> Thanks,
> Mike
>
> On Tue, 1 Mar 2016 at 18:13, Matt Burgess <mattyb...@gmail.com> wrote:
>
>> Mike,
>>
>> I have a blog containing a few posts on how to use ExecuteScript and
>> InvokeScriptedProcessor: http://funnifi.blogspot.com
>>
>> One contains an example using Javascript to get data from Hazelcast and
>> update flowfile attributes:
>> http://funnifi.blogspot.com/2016/02/executescript-using-modules.html
>>
>> If you'd like to share what you'd like to do with ExecuteScript, I'd be
>> happy to help you get going!
>>
>> Regards,
>> Matt
>>
>> On Tue, Mar 1, 2016 at 11:53 AM, Mike Harding <mikeyhard...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I'd like to utilise the ExecuteScript processor but I understand that
>>> its experimental. Can anyone point me in the direction of an example or
>>> tutorial preferably using Javascript on how to get started with it?
>>>
>>> Thanks,
>>> Mike
>>>
>>
>>


Nifi JSON event storage in HDFS

2016-03-02 Thread Mike Harding
Hi All,

I currently have a small hadoop cluster running with HDFS and Hive. My
ultimate goal is to leverage NiFi's ingestion and flow capabilities to
store real-time external JSON formatted event data.

What I am unclear about is what the best strategy/design is for storing
FlowFile data (i.e. JSON events in my case) within HDFS that can then be
accessed and analysed in Hive tables.

Is much of the design in terms of storage handled in the NiFi flow or do I
need to set something up external of NiFi to ensure I can query each JSON
formatted event as a record in a Hive log table for example?

Any examples or suggestions much appreciated,

Thanks,
M


Re: javascript executescript processor

2016-03-01 Thread Mike Harding
Hi Matt,

That's exactly what I'm looking for - much appreciated !

Thanks,
Mike

On Tue, 1 Mar 2016 at 18:13, Matt Burgess <mattyb...@gmail.com> wrote:

> Mike,
>
> I have a blog containing a few posts on how to use ExecuteScript and
> InvokeScriptedProcessor: http://funnifi.blogspot.com
>
> One contains an example using Javascript to get data from Hazelcast and
> update flowfile attributes:
> http://funnifi.blogspot.com/2016/02/executescript-using-modules.html
>
> If you'd like to share what you'd like to do with ExecuteScript, I'd be
> happy to help you get going!
>
> Regards,
> Matt
>
> On Tue, Mar 1, 2016 at 11:53 AM, Mike Harding <mikeyhard...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I'd like to utilise the ExecuteScript processor but I understand that its
>> experimental. Can anyone point me in the direction of an example or
>> tutorial preferably using Javascript on how to get started with it?
>>
>> Thanks,
>> Mike
>>
>
>


javascript executescript processor

2016-03-01 Thread Mike Harding
Hi,

I'd like to utilise the ExecuteScript processor but I understand that its
experimental. Can anyone point me in the direction of an example or
tutorial preferably using Javascript on how to get started with it?

Thanks,
Mike