Re: GetSQS causes high CPU usage

2015-11-03 Thread Joe Witt
Adam,

Just wanted to follow up on this.  Have you had any better results and
should we put a JIRA in behind what you're seeing?

Thanks
Joe

On Tue, Oct 20, 2015 at 7:58 PM, Adam Lamar  wrote:
> Adam,
>
> Thanks for the reply!
>
> Amazon supports (and recommends) long polling on SQS queues[1]. The GetSQS
> code doesn't attempt long polling at all, but I wasn't sure if this was
> intentional or if the option had just never been added. With a 20 second
> long poll, the processor would make 3 requests per minute instead of 60,
> assuming the queue was empty during that time.
>
> Another data point - even during high CPU usage, the GetSQS processor was
> only making one request per second to SQS (verified via tcpdump). While not
> ideal from a billing perspective, doesn't it seem wrong that 1 request a
> second is causing such high CPU?
>
> Perhaps to muddy the waters a bit, I played with the run schedule yesterday,
> and even now that I've turned it back to 1 second, CPU usage is remaining
> low. Before I could start/stop GetSQS repeatedly and observe the high CPU
> usage, but now I can't reproduce it. If I'm able to consistently reproduce
> the issue in the future, I'll be sure to post again.
>
> Cheers,
> Adam
>
>
> [1]
> http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-long-polling.html
>
>
> On 10/20/15 4:37 AM, Adam Estrada wrote:
>>
>> Adam,
>>
>> I suspect that getSQS is polling Amazon to check for data. It's not
>> exactly like your standard message broker in that you have to force the
>> poll. Anyway, throw a wait time in there and see if that fixes it. This will
>> also help lower your monthly Amazon bill...
>>
>> Adam
>>
>>
>>> On Oct 19, 2015, at 11:41 PM, Adam Lamar  wrote:
>>>
>>> Hi everybody!
>>>
>>> I've been testing NiFi 0.3.0 with the GetSQS processor to fetch objects
>>> from an AWS bucket as they're created. My flow looks like this:
>>>
>>> GetSQS
>>> SplitJson
>>> ExtractText
>>> FetchS3Object
>>> PutFile
>>>
>>> I noticed that GetSQS causes a high amount of CPU usage - about 90% of
>>> one core. If I turn off GetSQS, CPU usage immediately drops to 2%. If I turn
>>> GetSQS back on with the run schedule at 10, it stays at 2%.
>>>
>>> Would it be worth using setWaitTimeSeconds [1] to make the SQS receive a
>>> blocking call? Alternatively, should GetSQS default to a longer run
>>> schedule?
>>>
>>> Cheers,
>>> Adam
>>>
>>>
>>> [1]
>>> http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/sqs/model/ReceiveMessageRequest.html#setWaitTimeSeconds(java.lang.Integer)
>
>


Re: template repo for learning

2015-11-03 Thread Bryan Bende
If you can share a little more info about what the API that you're trying
to interact with looks like, we can likely provide more concrete guidance.

As a very basic test, to familiarize yourself with the Expression Language,
you could create a "dummy flow" such as:
- GenerateFlowFile to create a new FlowFile on a timer
- UpdateAttribute to set attributes you want to be passed to your API, you
can use expression language here to create dynamic date/time values
- InvokeHttp to call your API
- You could then route the "Response" relationship from InvokeHttp to some
other processor to possibly extract information from the response for
further use

Let us know if we can help more.

On Tue, Nov 3, 2015 at 9:51 AM, Christopher Hamm  wrote:

> I am trying to query api based on date/time and possibly based on results
> from fields of another query.
> On Nov 3, 2015 9:41 AM, "Bryan Bende"  wrote:
>
>> Christopher,
>>
>> In terms of templates, the best resources we have right now are:
>>
>> https://cwiki.apache.org/confluence/display/NIFI/Example+Dataflow+Templates
>> https://github.com/xmlking/nifi-examples
>>
>> For expression language we have the EL guide:
>> https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html
>>
>> Is there a specific flow you are trying to tackle?
>>
>> -Bryan
>>
>>
>> On Tue, Nov 3, 2015 at 9:36 AM, Christopher Hamm <
>> em...@christopherhamm.com> wrote:
>>
>>> Is there a repo of nifi templates with advanced features that use lots
>>> of expression language expecially when used to make requests? I can't find
>>> enough docs or youtube videos that really dig into it.
>>>
>>
>>


Re: Nifi connection queue monitoring.

2015-11-03 Thread Chakrader Dewaragatla
Matt – How do I get the start time of a processor ?
 we have a getSftp processor with time driven schedule (6h interval). I would 
like to monitor the start and finish time for that interval, where do I see the 
captured stats ?

Thanks,
-Chakri

From: Chakrader Dewaragatla 
>
Date: Monday, November 2, 2015 at 4:12 PM
To: "users@nifi.apache.org" 
>
Subject: Re: Nifi connection queue monitoring.

Thanks Matt, this should help us find the stats.

Thanks,
-Chakri

From: Matt Gilman >
Reply-To: "users@nifi.apache.org" 
>
Date: Monday, November 2, 2015 at 4:39 PM
To: "users@nifi.apache.org" 
>
Subject: Re: Nifi connection queue monitoring.

Chakri,

The configuration and the status are decoupled from one another so we don't 
need to pull back the entire configuration when a user wants to refresh the 
statistics frequently. Currently, the only endpoint available for retrieving 
status is based on a Process Group. That end point is

https://{host}:{port}/nifi-api/controller/process-groups/{process-group-id}/status?recursive={true|false}

If you want to get the status of every component, you can use the alias "root" 
for the top most process group id and set recursive to TRUE. If you want a 
specific connection, you can use the process group id of the Process Group that 
connection resides in. The best way to see these requests in action is to open 
up your Developer Tools in your browser and watch the requests the UI makes.

Let me know if you have any more questions. Thanks!

Matt

On Mon, Nov 2, 2015 at 6:11 PM, Chakrader Dewaragatla 
> 
wrote:
 Hi,
Does nifi has any REST api to monitor the queue size for a connection ?
 I am trying to query a connection to see if it has any data queued/backlogged. 
I will use the backlog queue size to determine a workflow is chocking with 
massive incoming data or one of the downstream processor stopped processing.

  Following api show the connection status, not queue size.

  /controller/process-groups/{process-group-id}/connections/{id}

  Thanks,
  -Chakri

The information contained in this transmission may contain privileged and 
confidential information. It is intended only for the use of the person(s) 
named above. If you are not the intended recipient, you are hereby notified 
that any review, dissemination, distribution or duplication of this 
communication is strictly prohibited. If you are not the intended recipient, 
please contact the sender by reply email and destroy all copies of the original 
message.



The information contained in this transmission may contain privileged and 
confidential information. It is intended only for the use of the person(s) 
named above. If you are not the intended recipient, you are hereby notified 
that any review, dissemination, distribution or duplication of this 
communication is strictly prohibited. If you are not the intended recipient, 
please contact the sender by reply email and destroy all copies of the original 
message.



Re: Client Site to Site

2015-11-03 Thread Mark Payne
Naveen,

With the config provided below, you are setting the "nifi.remote.input.secure" 
flag to true.
This means that you will need to also set the keystore and truststore 
properties.

Thanks
-Mark


> On Nov 3, 2015, at 12:36 PM, Madhire, Naveen  
> wrote:
> 
> Hi,
> 
> I am unable to connect to Nifi Instance using site-to-site configuration. 
> I’ve setup to run Nifi locally and also configured site-to- 
> site-communication using the below properties,
> 
> # Site to Site properties
> nifi.remote.input.socket.host=
> nifi.remote.input.socket.port=9870
> nifi.remote.input.secure=true
> 
> 
> I created a simple workflow to pull the data from Kafka and put into an 
> Output Port (“oput”).
> 
> My issue is when I am using the below SiteToSiteClientConfig in my 
> application I am getting “could not find port” error,
> 
> 
> SiteToSiteClientConfig clientConfig = new SiteToSiteClient.Builder()
>   .url("http://localhost:8080/nifi/;)
>   .portName(“oput")
>   .buildConfig();
> 
> 
> Do I need to configure any other property to enable remote site-to-site?
> 
> Please let me know.
> 
> Thanks,
> Naveen
> 
> The information contained in this e-mail is confidential and/or proprietary 
> to Capital One and/or its affiliates and may only be used solely in 
> performance of work or services for Capital One. The information transmitted 
> herewith is intended only for use by the individual or entity to which it is 
> addressed. If the reader of this message is not the intended recipient, you 
> are hereby notified that any review, retransmission, dissemination, 
> distribution, copying or other use of, or taking of any action in reliance 
> upon this information is strictly prohibited. If you have received this 
> communication in error, please contact the sender and delete the material 
> from your computer.



Re: Client Site to Site

2015-11-03 Thread Mark Payne
Naveen,

Currently, if you are using secure site-to-site, then it requires 2-way SSL. So 
you will want to use
needClientAuth = true, and you will need the keystore and truststore configured 
for both NiFi and
your client.

This is something that is being addressed for the 1.0.0 build, but as of right 
now it's two-way SSL or
not secure.

Thanks
-Mark


> On Nov 3, 2015, at 12:46 PM, Madhire, Naveen  
> wrote:
> 
> I did set the keystore properties and didn’t set the truststore properties 
> because I set “nifi.security.needClientAuth” to “false”
> 
> Do I still need truststore property?
> 
> 
> nifi.security.truststore=
> nifi.security.truststoreType=
> nifi.security.truststorePasswd=
> nifi.security.needClientAuth=false
> 
> 
> 
> 
> 
> 
> From: Mark Payne >
> Reply-To: "users@nifi.apache.org " 
> >
> Date: Tuesday, November 3, 2015 at 11:40 AM
> To: "users@nifi.apache.org " 
> >
> Subject: Re: Client Site to Site
> 
> Naveen,
> 
> With the config provided below, you are setting the 
> "nifi.remote.input.secure" flag to true.
> This means that you will need to also set the keystore and truststore 
> properties.
> 
> Thanks
> -Mark
> 
> 
>> On Nov 3, 2015, at 12:36 PM, Madhire, Naveen > > wrote:
>> 
>> Hi,
>> 
>> I am unable to connect to Nifi Instance using site-to-site configuration. 
>> I’ve setup to run Nifi locally and also configured site-to- 
>> site-communication using the below properties,
>> 
>> # Site to Site properties
>> nifi.remote.input.socket.host=
>> nifi.remote.input.socket.port=9870
>> nifi.remote.input.secure=true
>> 
>> 
>> I created a simple workflow to pull the data from Kafka and put into an 
>> Output Port (“oput”).
>> 
>> My issue is when I am using the below SiteToSiteClientConfig in my 
>> application I am getting “could not find port” error,
>> 
>> 
>> SiteToSiteClientConfig clientConfig = new SiteToSiteClient.Builder()
>>   .url("http://localhost:8080/nifi/ ")
>>   .portName(“oput")
>>   .buildConfig();
>> 
>> 
>> Do I need to configure any other property to enable remote site-to-site?
>> 
>> Please let me know.
>> 
>> Thanks,
>> Naveen
>> 
>> The information contained in this e-mail is confidential and/or proprietary 
>> to Capital One and/or its affiliates and may only be used solely in 
>> performance of work or services for Capital One. The information transmitted 
>> herewith is intended only for use by the individual or entity to which it is 
>> addressed. If the reader of this message is not the intended recipient, you 
>> are hereby notified that any review, retransmission, dissemination, 
>> distribution, copying or other use of, or taking of any action in reliance 
>> upon this information is strictly prohibited. If you have received this 
>> communication in error, please contact the sender and delete the material 
>> from your computer.
> 
> 
> The information contained in this e-mail is confidential and/or proprietary 
> to Capital One and/or its affiliates and may only be used solely in 
> performance of work or services for Capital One. The information transmitted 
> herewith is intended only for use by the individual or entity to which it is 
> addressed. If the reader of this message is not the intended recipient, you 
> are hereby notified that any review, retransmission, dissemination, 
> distribution, copying or other use of, or taking of any action in reliance 
> upon this information is strictly prohibited. If you have received this 
> communication in error, please contact the sender and delete the material 
> from your computer.



Re: GetSQS causes high CPU usage

2015-11-03 Thread Adam Lamar

Hey Joe,

I think there are two possible JIRAs.

1) Add long polling support using setWaitTimeSeconds() - should be 
really easy. I can take a crack at a pull request. Here's a JIRA: 
https://issues.apache.org/jira/browse/NIFI-1103


2) Investigate the high CPU usage. I saw this initially for several 
days, but it went away after I adjusted the run schedule (from 1 second 
to 10 seconds back to 1 second). I have CPU charts showing the high 
usage and corresponding drop, but I need to reproduce the issue.


I'll circle back in a few days when I get some time to work on it.

Cheers,
Adam

On 11/3/15 2:41 AM, Joe Witt wrote:

Adam,

Just wanted to follow up on this.  Have you had any better results and
should we put a JIRA in behind what you're seeing?

Thanks
Joe

On Tue, Oct 20, 2015 at 7:58 PM, Adam Lamar  wrote:

Adam,

Thanks for the reply!

Amazon supports (and recommends) long polling on SQS queues[1]. The GetSQS
code doesn't attempt long polling at all, but I wasn't sure if this was
intentional or if the option had just never been added. With a 20 second
long poll, the processor would make 3 requests per minute instead of 60,
assuming the queue was empty during that time.

Another data point - even during high CPU usage, the GetSQS processor was
only making one request per second to SQS (verified via tcpdump). While not
ideal from a billing perspective, doesn't it seem wrong that 1 request a
second is causing such high CPU?

Perhaps to muddy the waters a bit, I played with the run schedule yesterday,
and even now that I've turned it back to 1 second, CPU usage is remaining
low. Before I could start/stop GetSQS repeatedly and observe the high CPU
usage, but now I can't reproduce it. If I'm able to consistently reproduce
the issue in the future, I'll be sure to post again.

Cheers,
Adam


[1]
http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-long-polling.html


On 10/20/15 4:37 AM, Adam Estrada wrote:

Adam,

I suspect that getSQS is polling Amazon to check for data. It's not
exactly like your standard message broker in that you have to force the
poll. Anyway, throw a wait time in there and see if that fixes it. This will
also help lower your monthly Amazon bill...

Adam



On Oct 19, 2015, at 11:41 PM, Adam Lamar  wrote:

Hi everybody!

I've been testing NiFi 0.3.0 with the GetSQS processor to fetch objects
from an AWS bucket as they're created. My flow looks like this:

GetSQS
SplitJson
ExtractText
FetchS3Object
PutFile

I noticed that GetSQS causes a high amount of CPU usage - about 90% of
one core. If I turn off GetSQS, CPU usage immediately drops to 2%. If I turn
GetSQS back on with the run schedule at 10, it stays at 2%.

Would it be worth using setWaitTimeSeconds [1] to make the SQS receive a
blocking call? Alternatively, should GetSQS default to a longer run
schedule?

Cheers,
Adam


[1]
http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/sqs/model/ReceiveMessageRequest.html#setWaitTimeSeconds(java.lang.Integer)






Client Site to Site

2015-11-03 Thread Madhire, Naveen
Hi,

I am unable to connect to Nifi Instance using site-to-site configuration. I’ve 
setup to run Nifi locally and also configured site-to- site-communication using 
the below properties,


# Site to Site properties

nifi.remote.input.socket.host=

nifi.remote.input.socket.port=9870

nifi.remote.input.secure=true



I created a simple workflow to pull the data from Kafka and put into an Output 
Port (“oput”).


My issue is when I am using the below SiteToSiteClientConfig in my application 
I am getting “could not find port” error,



SiteToSiteClientConfig clientConfig = new SiteToSiteClient.Builder()
  .url("http://localhost:8080/nifi/;)
  .portName(“oput")
  .buildConfig();



Do I need to configure any other property to enable remote site-to-site?


Please let me know.


Thanks,

Naveen


The information contained in this e-mail is confidential and/or proprietary to 
Capital One and/or its affiliates and may only be used solely in performance of 
work or services for Capital One. The information transmitted herewith is 
intended only for use by the individual or entity to which it is addressed. If 
the reader of this message is not the intended recipient, you are hereby 
notified that any review, retransmission, dissemination, distribution, copying 
or other use of, or taking of any action in reliance upon this information is 
strictly prohibited. If you have received this communication in error, please 
contact the sender and delete the material from your computer.


Re: Client Site to Site

2015-11-03 Thread Madhire, Naveen
I did set the keystore properties and didn’t set the truststore properties 
because I set “nifi.security.needClientAuth” to “false”

Do I still need truststore property?



nifi.security.truststore=

nifi.security.truststoreType=

nifi.security.truststorePasswd=

nifi.security.needClientAuth=false






From: Mark Payne >
Reply-To: "users@nifi.apache.org" 
>
Date: Tuesday, November 3, 2015 at 11:40 AM
To: "users@nifi.apache.org" 
>
Subject: Re: Client Site to Site

Naveen,

With the config provided below, you are setting the "nifi.remote.input.secure" 
flag to true.
This means that you will need to also set the keystore and truststore 
properties.

Thanks
-Mark


On Nov 3, 2015, at 12:36 PM, Madhire, Naveen 
> wrote:

Hi,

I am unable to connect to Nifi Instance using site-to-site configuration. I’ve 
setup to run Nifi locally and also configured site-to- site-communication using 
the below properties,

# Site to Site properties
nifi.remote.input.socket.host=
nifi.remote.input.socket.port=9870
nifi.remote.input.secure=true


I created a simple workflow to pull the data from Kafka and put into an Output 
Port (“oput”).

My issue is when I am using the below SiteToSiteClientConfig in my application 
I am getting “could not find port” error,



SiteToSiteClientConfig clientConfig = new SiteToSiteClient.Builder()
  .url("http://localhost:8080/nifi/;)
  .portName(“oput")
  .buildConfig();


Do I need to configure any other property to enable remote site-to-site?

Please let me know.

Thanks,
Naveen


The information contained in this e-mail is confidential and/or proprietary to 
Capital One and/or its affiliates and may only be used solely in performance of 
work or services for Capital One. The information transmitted herewith is 
intended only for use by the individual or entity to which it is addressed. If 
the reader of this message is not the intended recipient, you are hereby 
notified that any review, retransmission, dissemination, distribution, copying 
or other use of, or taking of any action in reliance upon this information is 
strictly prohibited. If you have received this communication in error, please 
contact the sender and delete the material from your computer.



The information contained in this e-mail is confidential and/or proprietary to 
Capital One and/or its affiliates and may only be used solely in performance of 
work or services for Capital One. The information transmitted herewith is 
intended only for use by the individual or entity to which it is addressed. If 
the reader of this message is not the intended recipient, you are hereby 
notified that any review, retransmission, dissemination, distribution, copying 
or other use of, or taking of any action in reliance upon this information is 
strictly prohibited. If you have received this communication in error, please 
contact the sender and delete the material from your computer.


Suggestion on how to parse field out of filename

2015-11-03 Thread Mark Petronic
Looking for some help on best way to extract a field from a filename. I
need to parse out the date from the core filename attribute set by the
UnpackContent processor. I am unzipping files that contain many CSV files
and these CSV file names vary in format but each has a timestamp included
in the filename. Example formats are:

Priority_002_20151104123456_00.csv  (20151104123456 is MMddHHmmss)
ABC_02_1447586912344.csv (1447586912344 is Unix time in ms)
XYZ_20151104_1234.csv (20151104_1234 is MMdd_HHmm)

So, there are various forms to deal with. I need to normalize these into
MMddHHmmss. A regex with capture groups would be perfect but I cannot
quite figure out how to do it. ExtractText does regex with capture groups
but only against flowfile contents and these are attributes.
UpdateAttribute only support expression language and that does not have
regex based extracts of capture groups.

In Python, I would just do something like:

date, time = re.search(r"XYZ_(\d+)_(\d+)\.csv",
"XYZ_20151104_1234.csv").groups()

Then I could use the expression language format or doDate functions to
normalize the dates

I know I could use a utility script with ExecuteStreamCommand that I could
call with the filepath and get back the tokens but was looking for an
internal way to do it without forking out as there are a lot of archives in
each zip and that would add to latency in heavy loads.

Any thoughts?

Thanks!


Re: Need help in nifi- flume processor

2015-11-03 Thread Parul Agrawal
Hi Bryan,


I am trying to insert the following data in to Database using Nifi processor*
ConvertJsonToSql and PutSQL.*

*Json Object used:*
 {"index":"1", "num":"1", "len":"58", "caplen":"54", "timestamp":"*Nov  4,
2015 00:42:15.0 CST*"}

Kindly find the *table description:*

maddb=# \d gen_info;
 Table "public.gen_info"
  Column   |   Type   |Modifiers
---+--+--
 index | bigint   | not null default
nextval('gen_info_index_seq'::regclass)
 num   | integer  |
 len   | integer  |
 caplen| integer  |
 timestamp | *timestamp with time zone *|

Here while inserting timestamp '*Nov  4, 2015 00:42:15.0 CST*' ,
exception is thrown by Nifi Processor

2015-11-04 00:42:16,798 ERROR [Timer-Driven Process Thread-10]
o.apache.nifi.processors.standard.PutSQL
PutSQL[id=04197ba8-25a5-4301-ad00-2cc139972877] Cannot update database for
StandardFlowFileRecord[uuid=b32687d6-b814-4063-83e4-27af8f16be0b,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1446619336301-1, container=default,
section=1], offset=8816,
length=70],offset=0,name=10468777551217095,size=70] due to
org.apache.nifi.processor.exception.ProcessException: The value of the
sql.args.1.value *is 'Nov  4, 2015 00:42:15.0 CST', which cannot be
converted into the necessary data type; routing to failure:
org.apache.nifi.processor.exception.ProcessException: The value of the
sql.args.1.value is 'Nov  4, 2015 00:42:15.0 CST', which cannot be
converted into the necessary data type.*

Manual insertion in DB is working fine but with Nifi I am getting the above
exception. Also if timestamp is in sting format no error is thrown. Can you
please help me in this regard.

Thank you very much for all the guidance and support provided so far.

Thanks and Regards,
Parul



On Mon, Oct 26, 2015 at 5:14 PM, Bryan Bende  wrote:

> Hello,
>
> Can you tell us what you are trying to route on in the json? What regular
> expression did you try in RouteOnContent?
>
> -Bryan
>
>
> On Monday, October 26, 2015, Parul Agrawal 
> wrote:
>
>> Hi,
>>
>> Thank you very much for all the support.
>> I have written a custom processor to split json to multiple json.
>> Now I would like to route the flowfile based on the content of the
>> flowfile.
>> I tried using RouteOnContent. But it did not work.
>>
>> Can you please help me how can i route the flowfile based on the
>> content/data it contains.
>>
>> Thanks and Regards,
>> Parul
>>
>>
>>
>> On Tue, Oct 13, 2015 at 6:54 PM, Bryan Bende  wrote:
>>
>>> Parul,
>>>
>>> You can use SplitJson to take a large JSON document and split an array
>>> element into individual documents. I took the json you attached and created
>>> a flow like GetFile -> SplitJson -> SplitJson -> PutFile
>>>
>>> In the first SplitJson the path I used was $.packet.proto and in the
>>> second I used $.field  This seemed to mostly work except some of the splits
>>> going into PutFile still have another level of "field" which needs to be
>>> split again so would possibly need some conditional logic to split certain
>>> documents again.
>>>
>>> Alternatively you could write a custom processor that restructures your
>>> JSON.
>>>
>>> -Bryan
>>>
>>>
>>>
>>> On Tue, Oct 13, 2015 at 8:36 AM, Parul Agrawal >> > wrote:
>>>
 Hi,

 I tried with the above json element. But I am getting the below
 mentioned error:

 2015-10-12 23:53:39,209 ERROR [Timer-Driven Process Thread-9]
 o.a.n.p.standard.ConvertJSONToSQL
 ConvertJSONToSQL[id=0e964781-6914-486f-8bb7-214c6a1cd66e] Failed to parse
 StandardFlowFileRecord[uuid=dfc16db0-c7a6-4e9e-8b4d-8c5b4ec50742,claim=StandardContentClaim
 [resourceClaim=StandardResourceClaim[id=183036971-1, container=default,
 section=1], offset=132621, length=55],offset=0,name=json,size=55] as JSON
 due to org.apache.nifi.processor.exception.ProcessException: IOException
 thrown from ConvertJSONToSQL[id=0e964781-6914-486f-8bb7-214c6a1cd66e]:
 org.codehaus.jackson.JsonParseException: Unexpected character ('I' (code
 73)): expected a valid value (number, String, array, object, 'true',
 'false' or 'null')

 Also I have a huge json object attached (new.json). Can you guide me on
 how do i use ConvertJSONToSQL processor.
 Should I use any other processor before using ConvertJSONToSQL
 processor so that this new.json can be converted in to a flat document
 of key/value pairs, or an array of flat documents.

 Any help/guidance would be really useful.

 Thanks and Regards,
 Parul

 On Mon, Oct 12, 2015 at 10:36 PM, Bryan Bende  wrote:

> I think 

template repo for learning

2015-11-03 Thread Christopher Hamm
Is there a repo of nifi templates with advanced features that use lots of
expression language expecially when used to make requests? I can't find
enough docs or youtube videos that really dig into it.


Re: template repo for learning

2015-11-03 Thread Christopher Hamm
Thanks Bryan. I will take a look. I might have a few questions to resolve
errors from the UI once I try again.

On Tue, Nov 3, 2015 at 10:14 AM, Bryan Bende  wrote:

> If you can share a little more info about what the API that you're trying
> to interact with looks like, we can likely provide more concrete guidance.
>
> As a very basic test, to familiarize yourself with the Expression
> Language, you could create a "dummy flow" such as:
> - GenerateFlowFile to create a new FlowFile on a timer
> - UpdateAttribute to set attributes you want to be passed to your API, you
> can use expression language here to create dynamic date/time values
> - InvokeHttp to call your API
> - You could then route the "Response" relationship from InvokeHttp to some
> other processor to possibly extract information from the response for
> further use
>
> Let us know if we can help more.
>
> On Tue, Nov 3, 2015 at 9:51 AM, Christopher Hamm 
> wrote:
>
>> I am trying to query api based on date/time and possibly based on results
>> from fields of another query.
>> On Nov 3, 2015 9:41 AM, "Bryan Bende"  wrote:
>>
>>> Christopher,
>>>
>>> In terms of templates, the best resources we have right now are:
>>>
>>> https://cwiki.apache.org/confluence/display/NIFI/Example+Dataflow+Templates
>>> https://github.com/xmlking/nifi-examples
>>>
>>> For expression language we have the EL guide:
>>>
>>> https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html
>>>
>>> Is there a specific flow you are trying to tackle?
>>>
>>> -Bryan
>>>
>>>
>>> On Tue, Nov 3, 2015 at 9:36 AM, Christopher Hamm <
>>> em...@christopherhamm.com> wrote:
>>>
 Is there a repo of nifi templates with advanced features that use lots
 of expression language expecially when used to make requests? I can't find
 enough docs or youtube videos that really dig into it.

>>>
>>>
>


-- 
Sincerely,
Chris Hamm
(E) ceham...@gmail.com
(Twitter) http://twitter.com/webhamm
(Linkedin) http://www.linkedin.com/in/chrishamm