Adam,
I'm not very familiar with that specific processor but I think you'll
find your case is probably far better handled using the Record
reader/writer processors anyway. There is a GrokReader which you can
use to read each line of a given input as grok expressions to parse
out key fields agains
Hi Aruna,
The placeholders in your ReplaceText configuration, such as
'${city_name}' are NiFi Expression Language. If the incoming FlowFile
has such FlowFile Attributes, those can be replaced with FlowFile
Attribute values. But I suspect FlowFile doesn't have those attributes
since ReplaceText is
Hi there,
I've been playing with the ExtractGrok processor and noticed I was missing
some data that I expected to be extracted. After some investigation, it
seems that ExtractGrok extracts only the first line of the flowfile
content, and ignores the rest.
Is this expected behavior? I should be ab
Tian,
Ok - and was this with the 512MB heap again? Can you try with a 1GB
or 2GB heap and see if we're just looking at our minimum needs being
an issue or if we're looking at what sounds like a leak.
Thanks
On Mon, Sep 25, 2017 at 12:41 PM, Lou Tian wrote:
> Hi Joe,
>
> I tested with a simple
Hi Joe,
I tested with a simple flow file.
Only 4 processors: HandleHttpRequest, RouteOnContent, HandleHttpResponse
and DebugFlow.
I run the test 3 times (10 m/time and at most 50 users).
It works fine for the first 2 run. And on the third run, got the error.
I copied part of the log file. Please
I updated the insert statement to be in a single line. Again it failed. I
checked the flow file.
INSERT INTO ADR_SUB_NIFI (enrlmt_id, city_name, zip_cd, state_cd) VALUES ('',
'', '', '')
What could be the reason for the values to be blank instead of actual values
from the CSV file?
From: kart
Hi Joe, Thanks for your reply.
I will try to do those tests. And update you with the results.
On Mon, Sep 25, 2017 at 3:56 PM, Joe Witt wrote:
> Tian
>
> The most common sources of memory leaks in custom processors
> 1) Loading large objects (contents of the flowfile, for example) into
> memory
Tian
The most common sources of memory leaks in custom processors
1) Loading large objects (contents of the flowfile, for example) into
memory through byte[] or doing so using libraries that do this and not
realizing it. Doing this in parallel makes the problem even more
obvious.
2) Caching objec
Hi Joe,
1. I will build a simple flow without our customised processor to test
again.
It is a good test idea. We saw the OOME is under the HandleHttpRequest,
we never thought about others.
2. About our customised processor, we use lots of these customised
processors.
Properties are dynami
Tian,
Ok thanks. I'd try to removing your customized processor from the
flow entirely and running your tests. This will give you a sense of
base nifi and the stock processors. Once you're comfortable with that
then add your processor in.
I say this because if your custom processor is using up
1. The HandleHttpRequest Processor get the message.
2. The message route to other processors based on the attribute.
3. We have our customised processor to process the message.
4. Then message would be redirected to the HandleHttpResponse.
On Mon, Sep 25, 2017 at 3:20 PM, Joe Witt wrote:
> What
What is the flow doing in between the request/response portion?
Please share more details about the configuration overall.
Thanks
On Mon, Sep 25, 2017 at 9:16 AM, Lou Tian wrote:
> Hi Joe,
>
> java version: 1.8.0_121
> heap size:
> # JVM memory settings
> java.arg.2=-Xms512m
> java.arg.3=-Xmx512
Hi Joe,
java version: 1.8.0_121
heap size:
# JVM memory settings
java.arg.2=-Xms512m
java.arg.3=-Xmx512m
nifi version: 1.3.0
Also, we put Nifi in the Docker.
Kind Regrads,
Tian
On Mon, Sep 25, 2017 at 2:39 PM, Joe Witt wrote:
> Tian,
>
> Please provide information on the JRE being used (java
Tian,
Please provide information on the JRE being used (java -version) and
the environment configuration. How large is your heap? This can be
found in conf/bootstrap.conf. What version of nifi are you using?
Thanks
On Mon, Sep 25, 2017 at 8:29 AM, Lou Tian wrote:
> Hi,
>
> We are doing perfo
Hi,
We are doing performance test for our NIFI flow with Gatling. But after
several run, the NIFI always has the OutOfMemory error. I did not find
similar questions in the mailing list, if you already answered similar
questions please let me know.
*Problem description:*
We have the Nifi flow. The
Aruna,
seems failure in your insert statement, don't split the Replacement
value(query) in the replacetext processor into multiple lines and try to be
in a single line?
-Karthik
On Mon, Sep 25, 2017 at 4:20 PM, karthi keyan
wrote:
> Aruna,
>
> You can download the flow file to see whether your
Aruna,
You can download the flow file to see whether your query passed correctly
and try execute the same with you datasoruce.
-Karthik
On Mon, Sep 25, 2017 at 4:04 PM, Aruna Sankaralingam <
aruna.sankaralin...@cormac-corp.com> wrote:
> I clicked on that as well but nothing seemed to happen.
>
I clicked on that as well but nothing seemed to happen.
Thanks
Aruna
On Sep 25, 2017, at 4:33 AM, Peter Wicks (pwicks)
mailto:pwi...@micron.com>> wrote:
Use the Download button right next to View, then open it in a text editor.
From: Aruna Sankaralingam [mailto:aruna.sankaralin...@cormac-corp.
Use the Download button right next to View, then open it in a text editor.
From: Aruna Sankaralingam [mailto:aruna.sankaralin...@cormac-corp.com]
Sent: Monday, September 25, 2017 9:54 AM
To: users@nifi.apache.org
Subject: Re: [EXT] New to Nifi - Failed to update database due to a failed
batch upd
19 matches
Mail list logo