Hi,
I have few questions on nifi as we are implementing it in our production
environment.
- How to perform version control among multiple developers
- What’s the strategy to do QA/Prod migration
- Job monitoring in NiFi
- Can the NiFi flows be integrated with external schedulers (
t;
> Although, your environment is likely different from ours. I hope these
> ideas
> help.
>
> On Wed, Aug 28, 2019 at 11:16 AM KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com> wrote:
>
>> Hi,
>>
>> I have few questions on nifi as we are implementing
Hi,
I am using the CSVReader to read the data from the csv files.
[image: image.png]
Here is the sample data I have it.
"A"|"""Teahous"" Beijing People's Art"|"""Teahous"""|"
With the above CSV reader, " are completely removed. My expected output
should be *""Teahous"" Beijing People's
quotes should end up as "Teahous"
> Beijing People's Art
>
> On Tue, Sep 17, 2019 at 5:35 PM KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com> wrote:
>
>> Hi,
>>
>> I am using the CSVReader to read the data from the csv files.
>>
>&g
Hi,
I have existing flow and trying to understand what is sql.args.1.type ?
sql.args.1.type =11
sql.args.1.type=13
I understood that they are matched to column data type for sql.args.1.value.
what is data type for 11 and 13 ? May I know what are all the options
availabe for String, Integer,
Hi,
I am looking for an example where I need to push data to kafka topic and at
the same time use schema registry of confluent to use the schema for this
topic.
can anyone please share examples/templates if anyone has it. That would be
really helpful.
Thanks,
Asmath
ot;
>> the arguments with the query fields.
>>
>>> Em seg, 23 de set de 2019 às 20:02, KhajaAsmath Mohammed
>>> escreveu:
>>> Hi,
>>>
>>> I have existing flow and trying to understand what is sql.args.1.type ?
>>>
>>&g
Hi,
I have use case where I need to remove the files only after put database is
successful.
Currently I have , GetFile(With KeepSource=true) --> PutDatase -->
ExecuteStreamCommand to remove the successfile from NIFI fileserver.
May I know how to achieve this without using executestreamcommand
Hi,
I am using the jackson parser inside csv reader before loading that into
database. I have weird behavior where the double quotes are removed for
below string and works for other strings. May I know what would be reason.
"2160 University Ave (""Chulo"")" --> Converted to 2160 University Ave
Hi,
I have existing flow where it replace value in the json document. I
understood what is does but i would like to know what does ?s .*? does and
how to learn more about all these characters from expression language. Any
guidance would really help me.
(?s)("eventTime"\s*:\s*)("(.*?)") with
Thanks. You were right .
Sent from my iPhone
> On Nov 20, 2019, at 6:10 PM, Bimal Mehta wrote:
>
> Click on the Process Group and open Variable Registry. Most likely it's
> defined as Process Group variable inside Variable Registry
>
>> On Wed, Nov 20, 2019, 1:24 PM
HI,
I have requirement of read a file from fileserver and delete all records
from Database if filename is present in database. Next step is to load data
from this file.
This is more like delete/insert. I cannot do upserts because the new file
can have more/less records after correction from the
Hi,
Could anyone please let me know your experience on reading KSQL table in
NIFI. We use ksql table in conflunet version of kafka to manage updates and
I would like to view the data from it using NIFI. Any suggestions?
Thanks,
Asmath
o.
>>
>> https://crontab.guru/#1-59_*_*_*_*
>>
>> Sent from my iPhone
>>
>> On Jan 13, 2020, at 6:06 PM, KhajaAsmath Mohammed <
>> mdkhajaasm...@gmail.com> wrote:
>>
>>
>> Hi,
>>
>> I am trying to schedule job in NIFI through
doing any parsing and see what the message looks like.
>
> HTH
> Pierre
>
>> Le mer. 5 févr. 2020 à 16:27, KhajaAsmath Mohammed
>> a écrit :
>> Hi,
>>
>> I was able to consume messages from consumekafkarecord processor. Suddenly I
>> started
Hi,,
I have requirement where I need to add filename to attributes so that it
would be easy for us to load into table by matching attribute name to
columns in database.
In this case we have 70 + json files having different attributes.
"session": {
"filename":"course",
"id": "
I was able to get it through JOLT using below. Need to add attribute
extracted from JSON instead of test
[
{
"operation": "shift",
"spec": {
"*": "test_&"
}
}
]
On Thu, Jan 30, 2020 at 6:05 PM KhajaAsmath Mohamme
Hi,
I am using below text to convert CSV to JSON using convert record
processor. Value of 99.99 changed to 100.0 , May I know how is this
possible?
OPERATING_UNIT|TEST_34_2|TEST_34_15|TEST_34_5
"141516"|"1.1"|"1.1"|"1.1"
"141517"|"1.11"|"1.11"|"1.11"
Hi,
I am looking for solution to do incremental load from database once the initial
load is done. Incremental should happen automatically with the flow . Any
suggestions
Does capturechangesql help in this case ?
Thanks,
Asmath
Sent from my iPhone
want to capture deletes and updates?
>
> Thanks,
> Pierre
>
>> Le ven. 21 févr. 2020 à 05:16, KhajaAsmath Mohammed
>> a écrit :
>> Hi,
>>
>> I am looking for solution to do incremental load from database once the
>> initial load is done. Inc
Hi Community,
I am looking on some information on how to access kafka offsets, keys and
entire value of kafka message using consume record?
Is there any other processor to consume messages from particular offset?
Also, what happens if there is error after consuming messages. Assume there
is
f a Float.
>
> Thanks
> -Mark
>
>
>> On Feb 19, 2020, at 4:42 PM, Shawn Weeks wrote:
>>
>> How are you defining the schema and what data type are setting for that
>> column?
>>
>> Thanks
>> Shawn
>>
>> From: KhajaAsmat
Hi,
How to call stored procedure in NIFI that accepts parameters? I have seen
solution to use executeprocess or python scripts but is there a way to do
with processors?
Thanks,
Asmath
Hi,
I have requirement where I need to do develop and parse json based on attribute
value. This will eventually end up writing 150+ route on attributes. Is there a
good approach to deal with it?
Thanks,
Asmath
Sent from my iPhone
d as a result determine that the “job is done” and
> send a FlowFile to Notify to unblock the waiting FlowFiles.
>
> Hope this helps
> -Mark
>
>
>> On Apr 28, 2020, at 3:07 AM, KhajaAsmath Mohammed
>> wrote:
>>
>> Hi,
>>
>> We are usin
Hi,
How do I connect two processors properly as shown below?
[image: image.png]
when I connect, they doesnt look good.
Thanks,
Asmath
Hi,
Is there a way to get key,timestamp and offset in consume kafka record?
Offset: 380050 Key:
urn:uuid:eb3b516b-9a31-4365-8ccb-e8c009073f63|+|2020-05-05
Timestamp: 2020-05-05 03:23:54.210 Headers: empty
>
> On Sep 2, 2020, at 10:35 AM, KhajaAsmath Mohammed
> wrote:
>
> Here is the error that I am getting.
>
>
>
>
> <http://www.avg.com/email-signature?utm_medium=email_source=link_campaign=sig-email_content=webmail>
> Virus-free.
> www.avg.com
> <http://www
Hi,
I am having an issue using the Tab delimited CSV parser in NIFI. I need to
convert this csv to json before loading into queue. I am getting an
exception in this data. any help on how to resolve this?
*Sample Data:*
SUBMISSION_ID ASSIGNMENT_ID COURSE_ID ENROLLMENT_TERM_ID USER_ID GRADER_ID
;
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
On Wed, Sep 2, 2020 at 9:33 AM Mark Payne wrote:
> Asmanth,
>
> What is the exception that you’re seeing?
>
> On Sep 2, 2020, at 10:28 AM, KhajaAsmath Mohammed
> wrote:
>
> Hi,
>
> I am having an issue using the Tab d
Hi,
I am trying to insert json data using the putdatabase record. All the
columns of json are inserted as expected but one of the column which has
nested JSON data is inserted as CLOB. how to resolve this and force
putdatabase to insert only string.
[image: image.png]
I need to insert and read
Hi,
I am seeing lot of issues with our data when using putdatabase with
jsontree reader. JSON is valid when I checked actual data but NIFI is
complaining about non closeable input stream. May I know how to resolve
this?
Here is sample json and error message
error = 'Unrecognized character
Hi,
I am stuck with implementation where I need to load the entire JSON data
into database using NIFI. I was able to flatten json and load it but at the
same time I need to load the entire text of JSON into other field.
Here is the approach I tried.
Load entire json into attribute using
Hi,
I am extracting text using the below expression in NIFI and it results in 3
attributes.
Is there a way to get only one attribute?
event_bag
*event_bag.0*
*event_bag.1*
[image: image.png]
Thanks,
Asmath
Hi,
I have seen an issue where wait and notify processor performance is very
slow when the number of files is greater than 1000. It runs faster if it is
less than 1000. Is there a way setting on map cache client service that can
speed up ?
[image: image.png]
Thanks,
Asmath
Hi,
I am looking for some information on how to check datatypes of the data and
load transform them accordingly. I am okay to use any other processor to.
My req:
Check if column is Integer, if integer then load to _INT column else null
value
Check if column length is > 256, if more than 256
Hi,
I have folder where I need to pick up only the files that end with .txt
with getfile. I can read everything and do the route on attribute but we
will loose the files from the folder.
Thanks,
Asmath
Hi,
I have use case where the data is read using Getfile from file location and
loads that data into database. I would like to have trigger once the
database load is successful for all the files.
I tried approach of Wait/Notify but still it does not work as it works for
individual files. Lets
Hi,
I am looking for a use case to wait for processor/flow files for 2 minutes
and later merge them using a merge processor. Is there a processor or
script to achieve this?
Thanks,
Asmath
Hi,
I am using putddabase record with json tree reader to insert data into
database. This works great but is there a possibility to get column name in
the error? I need to open the file and see the text to find out the column.
putdatabaserecord.error
SAP DBTech JDBC: Value too large for column:
timed schedule.
>
> In my case, I follow this with a record based query that deduplicates the
> data.
>
> Dave
>
>> On Sun, Oct 25, 2020, 12:00 PM KhajaAsmath Mohammed
>> wrote:
>> Hi,
>>
>> I am looking for a use case to wait for processor/flow
r a specific amount
> of time after receiving the first flow file? Why would a scheduled run every
> 2 min not work? Is the issue that you need all related flowfiles merged
> together?
>
> Dave
>
>> On Mon, Oct 26, 2020, 9:25 PM KhajaAsmath Mohammed
>> wro
t;
>> On Sun, Oct 25, 2020 at 1:58 PM KhajaAsmath Mohammed
>> wrote:
>>
>> Hi,
>>
>> I am using putddabase record with json tree reader to insert data into
>> database. This works great but is there a possibility to get column name in
>> the er
Hi,
I generally get errors on putsql on the top right for this processor. Is
there a way to get errors inside the attribute so that I can use it and log
in our audit table.
Any other alternative? Put Database gives this attribute but it is not
possible to get with PUTSQL.
Thanks,
Asmath
t to *${NIFI_ROOT}**/**nifi-app.log* too.
>
> Russ
>
> On 8/3/20 8:56 AM, KhajaAsmath Mohammed wrote:
>
> Hi,
>
> I generally get errors on putsql on the top right for this processor. Is
> there a way to get errors inside the attribute so that I can use it and log
> in our a
Hi ,
I have a file that is throwing an error when looking for particular string
and replace it with other string.
Requested array size exceeds VM limit
Any suggestions for this? File is around 800 MB.
Thanks,
Asmath
be
> works very well, leverages appropriate RAID, and is proven highly reliable
> and durable.
>
> Thanks
>
> On Tue, Aug 11, 2020 at 7:26 AM KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com> wrote:
>
>> Hi,
>>
>> [image: image.png]
>>
>> we have 3
Hi,
[image: image.png]
we have 3 node NIFI clusters and due to some reasons NODE 2 and NODE 3 were
disconnected when the flow was running . Consume kafka was reading data
from all node settings and loading the data into the database.
In the above scenario, is there a possibility of loss of
Hi,
I am looking for below approach, any ideas on this please
Getfile --> CountOfFiles(GetFileFolder) -->Pass Count as attribute to all
flow files.
Thanks,
Asmath
Hi,
How can I capture the execution error on executesql and route it to
different queue when there is failure on the database.
executesql.error.message = 'com.sap.db.jdbc.exceptions.JDBCDriverException:
SAP DBTech JDBC: Cannot connect to jdbc:sap://hana-xx:30041 [Cannot
connect to host
standing your question. If you like please send more
> details.
>
> LC
>
>
>
>> El jue, 11-06-2020 a las 12:11 -0500, KhajaAsmath Mohammed escribió:
>> Hi,
>>
>> How can I capture the execution error on executesql and route it to
>> different
30, 2020 at 11:22:29, KhajaAsmath Mohammed (
> mdkhajaasm...@gmail.com) wrote:
>
> Hi,
>
> I am looking for some information on how to do retry logic on
> restapi until we get specific status code. Please let me know if you have
> any approach/templates for this
>
> Thanks,
> Asmath
>
>
l handle the delay for you (see the
> User's Guide section on penalized flowfiles).
>
> Regards,
> Matt
>
> On Thu, Jul 30, 2020 at 3:38 PM KhajaAsmath Mohammed
> wrote:
> >
> > can you please let me know how to use this in NIFI.
> >
>
gt; On Thu, Oct 29, 2020 at 5:12 PM KhajaAsmath Mohammed
>> wrote:
>>
>> Hi,
>>
>> I have a requirement where I need to get the file count from the path using
>> the groovy script.
>>
>> I came up with the below but unable to fil
Hi,
I have a requirement where I need to get the file count from the path using
the groovy script.
I came up with the below but unable to filter and count only txt files .
Any suggestions please?
import org.apache.commons.io.IOUtils
import java.nio.charset.*;
import java.io.*;
def flowFile =
Hi,
I have a scenario where I need to get value from the database and pass it
as an attribute for getfile in subsequent processors.
GetFile >> Execute SQL/PUTSQL >> Get value from the output of SQL and
assign it to Attribute >> pass attribute value to GetFIle .
Any help please ?
Thanks,
Hi,
I have e replacetext processor with below settings. I want to replace them
with only one processor and use or condition in the search query. any help
on how to do this?
[image: image.png]
[image: image.png]
Thanks,
Asmath
Hi,
Yes, it is case sensitive in SAP HANA. Table name is the same as it shows
in the error message. Should have another alternative but nothing works in
this case for me.
Thanks,
Asmath
On Wed, Jan 20, 2021 at 9:53 AM Peter Turcsanyi
wrote:
> Hi Asmath,
>
> Scenarios 2 and 3 should work.
>
Creating a synonym on the table name in the database and calling it with
put database helped it.
Thanks
Asmath
On Wed, Jan 20, 2021 at 1:38 PM KhajaAsmath Mohammed <
mdkhajaasm...@gmail.com> wrote:
> Hi,
>
> Yes, it is case sensitive in SAP HANA. Table name is the s
Hi,
I am having hard time inserting data into the table which has special
characters in the table name. I am able to test the same connection without
special characters in the table name. Any suggestions on how to resolve
this?
I tried quotes too for the table name but it didn't help me. Any
Hi,
Is there a way to get weekofyear from the query record processor. Any syntax
would really help.
Time stamp: 2021-02-01 12:12:12
Thanks,
Khaja
Sent from my iPhone
(‘-MM-dd HH:mm:ss’):format(‘w’)}
>
> Thanks
> -Mark
>
> > On Feb 1, 2021, at 9:13 PM, KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com> wrote:
> >
> > Hi,
> >
> > Is there a way to get weekofyear from the query record processor. Any
> sy
Hi Vibhath,
After a lot of research, I did this way.
Let’s say, you have 50 tables that needs to be processed. I will have status as
loading, success or fail foe each file processing. If anyone of the flow file
is failed, then process is considered as failed . I will keep checking the
status
Hi,
I have a below flow but I will be losing the properties that I need. ANy
help on how to get this?
Generate flow flow -- Has JSON data that I will be using as properties
GenerateFlowFile --> ExecuteSQLRecord(JSON Record Writer) (JSON of Generate
flow file is lost )--> Retreive BatchNumber
Hi,
I have a requirement where I need to ignore the flow files that have zero
records from executesql processor.
Process only the files which have results from executesql. any suggestions
please?
Thanks,
Khaja
fi-docs/components/org.apache.nifi/nifi-standard-nar/1.13.2/org.apache.nifi.processors.standard.RouteOnAttribute/index.html
>
>
> ---
> *Chris Sampson*
> IT Consultant
> chris.samp...@naimuri.com
> <https://www.naimuri.com/>
>
>
> On Fri, 23 Apr 2021 at 22:16, K
Hi,
I have a use case where I need to update the source system with success or fail
status after processing in nifi. Any suggestions. I have one in my mind by
maintaining audit table of processed tables and continuously check if that
table has any error or loading status. I don’t like this
(${filename:length():equals(5)}):or(${filename:endsWith('.txt')})}"
>
> https://nifi.apache.org/docs/nifi-docs/html/expression-language-guide.html#or
>
> Cheers,
>
> Chris Sampson
>
>> On Sat, 24 Apr 2021, 17:12 KhajaAsmath Mohammed,
>> wrote:
>
Hi,
I have a use case where I need to route based on multiple or conditions.
Instead of writing multiple separate conditions in route on attribute, I want
to add all those attributes with or conditions . Is this possible ?
Thanks,
Khaja
Hi,
I am looking for some information on how to improve performance of our
flows. Any suggestions?
1. how to troubleshoot which processor in the nifi is causing an issue?
2. What is the best way to increase the threads or tasks for a particular
processor?
3. How to debug the flow when there is
Hi,
I am looking for some help on how to deal with upserts/updates on SQL
Server with NIFI.
I get a flow file where the records are updated. Primary key are already
defined on the table.
I don't want to try the logic of delete and insert, is there a way to
handle upsert automatically with this
Hi,
I am having an issue with avro writer that gets automatically populated with
different data types.
In my case, big decimal in oracle is converted to double with avro writer and
logic data type is causing issue when loading data back to other database .
Values get changed .
In order to
Hi.
Is there a way to convert timestamp column automatically to 21 characters
for any columns, Columns can be dynamic, name can change frequently.
2020-05-29 23:08:44.541744 should be changed to 2020-05-29 23:08:44.5
for sql server . Sql server can have maximum of 23 characters. any
|N|OT|N|" "|P|OT|0|"
"|" "|UCSD|" "|" "|OT|" "|2020-11-01 00:00:00.0|2020-11-01
00:00:00.0||2020-11-01 00:00:00.0|2020-11-01 00:00:00.0||N|||2021-10-31
00:00:00.0|N|*2021-02-02 09:01:24.847756*|UC_BATCH|2021-02-02
09:01:25.0|2625|2021-02-0
quot;Timestamp Format" properties) can be applied to them in the
> NiFi flow.
>
> Hope it helps. Let us know if you are able to configure your flow to
> produce the date format you want.
>
> Best,
> Peter
>
> On Mon, Mar 1, 2021 at 7:35 PM KhajaAsmath Mohammed <
>
Hi,
I have a use case where I need to do incremental fetch on the oracle
tables. Is there a easy way to do this? I saw some posts about
querydatabase table. want to check if there is any efficient way to do this?
Thanks,
Khaja
Hi,
I have an issue with oracle driver where the executesql processor is
converting all the datatypes to String. AvroWriter used internally by
executesql is writing all the fields as string type.
I want to limit the timestamp field from oracle to 23 characters.
2021-02-02 09:01:24.847756 to
ards,
> Peter
>
> On Mon, Mar 1, 2021 at 6:42 PM KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com> wrote:
>
>> Hi,
>>
>> I have an issue where the csvrecordwriter is automatically converting
>> data from date to number. how to resolve this?
>>
>&g
Hi,
I have an issue where the csvrecordwriter is automatically converting data
from date to number. how to resolve this?
any suggestions to change this?
[image: image.png]
Source : Oracle with Date format
[image: image.png]
Target: Sql-server into Date format
Thanks,
Asmath
Please ignore I was able to figure out . Was missing . before *
On Tue, Feb 23, 2021 at 4:19 PM KhajaAsmath Mohammed <
mdkhajaasm...@gmail.com> wrote:
> Hi Joe,
>
> I have used the below but it is not reading data from topics. I want to
> read data from all the t
afka-2-6-nar/1.13.0/org.apache.nifi.processors.kafka.pubsub.ConsumeKafka_2_6/index.html
> Look at 'Topic Name(s)' and 'Topic Name Format' with a provided naming
> pattern.
>
> Thanks
>
> On Tue, Feb 23, 2021 at 1:55 PM KhajaAsmath Mohammed
> wrote:
> >
> > Hi,
&g
Hi,
I am planning to consume multiple topics from kafka that status with a
particular word. May I know how to use this?
I dont want to enter all the names.
Thanks,
Asmath
82 matches
Mail list logo