Hello,
I wanted to update my processor. But I can't seem to find the maven archetype which provides the serialization classes.
Can anybody help, please?
rgds,
Uwe
se back to the main flow.
Regards,
Matt
[1] https://issues.apache.org/jira/browse/AVRO-2065
[2] https://issues.apache.org/jira/browse/AVRO-1891
On Thu, Oct 19, 2017 at 1:00 PM, Uwe Geercken wrote:
> Hello,
>
> I have a problem with a timestamp field and the avro schema. the field is
>
Hello,
I have a problem with a timestamp field and the avro schema. the field is defined as:
{"name": "AAN", "type": [{"type": "long","logicalType": "timestamp-millis"},"null"]},
The Avro specs say:
"A timestamp-millis logical type annotates an Avro long, where the long store
pache.nifi/nifi-scripting-nar/1.4.0/org.apache.nifi.lookup.script.ScriptedLookupService/index.html
2. https://gist.github.com/jfrazee/ff9bcd859227d6939aca7faa9d54f2e2
On Oct 13, 2017, 2:20 PM -0500, Uwe Geercken , wrote:
Hello,
I am looking for some advice: I have Nifi sending flowfiles
Hello,
I am looking for some advice: I have Nifi sending flowfiles to Kafka. As we know for Kafka everything is an "insert". Messages are inserted into the Kafka log.
Now I wonder what is the best way to insert OR update a relational database table from Kafka messages using Nifi. What is th
descendent components. To create a service that is available to all components in the flow, define it in the root Process Group.
>
> https://community.hortonworks.com/content/kbentry/90259/understanding-controller-service-availability-in-a.html
>
> Matt
>
> On Thu, Oct 12, 2017 at 2
Hello,
I am using Nifi for quite a while, but I have never understood how the inheritance of the controller settings works.
If I have a blank flow and add controller settings - e.g. a DHCPConnection Pool - and then add a process group, then the controller settings are not visible. Why is th
t 4, 2017 at 11:22 AM, Uwe Geercken wrote:
>
> Mark,
>
> I stopped 1.4 and started 1.3 and created the same flow. And in 1.3 it works
> without a problem.
>
> And I looked into the 1.4 log (I should have looked before, but only looked
> at the processor's tooltip). Y
erties in bootstrap.conf.
>
> In the properties file approach you had to restart NiFi for any
> changes in the properties to be picked up.
>
> In the new approach you no longer need to restart anything yourself,
> it will be handled for you.
>
> -Bryan
>
> On Wed, Oct 4,
er need to restart anything yourself,
it will be handled for you.
-Bryan
On Wed, Oct 4, 2017 at 8:20 AM, Uwe Geercken wrote:
> Hello,
>
> in 1.4 when you right-click the processor group then there is an entry
> "variables". I have defined some, but I wonder how I ca
Record. Can you check your logs and see what other error may be present
in the logs?
Thanks
-Mark
On Oct 4, 2017, at 10:39 AM, dan young <danoyo...@gmail.com> wrote:
It might be...maybe others can share experience with 1.4...
On Wed, Oct 4, 2017, 8:37 AM Uwe Geercken
d with the variable value.
greetings,
Uwe
Gesendet: Mittwoch, 04. Oktober 2017 um 14:20 Uhr
Von: "Uwe Geercken"
An: nifi
Betreff: nifi 1.4 - variables
Hello,
in 1.4 when you right-click the processor group then there is an entry
"variables". I have defined some, but I
On Wed, Oct 4, 2017, 8:13 AM Uwe Geercken <uwe.geerc...@web.de> wrote:
Hello,
I have created a flow: GetFile >> QueryRecord >> Putfile. GetFile reads an avro file. QueryRecord has one property/sql and the result is routed to PutFile.
When I run the processor, I get follo
Hello,
I have created a flow: GetFile >> QueryRecord >> Putfile. GetFile reads an avro file. QueryRecord has one property/sql and the result is routed to PutFile.
When I run the processor, I get following error:
failed to process session due to java.lang.IllegalStateException
al
Hello,
in 1.4 when you right-click the processor group then there is an entry "variables". I have defined some, but I wonder how I can use them.
I have not found documentation on this. Does somebody have details or a link to documentation or an example?
Rgds,
Uwe
full table. Every row would then be hashed/digested or in any other way uniquely identified and 2 datasets would be crossed and compared to find inserts/deletes/updates. It was involved, but worked.
Andrew
On Sat, Sep 16, 2017, 2:38 AM Uwe Geercken <uwe.geerc...@web.de> wrote:
Bry
this
case, which unfortunately usually becomes database specific.
I believe we have a processor CaptureChangeMySQL that can process the
MySQL change log.
-Bryan
On Tue, Sep 12, 2017 at 1:39 PM, Uwe Geercken wrote:
> Hello,
>
> apparently the QueryDatabaseTable processor catches chang
Hello,
I have spent some time working with the QueryDatabaseTable. My Flow is very simple - I query Oracle and output to Avro files.
I want to retrieve all records from the Oracle Table containing 307714 records. I had a look at the Oracle table and looked for the lowest auto-generated id (
"Inherit Record Schema"
- "Schema Write Strategy" = "Set 'avro.schema' Attribute"
This way, you don't have to have the schema in registry, and result
CSV FlowFile has 'avro.schema' attribute inheriting the one created by
QueryDatabaseTable.
Hello,
apparently the QueryDatabaseTable processor catches changes made to the data of the source database - updates and inserts.
Has anybody a good idea or strategy how to handle deletes in the source database? Of course one could flag a record as deleted instead of phisically deleting it.
Hello,
I was wondering why if the QueryDatabaseTable processor creates internally an Avro schema, why is this schema not available as an attribute or saved to the registry?
If it would, then one could reuse the schema. E.g. if I use the ConvertRecord processor and I specify an AvroReader as
se value would be the topic name for the
current incoming message.
Regards,
Matt
On Thu, Jun 22, 2017 at 3:40 PM, Uwe Geercken wrote:
> Hello,
>
> besides my other problem with the ConsumeKafkaRecord_0_10 processor, I have another question.
>
> Using the AvroSchemaRegistry 1.3.0, I
eal" error
> message is getting lost or perhaps further down in your stack trace,
> but it looks like an error with finding or reading the schema.
>
> Regards,
> Matt
>
>
> On Thu, Jun 22, 2017 at 3:30 PM, Uwe Geercken wrote:
>> Hello everyone,
>>
>>
x27;s handling of the
Kafka input which might be empty? That's all the guessing I'll do on
the Kafka stuff, I'll leave it to the folks that know much more about
it :)
Regards,
Matt
On Thu, Jun 22, 2017 at 3:54 PM, Matt Burgess wrote:
> Uwe,
>
> It looks like this error is di
"real" error
> message is getting lost or perhaps further down in your stack trace,
> but it looks like an error with finding or reading the schema.
>
> Regards,
> Matt
>
>
> On Thu, Jun 22, 2017 at 3:30 PM, Uwe Geercken wrote:
>> Hello everyone,
>>
age is getting lost or perhaps further down in your stack trace,
but it looks like an error with finding or reading the schema.
Regards,
Matt
On Thu, Jun 22, 2017 at 3:30 PM, Uwe Geercken wrote:
> Hello everyone,
>
> I wanted to try the following
> - get messages from a kafka topic. these
Hello,
besides my other problem with the ConsumeKafkaRecord_0_10 processor, I have
another question.
Using the AvroSchemaRegistry 1.3.0, I can define a schema and reference it e.g.
in the CSVReader controller using the 'Schema Name' property and by setting
this property to ${schema.name}.
B
Hello everyone,
I wanted to try the following
- get messages from a kafka topic. these are simple messages in CSV format
- use the PartitionRecord processor to get familiar with the RecordPath concept
I started zookeeper and kafka on localhost and added some messages to a topic
using the kafka
compatible fragment.* attributes like the older split processors. Partially for a consistent user experience, and partly for compatibility with MergeContent and other processors that read the fragment.* attributes.
Thanks,
James
On Fri, May 26, 2017 at 7:17 AM, Uwe Geercken <uwe.geerc...@
Hello,
I have used both the SplitRecord and SplitText processors. When using the SplitText processor, the flowfile gets various attributes for the fragment which in turn can be used to generate a unique filename for the output with e.g. PutFile. I was using the fragment.index attribute for this
clear, please
let us know so that we can update the post to clarify.
Thanks!
-Mark
[1] https://blogs.apache.org/nifi/entry/record-oriented-data-with-nifi
On May 21, 2017, at 7:03 AM, Uwe Geercken <uwe.geerc...@web.de> wrote:
Hello,
I wonder if somebody
Hello,
I wonder if somebody has a sample for using CSVReader and CSVRecordsetWriter services in the SplitRecord processor for version 1.2.0
I want to e.g. read a CSV file then split it into chunks and output the individual flow files. From what I understand I need a reader and a writer serv
Sorry - was my mistake. All is ok.
Uwe
Gesendet: Freitag, 14. April 2017 um 18:49 Uhr
Von: "Joe Witt"
An: users@nifi.apache.org
Betreff: Re: Flow not loading in 1.2.0
You put the flow into the conf dir and not the root dir, right?
On Apr 14, 2017 12:29 PM, &qu
not put them in lib dir?
On Fri, Apr 14, 2017 at 12:29 PM Uwe Geercken
mailto:uwe.geerc...@web.de]> wrote:Hi,
I just downloaded 1.2.0 and unpacked it to the usual place. Then I simply
copied my flow.xml.gz from the previous version 1.1.1 to the Nifi 1.2.0 root
folder and started Nifi 1
Forget about it - I copied the wrong file.
All ok.
Uwe
Gesendet: Freitag, 14. April 2017 um 18:29 Uhr
Von: "Uwe Geercken"
An: nifi
Betreff: Flow not loading in 1.2.0
Hi,
I just downloaded 1.2.0 and unpacked it to the usual place. Then I simply
copied my flow.xml.gz from th
Hi,
I just downloaded 1.2.0 and unpacked it to the usual place. Then I simply
copied my flow.xml.gz from the previous version 1.1.1 to the Nifi 1.2.0 root
folder and started Nifi 1.2.0. But the flow does not show up. No error messages.
Am I missing something?
Rgds,
Uwe
Hello everybody,
I have released (Apache License) my NiFi processors at:
https://github.com/uwegeercken/nifi_processors[https://deref-web-02.de/mail/client/WDRw2DECNaw/dereferrer/?redirectUrl=https%3A%2F%2Fgithub.com%2Fuwegeercken%2Fnifi_processors]
Further below is a summary for the processo
Hi,
you might want to use the ExecuteRuleEngine processor I wrote. It allows you to do very complex checks on the data and then also update it.
In other scenarios the MergeTemplate processor might work as well.
have a look at: https://github.com/uwegeercken/nifi_processors
rgds,
Hello everybody,
I have released (Apache License) my NiFi processors at:
https://github.com/uwegeercken/nifi_processors
Further below is a summary for the processors. I would like to invite everybody to test, look at the source code and send me any feedback that you have.
I have do
he most recent events.
Thanks
-Mark
On Mar 30, 2017, at 3:06 AM, Uwe Geercken <uwe.geerc...@web.de> wrote:
James,
here is the screenshot of the data provenance:
http://imgur.com/a/29M8I
and here my flow:
http://imgur.com/a/nd6v7
If I clear the searc
events and then filtering down to the time and processor, does that change anything?
Thanks,
James
On Fri, Mar 24, 2017 at 8:39 AM, Uwe Geercken <uwe.geerc...@web.de> wrote:
Hello,
I have a production flow in place (nifi-1.1.1-RC1). It runs on a virtual machine running redhat. Th
ou share a screenshot of the provenance search criteria? If you try the search the other way around, starting with all provenance events and then filtering down to the time and processor, does that change anything?
Thanks,
James
On Fri, Mar 24, 2017 at 8:39 AM, Uwe Geercken <uwe.geerc...@
Hello,
I have a production flow in place (nifi-1.1.1-RC1). It runs on a virtual machine running redhat. The reads multiple Json files (small) from a windows share/mountpoint. Various shell scripts write single lines (json) into files. The files are read, split and put into MongoDb.
The flo
att
On Wed, Mar 15, 2017 at 6:54 PM, Uwe Geercken wrote:
> Thanks Matt,
>
> there is a little "but" though: The business rules are maintained outside of nifi in a web app. and that's good and the concept behind business rules - to not clutter up the flow and to devide the
aring this, hopefully others in the community will give it a go as
> well.
>
> Regards,
> Matt
>
>
> > On Mar 15, 2017, at 6:12 PM, Uwe Geercken wrote:
> >
> > Hello,
> >
> > I have worked quite a bit on my RuleEngine processor and would appr
Hello,
I have worked quite a bit on my RuleEngine processor and would appreciate, if
somebody takes some time to try it and to look at the source code and give me
some feedback.
You can download it at: https://github.com/uwegeercken/nifi_processors
It is my first bundle of processors, so I gue
er/nifi-nar-bundles/nifi-ignite-bundle/nifi-ignite-nar/src/main/resources/META-INF
>
> Thanks
> Joe
>
> On Thu, Mar 2, 2017 at 3:56 PM, Uwe Geercken wrote:
> > Thanks for all your resposes and help.
> >
> > One last question (at least for the moment ;-) ):
> &
and can
> instead just publish them outside the nifi community. We're happy to
> help either way.
>
> [1]
> https://github.com/apache/nifi/tree/master/nifi-nar-bundles/nifi-ignite-bundle/nifi-ignite-nar/src/main/resources/META-INF
>
> Thanks
> Joe
>
> On Thu,
ou can't sell a version of NiFi with
> your additional NAR and call it "NiFi++").
>
> Regards,
> Matt
>
> On Thu, Mar 2, 2017 at 7:10 AM, Uwe Geercken wrote:
> > Thanks Andrew,
> >
> > Matt also pointed me to the same direction.
> >
&g
I have had experience with
these kinds of things, both for NiFi and other extensible open-source
projects.
Regards,
Matt
On Wed, Mar 1, 2017 at 7:01 AM, Uwe Geercken <uwe.geerc...@web.de> wrote:
> Matt,
>
> I did not know there is an official Apache Nifi repo. If you send me a link
a license, but your
> > business rule engine (jare) seems to be GPL 3.0 licensed. I'm not sure that
> > fits with most uses of NiFi.
> >
> > Can you please clarify?
> >
> > Thanks
> >
> > -Matt
> >
> > On Tue, Feb 28, 2017 at 4:47 PM, Uw
of NiFi.
Can you please clarify?
Thanks
-Matt
On Tue, Feb 28, 2017 at 4:47 PM, Uwe Geercken <uwe.geerc...@web.de> wrote:
Hello everyone,
I just wanted to let you know, that I have created four processors for Nifi
1) GenerateData - generates random data (test data) ba
>
> Can you please clarify?
>
> Thanks
>
> -Matt
>
> On Tue, Feb 28, 2017 at 4:47 PM, Uwe Geercken <uwe.geerc...@web.de> wrote:
>>
>> Hello everyone,
>>
>> I just wanted to let you know, that I have created four processors for
>> Nifi
>
Hello everyone,
I just wanted to let you know, that I have created four processors for Nifi
1) GenerateData - generates random data (test data) based on word lists,
regular expressions or purely random
2) RuleEngine - a ruleengine which allows to process complex business logic.
But the logic is
Hello,
I would like to query a mariadb database for the latest updates based on a timestamp column and then put the retrieved record into kafka.
What would be the best practice to do that? Use a QueryDatabaseTable, convert the result (which is in Avro format) to Json and put that in Kafka?
Hello,
excuse my question, but I still have not fully understood how one would logically handle large flow graphs. If I have many systems involved and many different types of output, would I really put everything in one flow? Or is this a misunderstanding on my side? If you put everything in on
Matt,
I worked a while ago on a processor with apache velocity. I stopped work when
the packaging as nar did not work and
I was somewhat confused of the layout. You helped me at that time but there was
an error.
I would like to pickup the work again. But I need help with packaging. I am not
ve
somewhere but I am guessing we can do more to announce that in the relevant docs, thanks! Where do you think it would be helpful to add such reference(s)?
Thanks,
Matt
> On Mar 26, 2016, at 2:47 PM, Uwe Geercken wrote:
>
> Just a quick one: I can not
Just a quick one: I can not find any information on which port the rest client
of Nifi runs.
When I try:
http://localhost:8080/nifi
I see my Nifi web interface. If I try:
http://localhost:8080/nifi/controller/about
I get an error 404.
I would recommend to put something in the docs.
Greeting
:
Totally agree with you.
How about his for now
https://cwiki.apache.org/confluence/display/NIFI/Example+Dataflow+Templates
And we can come up with something better (like the registry) in the future.
Thanks
Joe
On Thu, Mar 24, 2016 at 11:00 AM, Uwe
I believe it would be cool to have a central place where we can store and download templates. today so many people do cool stuff but its all decentralized.
maybe it is just a matter of deciding where
uwe
--
Diese Nachricht wurde von meinem Android Mobiltelefon mit WEB.DE Mail gesendet.Am 23.03.2
Maybe I am too naive here, but formatting for text based formats could be done using a template engine.
Matt is right with the user experience - but only if the complexity of this one processor is not too high - that is what I think. Personally I - with the user hat on - don't like dialogs wit
ient certificate or your LDAP entry.
Matt
[1] https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html#user-authentication
On Thu, Mar 10, 2016 at 4:12 PM, Uwe Geercken <uwe.geerc...@web.de> wrote:
Hello I would like to setup a simple username/password authe
Hello I would like to setup a simple username/password authentication. A user has to specify the userid and a password to use the nifi web ui - that's all.
While there is a lot of information in the documentation, I am confused of what is required and what not.
in the file authority-provide
rking.
Thanks,
Bryan
[1] https://issues.apache.org/jira/browse/NIFI-1197
On Fri, Mar 4, 2016 at 6:43 PM, Uwe Geercken <uwe.geerc...@web.d
Hello,
I have tried to use the PutMongo processor. I read Json files from a folder and send them to a mongo database.
The insert of documents works seamlessly.
Next I tested updates. The problem here is that for the update a complete document is required. So if you insert a document wit
, 2016 at 1:22 PM, Uwe Geercken wrote:
> Matthew,
>
> does NiFi itself allow to define such things as constants? The idea would be
> to re-use these constants (e.g. a path) in different processors.
>
> Rgds,
>
> Uwe
>
> Gesendet: F
Matthew,
does NiFi itself allow to define such things as constants? The idea would be to re-use these constants (e.g. a path) in different processors.
Rgds,
Uwe
Gesendet: Freitag, 04. März 2016 um 19:04 Uhr
Von: "Matthew Clarke"
An: users@nifi.apache.org, "Joe Percivall"
Betreff:
, and supports PCRE.
On Tue, Mar 1, 2016 at 4:30 PM, Uwe Geercken <uwe.geerc...@web.de> wrote:
Hello,
I was wondering which tool people use to validate their regular expressions?
I was going throught s
Hello,
I think working with the templates in nifi is quite good but I have some suggestions and would like to hear your comments
- there is no way of saving my flow somehow other than in templates. if I create a couple of them, I loose the overview of which template is which. so I would rec
Hello,
I was wondering which tool people use to validate their regular expressions?
I was going throught some of the templates I found in the web and found one
with following regular expression:
(?s:^.*$)
When using http://www.regexr.com/ which I find very good and complete,
regexr.com tells
, Uwe Geercken <uwe.geerc...@web.de> wrote:
Aldrin,
I took a different approach. I store the info if a job is running externally in mongodb. so the master writes to mongo when it starts and when it ends.
I then
t does not work, are there additional constraints/criteria that extend beyond this process? What is the relationship like between the follow-on processes?
On Tue, Feb 23, 2016 at 5:11 AM, Uwe Geercken <uwe.geerc...@
are looking for, could you please provide some additional details about your needs that are unmet by the ExecuteStreamCommand approach?
Thanks!
On Mon, Feb 22, 2016 at 2:47 PM, Uwe Geercken <uwe.geerc...@web.de> wrote:
Hello,
I am new to Nifi and have a general question for the Exe
Hello,
I am new to Nifi and have a general question for the ExecuteProcess processor.
Is it possible with Nifi to run a process and have a second process wait until the first one finished? How would I do that? It looks like though that this processor does not allow for an incomming connecti
76 matches
Mail list logo