Flink - Kafka Connector

2018-04-13 Thread Lehuede sebastien
Hi All, I'm very new in Flink (And on Streaming Application topic in general) so sorry if for my newbie question. I plan to do some test with Kafka and Flink and use the Kafka connector for that. I find information on this page :

Flink Kafka connector not exist

2018-04-19 Thread Lehuede sebastien
Hi Guys, I have created a project with Maven to try to send data from Kafka to Flink. But when i try to build the project i have the following error : *[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project processing-app:

Re: Kafka to Flink Avro Deserializer

2018-04-25 Thread Lehuede sebastien
vroRuntimeException)e.getCause() : new > AvroRuntimeException(e); > } > } > > So I guess your schema is missing. > > I hope this helps. > > Regards, > Timo > > Am 25.04.18 um 10:57 schrieb Lehuede sebastien: > >> ava.lang.NullPointerException >> at org.apache.avro.specific.SpecificData.getSchema >> > > >

Kafka to Flink Avro Deserializer

2018-04-25 Thread Lehuede sebastien
Hi Guys, I tried to implement my Avro Deserializer following these link : - https://github.com/okkam-it/flink-examples/blob/master/src/main/java/org/okkam/flink/avro/AvroDeserializationSchema.java -

KafkaJsonTableSource purpose

2018-04-23 Thread Lehuede sebastien
Hi Guys, I'm actually trying to understand the purpose of Table and in particular KafkaJsonTableSource. I try to see if for my use case ths can be usefull. Here is my context : I send logs on logstash, i add some information (Type, Tags), Logstash send logs to Kafka in JSON format and finally i

Re: Grok and Flink

2018-08-30 Thread Lehuede sebastien
Hi, To parse my logs and reuse all my Grok pattern, i use the Java Grok API directly in my DataStream. Please see : https://github.com/thekrakken/java-grok With that you should be able to get rid of the full Logstash piece and use only the Grok part. Another solution, for example if you have

Process stream multiple time with different KeyBy

2020-02-17 Thread Lehuede sebastien
Hi all, I'm currently working on a Flink Application where I match events against a set of rules. At the beginning I wanted to dynamically create streams following the category of events (Event are JSON formatted and I've a field like "category":"foo" in each event) but I'm stuck by the

Re: Process stream multiple time with different KeyBy

2020-02-19 Thread Lehuede sebastien
question was answered in quote some detail : >> https://flink.apache.org/news/2020/01/15/demo-fraud-detection.html >> Best regardsTheo >> ---- Ursprüngliche Nachricht >> Von: Eduardo Winpenny Tejedor >> Datum: Mo., 17. Feb. 2020, 21:07 >> An: Lehuede sebastien

Scala : ClassCastException with Kafka Connector and ObjectNode

2021-03-30 Thread Lehuede sebastien
Hi all, I’m currently trying to use Scala to setup a simple Kafka consumer that receive JSON formatted events and then just send them to Elasticsearch. This is the first step and after I want to add some processing logic. My code works well but interesting fields form my JSON formatted events

Re: Scala : ClassCastException with Kafka Connector and ObjectNode

2021-03-30 Thread Lehuede sebastien
15083 > <https://stackoverflow.com/a/18008591/4815083> > > Cheers, > Till > > On Tue, Mar 30, 2021 at 3:45 PM Lehuede sebastien <mailto:lehued...@gmail.com>> wrote: > Hi all, > > I’m currently trying to use Scala to setup a simple Kafka