Hi All,
I'm very new in Flink (And on Streaming Application topic in general) so
sorry if for my newbie question.
I plan to do some test with Kafka and Flink and use the Kafka connector for
that.
I find information on this page :
Hi Guys,
I have created a project with Maven to try to send data from Kafka to
Flink. But when i try to build the project i have the following error :
*[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
(default-compile) on project processing-app:
vroRuntimeException)e.getCause() : new
> AvroRuntimeException(e);
> }
> }
>
> So I guess your schema is missing.
>
> I hope this helps.
>
> Regards,
> Timo
>
> Am 25.04.18 um 10:57 schrieb Lehuede sebastien:
>
>> ava.lang.NullPointerException
>> at org.apache.avro.specific.SpecificData.getSchema
>>
>
>
>
Hi Guys,
I tried to implement my Avro Deserializer following these link :
-
https://github.com/okkam-it/flink-examples/blob/master/src/main/java/org/okkam/flink/avro/AvroDeserializationSchema.java
-
Hi Guys,
I'm actually trying to understand the purpose of Table and in particular
KafkaJsonTableSource. I try to see if for my use case ths can be usefull.
Here is my context :
I send logs on logstash, i add some information (Type, Tags), Logstash send
logs to Kafka in JSON format and finally i
Hi,
To parse my logs and reuse all my Grok pattern, i use the Java Grok API
directly in my DataStream. Please see :
https://github.com/thekrakken/java-grok
With that you should be able to get rid of the full Logstash piece and use
only the Grok part.
Another solution, for example if you have
Hi all,
I'm currently working on a Flink Application where I match events against a
set of rules. At the beginning I wanted to dynamically create streams
following the category of events (Event are JSON formatted and I've a field
like "category":"foo" in each event) but I'm stuck by the
question was answered in quote some detail :
>> https://flink.apache.org/news/2020/01/15/demo-fraud-detection.html
>> Best regardsTheo
>> ---- Ursprüngliche Nachricht
>> Von: Eduardo Winpenny Tejedor
>> Datum: Mo., 17. Feb. 2020, 21:07
>> An: Lehuede sebastien
Hi all,
I’m currently trying to use Scala to setup a simple Kafka consumer that receive
JSON formatted events and then just send them to Elasticsearch. This is the
first step and after I want to add some processing logic.
My code works well but interesting fields form my JSON formatted events
15083
> <https://stackoverflow.com/a/18008591/4815083>
>
> Cheers,
> Till
>
> On Tue, Mar 30, 2021 at 3:45 PM Lehuede sebastien <mailto:lehued...@gmail.com>> wrote:
> Hi all,
>
> I’m currently trying to use Scala to setup a simple Kafka
10 matches
Mail list logo