Hi Till,
We are using 1.4.0. We have not tried this any other releases.
We will try this on 1.6.2 and see what happens.
Thank you.
Regards,
Teena
From: Till Rohrmann
Sent: 07 November 2018 20:23
To: Teena Kappen // BPRISE
Cc: user
Subject: Re: RichInputFormat working differently in eclipse
Hi all,
I have implemented RichInputFormat for reading result of aggregation queries in
Elasticsearch. There are around 10 buckets, which are of type json array.
Note: This is one time response.
My idea here is to iterate these arrays in parallel. Here is the pseudo code.
public void
Yes.. a custom sink with the required checks seems to be the only option.
From: Hequn Cheng
Sent: 10 July 2018 18:23
To: Teena Kappen // BPRISE
Cc: user@flink.apache.org
Subject: Re: Access the data in a stream after writing to a sink
Hi Teena,
It seems that a sink can not output data
: Teena Kappen // BPRISE
Sent: 10 July 2018 12:50
To: user@flink.apache.org
Subject: Access the data in a stream after writing to a sink
Hi,
Is it possible to access the data in a stream that was written to a sink? I
have a Cassandra Sink in my stream job and I have to access all the records
Hi,
Is it possible to access the data in a stream that was written to a sink? I
have a Cassandra Sink in my stream job and I have to access all the records
that were written to the Cassandra sink and write it to another sink. Is there
any way to do that?
Regards,
Teena
In conf.yaml file,
replace the setting ‘env.java.opts:
"-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005" ‘ as
mentioned in the link with the below setting
env.java.opts.taskmanager:
"-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005"
Regards,
Teena
From:
Hi,
If I have to aggregate a value in a stream of records, which one of the below
approaches will be the most/least efficient?
1. Using a Global Window to aggregate the value and emit the record when it
reaches a particular threshold value.
2. Using a FlatMap with a State Variable which
Hi,
Is it possible to have a variable value inside the time field for session
window?
input
.keyBy()
.window(ProcessingTimeSessionWindows.withGap(Time.minutes(10)))
.();
In the above code, instead of the value 10, which will be the same for all
records, is it possible to have a
the issue. Both the sinks wrote
into Elasticsearch as expected.
Thank you for taking this up. I will report back on the test results soon.
Regards,
Teena
From: Fabian Hueske [mailto:fhue...@gmail.com]
Sent: 31 January 2018 19:41
To: Stephan Ewen <se...@apache.org>
Cc: Teena Kappen //
Thanks Fabian. I will go through it and add info if required.
From: Fabian Hueske [mailto:fhue...@gmail.com]
Sent: 23 January 2018 15:20
To: Teena Kappen // BPRISE <teena.kap...@bprise.com>
Cc: Timo Walther <twal...@apache.org>; user@flink.apache.org
Subject: Re: Multiple Elastic
Elasticsearch sinks not working in Flink
Hi Teena,
what happens if you replace the second sink with a non-ElasticSearchSink? Is
there the same result? Is the data read from the KafkaTopic2?
We should determine which system is the bottleneck.
Regards,
Timo
Am 1/18/18 um 9:53 AM schrieb Teena Kappen
Hi,
I am running flink 1.4 in single node. My job has two Kafka consumers reading
from separate topics. After fetching the data, the job writes it to two
separate Elasticsearch sinks. So the process is like this
KafkaTopic1 -> Kafkaconsumer1 -> create output record -> Elasticsearchsink1
12 matches
Mail list logo