The fix you mentioned is part of later Flink releases (like 1.0.3)
Stephan
On Mon, May 16, 2016 at 11:46 PM, Madhire, Naveen
<naveen.madh...@capitalone.com<mailto:naveen.madh...@capitalone.com>> wrote:
Thanks Fabian. Actually I don’t see a .valid-length suffix file in the out
blem.
Best, Fabian
[1]
https://ci.apache.org/projects/flink/flink-docs-master/api/java/org/apache/flink/streaming/connectors/fs/RollingSink.html
2016-05-14 4:17 GMT+02:00 Madhire, Naveen
<naveen.madh...@capitalone.com<mailto:naveen.madh...@capitalone.com>>:
Thanks Fabian. Yes, I am see
Hi Naveen,
the RollingFileSink supports exactly-once output. So you should be good.
Did you see events being emitted multiple times (should not happen with the
RollingFileSink) or being processed multiple times within the Flink program
(might happen as explained before)?
Best, Fabian
2016-0
of an operator.
Note, you need a SinkFunction that supports Flink's checkpointing mechanism to
achieve exactly-once output. Otherwise, it might happen that results are
emitted multiple times.
Cheers, Fabian
2016-05-13 22:58 GMT+02:00 Madhire, Naveen
<naveen.madh...@capitalone.com<mailt
I checked the JIRA and looks like FLINK-2111 should address the issue which I
am facing. I am canceling the job from dashboard.
I am using kafka source and HDFS rolling sink.
https://issues.apache.org/jira/browse/FLINK-2111
Is this JIRA part of Flink 1.0.0?
Thanks,
Naveen
From: "Madhire,
Hi,
We are trying to test the recovery mechanism of Flink with Kafka and HDFS sink
during failures.
I’ve killed the job after processing some messages and restarted the same job
again. Some of the messages I am seeing are processed more than once and not
following the exactly once semantics.
2.9.1
0.8.2.0
Since it pulls Scala 2.9. You can probably remove this dependency because our
kafka connector will also pull also that kafka dependency.
On Fri, Mar 4, 2016 at 8:07 PM, Madhire, Naveen
<naveen.madh...@capitalone.com<mailto:naveen.madh...@capitalone.com>> wro
Is it possible for you to share the complete dependency section of your project?
Thanks,
Stephan
On Fri, Mar 4, 2016 at 7:34 PM, Madhire, Naveen
<naveen.madh...@capitalone.com<mailto:naveen.madh...@capitalone.com>> wrote:
Thanks Stephan. I am using the new dependen
Hey All,
I am getting the below error while executing a simple Kafka-Flink Application.
java.lang.NoClassDefFoundError: org/apache/flink/api/java/operators/Keys
Below are the maven dependencies which I included in my application.
org.apache.kafka
kafka_2.9.1
0.8.2.0
d for him using you program I am
>little puzzled what might go wrong in your setup. The program seems to
>be correct.
>
>
>-Matthias
>
>
>On 12/04/2015 08:55 PM, Madhire, Naveen wrote:
>> Hi Max,
>>
>> I forgot to include flink-storm-examples dependency in the
Hi Max,
Yeah, I did route the ³count² bolt output to a file and I see the output.
I can see the Storm and Flink output matching.
However, I am not able to use the BoltFileSink class in the 1.0-SNAPSHOT
which I built. I think it¹s better to wait for a day for the Maven sync to
happen so that I
Hi,
I am trying to execute few storm topologies using Flink, I have a question
related to the documentation,
Can anyone tell me which of the below code is correct,
https://ci.apache.org/projects/flink/flink-docs-release-0.10/apis/storm_compatibility.html
12 matches
Mail list logo