Re: Stopping a kafka consumer gracefully (no losing of inflight events, StoppableFunction)

2018-02-20 Thread Bart Kastermans
unctionality to the engine in order to properly stop and take a > savepoint.> > Cheers, > Till > > On Mon, Feb 19, 2018 at 3:36 PM, Bart Kastermans > wrote:>> In >> https://ci.apache.org/projects/flink/flink-docs-release-1.4/ops/cli.html >>

Stopping a kafka consumer gracefully (no losing of inflight events, StoppableFunction)

2018-02-19 Thread Bart Kastermans
In https://ci.apache.org/projects/flink/flink-docs-release-1.4/ops/cli.html it is shown that for gracefully stopping a job you need to implement the StoppableFunction interface. This appears not (yet) implemented for Kafka consumers. Am I missing something, or is there a different way to grace

akka.pattern.AskTimeoutException: Ask timed out (after upgrading to flink 1.4.0)

2018-01-22 Thread Bart Kastermans
I have upgraded to flink-1.4.0, with just local task and job manager (flink/bin/start-cluster.sh). After solving the dependency issues, I now get the below error consistently on a specific job. As this means absolutely nothing to me (other than that I realise flink uses akka), I have no idea w

Classpath for execution of KafkaSerializer/Deserializer; java.lang.NoClassDefFoundError while class in job jar

2017-11-29 Thread Bart Kastermans
I have a custom serializer for writing/reading from kafka. I am setting this up in main with code as follows: val kafkaConsumerProps = new Properties() kafkaConsumerProps.setProperty("bootstrap.servers", kafka_bootstrap) kafkaConsumerProps.setProperty("group.id",s"normalize-call-even

Database connection from job

2017-08-24 Thread Bart Kastermans
I am using the scala api for Flink, and am trying to set up a JDBC database connection in my job (on every incoming event I want to query the database to get some data to enrich the event). Because of the serialization and deserialization of the code as it is send from the flink master to the flin