Re: Guava version conflict

2017-06-19 Thread Tzu-Li (Gordon) Tai
Thanks a lot! Please keep me updated with this :) On 19 June 2017 at 6:33:15 PM, Flavio Pompermaier (pomperma...@okkam.it) wrote: Ok, I'll let you know as soon as I recompile Flink 1.3.x. Thanks, Flavio On Mon, Jun 19, 2017 at 7:26 AM, Tzu-Li (Gordon) Tai wrote: Hi

Re: How can I get last successful checkpoint id in sink?

2017-06-19 Thread Tzu-Li (Gordon) Tai
Hi! The last completed checkpoint ID should be obtainable using the monitoring REST API [1], under the url “/jobs/{jobID}/checkpoints/“. It is also visible in the JobManager Web UI under the “checkpoints” tab of each job. The web UI fetches its information using the monitoring REST API, so

Re: Possible Data Corruption?

2017-06-19 Thread Ted Yu
See this thread: http://search-hadoop.com/m/Flink/VkLeQm2nZm1Wa7Ny1?subj=Re+Painful+KryoException+java+lang+IndexOutOfBoundsException+on+Flink+Batch+Api+scala which mentioned FLINK-6398 fixed in 1.2.2 / 1.3 On Mon, Jun 19, 2017 at 5:53 PM,

Possible Data Corruption?

2017-06-19 Thread Philip Doctor
Dear Flink Users, I have a Flink (v1.2.1) process I left running for the last five days. It aggregates a bit of state and exposes it via Queryable State. It ran correctly for the first 3 days. There were no code changes or data changes, but suddenly Queryable State got weird. The process

Re: Window + Reduce produces more than 1 output per window

2017-06-19 Thread FRANCISCO BORJA ROBLES MARTIN
Hello Piotrek! Thanks for answering! Yes I have already changed the "TimeCharacteristic" to "ProcessingTime". I need it for the ".setWriteTimestampToKafka(true)" option as I use the timestamp in the Kafka consumer who reads this app's output. I have already changed the code a bit for using

Re: Window + Reduce produces more than 1 output per window

2017-06-19 Thread Piotr Nowojski
Hi, It is difficult for me to respond fully to your question. First of all it would be really useful if you could strip down your example to a minimal version that shows a problem. Unfortunately I was unable to reproduce your issue. I was getting only one output line per window (as expected).

Re: How choose between YARN/Mesos/StandAlone Flink

2017-06-19 Thread AndreaKinn
Ok I understand standalone mode it will be sufficient, but for my thesis I would like to setup a well performed ready-to-use infrastructure. My workload it's not heavy, about 35 millions of messages a day (35 gb) but it should be easily expandable and running for many days... due to this I would

Re: FlinkCEP 1.3 scala, cannot apply select function

2017-06-19 Thread Sonex
Thank you for your quick response. That worked and compiled but another error came up. On runtime it gives the following error: java.lang.ClassCastException: MyEventType cannot be cast to scala.collection.IterableLike The error is at line val startEvent = pattern.get("first").get.head of

Re: How to sessionize stream with Apache Flink?

2017-06-19 Thread Fabian Hueske
An alternative would be to use a FlatMapFunction with a ListState instead of a window with custom trigger. When a new element arrives (i.e., the flatMap() method is called), you check if the value changed. If the value did not changed, you append the element to the state. If the value changed,

Re: DataSet: combineGroup/reduceGroup with large number of groups

2017-06-19 Thread Fabian Hueske
Hi Urs, ad 1) Yes, my motivation for the bound was to prevent OOMEs. If you have enough memory to hold the AggregateT for each key in memory, you should be fine without a bound. If the size of AggregateT depends on the number of aggregated elements, you might run into skew issues though. ad 2)

Re: FlinkCEP 1.3 scala, cannot apply select function

2017-06-19 Thread Dawid Wysakowicz
Hi, Because of some optimizations between java <-> scala collections conversions, the type of Map used for select method is scala.collection.Map instead of Predef.Map imported by default. Try importing: import scala.collection.Map or use fully qualified name in function definition: def

FlinkCEP 1.3 scala, cannot apply select function

2017-06-19 Thread Sonex
Hello I have created a simple pattern with FlinkCEP 1.3 as well as a simple pattern select function. My simple function is as follows: def myFunction(pattern: Map[String,Iterable[MyEventType]]): MyEventType = { val startEvent = pattern.get("first").get.head val endEvent =