Re: Silly keyBy() error

2016-03-12 Thread Ron Crocker
Thanks Stefano - That helped, but just led to different pain. I think I need to reconsider how I treat these things. Alas, the subject of a different thread. Ron — Ron Crocker Principal Engineer & Architect ( ( •)) New Relic rcroc...@newrelic.com M: +1 630 363 8835 > On Mar 12, 2016, at 12:11

Re: Silly keyBy() error

2016-03-12 Thread Stefano Baghino
Hi Ron, not all classes can be used to `keyBy` a stream with. For your case in particular, it looks like you have to implement Comparable so that Flink can correctly key your stream based on AggregatableTimesliceImpl. Take a look at the first slides here for more information on keying:

Silly keyBy() error

2016-03-12 Thread Ron Crocker
I’m sure this should work, but I’m missing something… I searched the archive first, but didn’t have much luck finding any insights there. TL;DR: org.apache.flink.api.common.InvalidProgramException: This type (GenericType) cannot be used as key. I’m just getting started with a 1.0

Re: SourceFunction Scala

2016-03-12 Thread Stefano Baghino
Hi Ankur, I'm catching up with this week mailing list right now; I hope you already solved the issue, but if you haven't this kind of problem happen when you use a version of Scala for which your Flink dependencies have not been compiled for. Make sure you append the correct Scala version to the

Re: Flink packaging makes life hard for SBT fat jar's

2016-03-12 Thread Till Rohrmann
Great to hear Shikhar :-) Cheers, Till On Mar 4, 2016 3:51 AM, "shikhar" wrote: > Thanks Till. I can confirm that things are looking good with RC5. > sbt-assembly works well with the flink-kafka connector dependency not > marked > as "provided". > > > > -- > View this

Behavior of SlidingProessingTimeWindow with CountTrigger

2016-03-12 Thread Vishnu Viswanath
Hi All, I have the below code val sev = StreamExecutionEnvironment.getExecutionEnvironment val socTextStream = sev.socketTextStream("localhost",) val counts = socTextStream.flatMap{_.split("\\s")} .map { (_, 1) } .keyBy(0)

Re: Flink and YARN ship folder

2016-03-12 Thread Andrea Sella
Hi Ufuk, I'm trying to execute the WordCount batch example with input and output on Alluxio, i followed Running Flink on Alluxio and added the library to lib folder. Have I to replicate this operation on the slaves or YARN

Re: External DB as sink - with processing guarantees

2016-03-12 Thread Josh
Hi Fabian, Thanks, that's very helpful. Actually most of my writes will be idempotent so I guess that means I'll get the exact once guarantee using the Hadoop output format! Thanks, Josh > On 12 Mar 2016, at 09:14, Fabian Hueske wrote: > > Hi Josh, > > Flink can

Re: External DB as sink - with processing guarantees

2016-03-12 Thread Josh
Thanks Nick, that sounds good. I would still like to have an understanding of what determines the processing guarantee though. Say I use a DynamoDB Hadoop OutputFormat with Flink, how do I know what guarantee I have? And if it's at-least-once, is there a way to adapt it to achieve exactly-once?