Apache IoT

2017-02-03 Thread Trevor Grant
Hey all, This year at ApacheCon Miami, there is going to be a specific track dedicated to IoT and how the Apache Family of Projects can be used in concert to create full pipelines. As Apache Flink is a relevant part of that pipeline for the processing of data coming in from edge devices, I want

Re: Parallelism and Partitioning

2017-02-03 Thread Mohit Anchlia
Any information on this would be helpful. On Thu, Feb 2, 2017 at 5:09 PM, Mohit Anchlia wrote: > What is the granularity of parallelism in flink? For eg: if I am reading > from Kafka -> map -> Sink and I say parallel 2 does it mean it creates 2 > consumer threads and

Re: Clarification on state backend parameters

2017-02-03 Thread Mohit Anchlia
I thought rocksdb is used to as a store backend. If that is the case then why would are there 2 configuration parameter? Or in other words what is the behavior if both state.backend.fs.checkpointdir and state.backend.rocksdb is set? On Fri, Feb 3, 2017 at 1:47 AM, Stefan Richter

Re: user Digest 2 Feb 2017 14:54:03 -0000 Issue 1703

2017-02-03 Thread Aljoscha Krettek
Hi, KafkaIO should be supported. Have you tried following the Quickstart and adding the right dependency for the KafkaIO? Cheers, Aljoscha On Thu, 2 Feb 2017 at 16:32 Boris Lublinsky wrote: > Is KafkaIO supported on Flink Runner? > I see the code in Github, but

Re: Fink: KafkaProducer Data Loss

2017-02-03 Thread ninad
Thanks, Gordon and Till. -- View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Fink-KafkaProducer-Data-Loss-tp11413p11431.html Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Re: Kafka data not read in FlinkKafkaConsumer09 second time from command line

2017-02-03 Thread Aljoscha Krettek
Ok, thanks for letting us know! On Thu, 2 Feb 2017 at 17:40 Sujit Sakre wrote: > Implementing this formula seems to have solved our problem now. Thanks. > > On 2 February 2017 at 21:21, Sujit Sakre > wrote: > > Hi Aljoscha, > > Thanks

Re: allowed lateness on windowed join?

2017-02-03 Thread Aljoscha Krettek
Hi, I'm afraid that's not possible but you can use a regular stream and do the join yourself. What the code for JoinedStreams essentially does is take two streams, map them to a common data type, union them and then perform a normal window operation. The code for this is in CoGroupedStreams (as

Re: Restoring from an external checkpoint / savepoint with a local stream environment

2017-02-03 Thread Ufuk Celebi
Hey Kathleen, this is not supported "natively" in the APIs but the following is possible to work around it: int jobManagerPort = 6123; int numTaskSlots = 8; String savepointPath = "/Users/uce/Desktop/savepoint-xAsd"; Configuration config = new Configuration();

Re: Compiler error while using 'CsvTableSource'

2017-02-03 Thread nsengupta
Till, Many thanks. Just to confirm that it is working fine at my end, here's a screenshot. This is Flink 1.1.4 but Flink-1.2/Flink-1.3 shouldn't be any problem. It never struck me that lack

Restoring from an external checkpoint / savepoint with a local stream environment

2017-02-03 Thread Kathleen Sharp
Hi, is it possible to restore from an external checkpoint / savepoint while using a local stream environment? I ask because I want to play around with some concepts from within my IDE, without building a jar and deploying my job. Thanks, Kat --

Re: Compiler error while using 'CsvTableSource'

2017-02-03 Thread Till Rohrmann
This should do the trick val csvTableSource = new CsvTableSource("foobar", Array("base"), Array[org.apache.flink.api.common.typeinfo.TypeInformation[_]](Types.STRING)). The problem is that arrays in Scala are not covariant. Cheers, Till ​ On Fri, Feb 3, 2017 at 6:39 AM, nsengupta

Re: Clarification on state backend parameters

2017-02-03 Thread Stefan Richter
Hi, the purpose of the configuration parameter is described in the documentation under https://ci.apache.org/projects/flink/flink-docs-release-1.2/setup/config.html . In a nutshell, state.checkpoints.dir contains