Re: [SURVEY] Remove Mesos support

2020-10-26 Thread Piyush Narang
help is much appreciated. We'll let you know once there is something. On Fri, Oct 23, 2020 at 4:28 PM Piyush Narang mailto:p.nar...@criteo.com>> wrote: Thanks Kostas. If there's items we can help with, I'm sure we'd be able to find folks who would be excited to contribute / help in a

Re: [SURVEY] Remove Mesos support

2020-10-23 Thread Piyush Narang
e previous opinions that we cannot drop code that is actively used by users, especially if it something that deep in the stack as support for cluster management framework. Cheers, Kostas On Fri, Oct 23, 2020 at 4:15 PM Piyush Narang wrote: >

Re: [SURVEY] Remove Mesos support

2020-10-23 Thread Piyush Narang
Hi folks, We at Criteo are active users of the Flink on Mesos resource management component. We are pretty heavy users of Mesos for scheduling workloads on our edge datacenters and we do want to continue to be able to run some of our Flink topologies (to compute machine learning short term

Re: RichFunctions in Flink's Table / SQL API

2020-09-23 Thread Piyush Narang
able/flink-table-common/src/main/java/org/apache/flink/table/api/dataview/MapView.java [2] https://issues.apache.org/jira/browse/FLINK-19371 On 22.09.20 20:28, Piyush Narang wrote: > Hi folks, > > We were looking to cache some data using Flink’s MapState in one

RichFunctions in Flink's Table / SQL API

2020-09-22 Thread Piyush Narang
Hi folks, We were looking to cache some data using Flink’s MapState in one of our UDFs that are called by Flink SQL queries. I was trying to see if there’s a way to set up these state objects via the basic FunctionContext [1] we’re provided in the Table / SQL UserDefinedFunction class [2] but

Re: Flink checkpointing with Azure block storage

2020-08-24 Thread Piyush Narang
We had something like this when we were setting it in our code (now we’re passing it via config). There’s likely a better /cleaner way: private def configureCheckpoints(env: StreamExecutionEnvironment, checkpointPath: String): Unit = { if

Re: Understanding n LIST calls as part of checkpointing

2020-03-09 Thread Piyush Narang
. -- Piyush From: Yun Tang Date: Sunday, March 8, 2020 at 11:05 PM To: Piyush Narang , user Subject: Re: Understanding n LIST calls as part of checkpointing Hi Piyush Which version of Flink do you use? After Flink-1.5, Flink would not call any "List" operation on checkpoint side with FLI

Understanding n LIST calls as part of checkpointing

2020-03-06 Thread Piyush Narang
Hi folks, I was trying to debug a job which was taking 20-30s to checkpoint data to Azure FS (compared to typically < 5s) and as part of doing so, I noticed something that I was trying to figure out a bit better. Our checkpoint path is as follows:

Re: [DISCUSS] Adding e2e tests for Flink's Mesos integration

2019-12-06 Thread Piyush Narang
+1 from our end as well. At Criteo, we are running some Flink jobs on Mesos in production to compute short term features for machine learning. We’d love to help out and contribute on this initiative. Thanks, -- Piyush From: Till Rohrmann Date: Friday, December 6, 2019 at 8:10 AM To: dev Cc:

Re: Propagating event time field from nested query

2019-11-14 Thread Piyush Narang
somehow. Best, Dawid On 11/11/2019 22:43, Piyush Narang wrote: Hi folks, We have a Flink streaming Table / SQL job that we were looking to migrate from an older Flink release (1.6.x) to 1.9. As part of doing so, we have been seeing a few errors which I was trying to figure out how to work around

Propagating event time field from nested query

2019-11-11 Thread Piyush Narang
Hi folks, We have a Flink streaming Table / SQL job that we were looking to migrate from an older Flink release (1.6.x) to 1.9. As part of doing so, we have been seeing a few errors which I was trying to figure out how to work around. Would appreciate any help / pointers. Job essentially

Re: open() setup method not being called for AggregateFunctions?

2019-06-26 Thread Piyush Narang
can wrap the returned values in a UDF and capture metrics on that, but I’d love something cleaner. Thanks, -- Piyush From: Piyush Narang Date: Tuesday, June 25, 2019 at 3:16 PM To: "user@flink.apache.org" Subject: open() setup method not being called for AggregateFunctions? Hi fo

open() setup method not being called for AggregateFunctions?

2019-06-25 Thread Piyush Narang
Hi folks, I’ve tried to create some Flink UDAFs that I’m invoking using the Table / SQL api. In these UDAFs I’ve overridden the open() method to perform some setup operations (in my case initialize some metric counters). I noticed that this open() function isn’t being invoked in either the

Re: Clean way of expressing UNNEST operations

2019-06-04 Thread Piyush Narang
ut all three fields instead of t(product), I don’t face the issue.. Thanks, -- Piyush From: JingsongLee Reply-To: JingsongLee Date: Tuesday, June 4, 2019 at 2:42 AM To: JingsongLee , Piyush Narang , "user@flink.apache.org" Subject: Re: Clean way of expressing UNNE

Clean way of expressing UNNEST operations

2019-06-03 Thread Piyush Narang
Hi folks, I’m using the SQL API and trying to figure out the best way to unnest and operate on some data. My data is structured as follows: Table: Advertiser_event: * Partnered: Int * Products: Array< Row< price: Double, quantity: Int, … > > * … I’m trying to unnest the products

Support for custom triggers in Table / SQL

2019-03-28 Thread Piyush Narang
Hi folks, I’m trying to write a Flink job that computes a bunch of counters which requires custom triggers and I was trying to figure out the best way to express that. The query looks something like this: SELECT userId, customUDAF(...) AS counter1, customUDAF(...) AS counter2, ... FROM (

Re: Expressing Flink array aggregation using Table / SQL API

2019-03-15 Thread Piyush Narang
having a retractable sink / sink that can update partial results by key? Thanks, -- Piyush From: Kurt Young Date: Tuesday, March 12, 2019 at 11:51 PM To: Piyush Narang Cc: "user@flink.apache.org" Subject: Re: Expressing Flink array aggregation using Table / SQL API Hi Piyush, I

Re: Expressing Flink array aggregation using Table / SQL API

2019-03-12 Thread Piyush Narang
Thanks for getting back Kurt. Yeah this might be an option to try out. I was hoping there would be a way to express this directly in the SQL though ☹. -- Piyush From: Kurt Young Date: Tuesday, March 12, 2019 at 2:25 AM To: Piyush Narang Cc: "user@flink.apache.org" Subject: Re:

Expressing Flink array aggregation using Table / SQL API

2019-03-11 Thread Piyush Narang
Hi folks, I’m getting started with Flink and trying to figure out how to express aggregating some rows into an array to finally sink data into an AppendStreamTableSink. My data looks something like this: userId, clientId, eventType, timestamp, dataField I need to compute some custom