Re: How can I increase Flink managed memory?

2017-05-31 Thread Sathi Chowdhury
ager.memory.preallocate. > > >But actually what are trying to fix? Is your node flushing data to disk yet? Or >has it just not accumulated that much operator state yet? > > > >Nico > >[1] https://ci.apache.org/projects/flink/flink-docs-release-1.3/setup/ >config.html

Re: How can I increase Flink managed memory?

2017-05-29 Thread Sathi Chowdhury
For got to mention I am running 10 slots in each machine and taskmanager.network.numberOfBuffers: 1 Is there a scope of improving memory usage? From: Sathi Chowdhury <sathi.chowdh...@elliemae.com<mailto:sathi.chowdh...@elliemae.com>> Date: Monday, May 29, 2017 at 9:55

How can I increase Flink managed memory?

2017-05-29 Thread Sathi Chowdhury
Hello Flink Dev and Community I have 5 task managers each tie 64 GB of memory I am running flink on yarn with task manager heap taskmanager.heap.mb: 563200 Link still shows that it is using about 21 GB memory leaving 35 GB available..how and what can I do to fix it? Please suggest Thanks Sathi

Flink parallel tasks, slots and vcores

2017-05-25 Thread Sathi Chowdhury
Hi Till/ flink-devs, I am trying to understand why adding slots in the task manager is having no impact in performance for the test pipeline. Here is my flink-conf.yaml jobmanager.rpc.address: localhost jobmanager.rpc.port: 6123 jobmanager.heap.mb: 1024 taskmanager.memory.preallocate: false

Re: flink 1.2 job fails while trying to use -p option

2017-05-24 Thread Sathi Chowdhury
st happens that with parallelism 1, the `markAsTemporarilyIdle` method is not called in the Kinesis connector, hence the exception did not pop up. Cheers, Gordon On 24 May 2017 at 7:27:08 AM, Sathi Chowdhury (sathi.chowdh...@elliemae.com<mailto:sathi.chowdh...@elliemae.com>) wrote: Need

flink 1.2 job fails while trying to use -p option

2017-05-23 Thread Sathi Chowdhury
Need quick help on this Trying to run /bin/flink run -p 6 –c CLASSNAME - /mnt/flink/jarname.jar everything works without –p option ,but runs with parallelization 1, which is what I am trying to get past. java.lang.NoSuchMethodError:

Re: trying to externalize checkpoint to s3

2017-05-23 Thread Sathi Chowdhury
Hi Till, thanks for your reply.I have to try out my fatjar not including Hadoop classes as well. From: Till Rohrmann <trohrm...@apache.org> Date: Tuesday, May 23, 2017 at 7:12 AM To: Ted Yu <yuzhih...@gmail.com> Cc: Sathi Chowdhury <sathi.chowdh...@elliemae.com>, user <

trying to externalize checkpoint to s3

2017-05-22 Thread Sathi Chowdhury
We are running flink 1.2 in pre production I am trying to test checkpoint stored in external location in s3 I have set these below in flink-conf.yaml state.backend: filesystem state.checkpoints.dir: s3://abc-checkpoint state.backend.fs.checkpointdir: s3://abc-checkpoint I get this failure in

Re: ERROR while creating save points..

2017-05-22 Thread Sathi Chowdhury
I was able to bypass that one ..by running it from bin/flink … Now encountering by: java.lang.NullPointerException: Checkpoint properties say that the checkpoint should have been persisted, but missing external path. From: Sathi Chowdhury <sathi.chowdh...@elliemae.com> Date: Monday, May 22

ERROR while creating save points..

2017-05-22 Thread Sathi Chowdhury
Hi Flink Dev, I am running flink on yarn from EMR and I was running this command to test an external savepoint flink savepoint 8c4c885c5899544de556c5caa984502d /mnt The program finished with the following exception: org.apache.flink.runtime.leaderretrieval.LeaderRetrievalException: Could not

Re: put record to kinesis and then trying consume using flink connector

2017-04-26 Thread Sathi Chowdhury
by an outside service then obviously no issues.. From: Alex Reid <alex.james.r...@gmail.com> Date: Tuesday, April 25, 2017 at 4:31 PM To: Sathi Chowdhury <sathi.chowdh...@elliemae.com> Cc: "user@flink.apache.org" <user@flink.apache.org> Subject: Re: put record to kin

Re: put record to kinesis and then trying consume using flink connector

2017-04-25 Thread Sathi Chowdhury
From: Sathi Chowdhury <sathi.chowdh...@elliemae.com> Date: Tuesday, April 25, 2017 at 3:56 PM To: "Tzu-Li (Gordon) Tai" <tzuli...@apache.org>, "user@flink.apache.org" <user@flink.apache.org> Subject: Re: put record to kinesis and then trying consume using flink

Re: put record to kinesis and then trying consume using flink connector

2017-04-25 Thread Sathi Chowdhury
ted into is “mystream”. However, DataStream outputStream = see.addSource(new FlinkKinesisConsumer<>("myStream", new MyDeserializerSchema(), consumerConfig)); you seem to be consuming from “myStream”. Could the capital “S” be the issue? Cheers, Gordon On 24 April 201

put record to kinesis and then trying consume using flink connector

2017-04-23 Thread Sathi Chowdhury
Hi Flink Dev, I thought something will work easily with flink and it is simple enough ,yer I am struggling to make it work. I am using flink kinesis connector ..using 1.3-SNAPSHOT version. Basically I am trying to bootstrap a stream with one event pushed into it as a warmup inside flink job’s

Re: Flink errors out and job fails--IOException from CollectSink.open()

2017-04-15 Thread Sathi Chowdhury
) at com.esotericsoftware.kryo.io.Output.flush(Output.java:163) ... 17 more Thanks From: Sathi Chowdhury <sathi.chowdh...@elliemae.com> Date: Friday, April 14, 2017 at 1:29 AM To: "user@flink.apache.org" <user@flink.apache.org> Subject: Re: Flink errors out and job fails--IOException from Co

Re: Flink errors out and job fails--IOException from CollectSink.open()

2017-04-13 Thread Sathi Chowdhury
et.Socket.(Socket.java:244) at org.apache.flink.contrib.streaming.CollectSink.open(CollectSink.java:75) ... 6 more From: Ted Yu <yuzhih...@gmail.com> Date: Thursday, April 13, 2017 at 6:01 PM To: Sathi Chowdhury <sathi.chowdh...@elliemae.com> Cc: "user@flink.apache.o

Re: Flink errors out and job fails--IOException from CollectSink.open()

2017-04-13 Thread Sathi Chowdhury
) (8a7301a437cb2d052208ee42c994104b). From: Sathi Chowdhury <sathi.chowdh...@elliemae.com> Date: Thursday, April 13, 2017 at 5:44 PM To: Ted Yu <yuzhih...@gmail.com> Cc: "user@flink.apache.org" <user@flink.apache.org> Subject: Re: Flink errors out and job fails--IOException fro

Re: Flink errors out and job fails--IOException from CollectSink.open()

2017-04-13 Thread Sathi Chowdhury
.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:343) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:655) at java.lang.Thread.run(Thread.java:745) Thanks Sathi From: Ted Yu <yuzhih...@gmail.com> Date: Thursday, April 13, 20

Flink errors out and job fails--IOException from CollectSink.open()

2017-04-13 Thread Sathi Chowdhury
Has some one encountered this error …as I am using DataStream api to read from a kinesis stream .This happens intermittently and flink job dies. reamShard{streamName='dev-ingest-kinesis-us-west-2', shard='{ShardId: shardId-0009,HashKeyRange: {StartingHashKey:

elasticsearch version compatibility with 1.3-SNAPSHOT version of flink

2017-04-02 Thread Sathi Chowdhury
In order to use latest kibana,I wanted to know if I can use elasticsearch 5.x and inte4gratge it with flink-connector-elasticsearch_2.10 version 1.3-SNAPSHOT? Thanks Sathi =Notice to Recipient: This e-mail transmission, and any documents, files or previous e-mail messages

s3 read while writing inside sink function

2017-03-30 Thread Sathi Chowdhury
Hi fellow Flink enthusiasts, I am trying figure out a recommended way to read s3 data while I am trying to write to s3 using BucketingSink. BucketingSink s3Sink = new BucketingSink("s3://" + entityBucket + "/") .setBucketer(new EntityCustomBucketer()) .setWriter(new

Re: Data stream to write to multiple rds instances

2017-03-02 Thread Sathi Chowdhury
es not take part in Flink's checkpointing mechanism. Unfortunately, Flink does not have a streaming JDBC connector, yet. Cheers, Till On Thu, Mar 2, 2017 at 7:21 AM, Sathi Chowdhury <sathi.chowdh...@elliemae.com<mailto:sathi.chowdh...@elliemae.com>> wrote: Hi All, Is there any

Re: kinesis producer setCustomPartitioner use stream's own data

2017-03-02 Thread Sathi Chowdhury
in mind? Cheers, Gordon On February 21, 2017 at 11:54:21 AM, Sathi Chowdhury (sathi.chowdh...@elliemae.com<mailto:sathi.chowdh...@elliemae.com>) wrote: Hi flink users and experts, In my flink processor I am trying to use Flink Kinesis connector . I read from a kinesis

NPE while writing to s3://

2017-03-02 Thread Sathi Chowdhury
I get the NPE from the below code I am running this from my mac in a local flink cluster. RollingSink s3Sink = new RollingSink("s3://sc-sink1/"); s3Sink.setBucketer(new DateTimeBucketer("-MM-dd--HHmm")); s3Sink.setWriter(new StringWriter()); s3Sink.setBatchSize(200);

Data stream to write to multiple rds instances

2017-03-01 Thread Sathi Chowdhury
Hi All, Is there any preferred way to manage multiple jdbc connections from flink..? I am new to flink and looking for some guidance around the right pattern and apis to do this. The usecase needs to route a stream to a particular jdbc connection depending on a field value.So the records are

Re: Flink Application Log :Custom Log properties

2017-02-28 Thread Sathi Chowdhury
Thanks Sunil, this is also a question I wanted to ask to the forum. How to separate application log (aggregated to a file ) just similar to yarn log aggregation but capture flink process related logs from all workers and get it in one central location. I would love to hear suggestion and

kinesis producer setCustomPartitioner use stream's own data

2017-02-20 Thread Sathi Chowdhury
Hi flink users and experts, In my flink processor I am trying to use Flink Kinesis connector . I read from a kinesis stream , and After the transformation (for which I use RichCoFlatMapFunction), json event needs to sink to a kinesis stream k1. DataStream myStream = see.addSource(new

Re: broadcasting a stream from a collection that is populated from an web service call

2017-02-02 Thread Sathi Chowdhury
job. If that is not feasible for you, then you can also write your own custom source function which queries the REST endpoint and whenever it receives new data it will send the data to its downstream operators. Cheers, Till On Thu, Feb 2, 2017 at 10:27 PM, Sathi Chowdhury <sathi.c

broadcasting a stream from a collection that is populated from an web service call

2017-02-02 Thread Sathi Chowdhury
It’s good to be here. I have a data stream coming from kinesis. I also have a list of hashmap which holds metadata that needs to participate in the processing. In my flink processor class I construct this metadata (hardcoded) public static void main(String[] args) throws Exception { …….//