e.org>
Cc: "dyana.rose" <dyana.r...@salecycle.com>, "Bajaj, Abhinav"
<abhinav.ba...@here.com>
Subject: Re: Unable to load AWS credentials: Flink 1.2.1 + S3 + Kubernetes
Hi!
This is pretty much all in Hadoop's magic, from Flink's view, once this h
; We need to support Flink 1.2.1 for now.
>
>
>
> Thanks for your response.
>
>
>
> ~ Abhinav
>
>
>
>
>
>
>
> *From: *Stephan Ewen <se...@apache.org>
> *Date: *Thursday, March 29, 2018 at 2:30 AM
> *To: *"dyana.rose" <d
~ Abhinav
From: Stephan Ewen <se...@apache.org>
Date: Thursday, March 29, 2018 at 2:30 AM
To: "dyana.rose" <dyana.r...@salecycle.com>
Cc: user <user@flink.apache.org>
Subject: Re: Unable to load AWS credentials: Flink 1.2.1 + S3 + Kubernetes
Using AWS credentials wi
Using AWS credentials with Kubernetes are not trivial. Have you looked at
AWS / Kubernetes docs and projects like https://github.com/jtblin/kube2iam
which bridge between containers and AWS credentials?
Also, Flink 1.2.1 is quite old, you may want to try a newer version. 1.4.x
has a bit of an
Hiya,
This sounds like it may be similar to the issue I had when running on ECS. Take
a look at my ticket for how I got around this, and see if it's any help:
https://issues.apache.org/jira/browse/FLINK-8439
Dyana
On 2018/03/28 02:15:06, "Bajaj, Abhinav" wrote:
> Hi,
Hi,
I am trying to use Flink 1.2.1 with RockDB as statebackend and S3 for
checkpoints.
I am using Flink 1.2.1 docker images and running them in Kubernetes cluster.
I have followed the steps documented in the Flink documentation -