Problem with Amazon S3

2015-03-31 Thread pietro
Dear all, I have been developing a Flink application that has to run on Amazon Elastic Map Reduce. For convenience the data that the application has to read and write are on the S3. But, I have not been able to access S3 .This is the error I got:

Re: Problem with Amazon S3

2015-03-31 Thread pietro
Thank you Ufuk! That helped a lot. But I have an other problem now. Am I missing something? Caused by: java.net.UnknownHostException: MYBUCKETNAME at java.net.InetAddress.getAllByName0(InetAddress.java:1250) at java.net.InetAddress.getAllByName(InetAddress.java:1162)

Re: Problem with Amazon S3

2015-03-31 Thread Ufuk Celebi
Hey Pietro! You have to add the following lines to your flink-conf.yaml: fs.s3.accessKey: YOUR ACCESS KEY fs.s3.secretKey: YOUR SECRET KEY I will fix the error message to include a hint on how to configure this correctly. – Ufuk On Tue, Mar 31, 2015 at 10:53 AM, pietro

Re: GC on taskmanagers

2015-03-31 Thread Maximilian Michels
Hi Emmanuel, In Java, the garbage collector will always run periodically. So remotely executing it won't make any difference. If you want to reuse the existing Java process without restarting it, you have to stop the program code from executing which is causing the OutOfMemoryError. Usually,

RE: GC on taskmanagers

2015-03-31 Thread Emmanuel
Max, Thanks for the answer... What I am saying is that my program is not running indeed, yet it doesn't seem garbage collection occurs after cancelling the job. is you saw in the log, the memory is still 99% used even though I cancelled the job, and I cannot seem to run another job. I've had