Hello all.

Some clarification: locally everything works great.
However once we run our Flink on remote linux machine and try to run the
client program from our machine, using create remote environment- Flink
JobManager is raising this exception

On Thu, Aug 27, 2015 at 7:41 PM, Stephan Ewen <se...@apache.org> wrote:

> If you start the job via the "bin/flink" script, then simply use
> "ExecutionEnvironment.getExecutionEnvironment()" rather then creating a
> remote environment manually.
>
> That way, hosts and ports are configured automatically.
>
> On Thu, Aug 27, 2015 at 6:39 PM, Robert Metzger <rmetz...@apache.org>
> wrote:
>
>> Hi,
>>
>> Which values did you use for FLINK_SERVER_URL and FLINK_PORT?
>> Every time you deploy Flink on YARN, the host and port change, because the
>> JobManager is started on a different YARN container.
>>
>>
>> On Thu, Aug 27, 2015 at 6:32 PM, Hanan Meyer <ha...@scalabill.it> wrote:
>>
>> > Hello All
>> >
>> > When using Eclipse IDE to submit Flink to Yarn single node cluster I'm
>> > getting :
>> > "org.apache.flink.client.program.ProgramInvocationException: Failed to
>> > resolve JobManager"
>> >
>> > Using Flink 0.9.0
>> >
>> > The Jar copy a file from one location in Hdfs to another and works fine
>> > while executed locally on the single node Yarn cluster -
>> > bin/flink run -c Test ./examples/MyJar.jar
>> > hdfs://localhost:9000/flink/in.txt hdfs://localhost:9000/flink/out.txt
>> >
>> > The code skeleton:
>> >
>> >     ExecutionEnvironment envRemote =
>> > ExecutionEnvironment.createRemoteEnvironment
>> > (FLINK_SERVER_URL,FLINK PORT,JAR_PATH_ON_CLIENT);
>> > DataSet<String> data =
>> > envRemote.readTextFile("hdfs://localhost:9000/flink/in.txt");
>> > data.writeAsText("hdfs://localhost:9000/flink/out.txt");
>> > envRemote.execute();
>> >
>> >
>> > Please advise,
>> >
>> > Hanan Meyer
>> >
>>
>
>

Reply via email to