Hi Jens,

If I understand correctly, you are looking for a method to submit
jar-packaged Flink jobs to the job manager. If you have the job
manager running at "example.com" on port 6123, then you can a submit a
job like this:


import org.apache.flink.client.RemoteExecutor;
import org.apache.flink.client.program.Client;
import org.apache.flink.client.program.PackagedProgram;
import org.apache.flink.client.program.ProgramInvocationException;
import org.apache.flink.configuration.Configuration;

import java.io.File;
import java.net.InetSocketAddress;

public class MyTestsJava {
    public static void main(String[] args) {
        String path = "/path/to/flink/job.jar";
        String in = "/tmp/input";
        String out = "/tmp/output";
        submitJar(path, in, out);
    }

    public static void submitJar(String path, String... args){

        File file = new File(path);

        int parallelism = 1;
        boolean wait = true;

        try {
            PackagedProgram program = new PackagedProgram(file, args);
            InetSocketAddress jobManagerAddress =
RemoteExecutor.getInetFromHostport("example.com:6123");

            Client client = new Client(jobManagerAddress, new
Configuration(), program.getUserCodeClassLoader());

            System.out.println("Executing " + path);
            client.run(program, parallelism, wait);

        } catch (ProgramInvocationException e) {
            e.printStackTrace();
        }
    }
}


If you have set up Flink correctly, you can also access HDFS in the
Flink job. Let me know if this is what you had in mind.

Best regards,
Max

Reply via email to