Hi Robert,

Thanks for the prompt reply. I'm using the IterateExample from Flink examples. In the yarn log I get entries for the YarnJobManager and ExecutionGraph, but I was wondering if there is a way to push all the logging that the client produces into the yarn log. Including the System.out calls. Is there a way to modify the example to use a logging framework to achieve it?

Also when I submit the program using the Client runBlocking method, although I see in the taskmanager and jobmanager log that the application has finished, the runBlocking method does not return. Should I call it in a separate thread?

Cheers,
Theofilos

On 2016-06-10 22:12, Robert Metzger wrote:
Hi Theofilos,

how exactly are you writing the application output?
Are you using a logging framework?
Are you writing the log statements from the open(), map(), invoke() methods or from some constructors? (I'm asking since different parts are executed on the cluster and locally).

On Fri, Jun 10, 2016 at 4:00 PM, Theofilos Kakantousis <t...@kth.se <mailto:t...@kth.se>> wrote:

    Hi all,

    Flink 1.0.3
    Hadoop 2.4.0

    When running a job on a Flink Cluster on Yarn, the application
    output is not included in the Yarn log. Instead, it is only
    printed in the stdout from where I run my program.  For the
    jobmanager, I'm using the log4j.properties file from the
    flink/conf directory. Yarn log aggregation is enabled and the
    YarnJobManager log is printed in the yarn log. The application is
    submitted by a Flink Client to the FlinkYarnCluster using a
    PackagedProgram.

    Is this expected behavior and if so, is there a way to include the
    application output in the Yarn aggregated log? Thanks!

    Cheers,
    Theofilos



Reply via email to