I think you can try the application mode[1].

[1] 
https://ci.apache.org/projects/flink/flink-docs-master/deployment/#application-mode

Best,
Yangze Guo

On Tue, Jan 12, 2021 at 5:23 PM bat man <tintin0...@gmail.com> wrote:
>
> Thanks Yangze Gua.
> Is there a way these can be redirected to a yarn logs.
>
> On Tue, 12 Jan 2021 at 2:35 PM, Yangze Guo <karma...@gmail.com> wrote:
>>
>> The main function of your WordCountExample is executed in your local
>> environment. So, the logs you are looking for ("Entering
>> application.") are be located in your console output and the "log/"
>> directory of your Flink distribution.
>>
>> Best,
>> Yangze Guo
>>
>> On Tue, Jan 12, 2021 at 4:50 PM bat man <tintin0...@gmail.com> wrote:
>> >
>> > Hi,
>> >
>> > I am running a sample job as below -
>> >
>> > public class WordCountExample {
>> > static Logger logger = LoggerFactory.getLogger(WordCountExample.class);
>> >
>> > public static void main(String[] args) throws Exception {
>> > final ExecutionEnvironment env = 
>> > ExecutionEnvironment.getExecutionEnvironment();
>> >
>> > logger.info("Entering application.");
>> >
>> > DataSet<String> text = env.fromElements(
>> > "Who's there?",
>> > "I think I hear them. Stand, ho! Who's there?");
>> >
>> > List<Integer> elements = new ArrayList<Integer>();
>> > elements.add(0);
>> >
>> >
>> > DataSet<TestClass> set = env.fromElements(new TestClass(elements));
>> >
>> > DataSet<Tuple2<String, Integer>> wordCounts = text
>> > .flatMap(new LineSplitter())
>> > .withBroadcastSet(set, "set")
>> > .groupBy(0)
>> > .sum(1);
>> >
>> > wordCounts.print();
>> >
>> > logger.info("Processing done");
>> >
>> > //env.execute("wordcount job complete");
>> >
>> > }
>> >
>> > public static class LineSplitter implements FlatMapFunction<String, 
>> > Tuple2<String, Integer>> {
>> >
>> > static Logger loggerLineSplitter = 
>> > LoggerFactory.getLogger(LineSplitter.class);
>> >
>> > @Override
>> > public void flatMap(String line, Collector<Tuple2<String, Integer>> out) {
>> > loggerLineSplitter.info("Logger in LineSplitter.flatMap");
>> > for (String word : line.split(" ")) {
>> > out.collect(new Tuple2<String, Integer>(word, 1));
>> > }
>> > }
>> > }
>> >
>> > public static class TestClass implements Serializable {
>> > private static final long serialVersionUID = -2932037991574118651L;
>> >
>> > static Logger loggerTestClass = 
>> > LoggerFactory.getLogger("WordCountExample.TestClass");
>> >
>> > List<Integer> integerList;
>> > public TestClass(List<Integer> integerList){
>> > this.integerList=integerList;
>> > loggerTestClass.info("Logger in TestClass");
>> > }
>> >
>> >
>> > }
>> > }
>> >
>> > When run in IDE I can see the logs from main class i.e. statements like 
>> > below in console logs -
>> >
>> > 13:40:24.459 [main] INFO  com.flink.transform.WordCountExample - Entering 
>> > application.
>> > 13:40:24.486 [main] INFO  WordCountExample.TestClass - Logger in TestClass
>> >
>> >
>> > When run on Yarn with command - flink run -m yarn-cluster  -c 
>> > com.flink.transform.WordCountExample rt-1.0-jar-with-dependencies.jar
>> >
>> > I only see the flatmap logging statements like -
>> > INFO  com.flink.transform.WordCountExample$LineSplitter - Logger in 
>> > LineSplitter.flatMap
>> > INFO  com.flink.transform.WordCountExample$LineSplitter - Logger in 
>> > LineSplitter.flatMap
>> >
>> > I have checked the jobmanager and taskmanager logs from yarn in EMR.
>> >
>> > This is my log4j.properties from EMR cluster
>> >
>> > log4j.rootLogger=INFO,file,elastic
>> >
>> > # Config ES logging appender
>> > log4j.appender.elastic=com.letfy.log4j.appenders.ElasticSearchClientAppender
>> > log4j.appender.elastic.elasticHost=http://<>:9200
>> > log4j.appender.elastic.hostName=<>
>> > log4j.appender.elastic.applicationName=<>
>> >
>> > # more options (see github project for the full list)
>> > log4j.appender.elastic.elasticIndex=<>
>> > log4j.appender.elastic.elasticType=<>
>> >
>> > # Log all infos in the given file
>> > log4j.appender.file=org.apache.log4j.FileAppender
>> > log4j.appender.file.file=${log.file}
>> > log4j.appender.file.append=false
>> > log4j.appender.file.layout=org.apache.log4j.PatternLayout
>> > log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} 
>> > %-5p %-60c %x - %m%n
>> >
>> > # suppress the irrelevant (wrong) warnings from the netty channel handler
>> > log4j.logger.org.jboss.netty.channel.DefaultChannelPipeline=ERROR,file
>> >
>> >
>> > How can I access main driver logs when run on yarn as master.
>> >
>> > Thanks,
>> > Hemant
>> >
>> >
>> >
>> >

Reply via email to