Hi, Soheil

Placing the log configuration file in the resource directory of the job's jar 
will not be used by Flink, because the log configuration is explicitly 
specified by the script under the bin directory of Flink and the bootstrap code 
(for example the BootstrapTools class). If you want to output the logs of Flink 
components (such as Client, JM and TM) to the non-default files, you should 
modify the log configuration file of Flink with reference to the document 
mentioned by Caizhi and Biao. Note that the underlying logging framework of 
Flink defaults to log4j, so by default you should modify "log4j*. properties" , 
and "logback*. xml" is the configuration files for logback.


But I guess you might want to specify the log file for the job instead of the 
Flink component. If so, one way is to create a custom root logger to achieve 
that, as shown in the following example code. The following code is for log4j, 
If you use logback and are interested in that, you can study it yourself. 


public static final class PrintLog implements MapFunction<String, String> {
private static final Logger LOG = CustomLogger.getLogger(Tokenizer.class);


@Override
public void map(String value) {
LOG.info("Custom Logger: " + value);
}
}


public static final class CustomLogger {
private static final Logger rootLogger = new Hierarchy(new 
RootLogger(Level.INFO)).getRootLogger();


static {
FileAppender customAppender = null;
try {
customAppender = new FileAppender(
new PatternLayout("%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n"),
new File(getLogPath(), "myjob.log").getAbsolutePath(),
false);


customAppender.setName("custom");
rootLogger.addAppender(customAppender);
} catch (IOException e) {
throw new RuntimeException(e);
}
}


public static Logger getLogger(Class clazz) {
return rootLogger.getLoggerRepository().getLogger(clazz.getName());
}


private static String getLogPath() {
String path = null;


Enumeration enumeration = Logger.getRootLogger().getAllAppenders();
while (enumeration.hasMoreElements()) {
Appender appender = (Appender) enumeration.nextElement();
if (appender instanceof FileAppender) {
path = new Path(((FileAppender) appender).getFile()).getParent().getPath();
break;
}
}
if (path == null || path.isEmpty()) {
path = new File("").getAbsolutePath();
}


return path;
}
}




Best,
Haibo

At 2019-07-19 11:21:45, "Biao Liu" <mmyy1...@gmail.com> wrote:

Hi Soheil,


> I was wondering if is it possible to save logs into a specified file?


Yes, of course.


> I put the following file in the resource directory of the project but it has 
> no effect


I guess because the log4j has a higher priority. In the document [1], it says 
"Users willing to use logback instead of log4j can just exclude log4j (or 
delete it from the lib/ folder)."


1. 
https://ci.apache.org/projects/flink/flink-docs-stable/monitoring/logging.html




Soheil Pourbafrani <soheil.i...@gmail.com> 于2019年7月19日周五 上午2:03写道:

Hi,



When we run the Flink application some logs will be generated about the 
running, in both local and distributed environment. I was wondering if is it 
possible to save logs into a specified file?


I put the following file in the resource directory of the project but it has no 
effect:
logback.xml


<configuration><appendername="file"class="ch.qos.logback.core.FileAppender"><file>flink_logs.txt</file><append>false</append><encoder><pattern>%d{HH:mm:ss.SSS}
 [%thread] %-5level %logger{60} %X{sourceThread} - 
%msg%n</pattern></encoder></appender><rootlevel="INFO"><appender-refref="file"/></root></configuration>

Reply via email to