Hi guys,

I have the following use case. Every day a new file is created and
periodically some log records are appended to it. I am reading the file in
the following way:

executionEnvironment.readFile(format, directoryPath, PROCESS_CONTINUOUSLY,
period.toMilliseconds(),filePathFilter);

However, Flink takes modified files as new files and consequently all the
content of the modified file gets processed again. I know that a solution is
to process the file until it contains all the records of the day but I will
like to process the file continuously. Therefore, I am wondering if there is
a way of processing just the new records in a file?

Thank you in advance! :)
Nancy 







--
View this message in context: 
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/readFile-Continuous-file-processing-tp11384.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at 
Nabble.com.

Reply via email to