Steve Loughran commented on SPARK-14561:

To clarify: it's not changes in existing files that aren't showing up, *it is 
new files added to the same destination directory*

If that's the case, something is up with the scanning

#. set the logging of  org.apache.spark.deploy.history.FsHistoryProvider  to 
# have a look at the scan interval. Is it too long? 

> History Server does not see new logs in S3
> ------------------------------------------
>                 Key: SPARK-14561
>                 URL: https://issues.apache.org/jira/browse/SPARK-14561
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.6.1
>            Reporter: Miles Crawford
> If you set the Spark history server to use a log directory with an s3a:// 
> url, everything appears to work fine at first, but new log files written by 
> applications are not picked up by the server.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to