GitHub user brkyvz opened a pull request:

    https://github.com/apache/spark/pull/9285

    [SPARK-11324] Flag for closing Write Ahead Logs after a write

    Currently the Write Ahead Log in Spark Streaming flushes data as writes 
need to be made. S3 does not support flushing of data, data is written once the 
stream is actually closed.
    In case of failure, the data for the last minute (default rolling interval) 
will not be properly written. Therefore we need a flag to close the stream 
after the write, so that we achieve read after write consistency.
    
    cc @tdas @zsxwing 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/brkyvz/spark caw-wal

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/9285.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #9285
    
----
commit f80910448e4b0b4d10e79e89921d9116ef0872aa
Author: Burak Yavuz <[email protected]>
Date:   2015-10-25T04:34:25Z

    close after write

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to