Github user sarutak closed the pull request at:
https://github.com/apache/spark/pull/1580
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user jerryshao commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50226383
IMHO, I think it would be better to let it fail when this situation
happens, fast fail is better than trying to recover, I think :).
---
If your project is set up for
Github user sarutak commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50230269
I think, it's sometimes depends on the kinds of error.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50246267
Actually we have also seen this happen multiple times.
A few have them have been fixed, but not all have been identified.
For example, there is incorrect DCL
Github user sarutak commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50174987
@colorant @jerryshao Normally, non exist directory doesn't happen. I aim at
recovering rom operational miss or some accidents.
In my experience, system operator
Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50184666
But the point above was that the code that creates this object goes through
DiskBlockManager.getFile, which already creates any non-existent directories.
So I don't think
Github user sarutak commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50215042
Parent directories named spark-local-* were deleted before shuffle, you can
see the stack trace like this.
java.io.FileOutputStream.open(Native
GitHub user sarutak opened a pull request:
https://github.com/apache/spark/pull/1580
[SPARK-2671] BlockObjectWriter should create parent directory when the
directory doesn't exist
You can merge this pull request into a Git repository by running:
$ git pull
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50065852
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user colorant commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50098663
Just curious, In which case this non exist directory might happen in
current spark code path? ;)
---
If your project is set up for it, you can reply to this email and
Github user jerryshao commented on the pull request:
https://github.com/apache/spark/pull/1580#issuecomment-50109613
Usually File is gotten through `DiskBlockManger/getFile()`, so parent
directory will be create in `getFile()`, I think you needn't worry about the
parent directory if
11 matches
Mail list logo