Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/2393
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enab
Github user marmbrus commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58951776
Thanks! Merged to master.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58605205
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/334/consoleFull)
for PR 2393 at commit
[`3a6511f`](https://github.com/
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58602694
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/334/consoleFull)
for PR 2393 at commit
[`3a6511f`](https://github.com/a
Github user chenghao-intel commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58602197
Thank you @marmbrus , you're right!
I've just rebased the latest master, and the temp directory will be deleted
after unit test exit locally.
So, I thi
Github user marmbrus commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58601564
Also I merged #2670.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this f
Github user marmbrus commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58601541
I'm not super tied to the temp dirs with prefix/suffix (though that is kind
of nice). I'm find changing that code to use whatever standard mechanisms the
rest of spark
Github user chenghao-intel commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58475061
`TestHive` creates temporal files/directories by specifying the prefix /
suffix, it will be great if `Utils` support that as well, the code will be even
more clean
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58471362
Ah, right. This is the shutdown hook:
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L257
... but it require
Github user chenghao-intel commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58471183
Actually, I searched the code, seems we haven't set the shutdown hook for
`Utils`, you mean I have to do that myself ? I was thinking it should be set
properly som
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58471062
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/2
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58470882
Perhaps you can debug a bit first to see if the shutdown hook is called and
it attempts to delete the dir? is there are error while deleting it? this
mechanism appears to
Github user chenghao-intel commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58470589
@srowen thank you very much, this is quite informative.
I've updated the code with `Utils.registerShutdownDeleteDir`, the code is
very clean now! However, the t
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58465412
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/2
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58465409
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21522/consoleFull)
for PR 2393 at commit
[`ba1797a`](https://github.com/a
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58463679
Sorry for the late reply as well, but -1 since this introduces again
Commons IO `FileUtils` which broke the build recently. Use
`Utils.deleteRecursively`.
Why a
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58463076
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/2
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58462948
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21522/consoleFull)
for PR 2393 at commit
[`ba1797a`](https://github.com/ap
Github user chenghao-intel commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-58462878
Sorry for late reply. Thank you @marmbrus , it was quite error-pone by
using the `HiveInterruptUtils`. I've updated the code with our own callback
function.
---
Github user marmbrus commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-56569695
@chenghao-intel any thoughts on avoiding the use of InterruptUtils? I'd
like to avoid relying on more Hive APIs that we have to.
---
If your project is set up for it,
Github user marmbrus commented on a diff in the pull request:
https://github.com/apache/spark/pull/2393#discussion_r17814084
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
---
@@ -69,11 +82,22 @@ class TestHiveContext(sc: SparkContext) extends
HiveConte
Hm deleteOnExit should at least not hurt and I thought it will delete dirs
if they are empty, which may be so if temp files inside never existed or
were cleaned up themselves. But yeah always delete explicitly in the normal
execution path even in the event of normal exceptions.
On Sep 19, 2014 3:00
Github user mattf commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-56127248
+1 lgtm
fyi, i checked, deleteOnExit isn't an option because it cannot recursively
delete
---
If your project is set up for it, you can reply to this email and ha
Github user chenghao-intel commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-56124852
Thank you all, I've removed the `Signal` and use the
`Utils.deleteRecursively` instead.
---
If your project is set up for it, you can reply to this email and have
Github user liancheng commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-55969364
+1 for the `deleteOnExit`/`deleteRecursively` pattern.
@mattf According to its
[Javadoc](http://docs.oracle.com/javase/7/docs/api/java/io/File.html#deleteOnExit
Github user mattf commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-55781471
+1 @srowen, using signal is too heavy handed
i'm skeptical that the jvm can guarantee dir removal on failure (say kill
-9 or a jvm segv). those cases are hopefully
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-55728033
@chenghao-intel For example, in `FileServerSuite`:
```
override def beforeAll() {
super.beforeAll()
tmpDir = Files.createTempDir()
tmp
Github user chenghao-intel commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-55694942
Thank you @srowen I think you're right, we should provide a mechanism to
delete all of temp files in the test framework, not just for SQL on exit. I
will investiga
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/2393#discussion_r17533627
--- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
---
@@ -41,7 +49,27 @@ import org.apache.spark.sql.SQLConf
import scala.collect
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-55570548
[QA tests have
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/20332/consoleFull)
for PR 2393 at commit
[`4ecc9d4`](https://github.com/a
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2393#issuecomment-55563405
[QA tests have
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/20332/consoleFull)
for PR 2393 at commit
[`4ecc9d4`](https://github.com/ap
GitHub user chenghao-intel opened a pull request:
https://github.com/apache/spark/pull/2393
[SPARK-3529] [SQL] Delete the temp files after test exit
There are lots of temporal files created by TestHive under the /tmp by
default, which may cause potential performance issue for testin
32 matches
Mail list logo