Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
ping @rxin Appreciate your feedback if you can let me know what you think
of my suggestion. Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
Forget to say, of course, this example will thrown the exception only
running in "test".
Other developers would possibly encounter this when they write test codes
in the future. If we could
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
The test case I added in this pr:
val rng = new scala.util.Random(42)
val data = sparkContext.parallelize(Seq.tabulate(100) { i =>
Row(Array.fill(10)(rng.nextInt(10)))
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15916
Can you show an example of a leak that would happen in Executor but not in
the callback? Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
@rxin BTW, I see you merged #15989 to downgrade error message level in
TaskMemoryManager.
I'd like to modify the error message in Executor too, because the current
one is little confusing
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
@rxin Thanks. Appreciate your feedback. I could close this now.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15916
No it's not needed if you are using exactly the same way as the catch-all
to release memory. It is basically just the catch-all itself.
---
If your project is set up for it, you can reply to this
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
Hello @rxin @hvanhovell Can you let me know if this is needed or not? So I
can close it. Thank you.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
@hvanhovell @rxin What do you think? Please let me know if we need this or
not. Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/15916
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/15916
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/68824/
Test PASSed.
---
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/15916
**[Test build #68824 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68824/consoleFull)**
for PR 15916 at commit
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
Yeah, I see. As I said in previous comment, the memory is released at the
end anyway.
I would guess the default setting as true is to find potential memory leak
during development. So turn
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15916
I'd set it maybe to false. You are just adding a completion listener, which
is the same as taskMemoryManager.cleanUpAllAllocatedMemory anyway ...
---
If your project is set up for it, you can reply
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/15916
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/68818/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/15916
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/15916
**[Test build #68818 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68818/consoleFull)**
for PR 15916 at commit
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
Of course it is not actually a real memory leak because the memory is
released at the end by calling `taskMemoryManager.cleanUpAllAllocatedMemory` in
`Executor`. But with
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/15916
**[Test build #68824 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68824/consoleFull)**
for PR 15916 at commit
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
@rxin An exception will be thrown, not just a warning message. This
exception is thrown at `Executor` after it calls
`taskMemoryManager.cleanUpAllAllocatedMemory` and finds there are memory not
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15916
What do you mean by "report"? A warning message was logged? If a warning
message was logged, it is generated by the callback itself which just releases
the memory.
---
If your project is set up for
Github user viirya commented on the issue:
https://github.com/apache/spark/pull/15916
@rxin Yeah, the added test case will report memory leak failure.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user rxin commented on the issue:
https://github.com/apache/spark/pull/15916
Did you actually see an issue with memory leak?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/15916
**[Test build #68818 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68818/consoleFull)**
for PR 15916 at commit
24 matches
Mail list logo