GitHub user assia6 opened a pull request:
https://github.com/apache/spark/pull/20092
Memory and disc usage
Hi. I want to get memory and disk usage while running pagerenk aplication,
so I wonder to do some manipulations using the code that you proposed.
The problem that i have a prebuilt installation of spark and i don t have
core folder inside. only bin, conf, sbin, data, jars .... folders
Can you please tell me how can I put or extend classes inside my spark
installation?
How can i get storage memory and execution memory measures even after the
execution finished?
Thank you
Please review http://spark.apache.org/contributing.html before opening a
pull request.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/ThySinner/spark master
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/20092.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #20092
----
commit 8198e90208b9795185d0df0bf8af6cd6780e8bda
Author: Takuya UESHIN <ueshin@...>
Date: 2016-10-19T09:06:43Z
[SPARK-17985][CORE] Bump commons-lang3 version to 3.5.
## What changes were proposed in this pull request?
`SerializationUtils.clone()` of commons-lang3 (<3.5) has a bug that breaks
thread safety, which gets stack sometimes caused by race condition of
initializing hash map.
See https://issues.apache.org/jira/browse/LANG-1251.
## How was this patch tested?
Existing tests.
Author: Takuya UESHIN <[email protected]>
Closes #15548 from ueshin/issues/SPARK-17985.
commit 054fa811f22d5b51338326598fa4266fef0f7ad0
Author: Shixiong Zhu <shixiong@...>
Date: 2016-12-14T02:36:36Z
[SPARK-18588][TESTS] Ignore KafkaSourceStressForDontFailOnDataLossSuite
## What changes were proposed in this pull request?
Disable KafkaSourceStressForDontFailOnDataLossSuite for now.
## How was this patch tested?
Jenkins
Author: Shixiong Zhu <[email protected]>
Closes #16275 from zsxwing/ignore-flaky-test.
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]