GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/22167
[SPARK-25170][DOC] Add list and short description of Spark Executor Task
Metrics to the documentation
## What changes were proposed in this pull request?
Add description of Task
GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/22217
[SPARK-25228][CORE]Add executor CPU time metric
## What changes were proposed in this pull request?
Add a new metric to measure the executor's process (JVM) CPU time
## Ho
Github user LucaCanali closed the pull request at:
https://github.com/apache/spark/pull/22217
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/22218
[SPARK-25228][CORE]Add executor CPU time metric.
## What changes were proposed in this pull request?
Add a new metric to measure the executor's process (JVM) CPU time.
#
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22279
@jerryshao would you have any additional comments on this?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/19426
Thanks @hvanhovell for pushing the test. I see that the test build threw
an error on Scala style tests: the logs report "/Executor.scala:443:0:
Whitespace at end of line". However
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/19426
Thanks @umehrot2 for the review and comments. Indeed well spotted that I
had forgotten a couple of metrics and added one of them twice. This is
hopefully fixed with the latest commit. As for
GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/19426
[SPARK-22190][CORE] Add Spark executor task metrics to Dropwizard metrics
## What changes were proposed in this pull request?
This proposed patch is about making Spark executor task
GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/19039
Add feature to permanently blacklist a user-specified list of nodes, â¦
â¦SPARK-21829
## What changes were proposed in this pull request?
With this new feature I propose to
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/19039
Thanks @jiangxb1987 for the review. I have tried to address the comments in
a new commit, in particular adding the configuration to internal/config and
building a private function to handle
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/19039
@jiangxb1987 Indeed good suggestion by @jerryshao - I have replied on
SPARK-21829
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/18724
[SPARK-21519][SQL] Add an option to the JDBC data source to initialize the
target DB environment
Add an option to the JDBC data source to initialize the environment of the
remote database
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/18724
Thank you very much @gatorsmile for the review. I plan to provide the
required changes and add a test case, however it is probably going to take one
more week before I can do that.
---
If
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/18724#discussion_r132679699
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/jdbc/JDBCSuite.scala
---
@@ -1007,4 +1007,23 @@ class JDBCSuite extends SparkFunSuite
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22218#discussion_r212933292
--- Diff:
core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala ---
@@ -73,6 +75,13 @@ class ExecutorSource(threadPool: ThreadPoolExecutor
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22218#discussion_r212939411
--- Diff:
core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala ---
@@ -17,11 +17,13 @@
package org.apache.spark.executor
GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/22279
[SPARK-25277][YARN] YARN applicationMaster metrics should not register
static metrics
## What changes were proposed in this pull request?
YARN applicationMaster metrics registration
GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/22290
[SPARK-25285][CORE] Add executor task metrics, successfulTasks and
threadpool.startedTasks
## What changes were proposed in this pull request?
The motivation for these additional
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22279
Hi @jerryshao you can find here below an example of metrics currently
reported by applicationMaster, illustrating the issue reported here. You can
find there the list of AM metrics reported
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22279#discussion_r214289112
--- Diff: core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala
---
@@ -103,6 +103,14 @@ private[spark] class MetricsSystem private
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22218#discussion_r214393554
--- Diff:
core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala ---
@@ -17,11 +17,13 @@
package org.apache.spark.executor
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22218
I have refactored the code now using the BeanServer which should address
the comments about availability of com.sun.management.OperatingSystemMXBean
across different JDKs
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22218
I have implemented the changes as from the latest comments, namely inlined
the method.
---
-
To unsubscribe, e-mail: reviews
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22218#discussion_r214521537
--- Diff:
core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala ---
@@ -73,6 +75,29 @@ class ExecutorSource(threadPool: ThreadPoolExecutor
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22218
I have implemented the changes as from the latest comments by @maropu and
@srowen
---
-
To unsubscribe, e-mail: reviews
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22218
Thanks @srowen
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22218#discussion_r214549896
--- Diff:
core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala ---
@@ -73,6 +76,28 @@ class ExecutorSource(threadPool: ThreadPoolExecutor
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22218#discussion_r214605458
--- Diff:
core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala ---
@@ -73,6 +76,28 @@ class ExecutorSource(threadPool: ThreadPoolExecutor
GitHub user LucaCanali opened a pull request:
https://github.com/apache/spark/pull/22397
[SPARK-25170][DOC] Add list and short description of Spark Executor Task
Metrics to the documentation.
## What changes were proposed in this pull request?
Add description of Executor
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22167
Thanks @kiszk for reviewing this. I have addressed your comments in a new
commit +
apologies as I have now moved this to a new PR
https://github.com/apache/spark/pull/22397
I am closing
Github user LucaCanali closed the pull request at:
https://github.com/apache/spark/pull/22167
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22397
Thanks @srowen for reviewing this. The metrics are commented in the source
code of TaskMetrics class, I took most of the descriptions from there, adding
some additional explanations where needed
Github user LucaCanali commented on the issue:
https://github.com/apache/spark/pull/22279
@attilapiros would you be interested to review this as a follow-up of your
work on [SPARK-24594][YARN] Introducing metrics for YARN
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22279#discussion_r225088430
--- Diff: core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala
---
@@ -103,6 +103,14 @@ private[spark] class MetricsSystem private
Github user LucaCanali commented on a diff in the pull request:
https://github.com/apache/spark/pull/22279#discussion_r225280136
--- Diff: core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala
---
@@ -103,6 +103,14 @@ private[spark] class MetricsSystem private
35 matches
Mail list logo