Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9448#discussion_r43895579
--- Diff: core/src/main/scala/org/apache/spark/ui/UIUtils.scala ---
@@ -143,14 +143,10 @@ private[spark] object UIUtils extends Logging {
// Yarn
Github user vundela commented on the pull request:
https://github.com/apache/spark/pull/9448#issuecomment-153568315
Thanks for the review @vanzin, made the changes as you suggested.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/9448
[SPARK-11484][WebUi] Using proxyBase set by spark AM
Use the proxyBase set by the AM, if not found then use env. This is to fix
the issue if somebody accidentally set APPLICATION_WEB_PROXY_BASE
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9118#discussion_r42043365
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -340,6 +340,14 @@ private[spark] class Client(
"for alterna
Github user vundela commented on the pull request:
https://github.com/apache/spark/pull/9118#issuecomment-148163495
@vanzin can you please review the pull request.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9118#discussion_r42043343
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -340,6 +340,14 @@ private[spark] class Client(
"for alterna
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/9118
[SPARK-11105][yarn] Distribute log4j.properties to executors
Currently log4j.properties file is not uploaded to executor's which is
leading them to use the default values. This fix will make sure
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9118#discussion_r42043730
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -340,6 +340,14 @@ private[spark] class Client(
"for alterna
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9118#discussion_r42058235
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -340,6 +340,14 @@ private[spark] class Client(
"for alterna
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9118#discussion_r42058190
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -340,6 +340,14 @@ private[spark] class Client(
"for alterna
Github user vundela commented on the pull request:
https://github.com/apache/spark/pull/9118#issuecomment-149328542
@tgravescs
Thanks for the review. Yes, it make sense to update the documentation.
---
If your project is set up for it, you can reply to this email and have
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9118#discussion_r42503914
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -497,6 +497,19 @@ private[spark] class Client(
*/
private def
Github user vundela commented on the pull request:
https://github.com/apache/spark/pull/9118#issuecomment-149588730
@tgravescs Yes, it still works and user see a deprecated message.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9118#discussion_r42067281
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -340,6 +340,15 @@ private[spark] class Client(
"for alterna
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/9809
[SPARK-11799][CORE] Make it explicit in executor logs that uncaught eâ¦
â¦xceptions are thrown during executor shutdown
This commit will make sure that when uncaught exceptions
Github user vundela commented on the pull request:
https://github.com/apache/spark/pull/9809#issuecomment-157873004
Can you please retest?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/9866
[SPARK-11801][CORE] Notify driver when OOM is thrown before executor â¦
â¦JVM is killed
This fix try to make sure that task which caught OOM will update its status
to driver so
Github user vundela commented on the pull request:
https://github.com/apache/spark/pull/9866#issuecomment-158505231
Here is snippet of the messages in driver logs
15/11/19 16:31:23 INFO YarnAllocator: Canceling requests for 1 executor
containers
15/11/19 16:31:23 WARN
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/9809#discussion_r45260699
--- Diff:
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
---
@@ -46,6 +46,18 @@ private[spark] object
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/14989
[MINOR][SQL] Fixing the typo in unit test
## What changes were proposed in this pull request?
Fixing the typo in the unit test of CodeGenerationSuite.scala
## How
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/15003
@srowen, Its mostly for code readability. One of the customer got >64kb JVM
code size limit and I see lot of new lines. Thought it would give better
readability when code is in logs.
---
If y
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/15003
[MINOR][SQL] Stripping extra new lines from the generator code
## What changes were proposed in this pull request?
Here is the simple unit test
test("Extra new lines in gene
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/15003
Definitely log size will be lesser. When I look into the generated code for
caseWhen expressions with UDF's, majority was comments and extra new lines. I
don't know whether JVM considers
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/15003
I am also seeing lot of redundant code. I did the grep for single if
condition on a variable got98 and here is the output.
boolean got98 = false;
if (!got98
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/15003
Confirmed that comments and extra new lines in method body will not lead to
64KB limit exception.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user vundela closed the pull request at:
https://github.com/apache/spark/pull/15003
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/15003
Agreed @srowen, I will close this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17688
Hi @felixcheung Thanks for the review. I have added a small testcase.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/17688
[MINOR][DOCS] Adding missing boolean type for replacement value in fiâ¦
â¦llna
## What changes were proposed in this pull request?
Currently pyspark Dataframe.fillna API
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17694
Hi @maver1ck, Thanks for your time in testing the patch.
I did run the patch with 1000 threads and it works fine.
Can you please check
/grid/3/hadoop/yarn/log/usercache/bi/appcache
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/17694
[SPARK-12717][PYSPARK] Resolving race condition with pyspark broadcasâ¦
â¦ts when using multiple threads
## What changes were proposed in this pull request?
In pyspark when
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/17722
[SPARK-12717][PYSPARK][BRANCH-1.6] Resolving race condition with pyspark
broadcasts when using multiple threads
## What changes were proposed in this pull request?
In pyspark
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17688
@holdenk Thanks for the review. Can you please let me know the line number
where you are expecting list of types missing. Is this for fillna or other API?
---
If your project is set up for it, you
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17694
Thanks for testing @maver1ck. I will look into 1.6 and 2.0.2.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17694
Filed a PR for fixing the issue in spark1.6 branch.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/17662#discussion_r111833507
--- Diff: python/pyspark/context.py ---
@@ -240,6 +240,26 @@ def signal_handler(signal, frame):
if isinstance(threading.current_thread
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/17662#discussion_r111832361
--- Diff: python/pyspark/context.py ---
@@ -240,6 +240,26 @@ def signal_handler(signal, frame):
if isinstance(threading.current_thread
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17688
@holdenk, Thanks for reviewing it. I guess L1237 can't be changed until we
support boolean type. L1240 specifically talks about the types of values
supported in dict. Please let me know if you
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17694
cc @holdenk
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17694
Thanks for your time @holdenk.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17688
Thanks @holdenk
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/17694
ping @holdenk, checking if you have some time to review this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/19709
[SPARK-22483][CORE]. Exposing java.nio bufferedPool memory metrics to
Metric System
## What changes were proposed in this pull request?
Adds java.nio bufferedPool memory metrics
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/19709
retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
GitHub user vundela opened a pull request:
https://github.com/apache/spark/pull/19699
[MINOR][Core] Fix nits in MetricsSystemSuite
## What changes were proposed in this pull request?
Fixing nits in MetricsSystemSuite file
## How was this patch tested?
Ran the tests
Github user vundela commented on a diff in the pull request:
https://github.com/apache/spark/pull/19699#discussion_r149843907
--- Diff:
core/src/test/scala/org/apache/spark/metrics/MetricsSystemSuite.scala ---
@@ -42,7 +43,7 @@ class MetricsSystemSuite extends SparkFunSuite
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/19699
Thanks for review @vanzin, @srowen. Added the appropriate summary.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user vundela commented on the issue:
https://github.com/apache/spark/pull/20512
cc @squito @vanzin
Can you please comment on this PR?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
48 matches
Mail list logo