Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/22213
Thank you for reviews @vanzin @steveloughran @jerryshao @HyukjinKwon
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/22213
rebased
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/22213#discussion_r214237801
--- Diff:
core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala ---
@@ -1144,6 +1144,46 @@ class SparkSubmitSuite
conf1.get
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/22213
@steveloughran Regarding XML format, java.util.Properties has its dedicated
storeTo/loadFromXML methods which Spark does not use, so we don't need to check
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/22213
thanks for the comment @steveloughran. I'll add more tests for now and see
how the discussion goes from there.
as for transition to UTF it means to be fully correct Spark needs
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/22213
@jerryshao here is my new take on the problem that should be more
acceptable. The premise is that since JDK has already parsed out natural line
delimiters '\r' and '\n', the remaining ones
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/22213#discussion_r213489337
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2062,8 +2062,10 @@ private[spark] object Utils extends Logging {
try
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/22213#discussion_r213488600
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2062,8 +2062,10 @@ private[spark] object Utils extends Logging {
try
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/22213#discussion_r213049701
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2062,8 +2062,10 @@ private[spark] object Utils extends Logging {
try
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/22213#discussion_r212696259
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2062,8 +2062,10 @@ private[spark] object Utils extends Logging {
try
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/22213#discussion_r212695364
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -2062,8 +2062,10 @@ private[spark] object Utils extends Logging {
try
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/22213
@witgo please take a look since you worked on #2379
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/22213
[SPARK-25221][DEPLOY] Consistent trailing whitespace treatment of conf
values
## What changes were proposed in this pull request?
Stop trimming values of properties loaded from
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/21047
thank you for review and commit @jerryshao !
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/21047
[SPARK-23956][YARN] Use effective RPC port in AM registration
## What changes were proposed in this pull request?
We propose not to hard-code the RPC port in the AM registration
Github user gerashegalov closed the pull request at:
https://github.com/apache/spark/pull/20327
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20327
closing this PR since the bind bug is fixed, the rest is achievable per
configuration.
---
-
To unsubscribe, e-mail
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r176825696
--- Diff:
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
---
@@ -136,6 +135,39 @@ class YarnClusterSuite
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r176822894
--- Diff:
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
---
@@ -136,6 +135,39 @@ class YarnClusterSuite
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r176816848
--- Diff:
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
---
@@ -136,6 +135,39 @@ class YarnClusterSuite
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20883
This PR's title should reference Spark UI in general as opposed to just SHS
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r176598095
--- Diff:
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
---
@@ -136,6 +135,39 @@ class YarnClusterSuite
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r176597452
--- Diff:
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
---
@@ -136,6 +135,39 @@ class YarnClusterSuite
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r176584238
--- Diff:
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
---
@@ -136,6 +135,39 @@ class YarnClusterSuite
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20327
It turns out `0.x.x.x` on Linux are bindable addresses unlike on my Mac. So
changing it back to the original `1.1.1.1` that cannot be bound to on both Mac
OS and Linux
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r176183580
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r175934086
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r175869822
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r175672010
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r175623955
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r175551377
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r175544171
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r173611826
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r173611565
--- Diff: core/src/main/scala/org/apache/spark/ui/JettyUtils.scala ---
@@ -342,13 +342,13 @@ private[spark] object JettyUtils extends Logging
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r171749148
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
---
@@ -79,6 +80,19 @@ private[spark] class
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r171749008
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,7 @@ private[spark] abstract class WebUI(
def bind(): Unit
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r171748638
--- Diff:
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
---
@@ -123,6 +123,10 @@ class YarnClusterSuite
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r171747629
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
---
@@ -79,6 +80,19 @@ private[spark] class
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20575
Thanks for suggestions, I will look into them. I agree that a solution into
the right direction will definitely involve changing the write call path. I did
not go down this path because I have
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20575
@vanzin what do you mean by "as part of parsing the logs"? This PR is about
avoiding the long wait for eventLogs to be read from a remote filesystem, and
be
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r169257050
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -744,7 +744,9 @@ object SparkSubmit extends CommandLineUtils
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/20575
[SPARK-23386][DEPLOY] enable direct application links in SHS before replay
## What changes were proposed in this pull request?
Enable direct job links already in the scan thread before
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/20470
[SPARK-23296][YARN] Include stacktrace in YARN-app diagnostic
## What changes were proposed in this pull request?
Include stacktrace in the diagnostics message upon abnormal
Github user gerashegalov commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r165206579
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -744,7 +744,9 @@ object SparkSubmit extends CommandLineUtils
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20327
Thanks for catching the client mode issue @jerryshao . Please check updated
PR.
---
-
To unsubscribe, e-mail
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20326
thanks for feedback, Marcelo. I was also thinking back and forth where to
put this logic. Ideally YARN should provide a permalink similar to AM proxy for
logs as well. However, this was faster
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20326
@vanzin do you mind considering this issue?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/20327
[SPARK-12963][CORE] NM host for driver end points
## What changes were proposed in this pull request?
Driver end points on YARN in the cluster mode are potentially bound to
incorrect
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/20326
[SPARK-23155][DEPLOY] log.server.url links in SHS
## What changes were proposed in this pull request?
Ensure driver/executor log availability via Spark History Server UI even
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20098
Thank you for review and commit @vanzin !
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20098
Thanks @vanzin for review. Please take a look again
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20055
Thank you for review and commit @srowen
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/20098
[SPARK-22914][DEPLOY] Register history.ui.port
## What changes were proposed in this pull request?
Register spark.history.ui.port as a known spark conf to be used in
substitution
Github user gerashegalov commented on the issue:
https://github.com/apache/spark/pull/20055
agreed @srowen, this is a better fix although no other goal seemed
affected. Thanks for suggestion!
---
-
To unsubscribe
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/20055
[SPARK-22875][BUILD] Assembly build fails for a high user id
## What changes were proposed in this pull request?
Add tarLongFileMode=posix configuration for the assembly plugin
Github user gerashegalov commented on the pull request:
https://github.com/apache/spark/pull/1483#issuecomment-49835395
@tgravescs Sorry for coming late. I haven't tested on a secure cluster yet,
only on my pseudo-distributed cluster. But since the viewfs implements
GitHub user gerashegalov opened a pull request:
https://github.com/apache/spark/pull/1483
[YARN] SPARK-2577: File upload to viewfs is broken due to mount point re...
Opting to the option 2 defined in SPARK-2577, i.e., retrieve and pass the
correct file system object to addResource
57 matches
Mail list logo