GitHub user apetresc opened a pull request:
https://github.com/apache/spark/pull/15000
[SPARK-17437] Add uiWebUrl to JavaSparkContext and pyspark.SparkContext
## What changes were proposed in this pull request?
The Scala version of `SparkContext` has a handy field called `uiWebUrl`
that tells you which URL the SparkUI spawned by that instance lives at. This is
often very useful because the value for `spark.ui.port` in the config is only a
suggestion; if that port number is taken by another Spark instance on the same
machine, Spark will just keep incrementing the port until it finds a free one.
So, on a machine with a lot of running PySpark instances, you often have to
start trying all of them one-by-one until you find your application name.
Scala users have a way around this with `uiWebUrl` but Java and Python
users do not. This ticket (and the attached PR) fix this in the most
straightforward way possible, simply propagating this field through the
`JavaSparkContext` and into pyspark through the Java gateway.
Please let me know if any additional documentation/testing is needed.
## How was this patch tested?
Existing tests were run to make sure there were no regressions, and a
binary distribution was created and tested manually for the correct value of
`sc.uiWebPort` in a variety of circumstances.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/rubikloud/spark-public pyspark-uiweburl
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/15000.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #15000
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org