Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/16137#discussion_r90948903
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -276,13 +279,17 @@ class SparkContext(config: SparkConf) extends Logging
{
private[spark] def ui: Option[SparkUI] = _ui
+ /**
+ * @return `Option` containing url to web ui.
+ */
def uiWebUrl: Option[String] = _ui.map(_.webUrl)
/**
* A default Hadoop Configuration for the Hadoop code (e.g. file
systems) that we reuse.
*
- * @note As it will be reused in all Hadoop RDDs, it's better not to
modify it unless you
- * plan to set some global configurations for all Hadoop RDDs.
+ * @note as it will be reused in all Hadoop RDDs, it's better not to
modify it unless you
+ * plan to set some global configurations for all Hadoop RDDs.
+ * @return Hadoop configuration
--- End diff --
This isn't a useful comment; I'd revert changes like this
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]