[
https://issues.apache.org/jira/browse/SPARK-24174?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Nikolay Sokolov updated SPARK-24174:
------------------------------------
Description:
Currently, UI or /environment API call of HistoryServer or WebUI exposes only
system properties and SparkConf. However, in some cases when Spark is used in
conjunction with Hadoop, it is useful to know Hadoop configuration properties.
For example, HDFS or GS buffer sizes, hive metastore settings, and so on.
So it would be good to have hadoop properties being exposed in /environment
API, for example:
{code:none}
GET .../application_1525395994996_5/environment
{
"runtime": {"javaVersion": "1.8.0_131 (Oracle Corporation)", ...}
"sparkProperties": ["java.io.tmpdir","/tmp", ...],
"systemProperties": [["spark.yarn.jars", "local:/usr/lib/spark/jars/*"],
...],
"classpathEntries": [["/usr/lib/hadoop/hadoop-annotations.jar","System
Classpath"], ...],
"hadoopProperties": [["dfs.stream-buffer-size", 4096], ...],
}
{code}
was:
Currently, /environment API call exposes only system properties and SparkConf.
However, in some cases when Spark is used in conjunction with Hadoop, it is
useful to know Hadoop configuration properties. For example, HDFS or GS buffer
sizes, hive metastore settings, and so on.
So it would be good to have hadoop properties being exposed in /environment
API, for example:
{code:none}
GET .../application_1525395994996_5/environment
{
"runtime": {"javaVersion": "1.8.0_131 (Oracle Corporation)", ...}
"sparkProperties": ["java.io.tmpdir","/tmp", ...],
"systemProperties": [["spark.yarn.jars", "local:/usr/lib/spark/jars/*"],
...],
"classpathEntries": [["/usr/lib/hadoop/hadoop-annotations.jar","System
Classpath"], ...],
"hadoopProperties": [["dfs.stream-buffer-size", 4096], ...],
}
{code}
> Expose Hadoop config as part of /environment API
> ------------------------------------------------
>
> Key: SPARK-24174
> URL: https://issues.apache.org/jira/browse/SPARK-24174
> Project: Spark
> Issue Type: Wish
> Components: Spark Core
> Affects Versions: 2.1.0
> Reporter: Nikolay Sokolov
> Priority: Minor
> Labels: features, usability
>
> Currently, UI or /environment API call of HistoryServer or WebUI exposes only
> system properties and SparkConf. However, in some cases when Spark is used in
> conjunction with Hadoop, it is useful to know Hadoop configuration
> properties. For example, HDFS or GS buffer sizes, hive metastore settings,
> and so on.
> So it would be good to have hadoop properties being exposed in /environment
> API, for example:
> {code:none}
> GET .../application_1525395994996_5/environment
> {
> "runtime": {"javaVersion": "1.8.0_131 (Oracle Corporation)", ...}
> "sparkProperties": ["java.io.tmpdir","/tmp", ...],
> "systemProperties": [["spark.yarn.jars", "local:/usr/lib/spark/jars/*"],
> ...],
> "classpathEntries": [["/usr/lib/hadoop/hadoop-annotations.jar","System
> Classpath"], ...],
> "hadoopProperties": [["dfs.stream-buffer-size", 4096], ...],
> }
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]