Nikolay Sokolov created SPARK-24174:
---------------------------------------

             Summary: Expose Hadoop config as part of /environment API
                 Key: SPARK-24174
                 URL: https://issues.apache.org/jira/browse/SPARK-24174
             Project: Spark
          Issue Type: Wish
          Components: Spark Core
    Affects Versions: 2.3.0
            Reporter: Nikolay Sokolov


Currently, /environment API call exposes only system properties and SparkConf. 
However, in some cases when Spark is used in conjunction with Hadoop, it is 
useful to know Hadoop configuration properties. For example, HDFS or GS buffer 
sizes, hive metastore settings, and so on.

So it would be good to have hadoop properties being exposed in /environment 
API, for example:
{code:none}
GET .../application_1525395994996_5/environment
{
   "runtime": {"javaVersion": "1.8.0_131 (Oracle Corporation)", ...}
   "sparkProperties": ["java.io.tmpdir","/tmp", ...],
   "systemProperties": [["spark.yarn.jars", "local:/usr/lib/spark/jars/*"], 
...],
   "classpathEntries": [["/usr/lib/hadoop/hadoop-annotations.jar","System 
Classpath"], ...],
   "hadoopProperties": [["dfs.stream-buffer-size": 4096], ...],
}
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to