Maciej Bryński created SPARK-21470:
--------------------------------------
Summary: Spark History server doesn't support HDFS HA
Key: SPARK-21470
URL: https://issues.apache.org/jira/browse/SPARK-21470
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 2.2.0
Reporter: Maciej Bryński
With Spark version up to 2.1.1 there was possibility to config history server
to read from hdfs without putting namenode:
spark.history.fs.logDirectory hdfs:///apps/spark
And this works with HDFS HA.
Unfortunately there is regression with Spark 2.2.0 when such configuration
gives error:
{code}
Caused by: java.io.IOException: Incomplete HDFS URI, no host: hdfs:///apps/spark
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:142)
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at
org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:108)
at
org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:78)
... 6 more
{code}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]