[
https://issues.apache.org/jira/browse/HADOOP-14399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16002631#comment-16002631
]
Steve Loughran commented on HADOOP-14399:
-----------------------------------------
I should add that there is a workaround: don't use the file:// prefix, but
instead just go "//". This works for trunk, but not against hadoop 2.8
This makes me think that the absolute import process should try not be clever
at all: the file:// prefix must be considered mandatory, and the new XML parse
code not try and be helpful and add a file:// prefix, at least, not for any
absolute path.
{code}
- Check Hadoop version *** FAILED ***
java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId:
file:/home/stevel/Projects/sparkwork/spark-cloud-examples/cloud-examples/../../cloud-test-configs/s3a.xml;
lineNumber: 23; columnNumber: 47; An include with href
'//home/stevel/(secret)/auth-keys.xml'failed, and no fallback element was found.
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2696)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2553)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2426)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1240)
at
com.hortonworks.spark.cloud.CloudSuite$.loadConfiguration(CloudSuite.scala:353)
at
com.hortonworks.spark.cloud.common.HadoopVersionSuite$$anonfun$1.apply$mcV$sp(HadoopVersionSuite.scala:32)
at
com.hortonworks.spark.cloud.common.HadoopVersionSuite$$anonfun$1.apply(HadoopVersionSuite.scala:31)
at
com.hortonworks.spark.cloud.common.HadoopVersionSuite$$anonfun$1.apply(HadoopVersionSuite.scala:31)
at
org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
...
Cause: org.xml.sax.SAXParseException: An include with href
'//home/stevel/(secret)/auth-keys.xml'failed, and no fallback element was found.
at
com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:257)
at
com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:339)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2531)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2519)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2587)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2553)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2426)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1240)
at
com.hortonworks.spark.cloud.CloudSuite$.loadConfiguration(CloudSuite.scala:353)
...
{code}
> Configuration does not correctly XInclude absolute file URIs
> ------------------------------------------------------------
>
> Key: HADOOP-14399
> URL: https://issues.apache.org/jira/browse/HADOOP-14399
> Project: Hadoop Common
> Issue Type: Bug
> Components: conf
> Affects Versions: 2.9.0, 3.0.0-alpha3
> Reporter: Andrew Wang
> Priority: Blocker
>
> [Reported
> by|https://issues.apache.org/jira/browse/HADOOP-14216?focusedCommentId=15967816&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15967816]
> [[email protected]] on HADOOP-14216, filing this JIRA on his behalf:
> {quote}
> Just tracked this down as the likely cause of my S3A test failures. This is
> pulling in core-site.xml, which then xincludes auth-keys.xml, which finally
> references an absolute path, file://home/stevel/(secret)/aws-keys.xml. This
> is failing for me even with the latest patch in. Either transient XIncludes
> aren't being picked up or
> Note also I think the error could be improved. 1. It's in the included file
> where the problem appears to lie and 2. we should really know the missing
> entry. Perhaps a wiki link too: I had to read the XInclude spec to work out
> what was going on here before I could go back to finding the cause
> {quote}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]