[
https://issues.apache.org/jira/browse/HADOOP-8532?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13403425#comment-13403425
]
Aaron T. Myers commented on HADOOP-8532:
In my experience, the only time people hit this limit is when they use multiple
variable substitutions in a single config that is a comma-separated list. For
example, if you configure 4 entries in dfs.namenode.name.dir, each of which
uses a variable substitution, that will count as 4 substitutions against the
MAX_SUBST limit.
So, I think a first step would be to fix the above bug, i.e. MAX_SUBST was
originally intended to be a maxium substitution _depth_, but it presently isn't
interpreted that way.
[Configuration] Increase or make variable substitution depth configurable
-
Key: HADOOP-8532
URL: https://issues.apache.org/jira/browse/HADOOP-8532
Project: Hadoop Common
Issue Type: Improvement
Components: conf
Affects Versions: 2.0.0-alpha
Reporter: Harsh J
We've had some users recently complain that the default MAX_SUBST hardcode of
20 isn't sufficient for their substitution needs and they wished it were
configurable rather than having to roll about with workarounds such as using
temporary smaller substitutes and then building the fuller one after it. We
should consider raising the default hardcode, or provide a way to make it
configurable instead.
Related: HIVE-2021 changed something similar for their HiveConf classes.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira