Re: ${!var} In Scripts
> On Mar 4, 2019, at 10:00 AM, Daniel Templeton wrote: > > Do you want to file a JIRA for it, or shall I? Given I haven’t done any Hadoop work in months and months … - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
Re: ${!var} In Scripts
Do you want to file a JIRA for it, or shall I? Daniel On 3/4/19 9:55 AM, Allen Wittenauer wrote: On Mar 4, 2019, at 9:33 AM, Daniel Templeton wrote: Thanks! That's not even close to what the docs suggest it does--no idea what's up with that. It does. Here’s the paragraph: "If the first character of parameter is an exclamation point (!), a level of variable indirection is introduced. Bash uses the value of the variable formed from the rest of parameter as the name of the variable; this variable is then expanded and that value is used in the rest of the substitution, rather than the value of parameter itself. This is known as indirect expansion. The exceptions to this are the expansions of ${!prefix*} and ${!name[@]} described below. The exclamation point must immediately follow the left brace in order to introduce indirection.” There’s a whole section on bash indirect references in the ABS as well. (Although I think most of the examples there still use \$$foo syntax with a note that it was replaced with ${!foo} syntax. lol.) For those playing at home, the hadoop shell code uses them almost entirely for utility functions in order to reduce the amount of code that would be needed to processes the ridiculous amount of duplicated env vars (e.g., HADOOP_HOME vs. HDFS_HOME vs YARN_HOME vs …). This issue only shows up if the user uses the hadoop command to run an arbitrary class not in the default package, e.g. "hadoop org.apache.hadoop.conf.Configuration". We've been quietly allowing that misuse forever. Unfortunately, treating CLI output as an API means we can't change that behavior in a minor. We could, however, deprecate it and add a warning when it's used. I think that would cover us sufficiently if someone trips on the Ubuntu 18 regression. Thoughts? Oh, I think I see the bug. HADOOP_SUBCMD (and equivalents in yarn, hdfs, etc) just needs some special handling when a custom method is being called. For example, there’s no point in checking to see if it should run with privileges, so just skip over that. Probably a few other places too. Relatively easy fix. 2 lines of code, maybe. - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
Re: ${!var} In Scripts
> On Mar 4, 2019, at 9:33 AM, Daniel Templeton wrote: > > Thanks! That's not even close to what the docs suggest it does--no idea > what's up with that. It does. Here’s the paragraph: "If the first character of parameter is an exclamation point (!), a level of variable indirection is introduced. Bash uses the value of the variable formed from the rest of parameter as the name of the variable; this variable is then expanded and that value is used in the rest of the substitution, rather than the value of parameter itself. This is known as indirect expansion. The exceptions to this are the expansions of ${!prefix*} and ${!name[@]} described below. The exclamation point must immediately follow the left brace in order to introduce indirection.” There’s a whole section on bash indirect references in the ABS as well. (Although I think most of the examples there still use \$$foo syntax with a note that it was replaced with ${!foo} syntax. lol.) For those playing at home, the hadoop shell code uses them almost entirely for utility functions in order to reduce the amount of code that would be needed to processes the ridiculous amount of duplicated env vars (e.g., HADOOP_HOME vs. HDFS_HOME vs YARN_HOME vs …). > This issue only shows up if the user uses the hadoop command to run an > arbitrary class not in the default package, e.g. "hadoop > org.apache.hadoop.conf.Configuration". We've been quietly allowing that > misuse forever. Unfortunately, treating CLI output as an API means we can't > change that behavior in a minor. We could, however, deprecate it and add a > warning when it's used. I think that would cover us sufficiently if someone > trips on the Ubuntu 18 regression. > > Thoughts? Oh, I think I see the bug. HADOOP_SUBCMD (and equivalents in yarn, hdfs, etc) just needs some special handling when a custom method is being called. For example, there’s no point in checking to see if it should run with privileges, so just skip over that. Probably a few other places too. Relatively easy fix. 2 lines of code, maybe. - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
Re: ${!var} In Scripts
Thanks! That's not even close to what the docs suggest it does--no idea what's up with that. With your example, I was able to figure out exactly what the issue is. On Ubuntu 18/bash 4.4, dot is rejected in the name of the variable to substitute, which is sane in principle as dots aren't allowed in variable names, but it's a regression from Ubuntu 16/bash 4.3. For example: % docker run -ti ubuntu:16.04 /bin/bash root@9a36ac04f2ff:/# k=l.m root@9a36ac04f2ff:/# echo ${!k} root@9a36ac04f2ff:/# exit % docker run -ti ubuntu:18.04 /bin/bash root@36ce0eb1d846:/# k=l.m root@36ce0eb1d846:/# echo ${!k} bash: l.m: bad substitution root@36ce0eb1d846:/# exit This issue only shows up if the user uses the hadoop command to run an arbitrary class not in the default package, e.g. "hadoop org.apache.hadoop.conf.Configuration". We've been quietly allowing that misuse forever. Unfortunately, treating CLI output as an API means we can't change that behavior in a minor. We could, however, deprecate it and add a warning when it's used. I think that would cover us sufficiently if someone trips on the Ubuntu 18 regression. Thoughts? Daniel On 3/1/19 3:52 PM, Allen Wittenauer wrote: On Mar 1, 2019, at 3:04 PM, Daniel Templeton wrote: There are a bunch of uses of the bash syntax, "${!var}", in the Hadoop scripts. Can anyone explain to me what that syntax was supposed to achieve? #!/usr/bin/env bash j="hi" m="bye" k=j echo ${!k} k=m echo ${!k} - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
Re: ${!var} In Scripts
> On Mar 1, 2019, at 3:04 PM, Daniel Templeton wrote: > > There are a bunch of uses of the bash syntax, "${!var}", in the Hadoop > scripts. Can anyone explain to me what that syntax was supposed to achieve? #!/usr/bin/env bash j="hi" m="bye" k=j echo ${!k} k=m echo ${!k} - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
${!var} In Scripts
There are a bunch of uses of the bash syntax, "${!var}", in the Hadoop scripts. Can anyone explain to me what that syntax was supposed to achieve? According to the user guide: ${!name[@]} ${!name[*]} If name is an array variable, expands to the list of array indices (keys) assigned in name. If name is not an array, expands to 0 if name is set and null otherwise. When ‘@’ is used and the expansion appears within double quotes, each key expands to a separate word. That makes sense, but the usage is odd. In many cases it's used in a conditional, e.g. [[ -z ${!1} ]], which just seems like an archaic alternative to the more readable "$1" != "". In other cases, I'm not convinced it's being used as intended, e.g.: hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh: declare array=("${!arrref}") hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh: eval "$1"='$(cygpath -p -w "${!1}" 2>/dev/null)' hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh: if [[ ${!uvar} != "${USER}" ]]; then I'm asking because bash 4.4 on Ubuntu 18 appears to reject that syntax with a bad substitution warning message. Bash 4.3 on Ubuntu 16 is OK with it. I filed a support request in the bash project for the regression, but I'm wondering if we shouldn't just replace it in the Hadoop scripts. I'd love any input, especially from Allen. Thanks! Daniel