Hi,
consider this function:
split() (
unset -v IFS # default splitting
set -o noglob # disable glob
set -- $1 # split+(no)glob
[ "$#" -eq 0 ] || printf '<%s>\n' "$@"
)
Note the subshell above for the local scope for $IFS and for
the noglob option. That's a common idiom in POSIX shells when
you want to split something: subshell, set IFS, disable glob,
use the split+glob operator.
split 'foo * bar'
outputs
<foo>
<*>
<bar>
as expected. So far so good.
Now, if that "split" functions is called from within a function
that declares $IFS local like:
bar() {
local IFS=.
split $1
}
Then, the "unset", instead of unsetting IFS, actually pops a
layer off the stack.
For instance
foo() {
local IFS=:
bar $1
}
foo 'a b.c:d'
outputs
<a b>
instead of
<a>
<b>
because after the "unset IFS", $IFS is not unset (which would
result in the default splitting behaviour) but set to ":" as it
was before "bar" ran "local IFS=."
A simpler reproducer:
$ bash -c 'f()(unset a; echo "$a"); g(){ local a=1; f;}; a=0; g'
0
Or even with POSIX syntax:
$ bash -c 'f()(unset a; echo "$a"); a=0; a=1 eval f'
0
A work around is to change the "split" function to:
split() (
local IFS
unset -v IFS # default splitting
set -o noglob # disable glob
set -- $1 # split+(no)glob
[ "$#" -eq 0 ] || printf '<%s>\n' "$@"
)
For some reason, in that case (when "local" and "unset" are
called in the same function context), unset does unset the
variable.
Credits to Dan Douglas
(https://www.mail-archive.com/[email protected]/msg00707.html)
for finding the bug. He did find a use for it though (get the
value of a variable from the caller's scope).
--
Stephane