On 11/19/18 3:04 PM, Alexander Reintzsch wrote:
> Hello,
> 
> I think I have found some unexpected behavior related to constants in
> bash scripts. I have attached the short proof of concept bash script.
> 
> Usually bash scripts continue when they face an error with one command
> but this script shows some weired behavior. It exits all the functions
> it has called without executing the remaining commands and the continues
> to run in the top scope of the script.
> 
> This only happens when a constant declared with
> declare -r myConst="myConstantValue"
> is attempted to be redefined using
> myConst="new value"
> but not with
> declare myConst="new value"
> 
> This behavior doesn't seem right.

OK, let's unpack this.

The variable assignment without declare has the posix semantics, in that
it constitutes a variable assignment error. When the shell is in posix
mode, it exits. When not in posix mode, the shell returns to the top level
and continues execution. It could return to the previous execution context
-- a virtual `return' -- or exit as it does in posix mode but has never
done so.

When the assignment is used as an argument to `declare', it causes the
declare command to fail, but it's not a variable assignment error, so
the script simply continues as with any other failed command.

-- 
``The lyf so short, the craft so long to lerne.'' - Chaucer
                 ``Ars longa, vita brevis'' - Hippocrates
Chet Ramey, UTech, CWRU    c...@case.edu    http://tiswww.cwru.edu/~chet/

Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to