Hi, today I was debugging performance issues with a 200KB bash script [1] with bash-4.3 and 4.4 and it seems that much of it came from a function call that took 0.1 seconds (and it was done in a loop for 37000 files) even though it basically just consisted of an if [[ 0 != 0 ]] ; then
I also wrote this simpler reproducer: function func1 { [[ 0 != 0 ]] || return echo neverdone } function func2 { [[ 0 != 0 ]] || return echo neverdone : : : : : : : : : : : : : : : : : : : : } time for i in $(seq 30000) ; do func1 someparameters done time for i in $(seq 30000) ; do func2 someparameters done showing significant difference in execution time with the longer function taking 2x to 4x as much time. Even though the length is comparable, it is not nearly as slow as 0.1s per call of the original file. 1) could it be that the overall file size or complexity influences the time it takes for a function to be parsed and executed? 2) is there a way to avoid the slowdown from length of such functions? I tried readonly func2 declare -fr func2 but it did not make a difference Thanks in advance for your insights Bernhard M. [1] https://github.com/g23guy/supportutils/blob/1e89b672d61ac6da5d8cf4a164b529693eab0cd9/bin/supportconfig#L304