Here are my comments about coding error checking, the -p option,
and coding style in response to all of the e-mails.
There is no reason to use shcomp for checking for errors, since
you can run ksh -n which will give the same warnings. ksh -n
does a lot more checking and warns about obsolete or suspecious
behavior. It tries to check for quoting bugs but it can be misleading.
It is suspicious of quotes that span lines and if another quote
of the same type is on the line of the trailing quote it will
give a warming message. However, I recommend that you run ksh -n
on all scripts.
The -p option was added to provide protection for setuid/setgid shell
scripts. By default, unless -p is specified, the shell sets
the effective user and group to the real user group. There is a compile
option which prevents this from happening for real uid's
less than some value.
If the -p option is on, the shell checks to see whether the
real/effective uid/gid at startup are different. If they
differ, then the shell will not execute user profiles
to prevent users from interfering with setuid shells.
Within the script the user to toggle -p to go in and out
of privileged mode.
As far as coding is concerned. There are two commonly use
formats for if, for, and while. The one that was documents,
and
if condition
then action
fi
with all reserved words lined up.
Using [...] for conditions should be obsolete. You should always
use [[...]] for string tests and ((...)) for arithmetic tests.
Unlike, [...] word splitting and file name expansion are not done
on operands so most of the problems associated with [...] are
not there and the need for quoting (except for the right hand side
of == and != for patterns). In addition, the shell checks the
operators at compile time so that many errors will be caught.
Don't use the arithmetic compare operators, use ((...)) instead.
For infinite loops, :, true, and ((1)) are all about equally fast.
For variable naming, another often used convention is that
global variables within a script that are shared with functions
in the script start with a capital letter. Local variables
are all lower case.
As far as quoting is concerned, one rule is don't use ``, use $(...)
instead. The behavior of `` is poorly defined and is different when
embedded in "" than when it is not.
Use $'...' to quote charaters that are non-printable and use the
C characters constants. Thus, $'\t\n' for tab not '
'.
Use [[ ! $foo ]] to test for an empty string.
Avoid using full pathnames in your script. The script will not
be portable and will likely be slower. Much better to explicitly
set PATH in your script.
The comment about ksh -p is misleading. The .profile $ENV file is not
executed for shell scripts so this doesn't apply. Unfortunately,
the interpreter magic makes a script non-portable. It should
only be inserted as part of an installation procedure.
Note, that eveything that I said (except possibly -pP also applies to bash
and zsh.
There are many other possible coding conventions and suggestions.
Here are a few.
1. Put the command name and arguments before redirections.
You can legally do "> file date" instead of "date > file"
but don't do it.
2. Unless you want to do word splitting, put IFS=
at the beginning of a command. This way spaces in
file names won't be a problem. You can do
IFS='delims' read -r line
to override IFS just for the read command. However,
you can't do this for set.
3. If you don't expect to expand files, you can do set -f
(set -o noglob) as well. This way the need to use "" is
greatly reduced.
4. Use print or printf rather than echo. The behavior of echo
is not portable and otherwise, you need to worry about \ in words and
the first word beginning with a -. In fact in almost all cases
use print -r -- rather than print alone.
5. Use the -r option of read to read a line. You never know
when a line will end in \ and without a -r multiple
lines can be read.
6. If the first operand of a command is a variable, use --
for any command that accepts this as end of argument to
avoid problems if the variable expands to a value starting with -.
7. When you create a file that needs to be deleted at the
end of the script, also set an EXIT trap at the same time.
8. Use ksh -n on your script to locate syntax errors that
might otherwise only show up at runtime.
9. Use standard argument conventions when writing a script
and uset getopts to process the options. In fact with
ksh93 the getopts string allows you to specify a complete
manual page as well so that documentation can be kept
in syc.
10. Put $LINENO]in your PS4 prompt so that you will get line
numbers with you run with -x. If you are looking at performance
issues but $SECONDS in the PS4 prompt as well.
11. Use function to break up you code. Use function name rather
than name() and use local variables with typeset.
12. Avoid using eval unless absolutely necessary. Suble things
can happen when a string is passed back through the shell
parser. You can use name references to avoid uses such as
eval $name="$value"
13. Don't use eval. All the functionality of eval is built
into the shell and will be less prone to failure as well
as faster.
14. Don't use awk or sed for simple string manipulation. Use
the built-in features of the shell.
15. Use $"..." instead of "..." for strings that need to be
localized for different locales.
16. Use command exec, rather than exec for opening files since
you can check the status on failure. Otherwise the script
will fail.
17. When opening a file in a function that will also close
the file, use {n}<file, where n is a local variable rather
than specifying a digit so that it won't intefere with the
script.
18. Use inline here documents, for example
command <<< $x
rather than
print -r -- "$x" | command
David Korn
[EMAIL PROTECTED]
_______________________________________________
opensolaris-code mailing list
[email protected]
http://mail.opensolaris.org/mailman/listinfo/opensolaris-code