I've been leaving this thread alone because I've been very interested in 
various people's contributions.  But I'm a security professional so I know 
stuff most programmers don't know, and I've seen enough wrong information here 
that I'm concerned someone will use it to design something badly.  So please 
excuse me.  I'm not pouncing on any one person's post, since no one post has 
been worth pouncing on.  I'm reacting to the thread in general.

--------

Useful and plausible initial variable values lead to situations where a 
programmer/debugger may think things are working when they're not.  Clutter is 
good.  And intentional clutter is better than random clutter.

Zero is the worst value to have as a default because it is useful and plausible 
and will trigger no errors.  Zero is a useful initial value in many situations. 
 It can be used as a pointer without errors (it's small and always on a word 
boundary no matter what word size you're using).  It can be used as a memory 
offset without errors.  It is a low number and thus can be included in 
calculations without being obvious.

Common values (for over 30 years) for initialisation are numbers like 
0xDEADBEEF.  This value has these advantages:

a) It is memorable and easy to recognise so if any programmer/debugger sees it 
they know they have a bug somewhere.
b) It is odd, meaning that any attempt to use it as a pointer for CPUs which 
demand that pointers be aligned to words will immediately trigger 'segment 
fault' (or equivalent) error.
c) Not just the whole value but every byte has the top bit set, which means 
that however you use it as a pointer you will get an 'illegal memory access' 
(or equivalent) error because you're using more memory than was assigned in 
that block.
d) It has a very high value, meaning that if it's used as a memory offset 
you'll get n 'illegal memory access' (or equivalent) error because you're using 
more memory than was assigned in that block.
e) It has a very high value, meaning that if it is included in any calculation 
the result will be obviously wrong.

It is bad for initialisation to be done by the compiler when it allocates an 
App new memory.  You could write your own compiler which did not do this, and 
in that way spy on other app's memory.  In operating systems designed for 
security it is done by the operating system /when the previous application 
releases the memory, before that memory is returned to the free pool/.  This 
means that even if someone finds a way to trick the OS into reading memory 
which hadn't been allocated to an App yet, its contents would already have been 
overwritten.

Of course, many of us are still using operating systems written under the 
assumption that one App must trust another.  This got better for a while as 
people who worried about desktop computers started worrying about rogue Apps.  
But for the last decade it has got worse because so many of our computing 
devices are tiny and need to operated on battery power (phones, watches, GPS 
devices) and don't have enough free CPU to wipe memory every time it's 
released.  So, annoyingly, the device which knows the most personal information 
about you, your mobile phone, is probably the one with the worst security.

In closing, I hope that the construction used in C and other languages

int i, loopCounter;
float interestRate, currentDebt;

is going to go away.  I want the next generation of compilers to require that 
the programmer specify an initial value (a constant, not a calculated value) 
for every variable they define, including every array element when they define 
an array.

Simon.

Reply via email to