Zitat von "Fredrik Alströmer" <r...@excu.se>:
> A couple of very small coins.
> On Thu, Apr 22, 2010 at 06:55, Martin Nordholts <ense...@gmail.com> wrote:
>> On 04/22/2010 03:54 AM, Marc Lehmann wrote:
>>> On Wed, Apr 21, 2010 at 08:14:33PM +0200, Martin
>>> Nordholts<ense...@gmail.com> wrote:
>>>> The compiler doesn't catch all cases, like this one:
>>>> int main(int argc, char **argv)
>>>> int var;
>>>> if (argc == 2)
>>>> var = 42;
>>>> printf ("var = %d", var);
>>>> return 0;
>>> 1. initialising var will not fix the bug, if there is any.
>> It won't, but it will make the bug consistently occur, which is a big plus.
>>> 2. initialising var will prevent other static analysers
>>> to diagnose a possible problem.
>> The problem to diagnose would be that of using an initialized variable,
>> no? The fix would then be to initialize the variable.
> I think what he's trying to say here is that initializing it to 0 is
> still uninitialized. Just deterministicly so.
That's rhetoric saying.
> And no valgrind, or
> static analyzers will notice that you're reading an uninitialized
You have that defined value, and with each run it gives you the same value.
That mean: the bug is not fixed, but can be recreated with every run.
This is: you can track it down, because it always gives you the same bevaiour.
In this case: the value seems not to be changes... a good start: you
know what to look for.
And if the bug is, that the value will be changed intermeidately,
then it's also easier to track this, becaue something that is not 0,
as it definitely has to be is easier to detect as something that can
range the whole range of possibilities that the variable can hold, to
compare with any other value.
For example: you set it to zero, know that no function should work on it,
an it changes nevertheless. If it starts with <any value> have fun
with debuggin ;-)
It's always god to know, from where to start.
Languages like OCaml for example have never undefined values.
If you create something, it has a value.
> The fix would be to initialize the variable in all possible
> execution paths, but not necessarily to 0.
Can you explain that?
Why should every initialization would make sense?
You first set it to 0 (or NULL when it's a pointer),
and rightly afterwards you set the right value.
So in the case of a correct code, you get your initialisation,
which yoh want to have it.
If it's a value that is definitley always !0, but fixed
(constant start value) than setting to THAT value is OK too.
But then it's best doing it at definition time, not one or many lines later.
And it's also good, not to hard code that fixed starting point there,
but use #define.
If you have a fixed starting point, that's good in debugging.
If you later remove your init, or if the function, that makes the init,
makes nonsense, you at least can detect the difference.
difference = A - B
If one of A and B is fixed, you can find the difference.
If both are undetermeined, happy debugging. ;-)
Bugs that are untrackable, are untrackable because of those problems.
If for example the x-mouse-value is always 0, even when you move it,
you know what's wrong at the beginning of debugging. And you know what
to look for. And a constant 0 is easier to see than a constnat
It's distinct and clear. No need to look up manpages or science books.
You know there is always 0 or NULL. Fine, if that's wrong. :)
>>> 3. initialising var will prevent "weird effects"
>>> and just *might* decrease chances of finding the bug further.
>> Why would you want "weird effects" in software? That's exactly what you
>> don't want. At worst, a bug should manifest itself by making a program
>> not do what it was intended to do, not doing something unpredictable.
> Undeterministic behavior will expose a bug, deterministic but slightly
> wrong will probably hide it.
Deterministic behaviour will also expose a bug: it will show you
always the same wrong result.
Always the same wrong result is easier to track down than always a
behaviour that is different. And it's even more complicated, if it has
to be different every time. You have to compare all possible values
that should occure with all possible values that do occur.
See the difference-example above
In other words:
error = actual_value - wanted_value
If you know math, you know how to find the problem.
>>>> Since use of uninitlized variables very well can cause severe and
>>>> hard-to-reproduce crashes, and since unpredictability never is a good
>>> Actually, it's easy to diagnose those bugs though, just look at the
>> The coredump gives you the state of the program when it crashed, not the
>> cause leading up to the crash, which could have been an uninitlized
>> local variable that's no longer in any stack frame.
>>> Yes, don't do it unnecessarily, it tends to hide bugs.
>> Rather "As a rule of thumb, initialize local variables.". As always
>> there are cases where it's better not to initialize local variables.
> The compiler is actually smart enough to give you a warning "might be
> used uninitialized", always initializing to something will hide that
> warning. And you'll use your uninitialized value (which will always be
> zero, or whatever) unaware of that it's not sensibly initialized.
You don't need that warning anymore.
They were invented for languages that allow undefined values,
and prohgrammers that leave them undefined (mostly by accident).
You definitley know that there is one certain start value.
But which value is it have? Is it always the same? And the same
on all computers.
For example it's also good after freeing memory, to set the pointer to NULL.
Some people say: not necessary.
But it has helped me finding bugs.
Gimp-developer mailing list