On Thu, Apr 22, 2010 at 14:00, Oliver Bandel <oli...@first.in-berlin.de> wrote:
> Zitat von "Fredrik Alströmer" <r...@excu.se>:
>> And no valgrind, or
>> static analyzers will notice that you're reading an uninitialized
>> zero.
> No problem.
> You have that defined value, and with each run it gives you the same value.
> That mean: the bug is not fixed, but can be recreated with every run.
> This is: you can track it down, because it always gives you the same bevaiour.
> In this case: the value seems not to be changes... a good start: you
> know what to look for.

Let's say you pass that value down the stack, 0 IS a valid value for
this particular algorithm, and will produce results which are similar,
but not identical, to a non-zero value. If there are many factors
which might distort the result, you'd have no idea where to start
looking. If you DIDN'T initialize your variable, valgrind would tell
you that the algorithm is reading uninitialized memory, and it'd also
tell you which one.


> Languages like OCaml for example have never undefined values.
> If you create something, it has a value.
> That's fun.

Right, so does Java, yet still they're uninitialized and produce a
warning (or error actually, in some cases), if you use it.

>> The fix would be to initialize the variable in all possible
>> execution paths, but not necessarily to 0.
> Can you explain that?

Well, if you end up with an uninitialized variable, there's an
execution path which CAN reach that state, and chances are, you didn't
think of it. And IN that case the variable needs to be initialized,
but to what depends on the case (the one you forgot about).

> Why should every initialization would make sense?
> You first set it to 0 (or NULL when it's a  pointer),
> and rightly afterwards you set the right value.
> So in the case of a correct code, you get your initialisation,
> which yoh want to have it.

This is just... weird, if you initialize it directly, why bloat the code?

> If it's a value that is definitley always !0, but fixed
> (constant start value) than setting to THAT value is OK too.
> But then it's best doing it at definition time, not one or many lines later.
> And it's also good, not to hard code that fixed starting point there,
> but use #define.
> If you have a fixed starting point, that's good in debugging.
> If you later remove your init, or if the function, that makes the init,
> makes nonsense, you at least can detect the difference.
> difference = A - B
> If one of A and B is fixed, you can find the difference.
> If both are undetermeined, happy debugging. ;-)
> Bugs that are untrackable, are untrackable because of those problems.

You've never used tools like static analyzers and valgrind, right?
Bugs that are untrackable are usually not of this kind, rather
pertaining to race conditions or intricate semantic problems.

-- snip --
>>>> 3. initialising var will prevent "weird effects"
>>>>     and just *might* decrease chances of finding the bug further.
>>> Why would you want "weird effects" in software? That's exactly what you
>>> don't want. At worst, a bug should manifest itself by making a program
>>> not do what it was intended to do, not doing something unpredictable.
>> Undeterministic behavior will expose a bug, deterministic but slightly
>> wrong will probably hide it.
> Heheh. funny.


> Deterministic behaviour will also expose a bug: it will show you
> always the same wrong result.
> Always the same wrong result is easier to track down than always a
> behaviour that is different. And it's even more complicated, if it has
> to be different every time. You have to compare all possible values
> that should occure with all possible values that do occur.
> See the difference-example above
> In other words:
> error = actual_value - wanted_value
> If you know math, you know how to find the problem.

This is true for obvious bugs. For not so obvious ones, where 0 IS a
valid value, and SHOULD be 0 most of the time, you will NOT spot the
error until someone hits that corner case where it shouldn't be.

-- snip --
>> The compiler is actually smart enough to give you a warning "might be
>> used uninitialized", always initializing to something will hide that
>> warning. And you'll use your uninitialized value (which will always be
>> zero, or whatever) unaware of that it's not sensibly initialized.
> You don't need that warning anymore.
> They were invented for languages that allow undefined values,
> and prohgrammers that leave them undefined (mostly by accident).

This is simply wrong. It's a rather naive to set uninitialized =
undefined, as I've tried to explain above. As I also said, Java will
also give you an error in this case even though it's still well
defined. Using something uninitialized (defined or not) suggests
you've forgotten an execution path, and your code is semantically

> You definitley know that there is one certain start value.
> But which value is it have? Is it always the same? And the same
> on all computers.

If there was a way to define a value without initializing it, I'd be
all for it. But AFAIK, there isn't. Unfortunately.

> Ciao,
>    Oliver
> P.S.:
> For example it's also good after freeing memory, to set the pointer to NULL.
> Some people say: not necessary.
> But it has helped me finding bugs.

Thats a completely different case, and makes a lot more sense, as
after freeing it's initialized but undefined (yes I know, it keeps
pointing where it used to point, but that's not the point, no pun

For the record, I'm not necessarily against setting a predefined value
to variables sometimes. I'm just against doing it for the wrong
reasons, and I'd much rather have the compiler say "Warning: might be
used uninitialized in this context" as a part of the static analysis,
rather than chase down the bug where a value is 0 at run time
(remember, I'm primarily talking corner cases here).

Gimp-developer mailing list

Reply via email to