> Agreed, it's not pretty.  The fundamental problem is that a primitive
> like an C<int> simply cannot be undefined... there's no flag for that
> (which is they're primitive.)

Certainly there's no way of _storing_ C<undef>.

> So it having a 'default value' at all is perhaps a bit of a misnomer.

Why does that follow?  I'd say the opposite is true: it's because the
type _can't_ store C<undef> that a _valid_ default is required.

> A simple solution is perhaps to say that C<is default> can only be
> applied to types that can be undef (scalar,ref,Int,Str...) and can't
> be used on things that have no undefined state (bit,int,str...).

But what's the disadvantage of permitting things of int to have non-zero
default values?

And, just because C<undef> can't be stored in an int, why does it mean
that _trying_ to store an C<undef> can't be the action that triggers the
default being stored there?

For an int variable with a default of 5, you seem to have gone from the
suggestion that attempting to store either zero or undef would result in
5 being stored, to the suggestion that either would result in zero being
stored.

Why can't zero and undef do different things?  People obviously want to
be able to store zeros in integer variables and having code that looks
like it stores zero -- a valid integer -- actually store some other
integer is ridiculous.  So have zero store zero always.  But storing
C<undef> denotes clearing the element out of a particular value, which
seems like a good time to use the default.

Damian yesterday argued in favour of C<undef>, the value which is used
when no other value is known, not being permitted in Int arrays which
have a specific (integer) default, and hence that marking an element as
C<undef> should put the default value there.

That would make the rule very simple indeed:

  Assigning C<undef> to an array element causes that element to take the
  array's default value.

That's it.  It's what I assumed Damian meant yesterday, but I could be
mistaken.

The effects of this are:

  * Assigning a particular integer to an array of int or Int always does
    what it looks like it's doing, irrespective of whether or not that
    integer is zero or whether the array happens to have a default.

  * In an array of Int, attempting to store C<undef> will, by default,
    actually store C<undef>.  If the array has a different default
    defined then that will be stored instead.

  * In an array of int, attempting to store C<undef> will, by default,
    store zero.  If the array has a different default defined then that
    will be stored instead.

Smylers

Reply via email to