The fact that these are "small" (at most 64 bits) is incidental, not essential; 
introducing a new quadruple type would not destabilize our concept of a 
primitive value.

If we can tip the user's mental model so that they believe "small is
good" for B3 values, then we aid them in hitting the sweet space of
the design and help them avoid tearing issues.  It doesn't change the
model but the more we can encourage the belief that B3 values should
be <= 64it the happier users will be with the results.

I think its reasonable to say that “we can flatten 64 bits better than we can 
flatten 256, but go ahead and write the code you want, and we’ll do what we 
can.”  Recent data suggests that we can get to 128 more quickly than we had 
initially expected, and (especially if we can drop the null footprint tax, as 
B3 does), you can do a lot in 128.  Presumably in some future hardware 
generation this number will go up again, whether that is 5 or 10 years from 
now, we don’t know right now.

The tangible things I think we want permission from the user to do are:

 - drop identity
 - drop nullity
 - drop atomicity (non-tearing)

B2, as currently sketched, drops the first; B3.val further drops nullity and 
atomicity together.  Whether this is the right stacking is a good discussion to 
be having now, but ultimately we need permission for each of these.  While 
“small” objects may sidestep the atomicity constraint, we’d like this to remain 
an implementation detail, not an in-your-face aspect of the programming model.

Reply via email to