On 8/2/12 5:14 AM, Era Scarecrow wrote:
On Thursday, 2 August 2012 at 09:03:54 UTC, monarch_dodra wrote:
I had an (implementation) question for you: Does the implementation
actually require knowing what the size of the padding is?

eg:
struct A
{
int a;
mixin(bitfields!(
uint, "x", 2,
int, "y", 3,
ulong, "", 3 // <- This line right there
));
}

It that highlighted line really mandatory?
I'm fine with having it optional, in case I'd want to have, say, a 59
bit padding, but can't the implementation figure it out on it's own?

The original code has it set that way, why? Perhaps so you are aware and
actually have in place where all the bits are assigned (even if you
aren't using them); Be horrible if you used accidently 33 bits and it
extended to 64 without telling you (Wouldn't it?).

Yes, that's the intent. The user must define exactly how an entire ubyte/ushort/uint/ulong is filled, otherwise ambiguities and bugs are soon to arrive.

However, having it fill the size in and ignore the last x bits wouldn't
be too hard to do, I've been wondering if I should remove it.

Please don't. The effort on the programmer side is virtually nil, and keeps things in check. In no case would the use of bitfields() be so intensive that the bloat of one line gets any significance.


Andrei

Reply via email to