The following code example

import std.stdio, std.bitmanip;

enum E2 { a, b, c, d, e }

immutable bf = bitfields!(uint, "x", 6,
                          E2, "e2", 2);

struct A { mixin(bf); }

void main(string[] args)
{
    A obj;
    obj.x = 2;
    obj.e2 = E2.a;

    import core.exception: AssertError;
    try
    {
        obj.e2 = E2.e;
        assert(false, "Exception not caught");
    }
    catch (core.exception.AssertError e) { /* ok to throw */ }
}

shows how brilliantly generic D is with regards to the

std.bitmanip.bitfields

However, wouldn't it be better to detect the mismatches between enum bit-sizes and bitfield lengths at compile-time instead of at run-time?

This is possible because all size information is available at compile-time as template parameters to bitfields.

Reply via email to