Nicholas Wilson wrote:
On Thursday, 15 February 2018 at 16:51:05 UTC, Kyle wrote:
Hi. Is there a convenient way to convert a ubyte[4] into a signed int?
I'm having trouble handling the static arrays returned by
std.bitmanip.nativeToLittleEndian. Is there some magic sauce to make the
static arrays into input ranges or something? As a side note, I'm used
to using D on Linux and DMD's error messages on Windows are comparably
terrible. Thanks!
you mean you want to convert the bitpattern represented by the uint[4] to
an int?
You want a reinterpret style case
ubyte[4] foo = ...;
int baz = *cast(int*)&foo;
better to use `&foo[0]`, this way it will work with slices too.