On 5/19/06, Kimball Larsen <[EMAIL PROTECTED]> wrote:
Ok, so I've got a degree in CS and all, but it's been since I earned
that degree that I've ever had to mess with stuff like this.  I am
working on someone else's code (this someone is no longer at my
disposal) and have the following method:


        public static int get2Little(byte abyte0[], int i) throws Exception
        {
                return abyte0[i + 1] << 8 & 0xff00 | abyte0[i + 0] & 0xff;
        }

(of course, there are no comments anywhere in the entire class... )

So, a byte[] and index are passed in, the byte at i+1 is anded with
0xff00 and shifted 8, then or'ed with the byte at i.

Thus, this appears to be reading 2 bytes out of the array and
returning them as an int.

So, if the array has 0xC0 as the first byte, and 0x1C as the second
byte, this should return the integer 49180.

Right?

Isn't there a much easier way to do this?

Does not seem to be doing what I expect...

Anyone out there able to enlighten?

-- Kimball

It looks like the method is trying to swap the endian-ness of a 16-bit
number off a byte array.  I believe there is an easier way to do this
with a standard library, but the above routine will work.

-Bryan

/*
PLUG: http://plug.org, #utah on irc.freenode.net
Unsubscribe: http://plug.org/mailman/options/plug
Don't fear the penguin.
*/

Reply via email to