I'm working on hybrid CPU/FPGA with an Arm Cortex A9 sharing die space with an FPGA. The FPGA works by doing a bunch of rotations on individual elements of a vector simultaneously, but in non-linear fashion.
I'm running into a problem though. For instance imagine the char array of HELLO 72,69,76,76,79 The algorithm calls for a rotate left by 7 to each byte. Which I would think should be the same as a rotate right by 1 72 which is 01001000 becomes 00100100 or 36 when I hand calculate it no matter which way I calculate it. When I try to simulate it, I'm getting a value of 9216. However when I do a rotate right by 1 I get the correct answer. Is this an endianess problem or am I missing something? Thanks! /* PLUG: http://plug.org, #utah on irc.freenode.net Unsubscribe: http://plug.org/mailman/options/plug Don't fear the penguin. */
