So od's output depends on the endianness of the underlying CPU/ABI
combination.
CPU alone is not enought as some of them support both schemes.
I even suspect some OS with support for multiple ABI's would make this
choice uncanny.
Is there a way of forcing little or big endian behaviour in od ?
Would that not be good addition as it would draw attention in the
documentation on the somewhat surprising default behavior ?

Charlie Gordon.





_______________________________________________
Bug-coreutils mailing list
[EMAIL PROTECTED]
http://lists.gnu.org/mailman/listinfo/bug-coreutils

Reply via email to