* Costello, Roger L. wrote: >On page 62 it says: > > ... when we store ... data on disk, we write > not 32-bit (or 16-bit) numbers but series of > four (or two) bytes. And according to the > type of processor (Intel or RISC), the most > significant byte will be written either first > (the "little-endian" system) or last (the > "big-endian" system). Therefore we have > both a UTF-32BE and a UTF-32LE, a UTF-16BE > and a UTF-16LE. > >Then, on page 63 it says: > > ... UTF-16 or UTF-32 ... if we specify one of > these, either we are in memory, in which case > the issue of representation as a sequence of > bytes does not arise, or we are using a method > that enables us to detect the endianness of the > document. > >When data is in memory isn't it important to know >whether the most significant byte is first or last?
The idea is that this knowledge is implied because there is only a single system with a single convention involved, with the assumption that you do not look behind the curtain (do not access the "first" byte of a multi-byte integer, for instance). -- Björn Höhrmann · mailto:[email protected] · http://bjoern.hoehrmann.de Am Badedeich 7 · Telefon: +49(0)160/4415681 · http://www.bjoernsworld.de 25899 Dagebüll · PGP Pub. KeyID: 0xA4357E78 · http://www.websitedev.de/

