Based on your pictures let me assume the bitOrder is mostSignificantBitFirst.


The way you construct a binary int from the 10 bits is not the way you are 
showing it.


First you create the contiguous set of 10 bits. This is done by appending all 
the bits together according to the bitOrder. So the 10 bits are


1111000 000


The first 7 of those are from the first byte.

The last 3 of those came from the 2nd byte.


Now you divide those up into 8 bit bytes. Since bit order is 
mostSignificantBitFirst, you do this starting from the left.


11110000 00


Now pad the 2nd byte so it is a full byte. Again this is done based on the 
bitOrder, so those 2 bits are shifted right by 6 bits.


11110000 xxxxxx00


The "x" are zeros, but I'm using "x" just to illustrate the shift. This is the 
confusing and subtle step due to bitOrder MSBF. When using bitOrder MSBF, a 
chunk of bits will come from the most-significant bits of the next byte, but if 
that's the last part of the number, the place-value of those bits is wrong as 
far as what they contribute numerically to a number. They have to be shifted to 
the other end of the byte to contribute the proper numeric value.


Having done this shift, it is only now that byte order comes into play. So far, 
all we were doing was constructing the proper bytes from the bits in the data.


Now, treat this as a 2-byte littleEndian integer. So let's write the bits in 
normal left-to-right place-value order with most significant on the left:


xxxxxx00 11110000


So the 10 bit number is 0011110000 or 0x0F0 which is 240.




-mike beckerle

Tresys


________________________________
From: Costello, Roger L. <[email protected]>
Sent: Tuesday, February 19, 2019 6:15:44 PM
To: [email protected]
Subject: Two bytes ... bitOrder="mostSignificantBitFirst" ... 
byteOrder="littleEndian" ... I don't understand Daffodil's output


Hi Folks,



Here’s my input displayed in a hex editor:



[cid:[email protected]]



So the input is hex 78 01



Here is the equivalent binary:



[cid:[email protected]]

My DFDL schema declares:

                byteOrder="littleEndian"

bitOrder="mostSignficantBitFirst"



My schema declares this sequence of elements:



element One, length 1 bit

element Ten, length 10 bits

element Five, length 5 bits



I ran the input plus schema through Daffodil and this is the XML that was 
generated:



<Test>
  <One>0</One>
  <Ten>240</Ten>
  <Five>1</Five>
</Test>



Please help me understand that output.



Let’s work through it. The schema first declares the One element. So, Daffodil 
consumes the first bit of the first byte:



[cid:[email protected]]

Is that correct thus far?



Next, the Ten element wants 10 bits. There are only 7 remaining bits in the 
first byte, so 3 bits are consumed from the second byte. Which 3 bits? The 3 
most significant bits. Since byteOrder is littleEndian, those 3 bits are more 
significant than the 7 bits; therefore, the 3 bits go to the front of the 7 
bits:



[cid:[email protected]]



The result after prepending the 3 bits onto the 7 bits is this:



0001111000



That binary corresponds to decimal 120



But Daffodil says the value of Ten is 240. Why? Where did I go wrong?



/Roger

Reply via email to