Hi Folks,

Here's my input displayed in a hex editor:

[cid:[email protected]]

So the input is hex 78 01

Here is the equivalent binary:

[cid:[email protected]]
My DFDL schema declares:
                byteOrder="littleEndian"
bitOrder="mostSignficantBitFirst"

My schema declares this sequence of elements:

element One, length 1 bit
element Ten, length 10 bits
element Five, length 5 bits

I ran the input plus schema through Daffodil and this is the XML that was 
generated:

<Test>
  <One>0</One>
  <Ten>240</Ten>
  <Five>1</Five>
</Test>

Please help me understand that output.

Let's work through it. The schema first declares the One element. So, Daffodil 
consumes the first bit of the first byte:

[cid:[email protected]]
Is that correct thus far?

Next, the Ten element wants 10 bits. There are only 7 remaining bits in the 
first byte, so 3 bits are consumed from the second byte. Which 3 bits? The 3 
most significant bits. Since byteOrder is littleEndian, those 3 bits are more 
significant than the 7 bits; therefore, the 3 bits go to the front of the 7 
bits:

[cid:[email protected]]

The result after prepending the 3 bits onto the 7 bits is this:

0001111000

That binary corresponds to decimal 120

But Daffodil says the value of Ten is 240. Why? Where did I go wrong?

/Roger

Reply via email to