Hello DFDL community,
My input is binary. There is a 2-bit unsigned integer, followed by a 3-bit
unsigned integer, and then it is padded to an 8-bit boundary. The bits are
leastSignificantBitFirst. Here is my input (hex):
0E
Here is my DFDL Schema:
<xs:element name="input">
<xs:complexType>
<xs:sequence>
<xs:element name="two-bits" type="unsignedint2" />
<xs:element name="three-bits" type="unsignedint3" />
<xs:sequence dfdl:hiddenGroupRef="padToByteBoundary" />
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:group name="padToByteBoundary">
<xs:sequence dfdl:alignment="8" dfdl:alignmentUnits="bits"/>
</xs:group>
Parsing produces this XML:
<input>
<two-bits>2</two-bits>
<three-bits>3</three-bits>
</input>
Perfect!
However, unparsing produces incorrect binary (hex):
CE
Yikes! What am I doing wrong, please?
/Roger