I have these types:

TYPE1 ::= SEQUENCE { x1 [0] INTEGER,
                     x2 [1] TYPE2,
                     x3 [2] TYPE3 }

TYPE2 ::= SEQUENCE { y1 [0] INTEGER OPTIONAL,
                     y2 [1] INTEGER OPTIONAL,
                     y3 [2] SEQUENCE { yy1 [0] INTEGER,
                                       yy2 [1] INTEGER } OPTIONAL }

TYPE3 ::= CHOICE { z1 [0] INTEGER,
                   z2 [1] INTEGER,
                   z3 [2] INTEGER }

Is the tagging that I entered correct or it has to be, in fact like this:

TYPE1 ::= SEQUENCE { x1 [0] INTEGER,
                     x2 [1] TYPE2,
                     x3 [7] TYPE3 }

TYPE2 ::= SEQUENCE { y1 [2] INTEGER OPTIONAL,
                     y2 [3] INTEGER OPTIONAL,
                     y3 [4] SEQUENCE { yy1 [5] INTEGER,
                                       yy2 [6] INTEGER } OPTIONAL }

TYPE3 ::= CHOICE { z1 [8] INTEGER,
                   z2 [9] INTEGER,
                   z3 [10] INTEGER }
}

The reason I am asking is that in my first case, I am confused about how the
chosen value from the CHOICE type should be encoded under BER.
If AUTOMATIC TAGS clause is not used in the module header, it is recommended
to add a tag of class context-specific before every CHOICE type.

If TYPE2 is skipping its y3 member (whose encoding should have started with
[A2] - context specific, constructed, unique id 2 within TYPE 2), next comes
the encoding of TYPE 3 which also starts with [A2] - context specific,
constructed, unique id 2 within TYPE 1.

Obviously, this creates confusion on the decoding side. I guess this all
boils down to the following question: Those unique tags must be unique at
the sequence/choice level or at a "global" level?

Any help would be greatly appreciated.

Thanks,
Eddie

_______________________________________________
ASN1 mailing list
[email protected]
http://lists.asn1.org/mailman/listinfo/asn1

Reply via email to