Ramaswamy wrote:
But the problem that I faced is that the grammar is in such a manner that the ambiguity is within the same context not like ambiguity b/w SequenceValue/ObjectIdentifierValue or somethin like that but within the DefinedSyntax itself. The ambiguity is within defined syntax, which is basically a sequence of prdn 'Setting' and comma. So when the parser sees a Type followed by lets say a SequenceOfValue, it is not able to determine if its a Type followed by a value or to consider it as a parameterized type.
You cannot determine at parse time if the sentence "T{1,2}" is a setting that represents a type instantiation, or is a type setting "T" followed by a value setting "{1,2}".
If you have collected the tokens as per method (a) that I mentioned previously, then parsing "T{1,2}" would be trivial once the full type information is known. You will know if a type setting is expected, and if that type setting is parameterized, and how to parse the tokens.
If you are trying to parse a setting completely, without context information (as per method (b)), then any sentence that has "{" immediately after a Type is ambiguous. It is best to assume a type followed by an ArgumentList/DefinedSyntax/whatever. Thus, the settings are split up as much as possible. When semantic analysis is performed later, the settings can be recombined if necessary.
The parser I'm messing about with at the moment splits the settings as described above. For example, the following input:
Foo DEFINITIONS ::= BEGIN
T ::= T{1,2}
v T ::= {TYPE T{1,2} IDENTIFIED BY {}}
END
gets tokenized as follows:
-- ASN.1 for module 'Foo'
Foo DEFINITIONS IMPLICIT TAGS ::=
BEGIN
EXPORTS ALL;
-- no importsT ::= T{1,2}
v T ::= {TYPE T {1,2} IDENTIFIED BY {}}
END
Note that the sentence "T{1,2}" occurs twice: the first time, it is recognized as a type instantiaton. The second time, it is split into a type reference T and an ambiguous value {1,2}. The semantic analyzer would be responsible for converting the value {1,2} to an argument list if it turns out that arguments are expected.
Cheers, Geoff
