Orlando wrote:
What are you talking about when you say exponential heap? With the most
naive
implementation possible, the largest your memoization heap could be is
(n * m).
That's linear!
Not O(n log n), not O(n^2), not O(n^3), not O(n^m) and certainly not
O(m^n).
Ok, I was going to let this go, seeing that it was useful to Terence,
but my first reaction was, what on earth are they teaching these days?
Then the lecturing "Not..." line... Gee. Is this a list of all the
non-linear complexities you could think of? Here's a couple you forgot:
O(n!), not linear. O(n*m), not linear.
It could be you were thinking of O(n+m), the complexity of depth-first
search, or you might have been recalling reports that the n*m table in
the original packrat algorithm is very sparse when parsing e.g., Java or
C. In fact the number of m actually cached seems to be bounded by a
small constant. Therefore, for these grammars and input the storage
overhead is for all practical purposes O(n).
But it is still possible to write a PEG and find input for which almost
every rule is successfully matched to almost every input symbol, which
means that packrat storage cost is still worst case O(n*m) and
non-linear. Here's one simple example:
start -> a+ 'z'
a -> b 'x' \ c
b -> 'a'
c -> 'a'
Rules a, b and c successfully match all but one symbol in the input
described by the regular expression /a*z/.
Bob
_______________________________________________
PEG mailing list
PEG@lists.csail.mit.edu
https://lists.csail.mit.edu/mailman/listinfo/peg