Hans Åberg wrote:

> > I don't understand why they did that. They even say "memory
> > exhausted" which is probably wrong in most cases. (There's enough
> > memory available, the parser just chooses not to use it.) As the
> > bash example shows, such arbitrary limits keep hitting users many
> > years after (just like the Y2K bug), so it's better to avoid them
> > from the start (like the C++ parsers do).
> 
> A stack of 10000 might have been thought of never be reached.

Like I said, Y2K. And 640 KB is enough for everybody. :)

> Some tests showed that compilers can optimize better than human.
> So is probably better to just write a well structured code, and
> then profile to find hot spots.

Certainly. I know in my programs the parser is not the hot spot at
all. Even where it is, the dynamic reallocation of vector probably
still isn't the hot spot. If you find a case where it's so, you can
switch to deque, but I'll keep vector as the default.

Regards,
Frank

Reply via email to