> she thought it's like a cook book with an end and she kept going
> because she wanted to get an overview.

She is right about that. It is a cook book and it is finite at any given
time.
But it is also huge and growing. Depending on her processing speed, she
can parse it to the end (at that time) or might end up parsing "forever"
because of new recipes getting added faster than she parses the old ones.

Another case would be a Service, that generates new recipes as
requested. I imagine an arts project or expert system, that starts with
simple recipes and a huge set of rules and then generates increasingly
complex N-course-meals of more or less exotic (and tasty) dishes. You
would certainly sometime reach the 10-course-meal (and most likely be
bored to death at that time). But if the ingredient database is big
enough that would take really long. Such a service would in fact provide
an infinite stream of meals (of decreasing practicality as N growths).

In the context of langsec, the former and the latter cases are examples
for real world data streams that have to be processed (for the latter
you may assume a log line stream instead). Obviously, they are regularly
processed by splitting them into independent chunks (in this cases
single recipes or meals).
But if someone would sends a recipe or log line of infinite length, the
chunk would be of infinite length. So there have to be enforced limits
or the parse buffer would flow over.
And (at least in most humans) there are! She did not parse until the end
before interupting the process and doing something else. She most likely
will not parse "forever" if she encounteres an infinite or way too long
stream. And the same probably is valid for single recipes. If she would
encounter a recipe with a way too long list of ingredients - most
probably she would abort parsing and discard that recipe.

Every parser needs to limit input chunk size even when the expected
stream length is infinite because processing memory is never infinite.
And a lot of real world applications also need to limit stream lengths too.

Seems all obvious at first. But a look at all the buffer overflows out
there - that shows that it oviously is not... :(

But you might be glad, that your mother did not overflow while parsing
that looooong stream (that of course is long). ;)



-- 
Allan Wegan
Jabber: allanwe...@erdor.de
ICQ: 209459114

Attachment: signature.asc
Description: OpenPGP digital signature

_______________________________________________
langsec-discuss mailing list
langsec-discuss@mail.langsec.org
https://mail.langsec.org/cgi-bin/mailman/listinfo/langsec-discuss

Reply via email to