Hi Henrik,

> segfault. Hope this helps.

Thanks. Unfortunately, I cannot reproduce it. I even checked with my
special GC check setup, where a garbage collection is performed before
each 'cons'. This usually shows errors in the data handling (though such
a test runs for hours).


Could it be a stack overflow, because of that long argument? What says

   $ ulimit -s

in your case? Does it work if you set it to 'unlimited'?

If so, I would reconsider the design of that function. 'match'ing such
large lists is not very efficient. Can't you step through the data with
'from' and 'till', extracting individual tokens, instead of reading them
into a huge list?

Cheers,
- Alex
-- 
UNSUBSCRIBE: mailto:picol...@software-lab.de?subject=unsubscribe

Reply via email to