On 4/11/2019 12:28 PM, Peter Rolf wrote:

I have a compilation problem with a data driven document.
This is what I get after around 12min (first run)...

I'm still puzzled by this 12 min ... so less than one page per second ... do you need to process all that data each run? It's over a decade ago that I could start a run and come back after an enforced break to check. Is there some memory build up (swapping)? Is the speed linear?


Will show the time spend per page.

pages           > flushing realpage 991, userpage 970, subpage 991
pages           > flushing realpage 992, userpage 971, subpage 992
pages           > flushing realpage 993, userpage 972, subpage 993
mtx-context     | fatal error: return code: -1073741571

It compiles fine if I just use the first or second "half" of the data.
But as soon as I reach a certain amount of pages, it crashes.
An older version compiled fine with 958 users pages (977 total). Seems
that the data has reached a dimension that causes a resource problem.

Any advice? I looked into "texmf.cnf" and "texmfcnf.lua" for possible
bottlenecks, but I lack the knowledge what to change. And it simply
takes too much (run)time to just play around with some values.
I'll add a \showusage commands that reports some stats per page so that you can see what grows out of control.


                                          Hans Hagen | PRAGMA ADE
              Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
       tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
If your question is of interest to others as well, please add an entry to the 

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki     : http://contextgarden.net

Reply via email to