FM 7.2, P4 with dual AMD 2MHz cpu's and 2 Gb quick DRAM, 110 Gb free HDD space. All processing done locally (i.e., off the network).

I'm still hacking away at my DocBook project and learning structured Frame. Until now I've been developing my FM and WebWorks templates using a subset of the data available, which produces a book of about 1250 pages (one chapter of >1000pp and 4 smaller ones which mostly consist of xref "lists" that are pointing to the main chapter). The previous xml files were about 2 Mb; the latest file is 24Mb.

Both of my attempts to open/convert the big bugger have failed. After about 90 min, Task Manager's Performance window shows a rapid rise in MEM Usage from about 700Mb to 2.15 Gb, and Frame displays one of its "Hmmm. Dunno what's happening, but send us the error file and we'll look at it (but we won't let you know the outcome)" messages. Killing that message kills Frame.

Q1 Is the file too big for Frame? I can remember opening bigger files on UNIX FM 5.1.

Q2 Will I gain anything by grovelling for another Gb or two of DRAM?

Q3 Is there anything I can do? We would prefer not to cut up the xml file as we want all chapters to talk to each other without spending long periods resolving broken xrefs.

jjj



_________________________
John Pitt, technical writer
47 Gottenham St
Glebe NSW 2037
Ph: 02 9692 8096
Mob: 0438 92 8096
[EMAIL PROTECTED]
www.pitt.net.au

_______________________________________________


You are currently subscribed to Framers as [EMAIL PROTECTED]

To unsubscribe send a blank email to [EMAIL PROTECTED]
or visit 
http://lists.frameusers.com/mailman/options/framers/archive%40mail-archive.com

Send administrative questions to [EMAIL PROTECTED] Visit
http://www.frameusers.com/ for more resources and info.

Reply via email to