Sorry for the delay in thanking you, Jakob -- I'm on digest.
The xml file is undoubtedly less-than well-formed, but so are the
previous files unfortunately.
Your suggestion of using rxp is gratefully received. I've tried several,
including XML Spy, but found them extremely slow to u
Sorry for the delay in thanking you, Jakob -- I'm on digest.
The xml file is undoubtedly less-than well-formed, but so are the
previous files unfortunately.
Your suggestion of using rxp is gratefully received. I've tried several,
including XML Spy, but found them extremely slow to
FM 7.2, P4 with dual AMD 2MHz cpu's and 2 Gb quick DRAM, 110 Gb free HDD
space. All processing done locally (i.e., off the network).
I'm still hacking away at my DocBook project and learning structured
Frame. Until now I've been developing my FM and WebWorks templates using
a subset of the data
... this is our biggest hurtle.
What a great word!
When a hurdle is causing great pain, then it is a hurtle!
The word also applies if a hurdle is making your progress very slow
indeed.
;-)
I need some coffee.
john
... this is our biggest hurtle.
What a great word!
When a hurdle is causing great pain, then it is a hurtle!
The word also applies if a hurdle is making your progress very slow
indeed.
;-)
I need some coffee.
john
___
You are currently subscribed to Fr
ruary 06, 2006 9:16 AM
To: John Pitt; framers@lists.frameusers.com
Subject: Re: Importing a very large DocBook XML file kills FM 7.2
At 12:48 AM 2/6/2006, John Pitt wrote:
>Q3 Is there anything I can do? We would prefer not to cut up the xml
>file
>as we want all chapters to talk to each
ollye=clearpath.cc at lists.frameusers.com] On Behalf
Of Lynne A. Price
Sent: Monday, February 06, 2006 9:16 AM
To: John Pitt; framers at lists.frameusers.com
Subject: Re: Importing a very large DocBook XML file kills FM 7.2
At 12:48 AM 2/6/2006, John Pitt wrote:
>Q3 Is there anything I can do?
John,
On 06/02/06, John Pitt wrote:
> Q3 Is there anything I can do? We would prefer not to cut up the xml
> file as we want all chapters to talk to each other without spending long
> periods resolving broken xrefs.
coming from the XML end of things, my first question would be: Is the
XML file
At 08:47 AM 2/6/2006, Mollye Barrett wrote:
How do you recommend tweaking the DTD to avoid looking the xrefs. We're
round-tripping and this is our biggest hurtle.
Mollye,
Support for external xrefs during XML import/export was introduced in FM
7.1. Sorry, I'm about to catch a plane and don't
At 08:47 AM 2/6/2006, Mollye Barrett wrote:
>How do you recommend tweaking the DTD to avoid looking the xrefs. We're
>round-tripping and this is our biggest hurtle.
Mollye,
Support for external xrefs during XML import/export was introduced in FM
7.1. Sorry, I'm about to catch a plane and don't
> xref "lists" that are pointing to the main chapter). The previous xml
> files were about 2 Mb; the latest file is 24Mb.
>
> Both of my attempts to open/convert the big bugger have failed. After
> about 90 min, Task Manager's Performance window shows a rapid rise in MEM
> Usage from about 700Mb
xref "lists" that are pointing to the main chapter). The previous xml
files were about 2 Mb; the latest file is 24Mb.
Both of my attempts to open/convert the big bugger have failed. After
about 90 min, Task Manager's Performance window shows a rapid rise in MEM
Usage from about 700Mb to 2.15 G
At 12:48 AM 2/6/2006, John Pitt wrote:
Q3 Is there anything I can do? We would prefer not to cut up the xml file
as we want all chapters to talk to each other without spending long
periods resolving broken xrefs.
John,
Forgot to add that FM does support importing XML documents with xrefs to
At 12:48 AM 2/6/2006, John Pitt wrote:
>Q3 Is there anything I can do? We would prefer not to cut up the xml file
>as we want all chapters to talk to each other without spending long
>periods resolving broken xrefs.
John,
Forgot to add that FM does support importing XML documents with xrefs t
At 12:48 AM 2/6/2006, John Pitt wrote:
Until now I've been developing my FM and WebWorks templates using a
subset of the data available, which produces a book of about 1250 pages
(one chapter of >1000pp and 4 smaller ones which mostly consist of xref
"lists" that are pointing to the main chapt
At 12:48 AM 2/6/2006, John Pitt wrote:
> Until now I've been developing my FM and WebWorks templates using a
> subset of the data available, which produces a book of about 1250 pages
> (one chapter of >1000pp and 4 smaller ones which mostly consist of xref
> "lists" that are pointing to the mai
John,
On 06/02/06, John Pitt <[EMAIL PROTECTED]> wrote:
> Q3 Is there anything I can do? We would prefer not to cut up the xml
> file as we want all chapters to talk to each other without spending long
> periods resolving broken xrefs.
coming from the XML end of things, my first question would b
FM 7.2, P4 with dual AMD 2MHz cpu's and 2 Gb quick DRAM, 110 Gb free HDD
space. All processing done locally (i.e., off the network).
I'm still hacking away at my DocBook project and learning structured
Frame. Until now I've been developing my FM and WebWorks templates using
a subset of the dat
18 matches
Mail list logo