It's always difficult to estimate if there will be a problem. Trying it out will help a lot. :-) Anyway, given that the latest code hasn't received much attention on memory consumption, yet, there's a good chance that your document might not work. Having a lot of RAM (and an increased VM size) helps a lot which you obviously don't really have. The images are probably less of a problem than the forward references. FOP has a special area tree model which can serialize pages to disk but I've only fixed it, not done any extensive tests with it. It also needs Java knowledge to activate it. In the end, if your TeX solution works, use it. There's a good chance that you run into the problems you noted below with bigger documents and the latest FOP release.
On 13.03.2006 00:03:56 Paul Tremblay wrote: > I want know if FOP will run out of memory if I try to convert a thesis > to PDF. > > I will be using .91 beta and am running java 1.4.1. I am using linux > on an older box with a speed of 400 MHZ and ram of 500 Mb. > > The document will be about 80 pages long. It will have a table of > contents at the beginning. > > The appendix will contain a table of 100 rows, with each row > containing a small jpg graphic, probably around 100 Kb in size. > > Right now I am using TeX to convert the document. I would like to use > FOP instead, but before I do all the work of changing the XSLT > stylesheets, I want to know if FOP can handle the memory. I know that > a table of contents at the beginning uses a lot of memory, right? > Doesn't a table of contents require FOP to hold the whole document in > memory? Likewise, I am not sure if the number of images (though small) > will require too much memory. Jeremias Maerki --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
