I mean O(C) where C is a constant amount of memory.

This paragraph, taken from the PDF spec (sec 2.3.5, pg 24) makes me believe
it is possible to produce PDFs in (approximately) constant memory:

  "Because of system limitations and efficiency considerations,
   it may be desirable or necessary for an implementation of a
   program that produces PDF to create a PDF file in a single
   pass.  This may be, for example, because the application has
   access to limited memory or is unable to open temporary
   files.  For this reason, PDF supports single-pass generation
   of files.  While PDF requires certain objects to contain a
   number specifying their length in bytes, a mechanism is
   provided allowing the length to be located in the file after
   the object.  In addition, information such as the number of
   pages in the document can be written into the file after all
   pages have been written into the file."

While this doesn't specifically say, "You can generate PDFs in constant
memory,"  it seems to imply it.

So it seems to me, if a PDF generator can run in O(C) memory, that C would
be equal to the memory required to generate a single page plus overhead.

The reason I ask is that when generating a relatively simple PDF of around
100 pgs, it looked like FOP used somewhere between 100-200M of memory (I
forget the exact amount).  I'm trying to evaluate the scalability of FOP for
use as a real-time report generation tool, so obviously memory usage is
important to know.

Having a lot of fun working with FOP.  I can't believe how easy it is to use
and what an impressive job it already does.  Great work to all the
developers!  Good luck!

Thanks,

Chris Rorvick

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]

Reply via email to