Hey Guys (my emails were tidier but apache's spam protection is ridiculous :)),

But anyway ... 

Fair enough on your response kindaian... – thanks for your answer (and you too 
Jeremy!)

Perhaps the software I am using to do the conversion from PDF to PCL could be 
improved.. I tried postscript, but so far I have been processing for 21hrs.. My 
79Mb PDF file is now a 94 Gig Postscript file! (that can’t be real, surely??).

PCL takes about 6hrs for 20,000 pages (5333 recipient * four pages) with every 
4th and 5th recipient in duplexing mode and the 1st page taking from the 
letterhead tray, the rest from the plain paper tray. The current process would 
create 20,000 files and add them to one larger file (this process is fast) 
marking duplexing and paper requirements whilst it aggregates them.

As it seems my format options are somewhat limited PCL/PS/AFP [we call the AFP 
the coppers over here ☺] –

The XSLT -> FO and FO -> PDF processes work quickly. It’s simply the PDF -> 
PCL/AFP/PS stage which is killing me.

I’m using ghostscript for this purpose. Just their standard PDF2PS.bat script 
(and a slightly hacked one for PCL).

Can you recommend a better software for this purpose, or is the conversion to 
PDF first potentially hazardous in performance to my desired printing type (PCL 
or PS).

Like, maybe XSL -> FO -> SOMETHING -> PCL or XSL -> FO -> PCL?

Thanks guys your help is most appreciated!
Martin.

-------------------------------

From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] 
Sent: Saturday, 31 May 2008 11:11 PM
To: [email protected]
Subject: Re: best format for high speed printing


Is there an efficient file format available that is much faster, and still
allows me to control duplexing and tray selection per page?
    

  
Some high end printers have special markings and "language" (with barcodes) 
that you can use outside the printing area to control the way the printing on 
that page is handled.

I've seen it used in conjugation with printing-folding and enveloping systems 
(but can't be more precise because never used one, only watched the systems 
working on expos).

As for optimization, one thing that i noticed is if you have loads of citation 
references, the document production takes an huge time more to be produced. And 
I'm only referring to fo->pdf production mind.

For further optimization you can also decouple the xml-xslt->fo and fo->output 
production. Passing from a 32bit to full flagged 64bit environment may also 
allow further improvements (and break the Java limitation on the memory 
allocation that happens in all 32bit applications). In 32bit environment with 
4gb ram on the machine, I wasn't able to use more then 1.2Gb ram in Java.

The project I was involved is a book with more then 5k pages with extensive use 
of page references to produce automatically indexes. To be able to "render" the 
book it was needed to split the book in chapters and produce an index for each 
of the chapter. Not a nice hack, but it worked.

Hope to have been of help,

Cheers,
Luís Ferro


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to