Hello,

I have a LaTeX document which contains hundrends of PDF images. After
compiling the document, I get a PDF file which is too big, ~10 Mb instead
of expected ~1Mb. Investigating the issue, I found that the images
consist of two parts: ~10Kb of the image itself and ~75Kb of the
preamble, such as subsetting Helvetica Condensed. LaTeX embeds the PDF
images as is, therefore the final PDF document contains hundrends of very
similar preambles.

I'd like to optimize the PDF document by removing these preambles using
iText. However, I don't have experience with this library and can't
estimate if the task is realistic and how much time it will take. What is
your opinion?

Thanks.



-- 
Oleg Parashchenko  olpa@ http://uucode.com/
http://bbAntiSpam.com/   Universal antispam for forums, blogs, etc
http://uucode.com/blog/  Generative Programming, XML, TeX, Scheme

-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
iText-questions mailing list
iText-questions@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/itext-questions

Do you like iText?
Buy the iText book: http://www.1t3xt.com/docs/book.php
Or leave a tip: https://tipit.to/itexttipjar

Reply via email to