Recently I have tried the iText 1.3.
Everything was smooth but when I tried to generate a BIG .rtf I've received 
"java.lang.OutOfMemoryError: Java heap space" :)
This is no matter which of data cache styles are used (CACHE_MEMORY or 
CACHE_DISK), the result will be the same (OutOfMemoryError).
Brief look at source of RtfWriter2.java and RtfDocument.java gives the answer:

the piece of RtfWriter2() class:

/**
 * Closes the RtfDocument. This causes the document to be written
 * to the specified OutputStream
 */
public void close() {
    try {
        os.write(rtfDoc.writeDocument());
/*---------------^^^^^^-^^^^^^^^^^^^^^^ call of RtfDocument's method */
        os.close();
    } catch(IOException ioe) {
        ioe.printStackTrace();
    }
}

the piece of RtfDocument() class:

/**
 * Writes the document
 *
 * @return A byte array containing the complete rtf document
 */
public byte[] writeDocument() {
    ByteArrayOutputStream docStream = new ByteArrayOutputStream();
    try {
        docStream.write(OPEN_GROUP);
        docStream.write(RtfDocument.RTF_DOCUMENT);
        docStream.write(documentHeader.write());
        data.writeTo(docStream);
        docStream.write(CLOSE_GROUP);
    } catch(IOException ioe) {
        ioe.printStackTrace();
    }
    return docStream.toByteArray();
/*  -----------------^^^^^^^^^^^^^ this array eats up a lot of RAM */
}

I don't know why it was implemented in such a way. It obviously sets a limit of
 .rtf file size.
Why not change this piece of code like this:

in RtfWriter2() class:
/* the OutputStream of .rtf file is passed to writeDocument */ 
public void close() {
    try {
        rtfDoc.writeDocument(os);
        os.close();
    } catch(IOException ioe) {
        ioe.printStackTrace();
    }
}

in RtfDocument() class:
/* The ByteArrayOutputStream is removed. So everything from cache
 * goes directly into the .rtf OutputStream. */
public void writeDocument(OutputStream os) {
    try {
        os.write(OPEN_GROUP);
        os.write(RtfDocument.RTF_DOCUMENT);
        os.write(documentHeader.write());
        data.writeTo(os);
        os.write(CLOSE_GROUP);
    } catch(IOException ioe) {
        ioe.printStackTrace();
    }
    return;
}

Using this approach I have generated 448mb .rtf file (the simple one). Of cause 
it was done with CACHE_DISK data cache style. Not very fast but reliable :)

Eh... Now it's time to ask the local GURUs ;)
Does my approach have any limitations?

Dron.



-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
_______________________________________________
iText-questions mailing list
iText-questions@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/itext-questions

Reply via email to