Hi, I'm doing a largeish tabular dump from a database query and it's working fine (even made some 600+Mb ones)when splitted up to smaller tables. However, profiling indicates that it spends quite some time when adding a cell to a table. Is there e.g. some measuring in the table that could be moved to a later time in order to increase performance? Currently I get (yes, I know, benchmarks are quite relative) about 90ms/cell when read from a fast DB. Other than "insert faster CPU and press any key to continue"?
--- Nik ------------------------------------------------------------------------- This SF.net email is sponsored by: Splunk Inc. Still grepping through log files to find problems? Stop. Now Search log events and configuration files using AJAX and a browser. Download your FREE copy of Splunk now >> http://get.splunk.com/ _______________________________________________ iText-questions mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/itext-questions Buy the iText book: http://itext.ugent.be/itext-in-action/
