https://issues.apache.org/bugzilla/show_bug.cgi?id=45778


Antti Koskimäki <[EMAIL PROTECTED]> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|RESOLVED                    |REOPENED
         Resolution|FIXED                       |




--- Comment #2 from Antti Koskimäki <[EMAIL PROTECTED]>  2008-10-24 00:05:28 
PST ---
> I re-ran your test (cloning 50 sheets from a 22KB input workbook).
> Here are the resulting file sizes:
> before fix: 13,101KB
> after fix:     588KB

Thanks, verified that this is OK.

But, when I tried to stress-test the fix, I noticed that by adding more
cloning-iterations the workbook got corrupted at some point (limit=83, to be
exact :=). Workbook without auto-filters seems not to corrupt, not even with
1000 clones.

When I tried to open the workbook I got several "File error, data may have been
lost" pop-ups in row, one for each (!) cloned sheet. Besides fatal, also very
annoying :=)

Not sure how closely relates to this bug or fix, seems to re-produce with
3.2-FINAL too. But you mentioning buffer-overruns and me using the same
test-case to reproduce, plus auto-filters having something to do with it, I
decided to re-open this bug instead of reporting new one.




-- 
Configure bugmail: https://issues.apache.org/bugzilla/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to