The advice to try a different serializer is spot on.

Serialize any object tree to file using Java's standard serializer and
then open that file in a binary editor and then you'll see why the
standard Java serialization stream takes a surprisingly large amount of
bytes to store each object.

I had this problem in a desktop application years ago. We were pulling
in  .CSV files, converting each row to objects and then serializing the
lot. We were getting massive 15MB files from relatively small .CSV
files. For the majority of objects the header info stored for each
object in the stream is usually orders of magnitude larger than the
space taken up by the object's attributes.

>-----Original Message-----
>From: Per []
>Sent: Saturday, 25 February 2012 12:13 PM
>Subject: Re: Performance optimization
>Martin Makundi wrote
>> The problem is that the SERIALIZATION takes time. So it does not help
>> ZIP AFTER serialization...
>Well, if you really only have one page in your session, and that page's
>serialisation is killing you, then you're right. But if you have
>page versions, and other pages in your session, and your session is
>even 50mb, then the zipping might help: not for this particular page,
>for all the *others* that also have to get read and restored.
>Also, have you considered trying other serialisers? I'm not an expert
>that topic, but I overheard other developers that there are faster
>libraries. They have tradeoffs, but maybe one of them works for you.
>View this message in context: http://apache-
>Sent from the Users forum mailing list archive at
>To unsubscribe, e-mail:
>For additional commands, e-mail:

To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to