On 8/22/06, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:


I had this problem as well.
I read the whole file into memory and then deflated it, and then I encoded
it with Base64 (so, it could be parsed in XML), which actually made it
come out bigger than the original source.
All was fine until I passed it around the JMS setup and then I got some
serious grind going on. Even when I made it an asynch transfer.

I then decided as I was going to end up moving from 1mb files to 50mb and
in extreme cases 2GB files, I needed another approach. This is when I
started looking into Streams in ActiveMQ. I can tell you streams are not
explained very well and are not easy in ActiveMQ.

Basically you can stream the file onto the JMS queue, but it will be
broken up into many messages. So, far so good. The problem then appears
where you have to take the messages off the queue, one at a time and
create a new stream to write them out too.

Just create an InputStream?

http://activemq.org/site/jms-streams.html

via InputStream in = connection.createInputStream()


In my case another stream for
FTP, as I do not really want to write temp 2GB files on my base o/s
(windows). This is a real pain and not really documented anywhere
properly. Even in the ActiveMQ API there is no explanation of what the
stream classes do, let alone examples!

See the page above


Sounds not a big deal?
Well this means you may have streams hanging around in memory, either
waiting to be used, in use or worse still hanging around and not being
disposed off!
Yep you will have to write the code to manage all of that and make sure
its thread safe.

I don't follow this part?
--

James
-------
http://radio.weblogs.com/0112098/

Reply via email to