Hi Andreas,

Is there any other way to get around this? Please let me know if you
have any other solution for this.

Thanks
Yuva



-----Original Message-----
From: Andreas Veithen [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, March 11, 2008 5:41 PM
To: axis-user@ws.apache.org
Subject: Re: Question regarding attachments with Axis and DataHandler

Yuva,

The implementation of the ByteArrayDataSource constructor you are  
using looks as follows:

public ByteArrayDataSource(InputStream is, String type) throws  
IOException {
     ByteArrayOutputStream os = new ByteArrayOutputStream();
     byte[] buf = new byte[8192];
     int len;
     while ((len = is.read(buf)) > 0)
         os.write(buf, 0, len);
     this.data = os.toByteArray();
     this.type = type;
}

As you can see, it will indeed read the entire Blob into a  
ByteArrayOutputStream and then copy it to a byte array. It is  
therefore not surprising that you run out of memory.

Andreas


On 11 Mar 2008, at 21:42, Chandolu, Yuva wrote:

> Hi,
>
> I am trying to replace one servlet serving huge files by a  
> webservice. When request comes in, the servlet opens InputStream to  
> the file blob in database, read from the input stream and write to  
> the http output stream. When client starts pulling the data it comes  
> straight from the input stream of the servlet (pure streaming). In  
> which case the Servlet can handle 100s of clients because the  
> Servlet threads are not trying to load the whole file into memory on  
> the server side and just sending down the data when it is read from  
> the client side.
>
> Now I am trying to replace the servlet code by a web service. I am  
> using DataHanlder for attachements. We have big files in database  
> (each 10MB or more). When the client call my service  
> downloadfile(fileName) I need to pull it from the database and  
> create a datahandler and return to the client. My concern is what  
> will happen if 100 clients request the all big files (say each 10 MB  
> in size) using downloadFile() service. My question here is how the  
> datahandler works, will it read all the file contents from file blob  
> from database and store in memory before we return it to the client?  
> Looks like it is because my tomcat is running out of memory when I  
> tried 20 client threads in parallel requesting for big files. How  
> can I solve the problem?
>
> Following is my code snippet
>
> Blob file_blob = rs.getBlob(1);
> ByteArrayDataSource bds = new  
> ByteArrayDataSource(file_blob.getBinaryStream(), "application/octet- 
> stream");
> DataHandler data_handler = new DataHandler(bds);
>
> return data_handler;
>
> Thanks in advance
> Yuva


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to