You are correct in the analysis you posted in Commons mailing list.
GetMethod streams everything to disk or memory in parseResponse.
PostMethod.sendData(InputStream is) creates a memory buffer.

PostMethod also has sendData(URL url) and this is probably the solution
for your problem.

destinationResource.putMethod(sourceResource.getHttpURL().toURL());

This way the JDK functions are used to get the content so your
enhancement request is still valid. But first try this, if it doesn't
work we can start improoving HttpClient. You can always submit a patch
if you want.

I would implement GetMethod.setUseBuffer to disable buffering and delay
reading the response content. getData() can then directly return the
input stream from the response. A second call to getData should call
reset() on the InputStream and this will probably throw an exception so
you can only get the data once.
Postmethod can use the same method, setUseBuffer will then make that
sendData(InputStream is) doesn't buffer the inputstream. A second
execute needs a reset() in streamQuery throwing an IOException if not
possible.


Dirk


Dmitry Beransky wrote:
> 
> Hi,
> 
> I'm working on an application that schedules resource transfers between two
> webdav servers.  The application gets two urls -- a source and destination
> -- and pumps data from one into the other by the means of the following code:
> 
>           InputStream in = sourceResource.getMethodData();
>           destinationResource.putMethod( in );
> 
> The problem is that the resources being transferred are huge media files
> several hundred MBs in size.  And, apparently, getMethodData() always
> transfers the data to the local system first.  Since I don't need a local
> copy of the data, I'd like to be able to pump the data from one resource
> directly to the other (via a small memory buffer).  Is that currently
> possible?  How?  If not, could the current implementation of the client be
> modified to support such a behavior.
> 
> Another problem I have with the current behavior is that it unable to deal
> with transfers of big files.  Even though the client saves data to a
> temporary local file, I still get an OutOfMemoryException when trying to
> write to the destination:
> 
> java.lang.OutOfMemoryError:
>          at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:109)
>          at
> org.apache.commons.httpclient.methods.PutMethod.sendData(PutMethod.java:187)
>          at
> org.apache.webdav.lib.WebdavResource.putMethod(WebdavResource.java:2193)
>          at
> org.apache.webdav.lib.WebdavResource.putMethod(WebdavResource.java:2174)
>          at
> edu.ucsd.mediamanagerscheduler.TransferTask.run(TransferTask.java:46)
>          at java.util.TimerThread.mainLoop(Timer.java:435)
>          at java.util.TimerThread.run(Timer.java:385)
> 
> Plus, when this happens, the temporary files is permanently left on the
> system (in my case, each file is at least 0.5GB in size, so this adds up
> really fast).
> 
> Any recommendations?
> 
> Regards
> Dmitry
> 
> ---
> Dmitry Beransky
> Software Engineer
> 
> University of California, San Diego
> Multimedia Interactive Learning Lab (http://mill.ucsd.edu)


--
To unsubscribe, e-mail:   <mailto:[EMAIL PROTECTED]>
For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>

Reply via email to