On Jul 19, 2008, at 02:35, Dean Landolt wrote:

I seem to be running up an arbitrary file size limit in MochiWeb. For
smaller files I'm in good shape, but once I get upward of a few megs I get
this back from MochiWeb:

{"error":"EXIT","reason":"{body_too_large,6099917}"}

I poked around in the MochiWeb source and it looks like there's a setting
for this, but I haven't explored further yet.

I tried to use chunked encoding but I seem to run into a socket error issue with httplib: 104, connection reset by peer. Seems like MochiWeb doesn't like that very much, but that could just be httplib. Still trying to work
through that...

In any event, has anyone pushed in large attachments with the new api? Is
there something I'm missing? Thanks...

The current limit for request sizes is 1MB (didn't we up that to 4GB? Christopher,
maybe that got lost with recent MochiWeb updates?).

You can set the size for yourself in src/mochiweb/mochiweb_request.erl
-define(MAX_RECV_BODY, (1024*1024)). <- that is the line. Then
recompile and you are good to go. I was able to push 2.5GB into an
attachment.

Please note that CouchDB holds the entire attachment in memory, even
with a chunked request, before writing it to disk. This is a current design limitation and will be removed for 0.9. We are aware that this stinks, but
so far the new API removed the need for base64 which was another
performance killer. Streamed read and write operations will follow.

I used curl to make successful chunked requests.

Cheers
Jan
--

Reply via email to