Hi, we're trying to upload relatively large images (~5MB) to S3 via heroku/cedar/rails-3.2.3/dragonfly, and are hitting the 30-second cutoff pretty consistently. Has anyone found a workaround for uploading large files? I'm exploring a couple of options :

1) Upload directly to S3, bypassing heroku. This would be ideal, but haven't figured out yet how to integrate this with dragonfly, as it seems to rely on a specific combo of mime types and a metadata file to display properly.

2) Streaming the request somehow (https://devcenter.heroku.com/articles/request-timeout) with the 'Transfer-Encoding: Chunked' header. I've toyed with this but I think this is more targeted at streaming outgoing responses rather than incoming uploads (?). This looks similar (http://icelab.com.au/articles/money-stress-and-the-cloud/) but again I don't think it will help with incoming data.

3) Switch to something like https://github.com/dwilkie/carrierwave_direct - this will be a fairly major change, and we'd like to avoid it if possible.

Any suggestions or experiences handling large uploads?

Cheers,

Dave

--
You received this message because you are subscribed to the Google
Groups "Heroku" group.

To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/heroku?hl=en_US?hl=en

Reply via email to