On May 30, 2008, at 16:01, Simon Kapeniak wrote:

Hello,
I'd like to ask what is expected performance of CouchDB in case of heavy binary data attachments. Scenario I'd like to evaluate CouchDB for, implies
little different usage then web based application or such, which as I
understand involves dozens of small chunkcs requests rather then constant
access to big files.

I wonder what are possible issues with that: I want to save, lets say, a 1000 of binary files with about 1-20MB each. Many of them will have a number of versions (ten). There won't be too many simultaneous requests, but rather constant access of 1 to 20 clients per minute asking about single file, or
some of them (less) also sequence of attachments.

In general, under one project, my database should handle a few TB of data generated during a period of ~60 days, allowing access to that data (with a
rate of ~500MB per minut) for something like 10-20 people in total?

Are there any issues I don't see now? What are possible problems introduced
by such application? Thanks for any comments!

The current state is not optimised for the scenario but this is simply
due to the fact that we are in the getting-it-right than in the getting-it-fast
phase of the project. Other than that, CouchDB and its attachment
system should be a good fit for such a project.

Any help in this (or any, really) area (testing, bug reports,
profiling, patches) are highly appreciated.

Cheers
Jan
--



Cheers,
Simon.

Reply via email to