On 4/03/10 17:03, Tom Sante wrote:
On 4/03/10 16:59, km wrote:
BTW, what is the datasize ?

Krishna
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
On Thu, Mar 4, 2010 at 11:59 PM, Tom Sante<[email protected]> wrote:

On 4/03/10 15:52, Simon Metson wrote:

Hi,

I will probably use that strategy to partition my data per experiment.


If you have a logical division (say season or month the data was taken
in) I'd use that - it'll be easier to work out what your databases
hold.

For now I will probably divide per year and per experiment.


And use an external watcher script like in couchdb-lounge to replicate
my common design documents in all databases.


I think I'd use CouchApp for that - means you can easily version your
views, too.


I'll take look at both approaches.
The data gets imported per experiment so I could probably integrate the
replication of design docs into the import script too.


And store summary data and meta data in a separate database so I can
do easily queries spanning different dbs.


Sounds good. Pulling a view out of all the DB's and into the summary
database is pretty simple to do.
Cheers
Simon





As it is stored in mysql right now the tables
correction:"table" singular, so it's all in one table
takes up about 15 GB (6000 experiments of around 180000 probes values(=table 
rows).)

Reply via email to