Hi all, I've been experimenting with CouchDB. I'm use Net::CouchDB to batch insert 20 docs at a time and I'm simply setting _id to a sequence that is incremented for each doc. For just over 9 million rows where each row is just 6 small fields the resulting DB is 3.4G. When I was letting CouchDB set the _id, the resulting database was over 20G. The input source as a tab delimited file is just over 500MB.
So is it normal for CouchDB to create such a large database file when it assigns document ids? -- Jeff Macdonald Ayer, MA
