Hello,

We (Superfeedr) are currently diving into Riak to make it our key value
store for our Google Reader API
replacement.

So far, so good. However, as we're doing tests, we're trying to evaluate
the size that we'll need from Riak.
For that, we obviously need the average size of a value stored.

Of course we can run "estimates", but the data we store is actually quite
diverse and don't have a consistent
schema, so we thought it's probably easier/better to take "real" data into
account rather than theoretical one.

We tried the top down approach: size of disk space/(n_val * number of keys)
but that's not ideal because
we store different types of objects too...

Is there any way to do that from a bottom up approach with a map/reduce job?


Thanks!


--
*Got a blog? Make following it simple: https://www.subtome.com/
*

Julien Genestoux,
http://twitter.com/julien51

+1 (415) 830 6574
+33 (0)9 70 44 76 29
_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to