Hey Julien, You can get a rough size pr object by using the map_datasize/3 function in riak_mapreduce_utils[1]. You need to take into account things like siblings, bucket related data if you want a more precise estimate.
[1] https://github.com/whitenode/riak_mapreduce_utils Olav 2013/4/26 Julien Genestoux <[email protected]> > Hello, > > We (Superfeedr) are currently diving into Riak to make it our key value > store for our Google Reader API > replacement. > > So far, so good. However, as we're doing tests, we're trying to evaluate > the size that we'll need from Riak. > For that, we obviously need the average size of a value stored. > > Of course we can run "estimates", but the data we store is actually quite > diverse and don't have a consistent > schema, so we thought it's probably easier/better to take "real" data into > account rather than theoretical one. > > We tried the top down approach: size of disk space/(n_val * number of > keys) but that's not ideal because > we store different types of objects too... > > Is there any way to do that from a bottom up approach with a map/reduce > job? > > > Thanks! > > > -- > *Got a blog? Make following it simple: https://www.subtome.com/ > * > > Julien Genestoux, > http://twitter.com/julien51 > > +1 (415) 830 6574 > +33 (0)9 70 44 76 29 > > _______________________________________________ > riak-users mailing list > [email protected] > http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com > > -- Med Vennlig Hilsen Olav Frengstad Systemutvikler // FWT +47 920 42 090
_______________________________________________ riak-users mailing list [email protected] http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
