This is an old thread, but I figured I'd give it an answer now, since we have a 
much better story in Riak 1.4.x. As you may know, we now support result 
limiting and pagination in 2i. You can read more about it here [1], in the 
'pagination' section. This will allow you to only buffer `N` results at a time, 
out of a much larger query.

[1] http://docs.basho.com/riak/latest/dev/using/2i/

Reid


On Mar 4, 2013, at 7:18 AM, Pavel Kirienko <[email protected]> 
wrote:

> Hi everyone,
> 
> Is there any way to request a large number of keys through 2i streaming? Say, 
> there is index with 10M entries, I want to extract 1M of them. Obviously the 
> block request (i.e. all data packed into the single response) is not a best 
> idea since it requires a good amount of memory either on client and the 
> server.
> 
> One can suggest to feed 2i output into the Map/Reduce job with streaming 
> output, but this way is not so hot either: it is really slow (our 3-node 
> cluster stumbles on 100k keys for a minutes); and sometimes it just isn't 
> working (streaming may stop occasionally before all data being kicked out). 
> Not to mention that on 1M of keys Map/Reduce job just never starts.
> 
> Is it possible to perform 2i queries for large number of keys, or shall I use 
> another storage for indexing instead? (like Redis maybe)
> 
> Thanks in advance.
> 
> Pavel.
> _______________________________________________
> riak-users mailing list
> [email protected]
> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to