I am running it on a single node.
Using Erlang riak client. I added a map bucket type, create a new index, set
the index on a bucket, saved a few map data into the bucket. Then I did some
search. All good. Then I can remember I tried a invalid search query like
title_s_register:[New movie one]
From http://docs.basho.com/riak/latest/theory/concepts/:
In general, large numbers of buckets within a Riak cluster is not a problem.
In practice, there are two potential restrictions on the maximum number of
buckets a Riak cluster can handle
Is there any further, more specific documentation
Hi Hao,
Looks like you might be running into an EMFILE error:
http://docs.basho.com/riak/latest/community/faqs/logs/#riak-logs-contain-error-emfile-in-the-message
http://docs.basho.com/riak/latest/community/faqs/logs/#riak-logs-contain-error-emfile-in-the-message
What’s the open files limit
Single node with search=on, I previously indexed 2 map objects. Then I did a
restore of my previous bitcask folder. I can see newly created bucket type
still there. And when I search by index, the index data are also there. So I
removed the index folder (under
Hi Joe,
For most use cases, there would be no limit to the number of buckets you
can have on a level db cluster. (Aside from obvious limits of, eventually
you'd run out of disk space for all the objects).
Riak essentially treats the bucket as merely a prefix for the key. (It
basically
Which one? I am bit of lost.
[root@localhost riak_data]# cat /proc/sys/fs/file-max
786810
[root@localhost riak_data]# ulimit -Hn
4096
[root@localhost riak_data]# ulimit -Sn
1024
--Hao
在 2015-08-07 23:58:56,Alex Moore amo...@basho.com 写道:
Hi Hao,
Looks like you might be running into an
Hi Hao --
As Alex said, the error you're receiving is generally related to open file
limits set by your OS configuration. Riak requires a large number of
available open files handles. You can find information on how to up your
limits here: