I am trying to run a map-reduce job against all keys in a bucket.  The job
is working fine on buckets with ~60,000 or less entries.  However on buckets
with > 63,000 keys I get the following error every time:

Input:

curl -X POST -H "content-type: application/json"
http://testdw0b01.be.weather.com:8098/mapred?chunked=true --data @-
{"inputs":"profile_63000","query":[{"map":{"language":"javascript","source":
"function(v) {var data = Riak.mapValuesJson(v)[0]; var r=[];for(var i in
data.locations){ var
o = {}; o[data.locations[i]] = 1; r.push(o); } return r;
}"}},{"reduce":{"language":"javascript","source":"function(v) { var r = {};
for (var i in v) { for(var w in v[i])
 { if (w in r) r[w] += v[i][w]; else r[w] = v[i][w]; } } return
[r];}"}}],"timeout": 600000}

Output:

{"error":"map_reduce_error"}

Any ideas?  I am running on a 4 box Centos cluster with Riak installed via
64bit RPM¹s, and default settings.

Thanks,
Scott

_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to