Ok thanks Evan, I'll try that.
I just saw errors that I've never seen before (I certainly skipped them):
2013-03-22 12:02:18.719 [error] <0.16959.2526> gen_server <0.16959.2526>
terminated with reason: no function clause matching
riak_core_pb:encode({ts,{1363,205559,674898}},
{{ts,{1363,205559,674898}},<<131,104,7,100,0,8,114,95,111,98,106,101,99,116,109,0,0,0,14,101,...>>})
line 40
2013-03-22 12:02:18.720 [error]
<0.13316.2558>@riak_core_handoff_sender:start_fold:215 hinted_handoff
transfer of riak_kv_vnode from '[email protected]'
627988984790622347665645826557777860008408514560 to '[email protected]'
627988984790622347665645826557777860008408514560 failed because of
error:{badmatch,{error,{worker_crash,{function_clause,[{riak_core_pb,encode,[{ts,{1363,205559,674898}},{{ts,{1363,205559,674898}},<<131,104,7,100,0,8,114,95,111,98,106,101,99,116,109,0,0,0,14,101,107,108,97,98,108,111,103,45,99,97,99,104,101,109,0,0,0,35,45,45,45,109,90,71,85,99,120,51,84,99,78,101,108,122,72,75,90,80,115,85,85,51,121,87,85,64,56,48,48,120,54,48,48,108,0,0,0,1,104,3,100,0,9,114,95,99,111,110,116,101,110,116,104,9,100,0,4,100,105,99,116,97,6,97,16,97,16,97,8,97,80,97,48,104,16,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,106,104,1,104,16,106,106,108,0,0,0,1,108,0,0,0,1,109,0,0,0,5,76,105,110,107,115,106,106,106,106,106,106,106,106,106,108,0,0,0,2,108,0,0,0,11,109,0,0,0,12,99,111,110,116,101,110,116,45,116,121,112,101,97,105,97,109,97,97,97,103,97,101,97,47,97,106,97,112,97,101,97,103,106,108,0,0,0,23,109,0,0,0,11,88,45,82,105,97,107,45,86,84,97,103,97,49,97,55,97,109,97,110,97,117,97,66,97,56,97,116,97,108,97,114,97,72,97,82,97,55,97,115,97,114,97,88,97,73,97,73,97,78,97,79,97,67,97,85,106,106,108,0,0,0,1,108,0,0,0,1,109,0,0,0,5,105,110,100,101,120,106,106,106,108,0,0,0,1,108,0,0,0,1,109,0,0,0,20,88,45,82,105,97,107,45,76,97,115,116,45,77,111,100,105,102,105,101,100,104,3,98,0,0,5,83,98,0,3,34,247,98,0,10,75,84,106,106,108,0,0,0,1,108,0,0,0,1,109,0,0,0,11,88,45,82,105,97,107,45,77,101,116,97,106,106,109,0,4,54,114,255,216,255,224,0,16,74,70,73,70,0,1,1,1,1,44,1,44,0,0,255,219,0,67,0,3,2,2,2,2,2,3,2,2,2,3,3,3,3,4,6,4,4,4,4,4,8,6,6,5,6,9,8,10,10,9,8,9,9,10,12,15,12,10,11,14,11,9,9,13,17,13,14,15,16,16,17,16,10,12,18,19,18,16,19,15,16,16,16,255,219,0,67,1,3,3,3,4,3,4,8,4,4,8,16,11,9,11,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,255,192,0,17,8,2,88,3,32,3,1,33,0,2,17,1,3,17,1,255,196,0,29,0,0,3,0,3,1,1,1,1,0,0,0,0,0,0,0,0,4,5,6,2,3,7,1,8,0,9,255,196,0,70,...>>}],...},...]},...}}}
[{riak_core_handoff_sender,start_fold,5,[{file,"src/riak_core_handoff_sender.erl"},{line,161}]}]
2013-03-22 12:02:18.722 [error] <0.16959.2526> CRASH REPORT Process
<0.16959.2526> with 0 neighbours exited with reason: no function clause
matching riak_core_pb:encode({ts,{1363,205559,674898}},
{{ts,{1363,205559,674898}},<<131,104,7,100,0,8,114,95,111,98,106,101,99,116,109,0,0,0,14,101,...>>})
line 40 in gen_server:terminate/6 line 747
2013-03-22 12:02:18.725 [error] <0.7442.164> Supervisor poolboy_sup had
child riak_core_vnode_worker started with
{riak_core_vnode_worker,start_link,undefined} at <0.16959.2526> exit with
reason no function clause matching
riak_core_pb:encode({ts,{1363,205559,674898}},
{{ts,{1363,205559,674898}},<<131,104,7,100,0,8,114,95,111,98,106,101,99,116,109,0,0,0,14,101,...>>})
line 40 in context child_terminated
2013-03-22 11:28:53.379 [error] <0.28559.1189> gen_fsm <0.28559.1189> in
state active terminated with reason: no case clause matching
{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339...",...}},...],...}}
in riak_kv_vnode:do_diffobj_put/3 line 1059
2013-03-22 11:28:53.379 [error]
<0.15853.1198>@riak_core_handoff_receiver:handle_info:80 Handoff receiver
for partition 79925870791533753339264014289171727637433810944 exited
abnormally after processing 5337 objects:
{{{case_clause,{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-laprovence"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-laprovence"}},{<<"storage-hotviber">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178727>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-hotviber"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-hotviber"}},{<<"storage-mwc">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178265>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-mwc"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-mwc"}},{<<"storage">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178239>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage"}}],<<"storage">>}}},[{riak_kv_vnode,do_diffobj_put,3,[{file,"src/riak_kv_vnode.erl"},{line,1059}]},{riak_kv_vnode,handle_handoff_data,2,[{file,"src/riak_kv_vnode.erl"},{line,551}]},{riak_core_vnode,handle_sync_event,4,[{file,"src/riak_core_vnode.erl"},{line,472}]},{gen_fsm,handle_msg,7,[{file,"gen_fsm.erl"},{line,494}]},{proc_lib,init_p_do_apply,3,[{file,"proc_lib.erl"},{line,227}]}]},{gen_fsm,sync_send_all_state_event,[<0.28559.1189>,{handoff_data,<<156,186,9,60,148,97,212,55,60,37,217,98,66,178,155,10,201,158,45,187,73,182,16,146,236,203,88,179,140,37,89,99,152,74,246,157,80,132,132,36,49,217,151,48,246,41,18,89,179,51,200,158,25,203,184,153,197,59,61,239,242,125,191,247,123,190,239,247,188,223,53,191,115,141,185,239,115,238,57,231,186,206,242,63,215,96,164,119,131,123,59,57,123,251,185,179,241,42,251,120,91,155,133,90,89,153,62,241,246,48,176,212,212,51,115,50,176,112,125,234,113,211,200,211,148,135,128,167,137,242,160,115,5,209,63,129,249,57,123,185,185,4,250,128,64,160,255,37,251,239,195,255,151,184,55,245,254,25,15,26,87,16,195,19,152,139,159,111,160,155,111,160,7,131,43,232,156,171,167,75,160,211,121,167,139,212,23,189,211,125,167,155,30,23,189,254,183,225,113,230,223,181,255,144,255,143,233,223,55,209,222,243,244,133,7,252,47,150,127,215,207,254,155,152,254,221,188,240,63,158,47,25,248,244,177,155,147,167,147,143,147,147,147,187,147,155,147,180,147,151,211,99,234,187,251,127,240,115,254,99,101,178,146,124,224,233,4,151,180,120,72,229,80,112,122,234,228,235,100,227,100,226,100,230,164,237,244,196,201,213,41,216,233,174,147,37,85,234,14,85,202,197,41,192,201,194,41,204,201,217,201,220,73,225,255,161,143,167,175,171,91,168,215,255,126,249,210,255,120,252,61,167,128,64,73,35,63,87,207,71,158,110,174,30,52,206,84,129,7,206,32,250,130,239,206,32,22,135,177,255,93,234,127,42,101,228,22,232,228,229,69,189,66,32,158,254,62,93,4,93,52,208,213,215,5,157,57,67,101,163,190,64,167,20,144,170,214,3,29,205,135,38,15,84,174,184,187,74,122,61,118,115,191,18,44,35,117,243,202,141,160,0,79,95,247,43,250,6,122,87,12,238,235,232,93,9,190,37,43,42,113,197,63,200,201,219,51,240,233,21,245,43,202,10,140,167,51,32,45,208,217,51,255,198,191,249,236,191,113,142,230,223,76,123,238,28,205,185,243,180,231,207,255,7,209,49,208,83,137,238,252,121,122,38,122,6,198,127,131,250,215,5,38,198,11,255,62,252,123,200,127,23,61,75,75,67,67,203,72,119,158,142,241,255,...>>},...]}}
2013-03-22 11:28:53.390 [error] <0.28559.1189> CRASH REPORT Process
<0.28559.1189> with 1 neighbours exited with reason: no case clause
matching
{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339...",...}},...],...}}
in riak_kv_vnode:do_diffobj_put/3 line 1059 in gen_fsm:terminate/7 line 611
2013-03-22 11:28:53.397 [error] <0.139.0> Supervisor riak_core_vnode_sup
had child undefined started with {riak_core_vnode,start_link,undefined} at
<0.28559.1189> exit with reason no case clause matching
{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339...",...}},...],...}}
in riak_kv_vnode:do_diffobj_put/3 line 1059 in context child_terminated
2013-03-22 11:28:53.400 [error] <0.29144.1189> gen_fsm <0.29144.1189> in
state ready terminated with reason: no case clause matching
{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339...",...}},...],...}}
in riak_kv_vnode:do_diffobj_put/3 line 1059
2013-03-22 11:28:53.401 [error] <0.29144.1189> CRASH REPORT Process
<0.29144.1189> with 10 neighbours exited with reason: no case clause
matching
{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339...",...}},...],...}}
in riak_kv_vnode:do_diffobj_put/3 line 1059 in gen_fsm:terminate/7 line 611
2013-03-22 11:28:53.402 [error] <0.29145.1189> Supervisor poolboy_sup had
child riak_core_vnode_worker started with
riak_core_vnode_worker:start_link([{worker_args,[79925870791533753339264014289171727637433810944,[],worker_props]},{worker_callback_mod,...},...])
at undefined exit with reason no case clause matching
{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339...",...}},...],...}}
in riak_kv_vnode:do_diffobj_put/3 line 1059 in context shutdown_error
2013-03-22 11:28:53.404 [error] <0.29145.1189> gen_server <0.29145.1189>
terminated with reason: no case clause matching
{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339...",...}},...],...}}
in riak_kv_vnode:do_diffobj_put/3 line 1059
2013-03-22 11:28:53.406 [error] <0.29145.1189> CRASH REPORT Process
<0.29145.1189> with 0 neighbours exited with reason: no case clause
matching
{error,bad_crc,{state,[{<<"cache">>,riak_kv_memory_backend,{state,57458783,57450590,57442397,10737418240,31415183,43200}},{<<"storage-kazeo">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178788>,"79925870791533753339264014289171727637433810944",[{data_root,"/data/riak/bitcask/storage-kazeo"},{read_write,true}],79925870791533753339264014289171727637433810944,"/data/riak/bitcask/storage-kazeo"}},{<<"storage-laprovence">>,riak_kv_bitcask_backend,{state,#Ref<0.0.461.178779>,"79925870791533753339...",...}},...],...}}
in riak_kv_vnode:do_diffobj_put/3 line 1059 in gen_server:terminate/6 line
747
2013/3/21 Evan Vigil-McClanahan <[email protected]>
> busy_dist_ports usually means that you're trying to push too much data
> over your connections, but there are also other things that can
> trigger it.
>
> you can tune the amount of buffer that distributed erlang provides by
> adding
> +zdbbl <some_number>
>
> where <some_number> is the buffer size *in KB* *per other node in the
> cluster*. the default is 1024 (i.e. one megabyte), but raising it to
> 8 or 16MB is often called for in busy clusters. If that doesn't help,
> you may be affected by one of the other issues that I mentioned. That
> said, I am not sure that this will help handoff, as it does not go
> over distributed erlang.
>
>
>
> On Thu, Mar 21, 2013 at 12:22 PM, Godefroy de Compreignac
> <[email protected]> wrote:
> > Ok thanks Evan.
> > I didn't post them before on this thread because I thought it was just
> > "info", but I have a lot of messages like these:
> >
> > 2013-03-21 20:13:23.945 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.29558.176>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm,loop,7}},{message_queue_len,0}]
> > {#Port<0.4656154>,'[email protected]'}
> > 2013-03-21 20:13:26.295 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.29046.176>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm,loop,7}},{message_queue_len,0}]
> > {#Port<0.21529382>,'[email protected]'}
> > 2013-03-21 20:13:27.807 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.29046.176>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{erlang,crc32,2}},{message_queue_len,0}]
> > {#Port<0.21529382>,'[email protected]'}
> > 2013-03-21 20:13:27.843 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.21168.176>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm,loop,7}},{message_queue_len,0}]
> > {#Port<0.6629407>,'[email protected]'}
> > 2013-03-21 20:13:30.626 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.29558.176>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm,loop,7}},{message_queue_len,0}]
> > {#Port<0.6629407>,'[email protected]'}
> > 2013-03-21 20:13:30.771 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.24361.176>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm,loop,7}},{message_queue_len,0}]
> > {#Port<0.4656154>,'[email protected]'}
> > 2013-03-21 20:13:34.447 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.29558.176>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm,loop,7}},{message_queue_len,0}]
> > {#Port<0.4656154>,'[email protected]'}
> > 2013-03-21 20:13:36.210 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.6726.946>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm,loop,7}},{message_queue_len,0}]
> > {#Port<0.4656154>,'[email protected]'}
> > 2013-03-21 20:13:36.501 [info]
> > <0.12276.176>@riak_core_sysmon_handler:handle_event:85 monitor
> > busy_dist_port <0.32186.176>
> >
> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm,loop,7}},{message_queue_len,0}]
> > {#Port<0.6629407>,'[email protected]'}
> >
> > I guess I have a problem with my network config...
> >
> > I precise that the servers hosting my Riak cluster are also running
> > Couchebase, Nginx and Elasticsearch, so a lot of trafic and connections.
> > /proc/sys/net/netfilter/nf_conntrack_count = 30-100K
> >
> >
> >
> > --
> > Godefroy de Compreignac
> >
> > Eklaweb CEO - www.eklaweb.com
> > EklaBlog CEO - www.eklablog.com
> >
> > +33(0)6 11 89 13 84
> > http://www.linkedin.com/in/godefroy
> > http://twitter.com/Godefroy
> >
> >
> > 2013/3/21 Evan Vigil-McClanahan <[email protected]>
> >>
> >> It could be a large number of things, unfortunately. To go through
> >> them all it somewhat outside of my skill set. Maybe someone more
> >> network savvy can provide some pointers?
> >>
> >> Perhaps checking with your network admin, or turning any software
> >> firewalls on your nodes completely off, as a test?
> >>
> >> On Thu, Mar 21, 2013 at 11:44 AM, Godefroy de Compreignac
> >> <[email protected]> wrote:
> >> > But I don't understand what could stop transfers. Maybe a kernel
> >> > setting?
> >> > How could I find out?
> >> >
> >> > --
> >> > Godefroy de Compreignac
> >> >
> >> > Eklaweb CEO - www.eklaweb.com
> >> > EklaBlog CEO - www.eklablog.com
> >> >
> >> > +33(0)6 11 89 13 84
> >> > http://www.linkedin.com/in/godefroy
> >> > http://twitter.com/Godefroy
> >> >
> >> >
> >> > 2013/3/21 Evan Vigil-McClanahan <[email protected]>
> >> >>
> >> >> Handoff is done by default on port 8099.
> >> >>
> >> >> I guess what I am getting at here is that this doesn't look like an
> >> >> obvious riak problem, it's more likely that something on your network
> >> >> or on your nodes is closing or interrupting those sockets; you'd most
> >> >> likely get a different error if something internal to riak was
> causing
> >> >> the transfers to fail.
> >> >>
> >> >> On Thu, Mar 21, 2013 at 10:09 AM, Godefroy de Compreignac
> >> >> <[email protected]> wrote:
> >> >> > The only limitation that I'd see is Haproy which have a time limit:
> >> >> > contimeout 5000
> >> >> > clitimeout 50000
> >> >> > srvtimeout 3600000
> >> >> >
> >> >> > But Haproxy serves Riak on port 8098 and I configured Riak to use
> >> >> > port
> >> >> > 8097:
> >> >> > {pb_port, 8087 }
> >> >> > {http, [ {"5.39.68.152", 8097 } ]}
> >> >> >
> >> >> > So I guess Riak use only port 8097 internally, without any
> >> >> > limitation.
> >> >> >
> >> >> > And by checking logs, I see that a vnode transfer fails after a
> >> >> > random
> >> >> > duration, sometimes a few minutes.
> >> >
> >> >
> >
> >
>
_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com