Hoi,
<grin> Markus we agree. </grin> Given that the lag of updates is
measurable, it would be good to have an algorithm that allows bots to
negotiate their speed and thereby maximise throughput. When such an
algorithm is in bot environments as pywiki, any and all pywiki bots can
safely go wild and do the good they are known for. <grin>
Thanks,
GerardM
On 19 November 2015 at 11:06, Markus Krötzsch <[email protected]
> wrote:
> On 19.11.2015 10:40, Gerard Meijssen wrote:
>
>> Hoi,
>> Because once it is a requirement and not a recommendation, it will be
>> impossible to reverse this. The insidious creep of more rules and
>> requirements will make Wikidata increasingly less of a wiki. Arguably
>> most of the edits done by bot are of a higher quality than those done by
>> hand. It is for the people maintaining the SPARQL environment to ensure
>> that it is up to the job as it does not affect Wikidata itself.
>>
>> I think each of these argument holds its own. Together they are
>> hopefully potent enough to prevent such silliness.
>>
>
> Maybe it would not be that bad. I actually think that many bots right now
> are slower than they could be because they are afraid to overload the site.
> If bots would check the lag, they could operate close to the maximum load
> that the site can currently handle, which is probably more than most bots
> are doing now.
>
> The "requirement" vs. "recommendation" thing is maybe not so relevant,
> since bot rules (mandatory or not) are currently not enforced in any strong
> way. Basically, the whole system is based on mutual trust and this is how
> it should stay.
>
> Markus
>
>
> _______________________________________________
> Wikidata mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
_______________________________________________
Wikidata mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikidata