Its rather easy to write in pywiki I just need some information from
you about your wiki. (IE are all edits after X date bad, we only have
Y valid users and here are their names) exc stuff like that allows me
to tailor the script to your needs.

On Fri, Aug 24, 2012 at 12:03 PM, Yury Katkov <[email protected]> wrote:
> Hi John, thanks! Take your time! If you already have such a script,
> and can share it - please do! But if not - I think it will be a good
> exercise in pywikipediabot or extension development for me.
> -----
> Yury Katkov
>
>
>
> On Fri, Aug 24, 2012 at 7:55 PM, John <[email protected]> wrote:
>> Like I said if you want I can whip up a script to nuke the spam, just
>> drop me an email off list
>>
>> On Fri, Aug 24, 2012 at 11:54 AM, Yury Katkov <[email protected]> wrote:
>>> http://www.mediawiki.org/wiki/Manual:Reduce_size_of_the_database and
>>> here is the manual on how to purge the archive database! Thanks John,
>>> that's a perfect solution!
>>> -----
>>> Yury Katkov
>>>
>>>
>>>
>>> On Fri, Aug 24, 2012 at 7:51 PM, John <[email protected]> wrote:
>>>> What can be done after mass deleting is to purge the archive database
>>>> table which should reduce the database size significantly. If you take
>>>> a look at the the example where I cleaned up an existing site I
>>>> reduced the database size by about 90%
>>>>
>>>> On Fri, Aug 24, 2012 at 11:47 AM, Yury Katkov <[email protected]> 
>>>> wrote:
>>>>> Hi everyone! I agree with everyone in this thread, but the main
>>>>> problem is that even if I create a bot of use extensions that removes
>>>>> pages, the actual database records won't be deleted. If I understand
>>>>> correctly, the MediaWiki philosophy tells us that we cannot just drop
>>>>> the page or an account from the database - all the deletions means
>>>>> only that we will hide those nasty spam pages.
>>>>>
>>>>> Consequently after the deletions the size of my database won't shrink
>>>>> to original 100 Mb, it remains around 3Gb which is a problem for
>>>>> hosting.
>>>>>
>>>>> The proposed solution of exporting all the pages to a brand new wiki
>>>>> solves this problem. Are there any other solutions where the dropping
>>>>> of my old spammed database does not involved?
>>>>> -----
>>>>> Yury Katkov
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 24, 2012 at 4:13 PM, John <[email protected]> wrote:
>>>>>> Given enough facts it would be rather easy for me to write a script
>>>>>> that nukes said spam I did something similar on
>>>>>> http://manual.fireman.com.br/wiki/Especial:Registro/Betacommand
>>>>>>
>>>>>> _______________________________________________
>>>>>> Wikitech-l mailing list
>>>>>> [email protected]
>>>>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>>>
>>>>> _______________________________________________
>>>>> Wikitech-l mailing list
>>>>> [email protected]
>>>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>>
>>>> _______________________________________________
>>>> Wikitech-l mailing list
>>>> [email protected]
>>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>> _______________________________________________
>>> Wikitech-l mailing list
>>> [email protected]
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> _______________________________________________
>> Wikitech-l mailing list
>> [email protected]
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> _______________________________________________
> Wikitech-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to