About family bot configuration, maybe I'm wrong, but some time ago I
found on [[Special:Specialpages]] one special page about bot
configuration, with namespaces and other stuff, like a family file.
Maybe Wikia was testing some extension, but it's gone now.

An special page like this would be useful, though.

2008/4/27 DanTMan wrote:
>
>  Took the tip. Though I did the monobook force manually myself. That patch
> may have aged and missed new things, also that one for Export was redundant
> (Export is a raw page with no skin).
>
>  Actually, I was planning on doing something one up on what you thought of.
> Though, when I first thought of it, the pywikipediabot guys rejected it.
>  I was going to tweak the function that grabs family files. In a way so that
> when you use -family:wikia_name and it cannot find a Wikia family, it will
> fallback to a special class which will use the API to grab namespace and
> other information about the wiki. Then it will save that data to a flat file
> (What's the point of building a brand new family file for each wiki, just
> store it all in a flatfile) and thus the framework will work all over wikia
> without ever needing a family file, unless your wiki is desynced from Wikia
> (ie: Uncyclopedia), or is one of the few which use the /w format cause they
> moved to Wikia from another place and need link compatibility.
>  Of course, that would be changed so that now you would simply use
> -family:name when I get around to building that for this repo.
>
>  As for your patch.
>  I did some file searching before I made my change.
>  But I did it in the reverse of you and found some other info:
>  - export_address is the function which gives out the export page address.
>  -- _GetAll.getData is the only thing which calls for that address (besides
> perhaps one of my scripts or something real odd we don't need to worry
> about)
>  --- getall is the function which scripts actually use to grab multiple
> pages of data.
>  I don't know what tree leads replace.py to getall cause I didn't do enough
> hunting. But however, you know the "Getting ## pages from somesite..." that
> shows up as output in replace.py to indicate data being grabbed. That is
> part of getall, and that's the only place which has that output.
>
>  The issue with editing get other than the fact that you are also applying
> that replacement to the edit page text you should not be applying it to, is
> that there are 7 other scripts which call getall. Not only that, but that
> includes the pagegenerators (which are likely what replace.py is actually
> using to get the data), and even the disambiguation scripts. So by editing
> get instead of _GetAll the scripts which use getall or a page generator like
> they are supposed to are all still broken.
>
>  ~Daniel Friesen(Dantman) of:
> -The Gaiapedia (http://gaia.wikia.com)
> -Wikia ACG on Wikia.com (http://wikia.com/wiki/Wikia_ACG)
> -and Wiki-Tools.com (http://wiki-tools.com)
>
>  C Stafford wrote:
>  re:
> http://svn.nadir-point.com/viewvc/wikia-pywikibot/wikipedia.py?r1=10&r2=9&pathrev=10
> see
> https://sourceforge.net/tracker/index.php?func=detail&aid=1885569&group_id=93107&atid=603139
>
> also, in general
> https://sourceforge.net/tracker/index.php?func=detail&aid=1916496&group_id=93107&atid=603141
>
>
> since i was doing random bot work all around wikia, i got tired of
> manually creating a family file for each wiki, especially when i knew
> i was only going to be botting there this one time.
> i figured it would be worth the time to write something to generate
> the family files dynamically (well, with some cache) from the wiki's
> current namespace structure (yay Special:Export), than to have an
> archive of only certain ones.
>
> so i wrote one. works quite nicely too, random sample url from the
> wiki goes in, family file come out.
>
> sadly, its only here on my local machine right now (i had it online
> briefly for some testing, thanks again manticore and jack). i'll see
> what i can do it get it cleaned up (and protected against attacks) and
> with a better interface and put online for people.
>
> -ubrfzy
> _______________________________________________
> Wikia-l mailing list
> [email protected]
> http://lists.wikia.com/mailman/listinfo/wikia-l
>
>
>
> _______________________________________________
>  Wikia-l mailing list
>  [email protected]
>  http://lists.wikia.com/mailman/listinfo/wikia-l
>
>
_______________________________________________
Wikia-l mailing list
[email protected]
http://lists.wikia.com/mailman/listinfo/wikia-l

Reply via email to