Yes, I know about that extension... I found out about it before it was put up by browsing some of Wikia's source code.

I personally had some issues with the extension. The family files were largely redundant. All of them prefixed with wikia_ (which breaks all my bots which use 'anime', etc...) which is redundant when targeting for wikia. And because of how they were generated they had a large redundant list of namespaces which don't need to be specified because they're already part of family.py. I created a wikia_basefamily.py which all my family files now inherit from. This means that we only need to specify special flags like the required self.wikia['projectns'] which sets both the project namespace and it's talk (Note that this is geared towards English wiki only, other languages use a different format, I added this cause most of Wikia's wiki are English). But more importantly flags like, self.wikia['smw'] which when set to True automatically populate the list of SMW namespaces, starting at 300. A different starting namespace can also be specified instead.

Actually, I find how pywikibot does things largely pointless. The API itself could be used to grab all of the data that pywiki forces you to manually enter into family files. All you would need to do would be to give it a base url, and it should be able to generate everything else by itself.

Honestly, I'm thinking of trying to make it work more like SVN/GIT commands:

*Remote name aliases:*

$ pywiki remote "anime" http://en.anime.wikia.com
Added name alias "anime" to base url [http://en.anime.wikia.com]
$ pywiki remote "anime" http://anime.wikia.com
Error: The name alias "anime" already exists.
$ pywiki remote -f "anime" http://anime.wikia.com
Changed name alias "anime" from [http://en.anime.wikia.com] to 
[http://anime.wikia.com]
$ pywiki remote -r "anime"
Removed name alias "anime" from base url [http://anime.wikia.com]

*Getting the site data:*

$ pywiki sitedata show "anime"
There is no site data saved for "anime"
$ pywiki sitedata update "anime"
Getting site data for "anime" from 
http://anime.wikia.com/api.php?action=query&meta=siteinfo&siprop=general|namespaces&format=xmlfm
$ pywiki sitedata show "anime"
Version: 1.12alpha
Lang: en
Namespaces:
(-2) "Media"
(-1) "Special"
(0) ""
(1) "Talk"
(2) "User"
(3) "User talk"
(4) "Animepedia"
(5) "Animepedia talk"
(6) "Image"
(7) "Image talk"
(8) "MediaWiki"
(9) "MediaWiki talk"
(10) "Template"
(11) "Template talk"
(12) "Help"
(13) "Help talk"
(14) "Category"
(15) "Category talk"
(110) "Forum"
(111) "Forum talk"
(112) "World"
(113) "World talk"
(300) "Relation"
(301) "Relation talk"
(302) "Property"
(303) "Property talk"
(304) "Type"
(305) "Type talk"

*Setting up users:*

$ pywiki user show "anime"
There is no user set for "anime"
$ pywiki user set "anime" "AnimeBot"
Set [[User:AnimeBot]] as normal user for "anime"
$ pywiki user set -s "anime" "AnimeBotSys"
Set [[User:AnimeBotSys]] as sysop user for "anime"
$ pywiki user show "anime"
Normal: AnimeBot
Sysop: AnimeBotSys
$ pywiki user clean "anime"
Unset users for "anime"
$ pywiki user set "anime" "AnimeBot" "AnimeBotSys"
Set [[User:AnimeBot]] as normal user for "anime"
Set [[User:AnimeBotSys]] as sysop user for "anime"


And so on... All configured via cli instead of by messing with a config file you shouldn't need to.


Or I could always make it interactive and instead of using "pywiki command ..." use the interactive Python shell to create a bot shell.

~Daniel Friesen(Dantman) of:
-The Gaiapedia (http://gaia.wikia.com)
-Wikia ACG on Wikia.com (http://wikia.com/wiki/Wikia_ACG)
-and Wiki-Tools.com (http://wiki-tools.com)

Jesús Martínez wrote:
About family bot configuration, maybe I'm wrong, but some time ago I
found on [[Special:Specialpages]] one special page about bot
configuration, with namespaces and other stuff, like a family file.
Maybe Wikia was testing some extension, but it's gone now.

An special page like this would be useful, though.

2008/4/27 DanTMan wrote:
 Took the tip. Though I did the monobook force manually myself. That patch
may have aged and missed new things, also that one for Export was redundant
(Export is a raw page with no skin).

 Actually, I was planning on doing something one up on what you thought of.
Though, when I first thought of it, the pywikipediabot guys rejected it.
 I was going to tweak the function that grabs family files. In a way so that
when you use -family:wikia_name and it cannot find a Wikia family, it will
fallback to a special class which will use the API to grab namespace and
other information about the wiki. Then it will save that data to a flat file
(What's the point of building a brand new family file for each wiki, just
store it all in a flatfile) and thus the framework will work all over wikia
without ever needing a family file, unless your wiki is desynced from Wikia
(ie: Uncyclopedia), or is one of the few which use the /w format cause they
moved to Wikia from another place and need link compatibility.
 Of course, that would be changed so that now you would simply use
-family:name when I get around to building that for this repo.

 As for your patch.
 I did some file searching before I made my change.
 But I did it in the reverse of you and found some other info:
 - export_address is the function which gives out the export page address.
 -- _GetAll.getData is the only thing which calls for that address (besides
perhaps one of my scripts or something real odd we don't need to worry
about)
 --- getall is the function which scripts actually use to grab multiple
pages of data.
 I don't know what tree leads replace.py to getall cause I didn't do enough
hunting. But however, you know the "Getting ## pages from somesite..." that
shows up as output in replace.py to indicate data being grabbed. That is
part of getall, and that's the only place which has that output.

 The issue with editing get other than the fact that you are also applying
that replacement to the edit page text you should not be applying it to, is
that there are 7 other scripts which call getall. Not only that, but that
includes the pagegenerators (which are likely what replace.py is actually
using to get the data), and even the disambiguation scripts. So by editing
get instead of _GetAll the scripts which use getall or a page generator like
they are supposed to are all still broken.

 ~Daniel Friesen(Dantman) of:
-The Gaiapedia (http://gaia.wikia.com)
-Wikia ACG on Wikia.com (http://wikia.com/wiki/Wikia_ACG)
-and Wiki-Tools.com (http://wiki-tools.com)

 C Stafford wrote:
 re:
http://svn.nadir-point.com/viewvc/wikia-pywikibot/wikipedia.py?r1=10&r2=9&pathrev=10
see
https://sourceforge.net/tracker/index.php?func=detail&aid=1885569&group_id=93107&atid=603139

also, in general
https://sourceforge.net/tracker/index.php?func=detail&aid=1916496&group_id=93107&atid=603141


since i was doing random bot work all around wikia, i got tired of
manually creating a family file for each wiki, especially when i knew
i was only going to be botting there this one time.
i figured it would be worth the time to write something to generate
the family files dynamically (well, with some cache) from the wiki's
current namespace structure (yay Special:Export), than to have an
archive of only certain ones.

so i wrote one. works quite nicely too, random sample url from the
wiki goes in, family file come out.

sadly, its only here on my local machine right now (i had it online
briefly for some testing, thanks again manticore and jack). i'll see
what i can do it get it cleaned up (and protected against attacks) and
with a better interface and put online for people.

-ubrfzy
_______________________________________________
Wikia-l mailing list
[email protected]
http://lists.wikia.com/mailman/listinfo/wikia-l



_______________________________________________
 Wikia-l mailing list
 [email protected]
 http://lists.wikia.com/mailman/listinfo/wikia-l


_______________________________________________
Wikia-l mailing list
[email protected]
http://lists.wikia.com/mailman/listinfo/wikia-l

_______________________________________________
Wikia-l mailing list
[email protected]
http://lists.wikia.com/mailman/listinfo/wikia-l

Reply via email to