Re: [Wikitech-l] Maps extension graphical editor.
Slight user experience improvement you might want to make... I'd make it so that if you click on a tool icon a second time it unselects. I'd remove the hand tool and make that the function that is the default when no tools are selected. This icon seems out of place to me as all the others are things I can create. On Thu, May 31, 2012 at 1:33 PM, Kim Eik k...@heldig.org wrote: Just did some updates on it, added a slider to handle the opacity fields, and a color picker for color fields. On Thu, May 31, 2012 at 11:48 AM, Strainu strain...@gmail.com wrote: Looks like some email bug. :) Let's try without anything behind the URL: http://ec2-46-137-28-172.eu-west-1.compute.amazonaws.com/static/google-draw2.html 2012/5/31 Daniel Werner daniel.wer...@wikimedia.de: Looks good and helpful to me. One thing not working yet is the marker icons switching color when assigned to a group. You could take the markers from the maps extension directly for that. By the way, the url is lwithout the - in the end. 2012/5/31 Ole Palnatoke Andersen palnat...@gmail.com URL correction: http://ec2-46-137-28-172.eu-west-1.compute.amazonaws.com/static/google-draw2.html- there was no space between this and and. On Thu, May 31, 2012 at 8:49 AM, Kim Eik k...@heldig.org wrote: Hi guys, i have created a simple map editor which works with the Maps extension, i'm looking for some feedback on your impression of it. please take a look @ http://ec2-46-137-28-172.eu-west-1.compute.amazonaws.com/static/google-draw2.htmland let me know what you think. and also, please note it's a work in progress. My idea is to implement this as a special page in the Maps extension so that people can easily create and edit maps. Cheers Kim ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- http://palnatoke.org * @palnatoke * +4522934588 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Daniel Werner Software Engineer Wikimedia Deutschland e.V. | NEU: Obentrautstr. 72 | 10963 Berlin Tel. (030) 219 158 26-0 http://wikimedia.de Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Jon Robson http://jonrobson.me.uk @rakugojon ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Give create gerrit repo right to all WMF engineers
Hi all, Ryan Lane just showed me that in Gerrit there is a separate right for creating repositories. I suggest we give this right to all WMF engineers. A repo is free and fun and will prevent unnecessary delays. Best, Diederik ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Give create gerrit repo right to all WMF engineers
On Fri, Jun 1, 2012 at 10:33 AM, Diederik van Liere dvanli...@gmail.com wrote: Ryan Lane just showed me that in Gerrit there is a separate right for creating repositories. I suggest we give this right to all WMF engineers. A repo is free and fun and will prevent unnecessary delays. For the record this is a reference to https://gerrit.wikimedia.org/r/#/admin/groups/119,info -Jeremy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Give create gerrit repo right to all WMF engineers
I don't want to give this right to all engineers because setting up new repositories is more than just choosing the name. There's also the issue of understanding how Gerrit permissions work so you can set them up properly. I did make a new Project Creators group that I'm more than willing to add people to, once they've learned Gerrit permissions. In addition, unless you make a group you're in the owner of the repo (which can't be done via the GUI, only the CLI--this is a bug), you won't be able to set permissions at all (this is by design). So yeah, its not as easy as it sounds on the tin, so I don't want to hand this out en masse. In an ideal world, I want us to have a special page where people can request repos and we can automate the icky backend stuff. -Chad On Jun 1, 2012 10:33 AM, Diederik van Liere dvanli...@gmail.com wrote: Hi all, Ryan Lane just showed me that in Gerrit there is a separate right for creating repositories. I suggest we give this right to all WMF engineers. A repo is free and fun and will prevent unnecessary delays. Best, Diederik ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Give create gerrit repo right to all WMF engineers
Could you please add David Schoonover and Andrew Otto to the Project Creators group? Best, Diederik On 2012-06-01, at 5:41 PM, Chad wrote: I don't want to give this right to all engineers because setting up new repositories is more than just choosing the name. There's also the issue of understanding how Gerrit permissions work so you can set them up properly. I did make a new Project Creators group that I'm more than willing to add people to, once they've learned Gerrit permissions. In addition, unless you make a group you're in the owner of the repo (which can't be done via the GUI, only the CLI--this is a bug), you won't be able to set permissions at all (this is by design). So yeah, its not as easy as it sounds on the tin, so I don't want to hand this out en masse. In an ideal world, I want us to have a special page where people can request repos and we can automate the icky backend stuff. -Chad On Jun 1, 2012 10:33 AM, Diederik van Liere dvanli...@gmail.com wrote: Hi all, Ryan Lane just showed me that in Gerrit there is a separate right for creating repositories. I suggest we give this right to all WMF engineers. A repo is free and fun and will prevent unnecessary delays. Best, Diederik ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Update on IPv6
Hi all, June 6, 2012 is IPv6 Day ( http://www.worldipv6day.org/ ). The goal of this global event is to move more ISPs, equipment manufacturers and web services to permanent adoption of IPv6. We're planning to do limited production testing of IPv6 during the Berlin Hackathon 2012 (June 2-3). Provided that the number of issues we encounter are manageable, we may fully enable IPv6 on IPv6 day, and keep it enabled. MediaWiki has been used with IPv6 by third party wikis for some time. Wikimedia uses a set of additional features (GlobalBlocking, CheckUser, etc.) which weren't fully IPv6-ready until recently. In addition, we're working to ensure that all of Wikimedia's various services (mailing lists, blogs, etc.) are IPv6-ready. == What's the user impact going to be? == At least in the June 2-3, 2012 time window, you may see a small number of edits from IPv6 addresses, which are in the form 2001:0db8:85a3:::8a2e:0370:7334. See [[w:IPv6 address]]. These addresses should behave as any other IP adress would: You can leave messages on their talk pages; you can track their contributions; you can block them. CIDR notation is supported for rangeblocks. An important note about blocking: A single user may have access to a much larger number of addresses than in the IPv4 model. This means that range blocks (e.g. address with /64) have to be applied in more cases to prevent abuse by more sophisticated users. In the mid term, user scripts and tools that use simple regular expressions to match IPv4 addresses will need to be adapted for IPv6 support to behave correctly. We suspect that IPv6 usage is going to be very low initially, meaning that abuse should be manageable, and we will assist in the monitoring of the situation. User:Jasper Deng is maintaining a comprehensive analysis of the long term implications of the IPv6 migration here: https://en.wikipedia.org/wiki/User:Jasper_Deng/IPv6 We've set up a test wiki where you can see IPv6 IP addresses. This works by assigning you a fake IPv6 address the moment you visit the wiki, and allows you to see the behavior of various tools with the new address format: http://ipv6test.wmflabs.org/wiki/index.php/Main_Page The best way to report issues is to register them in Bugzilla and to ensure that they are marked as blockers for the IPv6 tracking bug: https://bugzilla.wikimedia.org/show_bug.cgi?id=35540 We'll post updates to wikitech-l and elsewhere as appropriate. All best, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Volunteers for Wikimania mobile app?
Greetings, As someone involved with Wikimania programming - I'm wondering if there are volunteers interested in developing a Wikimania mobile app. It may not be feasible given the short timeline, but I've seen a number of conference apps coming out over the past week and figured it should at least be pondered. :) Essentially the idea would be to include schedule, local info, maps, etc. Things that can be pulled from the Wikimania 2012 wiki. Perhaps something PhoneGap based? Any thoughts or interest? -greg aka varnent ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Making MW aware of 'view types'
As we're starting to migrate functionality provided by the MobileFrontend extension into Mediawiki core, we have to grapple with some existing limitations in Mediawiki. I'd like to propose a core change that I think will greatly increase MW's flexibility in rendering content for different 'view types'. By 'view type', I mean things like: * The standard web browser view * A print view * A mobile view * Other device specific views, views for particularly slow connections, views with no graphics, etc. We could use the idea of a 'view type' to allow us to do some cool stuff, for instance: *Dynamic view-specific skin loading* MobileFrontend already displays content from a mobile-specific skin. We could make it possible for Mediawiki to dynamically determine the skin to load based off of the current view type, and provide a pattern for skin authors to follow for building view type-specific skins, for example: * Standard: SkinName.php * Print: SkinNamePrint.php * Mobile: SkinNameMobile.php We could also provide a hook for people to be able to define their own view types, as well as allow for configuration to define default skins for specific view types. We could then more easily build view types for specific devices, for instance. This would greatly lower the barrier of entry for someone who wants to write a mobile-specific skin, print-specific skin, etc, and would enrich the pool of available MW skins. *View-specific functionality* Different view types will likely have different bits of functionality that they require, separate from the rest of MW, that may not make sense to exist in a Skin. We could allow different view types the opportunity to bootstrap view-specific functionality when appropriate. If we segment this functionality to only load for a specific view-type, we can prevent unnecessary components from loading if they are not appropriate for a specific view. I haven't yet really flushed this concept out, and am very open to suggestions on implementation. * * I started hacking a bit at a few core files to give a code example of how this might be implemented, at least for dynamically loading view-specific skins: http://pastie.org/4010775 I'd love to hear people's thoughts on this and integrate feedback as we move forward with migrating MobileFrontend into MW core. Fore more info on the MobileFrontend to core migration, see http://www.mediawiki.org/wiki/Mobile_support_in_MediaWiki_core -- Arthur Richards Software Engineer, Mobile [[User:Awjrichards]] IRC: awjr +1-415-839-6885 x6687 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Xmldatadumps-l] XML dumps/Media mirrors update
I have run cron archiving now every 30 minutes, http://ia700802.us.archive.org/34/items/wikipedia-delete-2012-06/ it is amazing how fast the stuff gets deleted on wikipedia. what about the proposed deletes are there categories for that? thanks mike On Wed, May 30, 2012 at 6:26 AM, Mike Dupont jamesmikedup...@googlemail.com wrote: https://github.com/h4ck3rm1k3/wikiteam code here On Wed, May 30, 2012 at 6:26 AM, Mike Dupont jamesmikedup...@googlemail.com wrote: Ok, I merged the code from wikteam and have a full history dump script that uploads to archive.org, next step is to fix the bucket metadata in the script mike On Tue, May 29, 2012 at 3:08 AM, Mike Dupont jamesmikedup...@googlemail.com wrote: Well, I have now updated the script to include the xml dump in raw format. I will have to add more information the achive.org item, at least a basic readme. other thing is that the wikipybot does not support the full history it seems, so that I will have to move over to the wikiteam version and rework it, I just spent 2 hours on this so i am pretty happy for the first version. mike On Tue, May 29, 2012 at 1:52 AM, Hydriz Wikipedia ad...@alphacorp.tk wrote: This is quite nice, though the item's metadata is too little :) On Tue, May 29, 2012 at 3:40 AM, Mike Dupont jamesmikedup...@googlemail.com wrote: first version of the Script is ready , it gets the versions, puts them in a zip and puts that on archive.org https://github.com/h4ck3rm1k3/pywikipediabot/blob/master/export_deleted.py here is an example output : http://archive.org/details/wikipedia-delete-2012-05 http://ia601203.us.archive.org/24/items/wikipedia-delete-2012-05/archive2012-05-28T21:34:02.302183.zip I will cron this, and it should give a start of saving deleted data. Articles will be exported once a day, even if they they were exported yesterday as long as they are in one of the categories. mike On Mon, May 21, 2012 at 7:21 PM, Mike Dupont jamesmikedup...@googlemail.com wrote: Thanks! and run that 1 time per day, they dont get deleted that quickly. mike On Mon, May 21, 2012 at 9:11 PM, emijrp emi...@gmail.com wrote: Create a script that makes a request to Special:Export using this category as feed https://en.wikipedia.org/wiki/Category:Candidates_for_speedy_deletion More info https://www.mediawiki.org/wiki/Manual:Parameters_to_Special:Export 2012/5/21 Mike Dupont jamesmikedup...@googlemail.com Well I whould be happy for items like this : http://en.wikipedia.org/wiki/Template:Db-a7 would it be possible to extract them easily? mike On Thu, May 17, 2012 at 2:23 PM, Ariel T. Glenn ar...@wikimedia.org wrote: There's a few other reasons articles get deleted: copyright issues, personal identifying data, etc. This makes maintaning the sort of mirror you propose problematic, although a similar mirror is here: http://deletionpedia.dbatley.com/w/index.php?title=Main_Page The dumps contain only data publically available at the time of the run, without deleted data. The articles aren't permanently deleted of course. The revisions texts live on in the database, so a query on toolserver, for example, could be used to get at them, but that would need to be for research purposes. Ariel Στις 17-05-2012, ημέρα Πεμ, και ώρα 13:30 +0200, ο/η Mike Dupont έγραψε: Hi, I am thinking about how to collect articles deleted based on the not notable criteria, is there any way we can extract them from the mysql binlogs? how are these mirrors working? I would be interested in setting up a mirror of deleted data, at least that which is not spam/vandalism based on tags. mike On Thu, May 17, 2012 at 1:09 PM, Ariel T. Glenn ar...@wikimedia.org wrote: We now have three mirror sites, yay! The full list is linked to from http://dumps.wikimedia.org/ and is also available at http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Current_Mirrors Summarizing, we have: C3L (Brazil) with the last 5 good known dumps, Masaryk University (Czech Republic) with the last 5 known good dumps, Your.org (USA) with the complete archive of dumps, and for the latest version of uploaded media, Your.org with http/ftp/rsync access. Thanks to Carlos, Kevin and Yenya respectively at the above sites for volunteering space, time and effort to make this happen. As people noticed earlier, a series of media tarballs per-project (excluding commons) is being generated. As soon as the first run of these is complete we'll announce its location and start generating them on a semi-regular basis. As we've been getting the bugs out of the mirroring setup, it is getting easier to add new locations. Know anyone interested? Please let us know; we would love to have them. Ariel