Re: [Wikitech-l] HtmlBuilder virus
I believe it is an ironic call for action to move from scripted HtmlBuilder to its native equivalent. On Sat, Jun 7, 2014 at 1:33 AM, Cristian Consonni kikkocrist...@gmail.com wrote: 2014-06-07 0:24 GMT+02:00 Ricordisamoa ricordisa...@openmailbox.org: The HtmlBuilder module virus, also known as HB-N1, which stemmed at the English Wikipedia, has infected many other sites: b:id:Modul:HtmlBuilder https://id.wikibooks.org/wiki/Modul:HtmlBuilder Since the creation of the mw.html library https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#HTML_library , very few attempts to actually defeat it have been done. We need your help. Spread this out to your wiki! Sorry, is there some more context available? What does this virus do? From where is it coming? Cristian ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Non-Violent Communication
Thanks for a nice tasty bikeshed on a technical mailing list. I assume your good faith, and I foresee its consequences. You couldn't employ your NVC skills because you were, quote, in a hurry, end quote. That means, NVC just doesn't work when it's needed. I don't think everyone here has a lot of spare time to mix original thoughts with a dump of meaningless requests and pardons. You want to share how you feel? I don't think it's the right place to do this. Don't ask to ask, just ask, and so on. On Tue, Feb 18, 2014 at 5:38 PM, Derric Atzrott datzr...@alizeepathology.com wrote: Question for Derric: why didn't you formulate your suggestion using NVC? I was excited and in a hurry. In retrospect I really think that I should have. After reading some of the replies I felt rather disappointed and frustrated, and even a little sad as I didn't feel my need for understanding was met. In the future I will try to take a little more time writing emails to the list. I'm sorry to anyone who felt offended by it or felt that my email was, well, violent. That was not my intention at all. I just began myself looking into and trying to practice NVC in the past six months or so, and I am, as of now, still not terribly great at it. Again, I want to express my apologies, and I really hope that I didn't turn anyone off to the subject. I guess all I was really trying to say in that email is that when conversation on this list gets heated, I feel frustrated because my needs for calm and community are not met. I end up not wanting to participate because I don’t think that I will be heard or understood. I would like to request that people onlist look into strategies to help everyone get along, whether that is AGF, or NVC, or something else, does not matter as much to me. I suggested NVC because it has been a very useful tool for me in the past. Thank you, Derric Atzrott ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] FWD: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites
Yep, that's what I did too a year a two ago. Since some parts of front-end code work quite differently in Common.js and gadgets, it's bettet to put modular stuff (esp. ones that users would like to opt-out of) in gadgets. On Tue, Dec 10, 2013 at 12:38 AM, Liangent liang...@gmail.com wrote: MediaWiki:Common.js can also create disagreement and defaults change without notice. Actually what I did sometimes is to move code from MediaWiki:Common.js to default gadgets, so it can be disabled per user (for example, some users are using a too slow computer or network), and at the same time, this makes maintenance easier for developers, because gadgets can be made modularized, instead of using a big JavaScript and CSS file (Common.js/css). -Liangent On Tue, Dec 10, 2013 at 5:02 AM, K. Peachey p858sn...@gmail.com wrote: For wider discussion --- From: bugzilla-daemon at wikimedia.org Subject: [Bug 58236] New: No longer allow gadgets to be turned on by default for all users on Wikimedia sites http://news.gmane.org/find-root.php?message_id=%3cbug%2d58236%2d3%40https.bugzilla.wikimedia.org%2f%3e Newsgroups: gmane.org.wikimedia.mediawiki.bugs http://news.gmane.org/gmane.org.wikimedia.mediawiki.bugs Date: 2013-12-09 20:41:41 GMT (19 minutes ago) https://bugzilla.wikimedia.org/show_bug.cgi?id=58236 Web browser: --- Bug ID: 58236 Summary: No longer allow gadgets to be turned on by default for all users on Wikimedia sites Product: Wikimedia Version: wmf-deployment Hardware: All OS: All Status: NEW Severity: normal Priority: Unprioritized Component: Site requests Assignee: wikibugs-l at lists.wikimedia.org Reporter: jared.zimmerman at wikimedia.org CC: benapetr at gmail.com, bugzilla+org.wikimedia at tuxmachine.com, dereckson at espace-win.org, greg at wikimedia.org , tomasz at twkozlowski.net, wikimedia.bugs at snowolf.eu Classification: Unclassified Mobile Platform: --- Gadgets being turned on for all site users (including readers) can cause a confusing users experience, especially when there is some disagreement and defaults change without notice (readers are rarely if ever part of these discussions) Move to model where gadgets are per user rather than part of a default experience for users -- You are receiving this mail because: You are the assignee for the bug. You are on the CC list for the bug. ___ Wikibugs-l mailing list Wikibugs-l at lists.wikimedia.orghttps:// lists.wikimedia.org/mailman/listinfo/wikibugs-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Officially supported MediaWiki hosting service?
I think Brion should have expressed some distinction between Wiki services (like Wikia), and hosting services that provide everything for MediaWiki to run smoothly, incl. caching software and other fancy stuff. On Tue, Oct 1, 2013 at 9:23 PM, Jay Ashworth j...@baylink.com wrote: Something other than Wikia, then? - Original Message - From: Brion Vibber bvib...@wikimedia.org To: Wikimedia-tech list wikitech-l@lists.wikimedia.org Sent: Tuesday, October 1, 2013 2:11:16 PM Subject: [Wikitech-l] Officially supported MediaWiki hosting service? Question for the group: Would an officially supported general-purpose MediaWiki hosting service be useful to people who would like to run wikis, but don't have the time, expertise, or resources to maintain their own installation? If so, what can we (as interested parties in MediaWiki development and use) do to make this happen? [Please do not consider the existence of this email to imply that only regular posters on wikitech-l are allowed to read, comment on, or give opinions in this matter -- on the contrary, wider input is being requested. Please forward this question to anyone to whom it may be of interest. If you would like to get more input from other people, please feel free to contact them on your own, with or without a forward of this mail, and to make follow-up posts or comments as you need or want to. Please feel free to modify the question, the idea, the proposal, or make comments or additions. Be bold and get involved!] -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Jay R. Ashworth Baylink j...@baylink.com Designer The Things I Think RFC 2100 Ashworth Associates http://baylink.pitas.com 2000 Land Rover DII St Petersburg FL USA #natog +1 727 647 1274 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] question about wikimedia apache module mod_pagespeed
Hi Luke, Max Semenik has already evaluated mod_pagespeed. You can see the report here: https://www.mediawiki.org/wiki/User:MaxSem/mod_pagespeed On Wed, Sep 11, 2013 at 2:25 PM, Luke Frank corte...@hotmail.it wrote: Hello, it's possible to add the Google Page Speed module on the wikipedia apache ? For speed purpose. I want to discuss with you of this possibility. Thanks Luke __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wikitech-ambassadors] VisualEditor weekly update - 2013-09-05 (MW 1.22wmf16)
On Fri, Sep 6, 2013 at 3:40 AM, James Forrester jforres...@wikimedia.orgwrote: All, We also added a set of keyboard shortcuts for setting the block formatting: Ctrl+0 sets a block as a paragraph; Ctrl+1 up to Ctrl+6 sets it as a Heading 1 (Page title) to Heading 6 (Sub-heading 4); Ctrl+7 sets it as pre-formatted (bug 33512https://bugzilla.wikimedia.org/show_bug.cgi?id=33512). The help/'beta' menu now exposes the build number next to the Leave feedback link, so users can give better reports about issues they encounter (bug 53050https://bugzilla.wikimedia.org/show_bug.cgi?id=53050 ). Here it is! I have some characters assigned to the third layer (via Ctrl+Alt/AltGr), and they conflict with the new keyboard shortcuts. It's very very annoying and makes VE almost unusable. Is there a way to distinguish Ctrl and Ctrl+Alt? If you have any questions, please do ask. Yours, -- James D. Forrester Product Manager, VisualEditor Wikimedia Foundation, Inc. jforres...@wikimedia.org | @jdforrester ___ Wikitech-ambassadors mailing list wikitech-ambassad...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] New search backend live on mediawiki.org
Will it be set as the search backend further on Wikimedia projects? Is there source code available for Elasticsearch on Gerrit? I couldn't find it. Stemming doesn't work for some languages at all, thus searching exact matches only. On Wed, Aug 28, 2013 at 9:20 PM, Nikolas Everett never...@wikimedia.org wrote: Today we threw the big lever and turned on our new search backend at mediawiki.org. It isn't the default yet but it is just about ready for you to try. Here is what is we think we've improved: 1. Templates are now expanded during search so: 1a. You can search for text included in templates 1b. You can search for categories included in templates 2. The search engine is updated very quickly after articles change. 3. A few funky things around intitle and incategory: 3a. You can combine them with a regular query (incategory:kings peaceful) 3b. You can use prefix searches with them (incategory:norma*) 3c. You can use them everywhere in the query (roger incategory:normans) What we think we've made worse and we're working on fixing: 1. Because we're expanding templates some things that probably shouldn't be searched are being searched. We've fixed a few of these issues but I wouldn't be surprised if more come up. We opened Bug 53426 regarding audio tags. 2. The relative weighting of matches is going to be different. We're still fine tuning this and we'd appreciate any anecdotes describing search results that seem out of order. 3. We don't currently index headings beyond the article title in any special way. We'll be fixing that soon. (Bug 53481) 4. Searching for file names or clusters of punctuation characters doesn't work as well as it used to. It still works reasonably well if you surround your query in quotes but it isn't as good as it was. (Bugs 53013 and 52948) 5. Did you mean suggestions currently aren't highlighted at all and sometimes we'll suggest things that aren't actually better. (Bugs 52286 and 52860) 6. incategory:category with spaces isn't working. (Bug 53415) What we've changed that you probably don't care about: 1. Updating search in bulk is much more slow then before. This is the cost of expanding templates. 2. Search is now backed by a horizontally scalable search backend that is being actively developed (Elasticsearch) so we're in a much better place to expand on the new solution as time goes on. Neat stuff if you run your own MediaWiki: CirrusSearch is much easier to install than our current search infrastructure. So what will you notice? Nothing! That is because while the new search backend (CirrusSearch) is indexing we've left the current search infrastructure as the default while we work on our list of bugs. You can see the results from CirrusSearch by performing your search as normal and adding srbackend=CirrusSearch to the url parameters. If you notice any problems with CirrusSearch please file bugs directly for it: https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensionscomponent=CirrusSearch Nik Everett ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] You know, we really should shift to Windows
This very article seems like yet another Hey check out my brand new Wikipedia redesign! story. No, you're not wrong. They took the170/385 numbers from the Labs stats. On Wed, Aug 21, 2013 at 6:12 PM, hoo h...@online.de wrote: Am I wrong or did they actually calculate that for labs only (which would be rather funny)? At least they link to https://wikitech.wikimedia.org/wiki/Special:Ask/-5B-5BResource-20Type::instance-5D-5D/-3FInstance-20Name/-3FInstance-20Type/-3FProject/-3FImage-20Id/-3FFQDN/-3FLaunch-20Time/-3FPuppet-20Class/-3FModification-20date/-3FInstance-20Host/-3FNumber-20of-20CPUs/-3FRAM-20Size/-3FAmount-20of-20Storage/searchlabel%3Dinstances/offset%3D0 ([...] that run on up to 385 instances [...]) which AFAIK doesn't have any production servers. Cheers, Marius On Wed, 2013-08-21 at 15:44 +0100, David Gerard wrote: Analysts agree! http://www.rightscale.com/blog/cloud-cost-analysis/cloud-cost-analysis-how-much-could-wikipedia-save-cloud _ - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] HTTPS central notice - translation needed?
Usually central notice banners link to an announcement that can be viewed in many languages. The HTTPS banner that is being displayed at the moment links to a rough page[0] that has only English version. Could anyone craft an announcement suitable for translation? [0] https://meta.wikimedia.org/wiki/HTTPS -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How's the SSL thing going?
Can we enable full security mode (as an optional feature) geographically based on the most concerned governments, if the whole thing isn't going fast due to lack of resources? On Wed, Jul 31, 2013 at 11:35 PM, Tyler Romeo tylerro...@gmail.com wrote: Like I've said before, the NSA spying on what users are reading is still the least of our concerns. We should focus on making sure passwords aren't sent over plaintext before attempting to evade a government-run international spy network. *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com On Wed, Jul 31, 2013 at 4:32 PM, Matthew Flaschen mflasc...@wikimedia.orgwrote: On 07/31/2013 03:23 PM, Risker wrote: Just one question from a relatively non-technical person: What falls off the map if everything is done using SSL? Is this the protocol that would make it essentially impossible to read/edit Wikipedia using a normal internet connection from China? Risker Good question. I'm not aware of the current status, but Tim Starling said SSL connections to Wikipedia have been blocked in China (https://bugzilla.wikimedia.org/show_bug.cgi?id=47832#c16). Matt Flaschen ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How's the SSL thing going?
Yes, that is exactly what I do. But Google, for instance, redirects me to HTTP, and if I've logged via HTTPS recently, I would have to log in once again via HTTP. It's very frustrating. Are there public statistics on HTTPS v. HTTP processed requests share for Wikimedia? Rough numbers? For inexperienced users yet concerned about privacy, there should be an HTTP/HTTPS switch in the Preferences page. We have one at the registration/log-in page, but I'd like MediaWiki to remember that I want to use HTTPS only. On Wed, Jul 31, 2013 at 11:50 PM, Ryan Lane rlan...@gmail.com wrote: On Wed, Jul 31, 2013 at 1:39 PM, Paul Selitskas p.selits...@gmail.com wrote: Can we enable full security mode (as an optional feature) geographically based on the most concerned governments, if the whole thing isn't going fast due to lack of resources? No. That's in fact much, much harder. There's nothing stopping you (and anyone else who is concerned about their privacy) from using HTTPS Everywhere. We support HTTPS natively as is right now. - Ryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wikidata-tech] Incorrect language code?
https://bugzilla.wikimedia.org/show_bug.cgi?id=41723 On Sun, Jul 21, 2013 at 9:23 AM, Hazard-SJ hazard...@yahoo.com wrote: Just to add to this, this issue also exists for other pairs of codes (als/gsw, bat-smg/sgs, and probably more). Hazard-SJ From: Hazard-SJ hazard...@yahoo.com To: wikitech-l@lists.wikimedia.org wikitech-l@lists.wikimedia.org; wikidata-t...@lists.wikimedia.org wikidata-t...@lists.wikimedia.org Sent: Sunday, July 21, 2013 12:46 AM Subject: [Wikidata-tech] Incorrect language code? Hello, As far as I checked, we should be using nb as the language code of nowiki. As is known, (see bugs 46455 and 37459), Wikidata allows both codes. I've just come upon a bot request that is over 2 months old that I'm willing to tackle, just that I need verification: terms should use the language code nb rather than no, correct? As for statistics, there are 534741 uses of the no code at present, and only 294725 uses of the nb code (at present). As I said, I'm willing to have my bot make the necessary fixes, but I just need to verify i advance. Thanks. Hazard-SJ ___ Wikidata-tech mailing list wikidata-t...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-tech ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Git config trick.
Father of God^WGit. Thanks, works like a charm! On Fri, Jul 19, 2013 at 8:40 PM, Ori Livneh o...@wikimedia.org wrote: In ~/.gitconfig, add: [url ssh://your_usern...@gerrit.wikimedia.org:29418/mediawiki/extensions/ ] insteadOf = ext: Now you can: git clone ext:UploadWizard ! --- Ori Livneh o...@wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Is it possible to opt out of LiquidThreads?
Oh, just in case there was no such question on the list yet: will LQT threads be somehow converted to the Flow format, whatever it is? How would one access old talks otherwise? On Wed, Jul 17, 2013 at 9:34 PM, Brion Vibber bvib...@wikimedia.org wrote: On Wed, Jul 17, 2013 at 11:29 AM, Jon Robson jdlrob...@gmail.com wrote: Is there any way to opt out of Liquid Threads and simply see the underlying talk page? I really really dislike the interface and it's been bugging me for some time... mostly due to it's reliance on JavaScript and links generated to it tend to be broken. LiquidThreads creates a page per post/reply; there's not a single underlying page. Keep in mind that LQT is on its way out as well, Flow is coming at some point. -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] PHP 5.4 (we wish)
We should have stayed on PHP4 if this is a trouble. Perfomance may be a problem (which is not much in 5.4 iirc), syntax flaws may be a problem (and this one fixes one of the flaws). Yes, it will take us some time to upgrade MW to 5.4 or 5.5, but deprecations are detected by any contemporary PHP IDE, and it is a matter of minutes to fix them. On Fri, Jun 21, 2013 at 1:22 AM, Bartosz Dziewoński matma@gmail.comwrote: PHP 5.4 does seem to be able to cause some trouble, see for example https://gerrit.wikimedia.org/**r/#/c/69807/https://gerrit.wikimedia.org/r/#/c/69807/. -- Matma Rex __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Automating Main Page with Lua
Turning back to the automating thing and the Main Page. I've got tired updating the Other Wikipedias section (congratulations to the Swedish Wikipedia!), so I wrote some code to automate the job. There is a bot that updates different statistics per wiki. I decided to parse the data page and push it through a mediawiki message to avoid hard-coded pieces of text inside. Here we have to expensive parts: getContent() for a template with necessary data, and retrieving a message for the view. Is it OK to have expensives at the Main Page? The module is placed here: http://goo.gl/3V5St -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Deprecating use of the style attribute (part 1)
I think Template:Foo.css would be just a MediaWiki template, and there would be a way to forward some arguments to the CSS template. On Wed, Jun 12, 2013 at 5:44 AM, Brad Jorsch bjor...@wikimedia.org wrote: On Tue, Jun 11, 2013 at 8:39 PM, Jon Robson jdlrob...@gmail.com wrote: All the patch does is allow Template:Foo to have an associated stylesheet Template:Foo.css which is included in pages that use it. How would this handle something like https://en.wikipedia.org/wiki/Template:Colorbox ? -- Brad Jorsch Software Engineer Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Strange PHPUnit behaviour
Hola! I noticed something very strange in unit tests for /languages/Language.php (/tests/phpunit/languages/LanguageTest.php): Language::sprintfDate() is tested to conform with expected formatted date strings. In the test file method 'provideSprintfDateSamples' provides a test case for the format bit 'W' which means ISO 8601 week number, zero-padded. Language::sprintfDate() outputs the result for W as documented. However, one of the test cases for this format contains expected value 1 (instead of 01). As I said above, sprintfDate() adds a zero (printing 01), but when I run the tests, they just pass! What is wrong here? Wrong test case + issue in PHPUnit or what? -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ???!!! ResourceLoader loading extension CSS DYNAMICALLY?!!
Why so many question marks? :) There is OutputPage::addModuleStyles(), it's said that it enables module CSS files via link / tags instead of loading it by means of JavaScript. Iirc, it works right like advertised. On Wed, Jun 5, 2013 at 3:43 PM, vita...@yourcmc.ru wrote: Hello! I've got a serious issue with ResourceLoader. WHAT FOR it's made to load extension styles_ DYNAMICALLY using JavaScript? It's a very bad idea, it leads to page style flickering during load. I.e. first the page is displayed using only skin CSS and then you see how extension styles are dynamically applied to it. Of course it's still rather fast, but it's definitely noticeable, even in Chrome. Why didn't you just output link rel=stylesheet href=load.php?ALL MODULES / ?? Am I free to implement it and submit a patch? -- With best regards, Vitaliy Filippov __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote
Tried setting up the global config in /etc. git-review's gone wild with an exception (very sorry, I switched the terminal off and cannot provide the backtrace; I remember it wanted 'updates' section or some kind of). Luckly ~/.config/ worked well, thanks! On Sat, Jun 1, 2013 at 1:14 AM, Ori Livneh o...@wikimedia.org wrote: Hey, The new version of git-review released today (1.22) includes a patch I wrote that makes it possible to work against a single 'origin' remote. This amounts to a workaround for git-review's tendency to frighten you into thinking you're about to submit more patches than the ones you are working on. It makes git-review more pleasant to work with, in my opinion. To enable this behavior, you first need to upgrade to the latest version of git-review, by running pip install -U git-review. Then you need to create a configuration file: either /etc/git-review/git-review.conf (system-wide) or ~/.config/git-review/git-review.conf (user-specific). The file should contain these two lines: [gerrit] defaultremote = origin Once you've made the change, any new Gerrit repos you clone using an authenticated URI will just work. You'll need to perform an additional step to migrate existing repositories. In each repository, run the following commands: git remote set-url origin $(git config --get remote.gerrit.url) git remote rm gerrit git review -s Hope you find this useful. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)
Yeah, let's make them configurable so people can set the scope of projects (all wikis, specific families, etc... languages?). On Sun, Jun 2, 2013 at 2:21 PM, Mathieu Stumpf psychosl...@culture-libre.org wrote: Le vendredi 31 mai 2013 à 11:15 -0400, David Cuenca a écrit : Hi all, After a talk with Brad Jorsch during the Hackathon (thanks again Brad for your patience), it became clear to me that Lua modules can be localized either by using system messages or by getting the project language code (mw.getContentLanguage().getCode()) and then switching the message. This second option is less integrated with the translation system, but can serve as intermediate step to get things running. Interesting, but why make a central code repository only for Wikisource ? For Wikisource it would be nice to have a central repository (sitting on wikisource.org) of localized Lua modules and associated templates. The documentation could be translated using Extension:Translate. These modules, templates and associated documentation would be then synchronized with all the language wikisources that subscribe to an opt-in list. Users would be then advised to modify the central module, thus all language versions would benefit of the improvements. This could be the first experiment of having a centralized repository of modules. What do you think of this? Would be anyone available to mentor an Outreach Program for Women project? Thanks, David Cuenca --Micru ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)
It's the worst case scenario. Bot syncing is a bad practice (and it was such since the beginning of times). Since we have banners (CentralNotice) and interwiki links (Wikidata) centralized, perhaps we should move forward and start centralizing on-wiki development efforts. There are also global gadgets (i.e. suitable for most wikis), and said above should be the case for Gadgets 2.0 as well. On Fri, May 31, 2013 at 6:19 PM, Yuvi Panda yuvipa...@gmail.com wrote: I'm pretty sure that the 'syncing' can be accomplished by a simple bot, and it might even already exist(?). Will be happy to help write the bot if it doesn't exist yet. -- Yuvi Panda T http://yuvi.in/blog ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code style: overuse of Html::element()
Besides, it adds much more overhead. What should we favour: readability or perfomance? On Wed, May 15, 2013 at 11:43 PM, Tim Landscheidt t...@tim-landscheidt.dewrote: Chris Steipp cste...@wikimedia.org wrote: [...] It looks like we can define custom filters in twig, so we may be able to move the review to making sure the template correctly escapes the value for the context with the correct function. Something like: ul id=users {% for user in users %} lia href={{ user|getMediaWikiUserURL }}{{ users.name|e }}/a/li {% endfor %} [...] So new MediaWiki developers would have to learn yet another syntax, debugging would require another level of indirec- tion, and if these custom filters include those we use for HTML5 now, the output would subtly differ from the template. What was the expected benefit of this operation again, and how often do we get to reap it? :-) Tim ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Processing unknown named template parameters with Scribunto
Yes, Lula (lol :)) seems to fit your purpose well. Look here[1], it just works! --- [1] https://en.wikipedia.org/wiki/Template:Navbox On Tue, May 14, 2013 at 11:09 AM, Toni Hermoso Pulido toni...@cau.catwrote: Hello, yesterday I found myself with a problem similar to the one described here: http://stackoverflow.com/questions/15164710/mediawiki-templates-inherit-parameters But, actually, I would like to process template paramaters without first having to return the values associated to them - {{{param1}}}, {{{param2}}}. Let's say, something like processing argv, where argv may contain argv['param1'], argv['param2'], but not knowing whether param1 or param2 do exist first (I should iterate over keys in this case…) Is this something that might be done somehow by using Lula (Scribunto)? Thanks! -- Toni Hermoso Pulido http://www.cau.cat ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Code style: overuse of Html::element()
Also, standards can change sometimes and same tags may change. It's better to change a thing in one place than chasing the mysterious bug. On Mon, May 13, 2013 at 9:22 PM, Tyler Romeo tylerro...@gmail.com wrote: Chris makes a good point. Also, it should be noted that the Html class does a lot more than just escape stuff. It does a whole bunch of attribute validation and standardization to make output HTML5-sanitary. While in simple cases like the one above it will not make a difference, it is probably better to maintain a uniform approach when generating HTML output. *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com On Mon, May 13, 2013 at 2:05 PM, Chris Steipp cste...@wikimedia.org wrote: On Mon, May 13, 2013 at 10:26 AM, Max Semenik maxsem.w...@gmail.com wrote: Hi, I've seen recently a lot of code like this: $html = Html::openElement( 'div', array( 'class' = 'foo' ) . Html::rawElement( 'p', array(), Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ), $somePotentiallyUnsafeText ) ) . Html::closeElement( 'div' ); IMO, cruft like this makes things harder to read and adds additional performance overhead. It can be simplified to $html = 'div class=foo'p' . Html::rawElement( 'p', array(), Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ), $somePotentiallyUnsafeText ) ) . '/p/div'; What's your opinion, guys and gals? I'm probably a bad offender here, but you've unintentionally proved my point ;). Note that in your example, you used a single instead of a double quote after foo. Obviously, if you're using an IDE, syntax highlighting would have helped you, but my point being that when you use the classes, you're less likely to make those little mistakes that could potentially have disastrous consequences (like using single quotes around an entity and relying on htmlspecialchars for escaping, etc). And for security, I prefer for people to use whatever will cause the least amount of mistakes. Personally also, when I'm code reviewing I don't like to see in the php, but that's my person preference. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Feature proposal: backups while editing articles
While editing Wikipedia articles, I often faced a situtation when I accidently pressed Back in browser, then spontaneously pressing random buttons, returning the edit form and... having a blank editor or the last submitted version of an article. I've been so much frustrated every time when I ruined the new article and had to start it over again. What can we do to fix this? I propose to keep the current textarea state in localStorage, at least one version per article (we can store them by the relevant article ID, also adding the current timestamp and to the structure). We can also let the user disable key press triggered storage update, but to save backups every N secs, or just to save them by pressing a button. [Timestamp and possibly other info would be stored to collect garbage. As we know, localStorage has a small quota.] There is another approach: storing the backups at the server side. Some people ( :] ) suggest that it's quite cheap and may be reasonable. Anyhow, we cannow allow per-keypress backup update due to requests latency. And this would be a disaster to process a huge bunch of some way useless requests simultaneously, unless dedicated servers are run to serve swiftly and obediently. All this stuff can be developed as a gadget. You can make it a WikiEditor plugin. And it would be re-e-aally fantastic to make this work with VisualEditor. Who would try to make this thing look good? Or should I do this otherwise? :) -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Feature proposal: backups while editing articles
Oh, thanks for pointing this out too. I should have prepare better. Unfortunately, this patch has some obvious flaws. But they can be fixed, I hope so. On Tue, Apr 30, 2013 at 5:27 PM, Bartosz Dziewoński matma@gmail.comwrote: Somebody tried to implement that in MediaWiki core about a year ago: https://gerrit.wikimedia.org/**r/#/c/5130/https://gerrit.wikimedia.org/r/#/c/5130/. It turned out to be harder than it looks and the patch is in a limbo now; maybe you could find someone to continue the work. -- Matma Rex __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Feature proposal: backups while editing articles
We could mix these two approaches, but the working copy going behind the latest change is the main issue. We're not talking about git, and it is natural text to be merged, unlike the programming languages. That is why I'm certain that every draft stored at the server-side should be treated as outdated, a new change is contributed before the draft is actually used at the first time. That is why I think it's expensive. We pay money for nothing, for a spurious machine time. That's why it is better to work locally at the client-side. And one more moment: drafts should work slightly different when the user creates a new article. Every new article has Article ID == 0, that is why a different way to store the drafts (dedicated named drafts?) should be chosen. On Tue, Apr 30, 2013 at 6:18 PM, Chad innocentkil...@gmail.com wrote: On Tue, Apr 30, 2013 at 10:59 AM, Bartosz Dziewoński matma@gmail.com wrote: On Tue, 30 Apr 2013 16:32:17 +0200, Chad innocentkil...@gmail.com wrote: What about the Drafts extension? It seems to be slightly different, saving the drafts server-side instead of client-side, and apparently only on demand or every few minutes instead of basically continuously. Granted, but it's worth keeping in mind since these are all possible solutions to people losing their hard work :) -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Countdown to SSL for all sessions?
There are some situations when HTTPS won't work (for example, blocked by provider or government). How does one disable HTTPS without actually accessing a HTTPS version if the user is redirected from HTTP automatically? HTTPS was once blocked in Belarus, thus disabling access to above mentioned GMail, Facebook, Twitter and so on. There should be always an option (like ?noSecure=1). On Mon, Apr 29, 2013 at 8:03 PM, Tyler Romeo tylerro...@gmail.com wrote: On Mon, Apr 29, 2013 at 12:59 PM, Chris Steipp cste...@wikimedia.orgwrote: Using $wgSecureLogin with CentralAuth, if a global account logged in and unchecked the box to continue using SSL, then SUL didn't correctly log them in. This has been fixed in some of the updates to SUL that we're working on right now (https://www.mediawiki.org/wiki/Auth_systems/SUL2). Aha. I figured it had something to do with CentralAuth or another extension. *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Countdown to SSL for all sessions?
On Tue, Apr 30, 2013 at 5:55 AM, Tyler Romeo tylerro...@gmail.com wrote: On Mon, Apr 29, 2013 at 9:07 PM, Paul Selitskas p.selits...@gmail.comwrote: There are some situations when HTTPS won't work (for example, blocked by provider or government). How does one disable HTTPS without actually accessing a HTTPS version if the user is redirected from HTTP automatically? HTTPS was once blocked in Belarus, thus disabling access to above mentioned GMail, Facebook, Twitter and so on. There should be always an option (like ?noSecure=1). Well, with $wgSecureLogin the idea is that it is completely disallowed to log in, i.e., enter a password, over an insecure connection. Ah, I missed that moment. Thanks. -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] GSoC proposals: Wikidata language fallback and conversion ( + one backup: category redirects )
How about if I don't want such fallback to work for me? What if I'd like to see what is labeled and what is not? Have you considered this a user option with a flexible fallback schema or a site-wide preference with a fixed one? In general, this is a very good Wikidata feature yet not implemented. And thanks for raising the category redirects once again! :) On Sat, Apr 27, 2013 at 11:07 PM, Liangent liang...@gmail.com wrote: Hello, I've drafted my proposal about language fallback and conversion issues for Wikidata at [1]. Currently Wikidata stores multilingual contents. Labels (names, descriptions etc) are expected to be written in every language, so every user can read them in their own language. But there're some problems currently: * If some content doesn't exist in some specific language, users with this exact language set in their preferences see something meaningless (its ID instead). This renders some language with fewer users (thus fewer labels filled) even unusable. * There're some similar languages which may often share the same value. Having strings populated for every language one by one wastes resources and may allow them out of sync later. * Even for languages which are not that similar, MediaWiki already has some facility to transliterate (aka. convert) contents from its another sister language (aka. variant) which can be used to provide better results for users. This proposal aims at resolving these issues by displaying contents from another language to users based on user preferences (some users may know more than one languages), language similarity (language fallback chain), or the possibility to do transliteration, and allow proper editing on these contents. Although Wikidata is in its fast development stage, lots of data have been added to it. The later we resolve these issues, the more duplications may be created which will require more clean up work in the future, like what we had to face before / when the language converter (that transliteration system) was introduced for the Chinese Wikipedia. So I'm planning to do this project in this summer. There's also a backup proposal about category redirects at [2]. I wrote it because I really want to see it implemented too, either by me or someone else. Some of its contents may be also useful for other participants willing to do this project. Comments are welcome and appreciated. [1] https://www.mediawiki.org/wiki/User:Liangent/wb-lang [2] https://www.mediawiki.org/wiki/User:Liangent/cat-redir -Liangent ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Support for multiple content languages in MW core
I've already tried both using page properties to store page content language and modifying ContentHandler::getPageLanguage()[1]. In both cases parser worked in a different language scope and didn't process magic words written in a default wiki language (e.g. Russian [[Категория:Test]] wouldn't work on a German page; English had to be used in both pages). It's OK for a wiki with the English language as default, but if such multi-lingual wiki worked for years with German on board, and then you implement the above said, all pages in other languages wouldn't be parsed properly. I couldn't achieve page content manipulations at the time of parsing (by means of magic words). It may be either me being one-eyed or the current parser design. P.S. In page properties, I had to set the page properties through the command line. You have to make an Action^WSpecial Page for that. Also, it will need some sort of restriction policy to prevent vandalism. -- [1] By determining the postfix (/en, /ru, /zh, etc.) On Wed, Apr 24, 2013 at 8:00 AM, MZMcBride z...@mzmcbride.com wrote: Erik Moeller wrote: I'd like to start a broader conversation about language support in MW core [...] Mailing lists are good for conversation, but a lot of your e-mail was insightful notes that I want to make sure don't get lost. I hope you'll eventually put together an RFC (https://www.mediawiki.org/wiki/RFC) or equivalent. [...] I'll stop there - I'm sure you can think of other issues with the current approach. For third party users, the effort of replicating something like the semi-acceptable Commons or Meta user experience is pretty significant, as well, due to the large number of templates and local hacks employed. Well, for Commons, clearly the answer is for everyone to write in glyphs. Wingdings, Webdings, that fancy new color Unicode that Apple has. Meta-Wiki, on the other hand, now that's a real problem. ;-) Would it make sense to add a language property to pages, so it can be used to solve a lot of the above issues, and provide appropriate and consistent user experience built on them? (Keeping in mind that some pages would be multilingual and would need to be identified as such.) If so, this seems like a major architectural undertaking that should only be taken on as a partnership between domain experts (site and platform architecture, language engineering, Visual Editor/Parsoid, etc.). I'm not sure I'd call what you're proposing a major architectural undertaking, though perhaps I'm defining a much narrower problem scope. Below is my take on where we are currently and where we should head with regard to page properties. We need better page properties (metadata) support. A few years ago, a page_props table was added to MediaWiki: * https://www.mediawiki.org/wiki/Manual:Page_props_table Within the past year, MediaWiki core has seen the info action resuscitated and Special:PagesWithProp implemented: * https://www.mediawiki.org/w/index.php?title=MediaWikiaction=info * https://www.mediawiki.org/wiki/Special:PagesWithProp That is, a lot of the infrastructure needed to support a basic language property field already exists, in my mind. However, where we currently fall short is providing a reasonable interface for adding or modifying page properties. Currently, we use the page text to set nearly any property, via magic words (e.g., __NEWSECTIONLINK__ or {{DISPLAYTITLE:}}). The obvious advantage to doing this is the accountability, transparency, and reversibility of using the same system that edits rely on (text table, revision table). The obvious disadvantage is that the input system is a giant textarea. If we could design a sane interface for modifying page properties (such as display title and a default category sort key) that included logging and accountability and reversibility, adding page content language as an additional page property would be pretty trivial. (MediaWiki could even do neat tricks like take a hint from either the user interface language of the page creator or examine the page contents themselves to make an educated guess about the page content language.) And as a fallback, I believe every site already defines a site-wide content language (even Meta-Wiki and Commons). The info action can then report this information on a per-page basis and Special:PagesWithProp can allow lookups by page property (i.e., by page content language). MZMcBride ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tutorial on Semantic MediaWiki in Russian
That is a nice starter guide! Thank you, Yury! On Fri, Apr 19, 2013 at 7:35 PM, Yury Katkov katkov.ju...@gmail.com wrote: Hi everyone! In Semantic MediaWiki we have great but very long documentation. I've tried to write a small introduction to SMW. It's only in Russian now (I know that some amount of Russian MediaWikers are reading this mailing list), but I think to write something similar in English (maybe to IBM DeveloperWorks). http://habrahabr.ru/post/173877/ If you don't know Russian you can enjoy the pictures anyway ;) Cheers, - Yury Katkov, WikiVote ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Project Idea for GSoC 2013 - Bayesian Spam Filter
On Sat, Apr 13, 2013 at 2:42 AM, Brian Wolff bawo...@gmail.com wrote: Qgill wrote: It might have a performance penalty in a site like English Wikipedia with plenty of concurrent edits, but for starters it could be potentially useful to the 99% of MediaWiki instances that have a significantly smaller number of daily edits and especially a very small number of editors and tools able / happy to deal with spam. Hmm. I was playing with nlp-ish automated newpage patrol recently. One thing that crossed my mind was if it becomes too expensive, one could run the classifier in the job queue (and hence on a dedicated server(s) ) and tag changes shortly after the fact. We have Parsoid running separately, don't we? Perhaps, the same approach could work here as well. -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A new project - Wikicards
Indeed. We have a Universal Language Selector for 3rd party websites already. This one would be super-duper great. This is where Wikidata can play its part as well! As for images, we have already a PageImages extension in all wikis, and we have Wikidata, where item properties may point to images. On Sun, Apr 7, 2013 at 7:05 PM, Petr Bena benap...@gmail.com wrote: On first sight I was like WTF is he proposing. Then I read your proposal and it sounds very nice. I hope you will be able to make it happen. On Sun, Apr 7, 2013 at 4:58 PM, Gaurav Chawla grvi...@gmail.com wrote: Hello, I am Gaurav Chawla, an undergraduate student at IIT Roorkee, India. I am applying for this year's Google Summer of Code as a developer for Wikimedia. For the past few days, I have been researching about any new improvement or extension in funtionality that can be introduced to this awesome wiki and finally came up with this idea: WikIcards - small information cards. I have drafted a project proposal with full description of WikIcards and also a sample structure of it at http://www.mediawiki.org/wiki/User:Grv99 I have also filed it as an extension request (Bug 46970) at https://bugzilla.wikimedia.org/show_bug.cgi?id=46970 I request you to go through this new project and give your suggestions and feedback. You can give the feedback on the bug report. Alternatively, you can also add your suggestions at Suggestions section near the end of the project proposal page. Please comment on its feasibility too. Hoping for a good response. Thank you. Regards, Gaurav Chawla ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] whatcanidoforwikipedia.org
Please, don't forget to make the website translatable. Although most tech people speak English, they will be much more pleased to contribute if they are invited in their mother tongue. :) On Sun, Apr 7, 2013 at 7:17 PM, Quim Gil q...@wikimedia.org wrote: Thank you Yuvi for putting efforts in new contributor outreach! Let me plug in the thoughts and discussions we have got in a similar direction so far. Back in January Ross (CCed) had the same idea and we started discussing until I enourage him to move to wiki pages here for further details. Also, let's start applying the lessons learned with the 'Wikitech contributors' debate. On 04/07/2013 03:11 AM, Yuvi Panda wrote: I came across http://www.whatcanidoformozilla.org/ today, and proceeded to register whatcanidoforwikipedia.org :) Why not doing exactly the same but on mediawiki.org directly? All the effort put on promoting a brand new site could be put instead in promoting mediawiki.org in a new, fresh way. Why not use the technologies we are already developing? like https://www.mediawiki.org/wiki/Extension:GettingStarted https://www.mediawiki.org/wiki/Extension:GuidedTour We would become their users, we would help testing and improving them. The effort you are putting contributing code on this new project could be put instead in patches to those extensions, better CSS and look feel for our site, etc. Producing and eating our own dog food. Why not improving mediawiki.org pages making them friendly to newcomers instead of creating content from scratch that would point to the same mediawiki.org pages that we would need to improve anyway? For example, no matter how nice your page on Lua is, the users clicking it would still land on https://www.mediawiki.org/wiki/Lua Thoughts on what to put there? I cal already think of the following languages to put up: 1. PHP 2. JS 3. Lua 4. Python 5. Java 6. Obj-C 7. 'Design' This recollection is exactly the same work we are calling One ontology at http://www.mediawiki.org/wiki/Project:New_contributors#One_ontology Why not creating http://www.mediawiki.org/wiki/Project:New_contributors/One_ontology , move this list there and continue defining more categories on that page? I'll start a wiki page sometime to collect content, and then spend some time writing the code - we can even fork the original site's code and use it. Yes please, start a wiki page documenting this effort that I hope becomes the same effort that we want to put in mediawiki.org. URL suggested: http://www.mediawiki.org/wiki/Project:New_contributors/What_can_I_do -- Quim Gil Technical Contributor Coordinator @ Wikimedia Foundation http://www.mediawiki.org/wiki/User:Qgil ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] whatcanidoforwikipedia.org
Sure. Translatewiki FTW. :) Perhaps you would try to ask the Language Engineering Team for some consulting and assisting (Amire, Nikkerabit or Siebrand, saying off-hand). On Sun, Apr 7, 2013 at 7:25 PM, Yuvi Panda yuvipa...@gmail.com wrote: On Sun, Apr 7, 2013 at 9:50 PM, Paul Selitskas p.selits...@gmail.com wrote: Please, don't forget to make the website translatable. Although most tech people speak English, they will be much more pleased to contribute if they are invited in their mother tongue. :) Oh sure! What do you think will be the best way to go about this? Translatewiki.net? -- Yuvi Panda T http://yuvi.in/blog ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] whatcanidoforwikipedia.org
One more moment (which is not very common for English): lack of reliable sources. We've already had experience in getting access to different scientific works. We should pursue this goal as well. There may be people willing to write good article, but those people lack sources to start their work from. On Sun, Apr 7, 2013 at 10:43 PM, Brandon Harris bhar...@wikimedia.org wrote: I would love a single landing page that asks What Can I Do… and then has some really big, pretty buttons: [Write Content] [Write Code] [Donate Money] [Donate Services] And then each goes to separate pages that explain what can be done. On Apr 7, 2013, at 12:39 PM, Jon Robson jdlrob...@gmail.com wrote: +1 to what Stephen said!! On 7 Apr 2013 12:17, Steven Walling steven.wall...@gmail.com wrote: On Sunday, April 7, 2013, Yuvi Panda wrote: I came across http://www.whatcanidoformozilla.org/ today, and proceeded to register whatcanidoforwikipedia.org :) Thoughts on what to put there? I cal already think of the following languages to put up: 1. PHP 2. JS 3. Lua 4. Python 5. Java 6. Obj-C 7. 'Design' I'll start a wiki page sometime to collect content, and then spend some time writing the code - we can even fork the original site's code and use it. Thoughts? Wikipedia needs editors just as much or more than it needs people who know PHP etc. I would prefer to focus on higher levels of contribution, one of which would be Code, and then drills down into the languages or frameworks. P.S. Can we *please* not bikeshed on the domain name? Domain names are cheap -- Yuvi Panda T http://yuvi.in/blog ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org javascript:; https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l --- Brandon Harris, Senior Designer, Wikimedia Foundation Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 1.21wmf12 re-deployed (was Re: wmf12 rollback, all wikis (except test2) are on wmf11)
May this be connected with the deployment? https://www.wikidata.org/wiki/Wikidata:Project_chat#Special:ItemByTitle_changes_.22_.22_with_.22_.22_in_.22site.22 On Sat, Mar 23, 2013 at 12:26 AM, Greg Grossmeier g...@wikimedia.orgwrote: quote name=Greg Grossmeier date=2013-03-21 time=11:11:15 -0700 We're still diagnosing/etc. Thanks to Aaron Schulz for debugging with help from Chris Steipp and Aude testing we fixed the issue. We now have re-deployed 1.21wmf12 to the phase 1 and 2 wikis (see: https://www.mediawiki.org/wiki/MediaWiki_1.21/Roadmap ) To see the changes/reverts that were made, see the log on gerrit, here: https://gerrit.wikimedia.org/r/#/q/status:merged+project:mediawiki/core+branch:wmf/1.21wmf12+topic:wmf/1.21wmf12,n,z (if that doesn't work, try http://goo.gl/Rw3nF ) Thanks, all, and have a good weekend, Greg -- | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E | | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D | ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Who is responsible for communicating changes in MediaWiki to WMF sites?
Example: We are running a fix in category sorting collations. That was a fix for the bug (introduced by developers, 3rd party software, whatever), not an enhancement. Anyway, notifying the community and its approval was requested. On Thursday, March 21, 2013, Quim Gil q...@wikimedia.org wrote: On 03/21/2013 02:55 AM, Niklas Laxström wrote: I've seen a couple of instances where changes to MediaWiki are blocked until someone informs the community. Someone is a volunteer. Community is actually just the Wikimedia project communities. Or at least the biggest ones which are expected to complain and where the complaining would hurt. This situation seems completely unfair to me. WMF should be able to communicate upcoming changes itself, not throw it to volunteers. Volunteers can help, but they should not be responsible for this to happen. Can you point to the changes blocked, or to anything that would give a better idea to those of us that don't know what are the cases you are talking about? I agree with the principle, but without more details it is difficult to help fixing the problem. -- Quim Gil Technical Contributor Coordinator @ Wikimedia Foundation http://www.mediawiki.org/wiki/User:Qgil ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Category sorting in random order
Can you add Belarusian projects as well? 'bewiki' = 'uca-be', 'bewikisource' = 'uca-be', 'be_x_oldwiki' = 'uca-be', I was denied while sending a patch for review. On Mon, Mar 11, 2013 at 3:18 PM, Bartosz Dziewoński matma@gmail.comwrote: This temporary, due to bug 45446 being fixed right now. Give it 24 hours :) (and then purge the cache). https://bugzilla.wikimedia.**org/show_bug.cgi?id=45446https://bugzilla.wikimedia.org/show_bug.cgi?id=45446 -- Matma Rex __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Category sorting in random order
Git review, of course. The log is here: http://pastebin.com/iC4N1am0 On Tue, Mar 12, 2013 at 1:46 AM, Matthew Flaschen mflasc...@wikimedia.orgwrote: On 03/11/2013 08:38 AM, Paul Selitskas wrote: Can you add Belarusian projects as well? 'bewiki' = 'uca-be', 'bewikisource' = 'uca-be', 'be_x_oldwiki' = 'uca-be', I was denied while sending a patch for review. Please file a bug if you haven't already. How did you attempt to do a patch? The recommended way is now Gerrit (https://www.mediawiki.org/wiki/Gerrit/Getting_started). Matt Flaschen ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Category sorting in random order
Yes, thanks for that question. I didn't check but this is obviously it. Sorry for spamming :) On Tue, Mar 12, 2013 at 2:15 AM, Brian Wolff bawo...@gmail.com wrote: To ask the obvious question, is the key you have scp configured to use the same as the one in your gerrit prefs? -bawolff On 2013-03-11 8:05 PM, Paul Selitskas p.selits...@gmail.com wrote: Git review, of course. The log is here: http://pastebin.com/iC4N1am0 On Tue, Mar 12, 2013 at 1:46 AM, Matthew Flaschen mflasc...@wikimedia.orgwrote: On 03/11/2013 08:38 AM, Paul Selitskas wrote: Can you add Belarusian projects as well? 'bewiki' = 'uca-be', 'bewikisource' = 'uca-be', 'be_x_oldwiki' = 'uca-be', I was denied while sending a patch for review. Please file a bug if you haven't already. How did you attempt to do a patch? The recommended way is now Gerrit (https://www.mediawiki.org/wiki/Gerrit/Getting_started). Matt Flaschen ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Global user CSS and JS
I may be saying rubbish, but... I think we should have a checkbox in Preferences where we can switch off global JS and CSS for the wiki where this checkbox is set/unset. Let's imagine I have a script which fits well for every project but Wikidata. Then I go to the preferences and just disable the global script in Wikidata. On Tue, Mar 5, 2013 at 4:00 AM, James Forrester jforres...@wikimedia.orgwrote: On 4 March 2013 14:59, Krenair kren...@gmail.com wrote: On 04/03/13 22:57, Matthew Flaschen wrote: Has anyone looked at allowing a user to have global CSS and JS across all WMF wikis? I know you can hack it with a mw.loader.load on all the wikis you use, but it would be useful if CentralAuth had it built in. Is there a bug for this? It seems so, yes: https://gerrit.wikimedia.org/r/7274 Bug: https://bugzilla.wikimedia.org/13953 Yes, it would be really lovely to get this enhancement fulfilled (either with that or new code); it's now on the backlog for admin tools development[*]. (I speak conflicted, as someone who's used the bot to fake this globally for my staff account.) [*] - https://www.mediawiki.org/wiki/Admin_tools_development/Roadmap#Other_tasks J. -- James D. Forrester Product Manager, VisualEditor Wikimedia Foundation, Inc. jforres...@wikimedia.org | @jdforrester ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proper category collation support finally implemented!
The result in the link you provided is ideal. Г and Ґ are in one bucket, while У and Ў are separated. That's what we need. Great job done! On Thu, Feb 28, 2013 at 6:24 PM, Bartosz Dziewoński matma@gmail.com wrote: On Thu, 28 Feb 2013 00:33:57 +0100, Paul Selitskas p.selits...@gmail.com wrote: I had to add 'be-tarask' to $tailoringFirstLetters and set $wgCategoryCollation explicitly to make this thing work. Yes, it's not enabled by default. That should probably wait until the support is more battle-tested :) Can character mapping be also implemented here? For example, in Belarusian letter «Ґ» should be in the same section as «Г», and «Ў» in the same section as «У». It's not an urgent request, just my curiosity. I created a testwiki in Belarussian with uca-be collation to test this: http://users.v-lo.krakow.pl/~matmarex/testwiki-be/index.php?title=%D0%9A%D0%B0%D1%82%D1%8D%D0%B3%D0%BE%D1%80%D1%8B%D1%8F:Test It seems like Ґ and Г behave correctly. I don't know why Ў and У are separate; probably most languages they're used in consider them entirely separate letters. This is certainly doable, though; we simply need to make Ў not create a heading in the same way we made ё create one; it should start sorting under У then. I didn't realize this kind of behavior is possible :) (If they are sorted / separated differently on your install, you probably need to run the maintenance/languages/generateCollationData.php script - see https://bugzilla.wikimedia.org/show_bug.cgi?id=43740 .) -- Matma Rex ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proper category collation support finally implemented!
Does this need any maintenance/* runs? I want to test this for Belarusian (be + be-tarask), although now I have what I had before the git pull. On Thu, Feb 28, 2013 at 1:50 AM, Bartosz Dziewoński matma@gmail.com wrote: Just yesterday I managed to get https://gerrit.wikimedia.org/r/#/c/49776/ merged. Based heavily on Tim's work on the IcuCollation, it allows one to *finally* get articles to be correctly sorted on category pages for 67 languages based in latin, greek and cyrillic alphabets. I also created https://bugzilla.wikimedia.org/show_bug.cgi?id=45443 to track the process of getting this deployed to Wikimedia wikis. The process is already underway for uk.wiki and pl.wiki; if anybody technical wishes to get it on their wiki first, please create a sub-bug and start a community discussion/vote - I can provide a testwiki in your language :) Eventually, I'd like this to be deployed on all wikis in those 67 languages. I'll start poking people about this (and will drop a mail to -ambassadors) once wmf11 is deployed and the change goes live on a few wikis. -- Matma Rex ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proper category collation support finally implemented!
I had to add 'be-tarask' to $tailoringFirstLetters and set $wgCategoryCollation explicitly to make this thing work. But it damn works! Awesome, thanks! Can character mapping be also implemented here? For example, in Belarusian letter «Ґ» should be in the same section as «Г», and «Ў» in the same section as «У». It's not an urgent request, just my curiosity. On Thu, Feb 28, 2013 at 2:27 AM, Bartosz Dziewoński matma@gmail.com wrote: On Thu, 28 Feb 2013 00:20:09 +0100, Paul Selitskas p.selits...@gmail.com wrote: Does this need any maintenance/* runs? I want to test this for Belarusian (be + be-tarask), although now I have what I had before the git pull. Yes, you need to run maintenance/updateCollation.php and then purge all category pages. And if you run into any weird display bugs (like letters sorting under headings containing weird symbols), check out https://bugzilla.wikimedia.org/show_bug.cgi?id=43740 . -- Matma Rex ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Problem with CentralAuth in MobileFrontend
Do you use the same protocol in Wikipedia and other projects? When I first log in via HTTPS and then somehow get to HTTP, I need to log in. On Thu, Feb 28, 2013 at 4:10 AM, Yuri Astrakhan yuriastrak...@gmail.com wrote: I am seeing this issue right now on desktop - I am logged in into en-wiki, and all other languages works, but the moment i switch to commons / wikiversity / wikiquote / etc, i need to login. Seems like all the cross-site is broken (has it even worked before?) On Wed, Feb 27, 2013 at 8:02 PM, Juliusz Gonera jgon...@wikimedia.orgwrote: Hi, Yesterday we released photo uploads in mobile (actually moved it from beta to stable). We're logging errors we get when people try to upload photos and it seems the most common one is that they're not logged in to Commons even though they logged in to Wikipedia. It seems to be happening quite randomly and doesn't seem to depend on any particular browser. We're not sure how to debug this. Have any similar issues ever happened on desktop (people not being logged in to other projects)? Is there anyone who has a good knowledge of how CentralAuth works and could help us? Thanks, Juliusz __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] WikiEditor caching (??)
WikiEditor is initialized when the 'ready' event is fired in JavaScript: /extensions/WikiEditor/modules/ext.wikiEditor.js: $( document ).ready( function() { // Initialize wikiEditor $( '#wpTextbox1' ).wikiEditor(); } ); First we need to find out when is the right moment to initialize WikiEditor, that being said when all needed modules are done. I'm not sure if using mw.loader.using instead of $( document ).ready() helps to fix the issue a lot. Well, I may be wrong :) On Sat, Feb 16, 2013 at 11:59 PM, vita...@yourcmc.ru wrote: vita...@yourcmc.ru wrote 2013-02-14 21:38: Hello Wiki Developers! I have a question: I think it's slightly annoying that WikiEditor shows up only some moment after the editing page loads and that the textarea gets moved down (because WikiEditor is only built dynamically via JS). Do you think it's possible to cache the generated WikiEditor HTML code in some way to speed up loading? Anyone? __**_ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Font at dv.wikipedia and dv.wiktionary
These fonts seem to be packaged in a .deb-pack. http://bazaar.launchpad.net/~mvishah/ttf-dhivehi-fonts/trunk/view/head:/debian/copyright On Tue, Feb 12, 2013 at 7:30 PM, Gerard Meijssen gerard.meijs...@gmail.comwrote: Hoi, I have been looking at the font.. The one thing that is important and I cannot find is an indication of the license. When the font is available under a free license, it will be relatively easy to get it and possibly other fonts as well included as a web font in MediaWiki. Thanks, GerardM On 12 February 2013 09:56, oәuɐႡɔsʇnәp nɐႡsn y.1...@hotmail.com wrote: Hello. The fonts at Dhivehi Wikipedia and Wiktionary are not webfonts. If possible could the fonts be changed to Faruma which can be downloaded from here: http://bazaar.launchpad.net/~mvishah/ttf-dhivehi-fonts/trunk/view/head:/ttf-thaana-fonts/faruma.ttf. Originally this issue was raised at http://dv.wikipedia.org/wiki/User_talk:Glacious#Fonts by User:Vituzzu. I have tried http://www.mediawiki.org/wiki/Extension:WebFonts#Preparing_webfonts but to no result. Can someone help me regarding this matter. Links: dv.wikipedia.orgdv.wiktionary.org MediaWiki - 1.21wmf8 (17a7840)PHP - 5.3.10-1ubuntu3.4+wmf1 (apache2handler)MySQL - 5.1.53-wm-log I have posted this question at the MediaWiki support desk here: http://www.mediawiki.org/wiki/Thread:Project:Support_desk/Font_at_dv.wikipedia_and_dv.wiktionaryandUser:MarkAHershberger directed me to https://bugzilla.wikimedia.org/show_bug.cgi?id=42812 but no one is responding in there. So he told me to mail to this list. Regards,Ushau ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Technical design review of extension
Hello. I wrote an extension[1] for Wikinews and Wiktionary to replace the JS-driven custom tabs (Opinions in Wikinews, Citations/Template documentation in Wiktionary). According to the manual, as there were no pros heard from Design-l, now the extension must undergo a technical design review, before it is decided whether the extension is able to fly on production sites - deployment review. -- [1] https://www.mediawiki.org/wiki/Extension:NamespaceRelations -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Technical design review of extension
On Sat, Feb 9, 2013 at 4:56 AM, Matthew Flaschen mflasc...@wikimedia.orgwrote: On 02/08/2013 08:35 PM, Paul Selitskas wrote: According to the manual, as there were no pros heard from Design-l, now the extension must undergo a technical design review, before it is decided whether the extension is able to fly on production sites - deployment review. Out of curiosity, where is this in the manual? Matt Flaschen https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment? -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Technical design review of extension
Well, those tabs are already used in Wikinews and Wiktionary, so I personally find no reason for design review. On Sat, Feb 9, 2013 at 5:14 AM, Tyler Romeo tylerro...@gmail.com wrote: I suppose that means there should be a review to see if the extension design is OK to be deployed live. *--* *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerro...@gmail.com On Fri, Feb 8, 2013 at 9:07 PM, Paul Selitskas p.selits...@gmail.com wrote: On Sat, Feb 9, 2013 at 4:56 AM, Matthew Flaschen mflasc...@wikimedia.org wrote: On 02/08/2013 08:35 PM, Paul Selitskas wrote: According to the manual, as there were no pros heard from Design-l, now the extension must undergo a technical design review, before it is decided whether the extension is able to fly on production sites - deployment review. Out of curiosity, where is this in the manual? Matt Flaschen https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment? -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l P.S. bikeshedding, guys... bikeshedding... -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] wikiscan / similar for english wikipedia
Perhaps, we should not limit such requests to just English Wikipedia. It's better to contact the project maintainers directly first. If this tool is really powerful and useful, maybe Wikimedia Labs could be a new home for it? On Thu, Jan 31, 2013 at 12:38 AM, rupert THURNER rupert.thur...@gmail.comwrote: hi, is there any possibility to have a list of users with contributions similar to: http://wikiscan.org/?menu=userstatsuserlist=Cat%C3%A9gorie%3AUtilisateur+participant+au+projet+Afrip%C3%A9dia for the english wikipedia? kr, rupert ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?
It definitely needs a redesign or a different approach. I believe that putting rendered view into data attributes is the worst practice ever. Data is for data, and if you want to put rendering onto client's shoulders (that is why you want these data attributes, right?), then you should not mix client- and server-site together. On Wed, Jan 23, 2013 at 4:46 PM, Alex Brollo alex.bro...@gmail.com wrote: In the meantime, I tested the urlencode:...|WIKI trick, it runs perfectly for quotes, html tags as br / and links wikicode. Now it can be used both for tl|Autore and tl|Intestazione into it.wikisource, and I hope into tl|MediaWiki:Proofreadpage_index_template too. But it fails with templates; templates passed as a parameter are parsed before urlencode can do its masking job. See [[:commons:User:Alex brollo/Sandbox]] for my test, which uses an instance of a modified tl|Book (my interest is focused to Book and Creator templates). Presently my way for data recovering is a simple AJAX query but as an ecologist I'd like to save both band and server load. :-) Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?
What do you mean by any wikicode (template call, parameter, link) present into the value of infobox parameter breaks the stuff, since it is parsed and expanded by parser with unpredictable results. If your {{{author}}} doesn't have anything and it's aсceptable, then make it {{{author|}}}, or {{#if:{{{author|}}}|span .}}. Please clarify the statement above. On Tue, Jan 22, 2013 at 8:50 AM, Alex Brollo alex.bro...@gmail.com wrote: I tried to build a template which wraps template parameters into data- attributes. First results have been incouraging, then I find something logical but unexpected, crushing the whole idea. I wrote into the code of an infobox-like template something like this: span data-author={{{author}}} data-birthdate={{{birthDate}}}/span and I very happily see that html code had my data wrapped into such span tags. But I was testing my code with clean templates, t.i.: templates which have no wikicode into parameter values (as usually occurs into it.wikisource). As soon as I tested my idea into another project (Commons) I found that any wikicode (template call, parameter, link) present into the value of infobox parameter breaks the stuff, since it is parsed and expanded by parser with unpredictable results. So... I ask you again: is there any sound reason (i.e. safety related,or server loading related ) reason to avoid that HTML comments, wrapped into raw page wikicode are sent back into html rendering as-they-are? Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?
There will be no mess. You'll just get span data-author=[[Alessandro Manzoni]]/span (did you even lift^Wtry, bro? :)), at least at Wikipedia that is what I get. If it could pass raw HTML into attributes, you'd get a huge hole for XSSploits lovers. On Wed, Jan 23, 2013 at 1:03 AM, Alex Brollo alex.bro...@gmail.com wrote: 2013/1/22 Paul Selitskas p.selits...@gmail.com What do you mean by any wikicode (template call, parameter, link) present into the value of infobox parameter breaks the stuff, since it is parsed and expanded by parser with unpredictable results. If your {{{author}}} doesn't have anything and it's aсceptable, then make it {{{author|}}}, or {{#if:{{{author|}}}|span .}}. Please clarify the statement above. Imagine that my infobox had a parameter author=, and imagine a clean content as this: author=Alessandro Manzoni With my template code: span data-author={{{author}}}/span I get into parsed html: span data-author=Alessandro Manzoni/span Perfect! But imagine that my template parameter is: author=[[Alessandro Manzoni]] When I pass the parameter content to span data-author={{{author}}}/span, I dont' get into html page what I'll like: span data-author=[[Alessandro Manzoni]]/span since wikicode [[Alessandro Manzoni]] will be interpreted by the server, and parsed/expanded into a html link as usual, resulting into a big mess. The same occurs for any wikicode and/or html passed into a infobox template parameter. Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?
It will just strip the whole attribute if there is a quote in. That is why we have {{urlencode:{{{1}}}|WIKI}} (or any other mode). On Wed, Jan 23, 2013 at 1:18 AM, Bawolff Bawolff bawo...@gmail.com wrote: On 2013-01-22 6:03 PM, Alex Brollo alex.bro...@gmail.com wrote: 2013/1/22 Paul Selitskas p.selits...@gmail.com What do you mean by any wikicode (template call, parameter, link) present into the value of infobox parameter breaks the stuff, since it is parsed and expanded by parser with unpredictable results. If your {{{author}}} doesn't have anything and it's aсceptable, then make it {{{author|}}}, or {{#if:{{{author|}}}|span .}}. Please clarify the statement above. Imagine that my infobox had a parameter author=, and imagine a clean content as this: author=Alessandro Manzoni With my template code: span data-author={{{author}}}/span I get into parsed html: span data-author=Alessandro Manzoni/span Perfect! But imagine that my template parameter is: author=[[Alessandro Manzoni]] When I pass the parameter content to span data-author={{{author}}}/span, I dont' get into html page what I'll like: span data-author=[[Alessandro Manzoni]]/span since wikicode [[Alessandro Manzoni]] will be interpreted by the server, and parsed/expanded into a html link as usual, resulting into a big mess. The same occurs for any wikicode and/or html passed into a infobox template parameter. Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l Have you tried {{#tag:nowiki|{{{author} to prevent interpretation? There may still be issues with quotes. Im not sure. -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?
Filed a bug report: https://bugzilla.wikimedia.org/show_bug.cgi?id=44262. On Wed, Jan 23, 2013 at 1:34 AM, Matthew Flaschen mflasc...@wikimedia.orgwrote: On 01/22/2013 05:24 PM, Paul Selitskas wrote: It will just strip the whole attribute if there is a quote in. That is why we have {{urlencode:{{{1}}}|WIKI}} (or any other mode). URL-encoding is not the same as HTML-encoding for an attribute. I'm not sure if we have a parser function for the latter, though. Matt ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Extension code review
Hi, I'm writing a new extension and would like to get some code reviewers (even 1 would be fantastic). May I have any assistance? Extension: http://www.mediawiki.org/wiki/Extension:NamespaceRelations Current pending reviews: https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/NamespaceRelations,n,z -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] wikiCodeEditor - Code Editor for MediaWiki CSS and JavaScript pages
Hello. It looks fantastic from the first sight, but it has a lot of things yet to polish. I left some issue tickets in GitHub. And I hope that you'll see some pull requests soon! :) On Mon, Jan 14, 2013 at 1:25 AM, Sibi Prabakaran psibi2...@gmail.comwrote: Hello everyone! We have developed a Code Editor for Mediawiki CSS and JavaScript pages as part of IIT Hackathon which was mentored by Yuvi Panda. Basic description about the project along with snapshots can be found in this page: https://github.com/psibi/wikiCodeEditor#readme Any suggestions/improvements are welcome :) Regards, Sibi ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?
Exactly. Nevertheless: is HTML5 already in use? If it isn't, when it will be introduced into any wiki? HTML5 was introduced into Wikipedia (and MediaWiki by default, see $wgHtml5[1]) lately. FYI, in be.wikisource data fields are used to make a link to both Belarusian Wikipedias in a link hover![2] [1] http://www.mediawiki.org/wiki/Manual:$wgHtml5 [2] http://be.wikisource.org/wiki/MediaWiki:Common.js (bottom of the code) On Mon, Dec 31, 2012 at 2:17 PM, Alex Brollo alex.bro...@gmail.com wrote: Perfect! A data- attribute can contain anything and it runbs perfectly. It can contain too a JSON-stringified object added into edit mode into a span (so that a while dictionary can be passed into a single data- attribute). It's just what I needed. Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?
Perhaps, you chose the wrong approach. Dig in HTML5 data attributes, for examples. That's a better data interface between wikipage code and the View. You can then access them with $(selector).data() method. On Sun, Dec 30, 2012 at 12:23 AM, Alex Brollo alex.bro...@gmail.com wrote: I'd like to use html comment into raw wiki text, to use them as effective, server-unexpensive data containers that could be read and parsed by a js script in view mode. But I see that html comment, written into raw wiki text, are stripped away by parsing routines. I can access to raw code of current page in view mode by js with a index.php or an api.php call, and I do, but this is much more server-expensive IMHO. Is there any sound reason to strip html comments away? If there is no sound reason, could such a stripping be avoided? Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- З павагай, Павел Селіцкас/Pavel Selitskas Wizardist @ Wikimedia projects p.selits...@gmail.com, +375257408304 Skype: p.selitskas ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l