Re: [Wikitech-l] [Wikidata] JADE needs your feedback
On Thu, Oct 5, 2017 at 4:51 PM, Aaron Halfaker <aaron.halfa...@gmail.com> wrote: > Before anyone comes up with something weird, JADE is pronounced like the > ornamental rock. https://en.wikipedia.org/wiki/Jade :) Since we're discussing the name, there used to be a nodejs templating language called "jade" but the people with a trademark on "jade" asked/forced them to change the name[1]. I'm not sure if that's something we need to be concerned about. [1] https://github.com/pugjs/pug/issues/2184 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-CodeSniffer 13.0.0 released
Hello! MediaWiki-Codesniffer 13.0.0 is now available for use in your MediaWiki extensions and other projects. This release features new sniffs and improvements in existing ones. And as you may have noticed, we jumped from 0.12.0 to 13.0.0. This is to be compliant with semantic versioning (new releases will just bump the major version) and indicate the maturity of the project. The changelog for this release: * Add sniff for @cover instead of @covers (James D. Forrester) * Add sniff to find and replace deprecated constants (Kunal Mehta) * Add sniff to find unused "use" statements (Kunal Mehta) * Add space after keyword require_once, if needed (Umherirrender) * Fix @returns and @throw in function docs (Umherirrender) * Prohibit some globals (Max Semenik) * Skip function comments with @deprecated (Umherirrender) * Sniff & fix lowercase @inheritdoc (Gergő Tisza) Libraryupgrader is submitting patches for most Gerrit repositories. :) Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HHVM vs. Zend divergence
Hi, On 09/19/2017 10:50 AM, C. Scott Ananian wrote: > Chad: the more you argue that it's not even worth considering, the more I'm > going to push back and ask for actual numbers and facts. ;) Migrating to Hack simply isn't feasible for most users of MediaWiki. Most distros don't have packages for HHVM, and it's not straightforward to build AIUI. Once you've figured that out, then you're going to quickly realize that you need a cron job to regularly restart HHVM because it'll OOM/crash/etc. So either you have downtime or multiple HHVM processes. On the MediaWiki development side, we've been forced for fork all of the external libraries we depend upon to support Hack. This includes basic tooling like composer, phpunit, codesniffer, and so on. There are probably more reasons that could be listed, but I think you get the idea of why Chad considers it to be a non-starter. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HHVM vs. Zend divergence
Hi, On 09/19/2017 10:27 AM, C. Scott Ananian wrote: > To be super clear: MediaWiki is in *PHP5*. The choices are: Well, kind of. MediaWiki has full support for PHP 7 while still maintaining PHP 5 support. > 1) MediaWiki will always be in PHP5. > 2) MediaWiki will eventually migrate to PHP7, or > 3) MediaWiki will eventually migrate to Hack. > > Is anyone arguing for #1? I don't think anyone has argued for anything besides #2. > So we've got two backwards-incompatible choices to make, eventually. There's already an open RfC[1] about bumping the minimum PHP requirement to 7.x. [1] https://phabricator.wikimedia.org/T172165 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HHVM vs. Zend divergence
Hi, On 09/18/2017 05:13 PM, Tim Starling wrote: > * The plan to also drop PHP 5 compatibility, on a short timeline (1 year). > * Rather than "drifting away" from PHP, their top priority plans > include removing core language features like references and destructors. On Reddit[1], a member of the HHVM team clarified they plan on dropping support for destructors *from Hack* soon. (Not that I think it really makes any difference in what our long-term plan should be.) [1] https://www.reddit.com/r/PHP/comments/70wtky/the_future_of_hhvm/dn6skdn/ -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MinusX - a new tool to make sure files aren't executable (+x)
Hi, Occasionally someone will unintentionally create a new file or modify an existing one to set the executable bit (+x). This is almost nearly an accident, and usually someone will come along later and fix them en masse[1][2]. With help from Anomie, I've written a tool, MinusX[3], that will search and for executable files that shouldn't be, and optionally fix them. It's written in PHP and should run as part of "composer test", but operates on all types of files, not just PHP. I've proposed adding this tool to all repositories[4] in an automated manner. If you have any suggestions/feature requests/bugs, feel free to create a ticket in Phabricator or reply here. [1] https://phabricator.wikimedia.org/T168659 [2] https://phabricator.wikimedia.org/P5913 [3] https://www.mediawiki.org/wiki/MinusX [4] https://phabricator.wikimedia.org/T175794 Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Recommending Firefox to users using legacy browsers?
Hi, On 09/01/2017 07:06 PM, Chad wrote: > (3) I would *really* like to have 2--maybe 3--browsers to list. There's > zero reason to make users think there's only one option when there's a > couple of valid ones. I think Neil hit my goals on the head, and ideally there would be multiple browsers we could recommend. Skipping the "support a like minded organization" part for a bit, here's the criterion I would want to see for a web browser we recommend: 1. Free software/open source 2. Privacy friendly 3. Good security track record and active security team 4. Easy for non-technical users to use Firefox and Chromium both clear 1-3 pretty easily. For #4 I think Firefox also meets it - I know people have concerns with Firefox dropping legacy XUL addons or other changes to defaults, but I don't think people coming from IE 8 would really care or even notice. According to the "Download Chromium" page[1], their builds are based on the current master, and don't auto update. So I don't think that would meet the criteria for #4 since users would need to download updates manually. And Google Chrome, which does have auto updating, doesn't meet #1 or #2. After Firefox and Chromium, there's a bunch of open source web browsers listed on [2], but a brief spot check showed many as being Linux only (or outdated Mac builds). One that looked promising was Brave[3], though it's a relatively new browser and I would need to do more research regarding #3. Ultimately I think Firefox is the best and probably only choice for us to recommend to users. Given that Mozilla is an organization that is aligned with us ideologically, I feel comfortable recommending their product on Wikimedia sites, and trust they're not suddenly going to add privacy invasive features. That said, if there's another browser that also meets the criterion listed above, I would be totally open to recommending it well. [1] https://www.chromium.org/getting-involved/download-chromium [2] https://en.wikipedia.org/wiki/Comparison_of_web_browsers [3] https://en.wikipedia.org/wiki/Brave_(web_browser) -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] libraryupgrader will now upgrade libraries
Sorry, I totally forgot to include a link to the main project page: <https://www.mediawiki.org/wiki/Libraryupgrader>. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-CodeSniffer 0.12.0
Hello! MediaWiki-CodeSniffer 0.12.0 is now available for use in your MediaWiki extensions and other projects. This release fixes bugs from 0.11.0 as well as some new features: * Add sniff to ensure floats have a leading `0` if necessary (Kunal Mehta) * Add sniff to ensure the class name matches the filename (Kunal Mehta) * Change bootstrap-ci.php to match PHP CodeSniffer 3.0.0 (Umherirrender) * Check for unneeded punctation in @param and @return (Umherirrender) * Check spacing after type in @return (Umherirrender) * Check spacing before type in @param and @return (Umherirrender) * Clean up test helpers (Kunal Mehta) * Do not mess long function comments on composer fix (Umherirrender) * Enforce "short" type definitions in multi types in function comments (Umherirrender) * Make it easier to figure out which test failed (Kunal Mehta) * phpunit: replace deprecated strict=true (Umherirrender) * Remove GoatSniffer integration (Kunal Mehta) * Remove unmatched @codingStandardsIgnoreEnd (Umherirrender) * Rename OpeningKeywordBracketSniff to OpeningKeywordParenthesisSniff (Reedy) * Use local OneClassPerFile sniff for only one class/interface/trait (Kunal Mehta) Patches were submitted by libraryupgrader for all MediaWiki extensions and skins to update them to the newest version (see my other email for more details). Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] libraryupgrader will now upgrade libraries
Hi, In the past I used a few scripts on my laptop to submit patches and review bumps to libraries, usually for MediaWiki-CodeSniffer. I've now migrated this to a separate account "libraryupgrader", that is now semi-automatically doing the entire process, and runs inside a Cloud VPS project. I've also filed a request for the bot to have +2 permissions for simple library bumps[1]. (And thank you to Niharika for stepping up and helping review today's library update of MediaWiki-CodeSniffer!) [1] https://phabricator.wikimedia.org/T174760 Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Recommending Firefox to users using legacy browsers?
Hi, On 08/31/2017 02:20 PM, Fæ wrote: > +1 on appearing to be a slippery slope and benefiting from wider, > political, discussion. Just to clarify, I fully plan on turning this into a wider discussion on Meta or alternative venue if/when pursuing this further. I was just trying to use wikitech-l as a place to gauge initial reactions from. Where do you think the slippery slope would lead us to? I don't think we're ever going to tell our users to start using GNU/Linux or something. > I've promoted Wikimedia and projects as being deliberately agnostic. I think we aim for this, but this isn't the actual case when it comes to browser support. For some time Chromium users had better load performance than Firefox users due to how localStorage was used, and in another case Opera 12 users couldn't access some pages with apostrophes in them. In this case, I'm deliberately proposing that we do take a side and align ourselves with Mozilla/Firefox. The main takeaway I got from the Wikimania session I mentioned earlier was that all of us free software and open content projects need to work together and support each other. We've already seen the open web lose when Mozilla gave into EME, simply because it didn't have enough market share to actually make a difference[1]. I'm afraid of the future where we no longer have an ally who can defend and push the shared Wikimedian ideals in the web browser space. > Strategically, locking Wikimedia into fixed relationships with other > organizations with their own drives and timelines, is going to > increase risks downstream. I do agree this adds risks to us, like in terms of public image if something bad happens regarding Firefox. But I don't think it should be a locked/fixed relationship, it should be something that we can say "no this isn't working" and turn off whenever we need to. [1] https://blog.mozilla.org/blog/2014/05/14/drm-and-the-challenge-of-serving-users/ -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Recommending Firefox to users using legacy browsers?
Hi, On 08/31/2017 01:51 PM, Max Semenik wrote: > +1 to that. Additionally, the proposed method wouldn't even work because we > blacklist crappy browsers from receiving JS. This isn't strictly true, we give legacy browsers some JS to make new HTML5 elements work using html5shiv[1]. And I think the specific implementation can be fleshed out a little later, that was just a concept/idea. :) [1] https://gerrit.wikimedia.org/r/#/c/369991/ -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Recommending Firefox to users using legacy browsers?
Hello, This was something that came up during "The Big Open" at Wikimania, when Katherine Maher talked with Ryan Merkley (CEO of Creative Commons) and Mark Surman (ED of Mozilla Foundation). One of the themes mentioned was that our projects need to work together and support each other. In that vein, I'm interested in what people think about promoting Firefox to users who are using legacy browsers that we don't support at Grade A (or some other criteria). As part of the "drop IE8 on XP" project[1] we're already promoting Firefox as the alternative option. I was imagining it could be a small and unobtrusive bubble notification[2], similar to those that Google pushes Chrome on people with. If users use modern browsers, they're going to have better security support, and most likely a better experience browsing Wikimedia sites too. We'd be improving the web by reducing legacy browsers, and allowing us to move forward with newer technology sooner (ideally). And we'd be supporting a project that is ideologically aligned with us: Mozilla. Thoughts, opinions? [1] https://phabricator.wikimedia.org/T147199 [2] https://www.mediawiki.org/wiki/Bubble_notifications Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-GoatSniffer 0.11.1 released
Hello! Straight from Wikimania, MediaWiki-GoatSniffer 0.11.1 is now available for use in your MediaWiki extensions and other projects. This release fixes bugs from 0.10.0 as well as some new features: * Add GoatSniffer ASCII art (Kunal Mehta) * Added OpeningKeywordBraceSniff (Umherirrender) * Add sniff to forbid PHP 7 scalar type hints (Kunal Mehta) * Enable Squiz.WhiteSpace.OperatorSpacing (Umherirrender) * Enforce "short" type definitions on @param in comments (Umherirrender) * Fix phpunit test on windows (Umherirrender) * Fix Undefined offset in FunctionCommentSniff (Umherirrender) I already updated most extensions to 0.11.0 a few days ago, it's up to each extension maintainer if they'd also like to update to 0.11.1. Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] What makes a high quality MediaWiki extension so nice?
Hi, I've proposed a roundtable like discussion for the Hackathon on Thursday to discuss what makes high quality extensions so nice. The goal is to collect a list of things that developers like and look for when reviewing extensions, as advice for newer developers. Even if you aren't able to attend in person, you can leave suggestions in the comments at <https://phabricator.wikimedia.org/T172845>. See you then! -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-Codesniffer 0.10.1 released
Hello! MediaWiki-Codesniffer 0.10.0 is now available for use in your MediaWiki extensions and other projects. This release fixes bugs from 0.10.0 as well as some new features: * Add .gitattributes (Umherirrender) * Add Squiz.Classes.SelfMemberReference to ruleset (Kunal Mehta) * build: Added php-console-highlighter (Umherirrender) * Don't ignore files or paths with "git" in them, only .git (Kunal Mehta) * Fix exclude of common folders (Umherirrender) * Fix "Undefined index: scope_opener" in SpaceBeforeClassBraceSniff (Reedy) * Forbid backtick operator (Matthew Flaschen) * Ignore returns in closures for MissingReturn sniff (Kunal Mehta) * PHP CodeSniffer on CI should only lint HEAD (Antoine Musso) * Reduce false positives in ReferenceThisSniff (Kunal Mehta) * Sniff that the short type form is used in @return tags (Kunal Mehta) * Swap isset() === false to !isset() (Reedy) * track=1 rather than defaultbranch (Reedy) * Update PHP_CodeSniffer to 3.0.2 (Kunal Mehta) I'll be working on submitting patches for extensions again shortly. Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-Codesniffer 0.10.0 released
Hello! MediaWiki-Codesniffer 0.10.0 is now available for use in your MediaWiki extensions and other projects. This release fixes bugs from 0.9.0 as well as some new features: * Add sniff to prevent against using PHP 7's Unicode escape syntax (Kunal Mehta) * Add sniff to verify type-casts use the short form (bool, int) (Kunal Mehta) * Add sniff for `&$this` that causes warnings in PHP 7.1 (Kunal Mehta) * Clean up DbrQueryUsageSniff (Umherirrender) * Ensure all FunctionComment sniff codes are standard (Kunal Mehta) * Exclude common folders (Umherirrender) * Fix handling of nested parenthesis in ParenthesesAroundKeywordSniff (Kunal Mehta) * IllegalSingleLineCommentSniff: Check return value of strrpos strictly (Kunal Mehta) * Improve handling of multi-line class declarations (Kunal Mehta) * Include sniff warning/error codes in test output (Kunal Mehta) * Make DisallowEmptyLineFunctionsSniff apply to closures too (Kunal Mehta) * Use correct notation for UTF-8 (Umherirrender) I'll be working on submitting patches for extensions again shortly. Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-Codesniffer 0.9.0 released
Hi! MediaWiki-CodeSniffer 0.9.0 is now available for use in your MediaWiki extensions and other projects. Here are the notable changes since the last release: * Add sniff to enforce "function (" for closures (Kunal Mehta) * Add usage of && in generic_pass (addshore) * Disallow `and` and `or` (Kunal Mehta) * Don't require documentation for constructors without parameters (Kunal Mehta) * Don't require documentation for '__toString' (Kunal Mehta) * Don't require return/throws/param for doc blocks with @inheritDoc (Kunal Mehta) * Expand list of standard methods that don't need documentation (Kunal Mehta) * Fix FunctionComment.Missing sniff code (Kunal Mehta) * Fix indentation (Umherirrender) * Fix WhiteSpace/SpaceAfterClosureSniff (Antoine Musso) * Make sure all files end with a newline (Kunal Mehta) * test: ensure consistent report width (Antoine Musso) * Update for CodeSniffer 3.0 (Kunal Mehta) * Update squizlabs/PHP_CodeSniffer to 3.0.1 (Reedy) * Use upstream CharacterBeforePHPOpeningTag sniff (Kunal Mehta) The 3.0 upstream release should bring in a lot of under-the-hood improvements, including support for parallel processing! Note that due to an upstream bug, PHPCS 3.x doesn't work with repos that have "prepend-autoloader": false set. I've begun to submit patches to upgrade to the latest version and automatically fix some of the issues: <https://gerrit.wikimedia.org/r/#/q/topic:bump-dev-deps+status:open>. Thanks to everyone who contributed to this release, please file bugs or feature requests in the MediaWiki-Codesniffer Phabricator project if you find any. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] "Unloading"/reloading ResourceLoader modules the proper way?
Hi, On 06/18/2017 01:00 PM, Jack Phoenix wrote: > What would be the proper way to signal to ResourceLoader, "actually I > need you to load this module again"? I don't think ResourceLoader was really designed for the concept of "unloading" modules. Instead I'd suggest scoping the CSS to some class like "theme-name" and then using JS to set and remove that class from (or whatever other element) as needed. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] +2 request for Ladsgroup (Amir)
Hi, Ladsgroup filed a request[1] for +2 in mediawiki/* repos that I missed until now, and hadn't been sent to this mailing list. Please comment if you haven't already and I'll close it in a few days. [1] https://phabricator.wikimedia.org/T165860 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] use of @inheritdoc
Hi, On 05/13/2017 03:43 PM, Ryan Kaldari wrote: > A question that came up during Documentation Day was whether or not we > should use the @inheritdoc convention for indicating that methods are > documented in parent classes (rather than leaving them blank). > > Some arguments for using it include: > * It can help code-sniffers and linters know that the documentation isn't > missing. > * It lets humans who are looking at the code know that they can find > documentation elsewhere. > * We are already using it. (It appears over 200 times in core JavaScript > and about 40 times in core PHP. It's also used in at least 19 production > extensions.) > > Some arguments for not using it: > * It isn't necessary for generating docs at doc.wikimedia.org (Doxygen > automatically inherits method docs if available). > * It would require adding thousands of extra comment blocks to our code to > implement it consistently. > > What are people's opinions on using it? I think the pros you mentioned outweigh the cons. I currently have a patch[1] for MW-Codesniffer to recognize methods with "@inheritDoc" (but not when surrounded by braces) as documented and not flag them. [1] https://gerrit.wikimedia.org/r/#/c/353623/ -- Kunal ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Global language preference
Hi, On 05/11/2017 07:29 AM, Amir E. Aharoni wrote: > 2017-05-11 17:16 GMT+03:00 Brad Jorsch (Anomie) <bjor...@wikimedia.org>: >> Of course, if someone does want to work on implementing a global >> preferences extension and doing the work to get it deployed, then more >> power to them. A better solution to be sure, but it'll likely take more >> time and effort. > My understanding is that the code is kind of there already, but it would > have to be deployed, and before deployment it would have to be reviewed. So > the manager to convince would be the manager of the team that needs to > review and deploy it. So I wrote Extension:GlobalPreferences[1] a while back because the Tool Labs tool I wrote was a pretty hacky prototype that I was unhappy with. The main place the extension needs work is a UI that isn't extremely hacky. The part that loads user options out of a central table and overrides locally set ones is basically done. [1] https://www.mediawiki.org/wiki/Extension:GlobalPreferences -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-Codesniffer 0.8.0 released
Hi, MediaWiki-Codesniffer 0.8.0 is now available for use in your MediaWiki extensions and other projects. Here's the full list of changes since the last release (0.8.0-alpha.1): * Add sniff for cast operator spacing (Sam Wilson) * Allow filtering documentation requirements based on visibility (Kunal Mehta) * Don't require documentation for test cases (Kunal Mehta) * Don't require @return annotations for plain "return;" (Kunal Mehta) * Explicitly check for method structure before using (Sam Wilson) * Fix test result parsing, and correct new errors that were exposed (Sam Wilson) * Prevent abstract functions being marked eligible (Sam Wilson) * PHP_CodeSniffer to 2.9.0 (Paladox) This release contains a lot of new rules documenting functions and ensuring parameters and return types are specified. It's most likely that existing code will not pass the new rulset without documentation improvements (see MediaWiki Documentation Day email ;)). If you encounter any bugs or have suggestions on new rules, please reply to this thread or file a bug in the #MediaWiki-Codesniffer Phabricator project. Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Issue loading JS with ResourceLoader but not with addScript
Hi, On 04/17/2017 02:17 PM, James Montalvo wrote: > Uncaught Error: Module "jquery" is not loaded. Does your module definition have a dependency on 'jquery'? You should remove that as it will cause errors like this. jQuery is always loaded before any other RL modules are. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ArchCom Monutes, News, and Outlook
Hi, On 03/15/2017 03:58 PM, Gabriel Wicke wrote: > Thanks for everyone who participated in the discussion. Unfortunately, we > ran into a technical issue with setting up the youtube stream that we > weren't able to resolve quickly (my apologies to those unable to follow the > stream), but we did take detailed notes > <https://docs.google.com/document/d/1jlBl_qAIrGPF7zqOmK77Db8y4InO1j3qX1qqipzJNmc/edit#> > that > you can peruse and comment on. As requested earlier, will these documents be moved to mediawiki.org or is it now required to use Google's proprietary software to discuss the architecture of MediaWiki? -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ArchCom Monutes, News, and Outlook
Hi, On 03/10/2017 01:11 PM, Gabriel Wicke wrote: > On Fri, Mar 10, 2017 at 12:06 PM, Legoktm <legoktm.wikipe...@gmail.com> > With audio and video, hangouts provide a somewhat higher bandwidth, and > avoid the problem of many people discussing several topics at the same time > that larger IRC meetings frequently run into. > > >> At least for me, Google Hangouts simply isn't an option >> to participate when I'm in a crowded library/classroom. > > > You can still listen in & ask questions via the hangout chat, or the > regular office IRC channel. Adam volunteered to monitor the channel, so > that questions can be addressed. The meeting will also be recorded at the > youtube link I provided, so you can catch up later, and ask follow-up > questions by mail. I should have been been more explicit - listening in isn't an option either for me. I'll watch the recording later, but that entirely defeats the "higher bandwidth" point. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ArchCom Monutes, News, and Outlook
Hi, On 03/09/2017 08:17 AM, Daniel Kinzler wrote: > Next week’s RFC meeting (tentative, pending confirmation): > * explore High - Level Mobilefrontend Requirements (JavaScript frameworks, > Progressive Apps, and all that jazz) > <https://docs.google.com/document/d/1id-E_KELGGA3X5H4K44I6zIX3SEgZ0sF_biOY4INCqM/edit#heading=h.xs2aq4j4wzse>. I didn't try to open the link until now, but it requires a Google account to view, and is only visible to those in the WMF - could it be moved to mediawiki.org please? -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ArchCom Monutes, News, and Outlook
Hi, On 03/09/2017 08:17 AM, Daniel Kinzler wrote: > * NOTE: we plan to experiment with having a public HANGOUT meeting, instead of > using IRC. Can I ask why? At least for me, Google Hangouts simply isn't an option to participate when I'm in a crowded library/classroom. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 2017-02-10 Scrum of Scrums meeting notes
Hi, On 02/01/2017 11:03 AM, Grace Gellerman wrote: > == Product == > === Reading === > Web > ** PagePreviews schema changes. > ** Ongoing work towards enabling PagePreviews to use the RESTBase endpoint. > * Next week: https://phabricator.wikimedia.org/project/view/2460/ What exactly is "PagePreviews"? From <https://www.mediawiki.org/w/index.php?title=Special:Search=pagepreviews> it looks to be the second renaming of the Popups extension/Hovercards, except this time it's named very similarly to the actual "show preview" feature...? Is there a rationale somewhere about the renaming? -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] +2 request for yurik in mediawiki and maps-dev
Hi, On 01/26/2017 08:03 AM, Quim Gil wrote: > For the record: Yuri got his permissions restored 52 minutes after the task > was filed, without much discussion. The task is Resolved since then. Can we > resolve this thread with this title as well, please? Yes, thanks. I respectfully ask people not further respond to this thread, if you have questions please ask me offlist. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] +2 request for yurik in mediawiki and maps-dev
Hi, After speaking with Yurik, I've filed <https://phabricator.wikimedia.org/T156219> on his behalf to restore his membership in the mediawiki and maps-dev groups. I would appreciate guidance in whether these rights can be summarily granted since he used to have them, or if it needs to go through the full process. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to clear backlinks cache?
Hi, On 01/24/2017 02:34 PM, Victor Porton wrote: > It seems that the problem is that ParserAfterTidy is not called for old > pages (pages existed before I installed my extension which adds > additional links in HtmlPageLinkRendererEnd hook). You can run the refreshLinks.php maintenance script to force the pages to be re-parsed. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Updated texvc (Math rendering) Debian packages now available
Hi, texvc is a OCaml program that generates PNG images for the Math extension. Packages for Debian and Ubuntu are now available, please see <https://www.mediawiki.org/wiki/User:Legoktm/Packages/texvc> for more details. This package is useful regardless of whether you are using the mediawiki package or not - it allows you to use texvc without needing to install the full OCaml toolchain to build it manually. Finally, in the long term the Math extension maintainers would like to phase out texvc in favor of the mathoid-based code. If you do use texvc (packaged or not), your comments on <https://phabricator.wikimedia.org/T155201> would be appreciated. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [SECURITY] Math extension - shell invocation followup
Hi, Somewhat related, in the last MediaWiki security release, the bugs already have CVE numbers assigned to them. Would it be possible to get CVE ids for extension security issues in advance as well? -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] pywikibot troubleshooting in recitation-bot
Hi, +cc pywiki...@lists.wikimedia.org On 12/22/2016 03:55 PM, Anthony Di Franco wrote: > Hi all, > I'm doing some renovations on recitation-bot and running into trouble when > the time comes for pywikibot to upload article data to wikisource and > commons. The thread doing so hangs without any sort of informative error. I > made sure that the unix user under which the web service that is using > pywikibot is running is logged into each wiki per Max's advice but I still > have the problem. I'm going to try to get more information about what's > going on but would also appreciate pointers about what might be going > wrong. Particularly, the web service is now running under Kubernetes rather > than sun grid engine, so I suspect that the login state might not be making > it into the container - can anyone advise on where the login state is > maintained and whether this will be transferred into the kubernetes > container? Pywikibot stores all of its state in the same directory that your user-config.py file is in. In my experience python hanging is typically an accidental infinite loop. Adding debug logging can help pinpoint where it starts hanging and narrow down the problematic code. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] +2 nomination for Fomafix in mediawiki/
Hi, > I've filed <https://phabricator.wikimedia.org/T153177>, nominating > Fomafix for +2 in mediawiki/ repositories. I've know closed this as successful. Congrats! -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Goodbye grrrit-wm, hello again wikibugs
Hi, grrrit-wm has been subsumed by wikibugs. You'll now see events from Gerrit reported to IRC by wikibugs. The main purpose of this was to unify the pretty similar codebases and infrastructure instead of having to maintain them separately. I also took the opportunity to fix some silly bugs like the "[]" one. I'll be archiving the "grrrit-wm" Phabricator project shortly - you can use the "wikibugs" one to report any issues you notice. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] +2 nomination for Fomafix in mediawiki/
Hi, I've filed <https://phabricator.wikimedia.org/T153177>, nominating Fomafix for +2 in mediawiki/ repositories. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] phan (new static analysis tool) is now voting on MediaWiki core
Hi, On 12/12/2016 10:03 PM, Sam Wilson wrote: > Is it possible now to add Phan to extensions' CI as well? That's the next step :). Filed <https://phabricator.wikimedia.org/T153039> for it. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] phan (new static analysis tool) is now voting on MediaWiki core
Hi! Thanks to some awesome work by Erik Bernhardson, phan[1], a new static analysis tool, is now voting[2] on MediaWiki core patches. It's significantly more advanced than any of our other current tools, and should help identify some types of errors. It uses PHP 7's AST to process code, but is capable of analyzing PHP5 code. There's documentation on mediawiki.org[3] about how it is currently configured, and how to set it up locally. You'll need PHP 7 with the ast extension to actually run phan. If that's not possible for your system, you can rely on jenkins to run it for you. [1] https://github.com/etsy/phan [2] https://phabricator.wikimedia.org/T132636 [3] https://www.mediawiki.org/wiki/Continuous_integration/Phan -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Changes in colors of user interface
Hi, On 12/09/2016 01:56 PM, Strainu wrote: > 2016-12-09 10:53 GMT+02:00 Amir Ladsgroup <ladsgr...@gmail.com>: >> But these changes are too small to notice and even smaller to dislike. > > Amir, I believe you've been a wikimedian for too long to really > believe that. No change is to small to be disliked by one or more > people! <https://xkcd.com/1770/> seems pretty timely! -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Deployments: upcoming holidays and their impact
On 12/06/2016 03:32 PM, Greg Grossmeier wrote: > Reminder: This next week (Dec 12th) is the last normal week before the > end of year holiday season deploy freeze goes into effect. Plan > accordingly. Does the freeze also affect the beta cluster? Or just production? Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] First release candidate for 1.28 (1.28.0-rc.0)
1.28 has a bunch of changes, but <https://www.mediawiki.org/wiki/MediaWiki_1.28> is rather bare and needs more people to fill it out and highlight the cool new things that were developed in this release cycle! -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Deploying the Linter extension to Wikimedia wikis
Hi, On 10/24/2016 06:42 AM, MZMcBride wrote: > How will the errors be queried? Will be there an api.php module to query > on a per-error or per-page basis? Yes, per-error is implemented, and I've also implemented per-namespace filtering, but not per-page yet. > Does the extension distinguish between errors and warnings? Are there > gradations of errors? For example, deprecated syntax v. invalid syntax? Not really. Each category has a name like "obsolete-tag" or "bogus-image-options", and that's about it. > I wonder if the name "Linter" is overly generic. This extension will only > activate on wikitext, correct? It won't lint other content models/types > such as JavaScript and CSS? The Linter extension doesn't care about that at all - it gets told about errors from Parsoid (or any whitelisted service) stores them, and displays them to users. So if someone set up a JSCS service or something to provide errors about JavaScript pages, we could hook up that to Linter. (There's currently only one line of code in Linter that's Parsoid-specific). -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 2016W43 ArchCom-RFC meeting: Allow HTML in SVG?
Hi, On 10/25/2016 03:14 PM, Rob Lanphier wrote: > 3. Should we turn our SVG validation code into a proper library? Yes! This is <https://phabricator.wikimedia.org/T86874>. :) -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Deploying the Linter extension to Wikimedia wikis
Hi, Arlo and myself have been working on a new MediaWiki extension to expose Parsoid's "lint errors" to users. A little bit of background, Parsoid has a linter that identifies some issues in wikitext that while may not result in user-facing errors, are still not wanted in wikitext. An example might be [[File:Example.png|foo|bar|baz]]. In this case, "foo" and "bar" are ignored, and "baz" is the actual caption - but the bogus image options should be removed. Other errors include missing end tags, obsolete HTML tags, fostered content, etc. The main advantage of this over tracking categories is that we know the location in the wikitext so it should be easier to identify the error and fix it, as well as knowing whether the issue was caused via a template or not. The main ticket tracking deployment is <https://phabricator.wikimedia.org/T148609>. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Future of magic links / making ISBN linking easier
Hi, In the RfC meeting for the Future of magic links RfC, it was agreed upon to disable magic links for the MW 1.28 release by default and begin the deprecation for Wikimedia sites. One of the subpoints was to provide migration tools and similar functionality. For RFC and PMID links, they were added to the default core interwiki map, and the Wikimedia one. But for ISBN, it points to the local wiki, so an interwiki link doesn't make sense. The two main proposals are: Virtual namespace ([[ISBN:0-7475-3269-9]]), which is similar to existing link syntax (double brackets), but is also a bit magical since there haven't been any new virtual namespaces since Special and Media in the initial version of MW. Parser function ({{#isbn:0-7475-3269-9}}), which is typically how new syntax/functions are added, but is farther away from traditional link syntax. I would appreciate comments on <https://phabricator.wikimedia.org/T148274> - I've written a patch for the virtual namespace as a POC, but writing one for the parser function if people prefer that, should be trivial. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disabled local uploads
Hi, On 10/15/2016 07:23 AM, Danny B. wrote: > I've just noticed, that local uploads are completely disabled on projects > which originally had them disabled for users, but allowed for admins and > bureaucrats. What wikis are you referring to? What error message is showing up? > Could somebody please point me to the relevant discussion about this topic > as well as to relevant commits which are setting this behavior? Without specifying which project, that's kind of impossible to answer. If you look at <https://tools.wmflabs.org/robin/?tool=uploadconfig> you can see how diverse the different upload configurations are. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 2016W40 ArchCom-RFC meeting: future of magic links
Hi, On 10/04/2016 04:56 PM, Rob Lanphier wrote: > The RFC has three components: > 1. Moving this functionality to an extension (via parser hook) in > time for the MediaWiki 1.28 release Errr, not exactly. Step 1 is to disable the magic link functionality by default for the MediaWiki 1.28 release, and mark it as deprecated. We would add a tracking category for each type of magic link if they are still enabled. And regardless of magic link enabling status, we would add a {{ISBN:...}} parser function to make a convenient link to Special:Booksources. (RFC/PMID are replaced using interwiki links). At this time nothing would be moving to an extension. > 2. Deprecation strategy for Wikimedia wikis (e.g. Wikipedia) Most of the migration can be done using a bot with some basic regexes, but the key part will be adapting templates to generate links (e.g. ISBN citation templates) instead of relying upon magic link functionality. > 3. Disable magic links hooks a year later (in time for the next > MediaWiki LTS release) More specifically, removing all magic link functionality from MediaWiki core. We would move the Booksources code and ISBN parser function to an extension. > Given that the implications of steps 2 and 3 are much broader than > just the MediaWiki community, this won't likely result in a "final > comment" period for anything other than step 1. But maybe(?) we can > agree to move forward with step 1. Yes, I think having a decision about the time sensitive parts (for MediaWiki 1.28) would be a good goal. > Let's chat about that tomorrow. Same time as always (Wednesday 21 > UTC, 14 PDT, 23 CEST) and place (#wikimedia-office). Unfortunately I won't be able to attend due to a conflict with school, but cscott has volunteered to represent me and the RfC tomorrow. :) Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikimedia Developer Summit 2017 discussion
in it. It makes me especially happy that Phab is free software, > and that the time and money we invest in it is making the universe of > free software better. The Wikimedia Foundation has invested a significant amount of money in Phabricator, via resourcing, staffing, training users, etc. I too am glad when we are able to contribute to the growth of another free software project, however, if that comes at the cost of improving MediaWiki and by extension, the Wikimedia movement, then I think we should be extremely cautious. [1] https://www.mediawiki.org/wiki/Principles [2] https://phabricator.wikimedia.org/T6323 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikimedia Developer Summit 2017 discussion
Hi, On 09/27/2016 03:57 AM, Quim Gil wrote: > * Phabricator form to submit Wikimedia developer Summit 2017 proposals > (urgent because it blocks the opening of the call for participation) > https://phabricator.wikimedia.org/T146377 I have proposed the opposite on the ticket[1] - I believe we should be using mediawiki.org to organize the summit instead of Phabricator. I find it demoralizing that we cannot use our own online collaboration software to plan our own summit. We need more dogfooding[2], not less. [1] https://phabricator.wikimedia.org/T146377#2670042 [2] https://en.wikipedia.org/wiki/Eating_your_own_dog_food -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Debian and Ubuntu packages for MediaWiki now available
Hi, I'm very excited to announce that Debian and Ubuntu packages for MediaWiki are now available. These packages will follow the MediaWiki LTS schedule and currently contain 1.27.1. If you've always wanted to "sudo apt install mediawiki", then this is for you :) For Debian users, you can get the mediawiki package for Jessie from the official Debian repositories using jessie-backports, and it will be included in the upcoming Stretch release. Ubuntu users will need to enable a PPA for Xenial or Trusty to set up the package. Instructions, links, and help can all be found at <https://www.mediawiki.org/wiki/User:Legoktm/Packages>. Please let me know if you have any questions or feedback. Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff, Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers of the package for making this a reality. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-Codesniffer 0.8.0-alpha.1 released
Hi, As the subject line hints, we're doing something a little different with this release. There was a lot of work done on MW-CS over the summer by our GSoC student, Lethexie. There are a significant number of changes, especially to do with documentation comments. So we are releasing this as alpha quality to ask for more feedback on the changes to the ruleset. You can upgrade in the same manner as normal - by updating the version number in the composer.json file. Feel free to provide feedback by responding to this thread or by filing a bug in the MediaWiki-Codesniffer Phabricator project. We would like to release 0.8.0 by mid-October. Thanks! -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Cleaning up ContentHandler technical debt (was Re: Opening up MediaWiki dev summit in January?)
On 09/15/2016 01:09 AM, Daniel Kinzler wrote: > Am 15.09.2016 um 01:54 schrieb Chad: >> Bummer. I think paying down tech debt is fun and way more rewarding >> than making shiny new things. >> >> But I'm also weird as hell... > > /me waves to a kindred soul. > Lovely! I just filed about 20 tasks on cleaning up ContentHandler's technical debt (you know, just in time to rewrite it all over again), help would be appreciated! <https://phabricator.wikimedia.org/T145728>. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] RfC: Future of magic links
Hi, I wrote an RfC to determine the future of the ISBN/PMID/RFC magic links feature (spoiler: I propose getting rid of them). Comments on the talk page would be appreciated! <https://www.mediawiki.org/wiki/Requests_for_comment/Future_of_magic_links> -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Allowing users to change contentmodels of pages
Hi, A few more blockers were identified (and some resolved!) so I'm going to postpone this until at least Thursday (2016-09-15). Thanks, -- Legoktm On 09/07/2016 12:24 PM, Legoktm wrote: > Hi, > > On Monday (2016-09-12), I plan on deploying a change that allows most > users to change the "content model" of a page[1] (the "editcontentmodel" > user right). This functionality has been available to administrators for > a few months now. The permission is similar conceptually to "move", so > it is being granted to all groups that have the "move" permission (and > taken away from those that don't have "move"). Specifics can be seen in > the Gerrit change[2]. > > This functionality is mostly technical, and documentation can be found > at <https://www.mediawiki.org/wiki/Help:ChangeContentModel>. Currently, > the main usecase is to allow any user to create a MassMessage target > list[3], but others are planned in the future. > > If you find documentation missing, or want to change your wiki's > configuration, or have questions in general, feel free to respond via > email, or file a task in the MediaWiki-ContentHandler project and CC me. > > [1] https://phabricator.wikimedia.org/T85847 > [2] https://gerrit.wikimedia.org/r/#/c/309066 > [3] https://phabricator.wikimedia.org/T92795 > > Thanks! > -- Legoktm > ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] +2 for Thiemo Mättig in mediawiki/*
Hi, Per[1], I have given Thiemo Mättig +2 rights in all mediawiki repositories. Congrats! [1] https://phabricator.wikimedia.org/T144250 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] PHP 7 now available and usable in CI
Hi! PHP 7.0 is now available and usable in CI. As a small starter, we have added a job that runs "composer test" using PHP 7.0 for all repositories that run some form of MediaWiki or unit tests or composer tests. It's not enabled by default, so you'll need to run "check experimental" on a patch to see if it passes (thought I doubt there will be a lot of issues). You can follow [1] for progress on enabling those jobs by default, and [2] for the next step of running MediaWiki unit tests under PHP 7. Thank you to ops and the CI team for all of their help and work on this :) [1] https://phabricator.wikimedia.org/T144961 [2] https://phabricator.wikimedia.org/T144962 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Allowing users to change contentmodels of pages
Hi, On Monday (2016-09-12), I plan on deploying a change that allows most users to change the "content model" of a page[1] (the "editcontentmodel" user right). This functionality has been available to administrators for a few months now. The permission is similar conceptually to "move", so it is being granted to all groups that have the "move" permission (and taken away from those that don't have "move"). Specifics can be seen in the Gerrit change[2]. This functionality is mostly technical, and documentation can be found at <https://www.mediawiki.org/wiki/Help:ChangeContentModel>. Currently, the main usecase is to allow any user to create a MassMessage target list[3], but others are planned in the future. If you find documentation missing, or want to change your wiki's configuration, or have questions in general, feel free to respond via email, or file a task in the MediaWiki-ContentHandler project and CC me. [1] https://phabricator.wikimedia.org/T85847 [2] https://gerrit.wikimedia.org/r/#/c/309066 [3] https://phabricator.wikimedia.org/T92795 Thanks! -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] git release branches
Hi, On 08/29/2016 10:33 AM, Brad Jorsch (Anomie) wrote: > On Mon, Aug 29, 2016 at 11:04 AM, John <phoenixoverr...@gmail.com> wrote: > >> I am running into issues where I cant set a branch on the >> extensions/ repo for REL 1_27 and get all extensions as of that branching, >> Instead I am forced to go thru one by one and manually set it, if and only >> if a given extension has said branch set, otherwise I am out of luck. >> > > I see the mediawiki/extensions and mediawiki/skins repos both have a > REL1_27 branch. Do they somehow not work correctly to check out all > submodules at the appropriate revisions? That should work, you'd just need to remember to run "git submodule update" after switching branches, which should be pretty easy. At that point you still need to manage 4 different git repos though (core, vendor, skins, extensions). -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] +2 for Glaisher in mediawiki
Hi, Per[1], I have given Glaisher +2 rights in all mediawiki repositories. Congrats, and thanks for all of your contributions so far! [1] https://phabricator.wikimedia.org/T141730 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] New scap version is live
Hi, On 08/04/2016 01:41 PM, Tyler Cipriani wrote: > Canary deployments: > >1. Sync your change(s) to the api and appserver canary hosts >2. Wait (20 seconds) for traffic to hit those host >3. If there isn't a large increase in the error rate on those hosts > (10x), release your changes to the remainder of the fleet. Does scap/whatever make any requests against those hosts? Or is it just depending upon normal traffic to those hosts to possibly cause errors? -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Autodiscovery of extension unittests
Hi, One thing that's always annoyed me about writing tests for an extension is that for the first test, you need to bootstrap it by adding a UnitTestsList hook, and copying the code for it from some other extension. Now that we have a proper extension registry, I wrote <https://gerrit.wikimedia.org/r/#/c/302944/> (merged by Florian!), which will automatically register any tests in an extension's "tests/phpunit/" directory. If you already have a UnitTestsList hook, the tests won't be duplicated or anything, but it is now safe to remove the hook. <https://phabricator.wikimedia.org/T142120> tracks updating extensions to do so by moving their tests into the standardized path and removing the hook. Note: this requires that the extension has already been converted to use extension.json. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Acquiring list of templates including external links
Hi, On 07/31/2016 07:53 AM, Takashi OTA wrote: > Such links are stored in externallinks.sql.gz, in an expanded form. > > When you want to check increase/decrease of linked domains in chronological > order through edit history, you have to check pages-meta-history1.xml etc. > In a such case, traditional links and links by templates are mixed, > Therefore, the latter ones (links by templates) should be expanded to > traditional link forms. If you have the revision ID, you can make an API query like: <https://en.wikipedia.org/w/api.php?action=parse=387276926=externallinks>. This will expand all templates and give you the same set of externallinks that would have ended up in the dump. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Ops] Canary Deploys for MediaWiki
Hi, On 07/25/2016 04:12 PM, Bryan Davis wrote: > On Mon, Jul 25, 2016 at 3:07 PM, Alex Monk <kren...@gmail.com> wrote: >> On 25 July 2016 at 21:54, Roan Kattouw <roan.katt...@gmail.com> wrote: > I think Alex is "more right" here. If you are introducing a new $wmgX > var > And to continue tooting the extension.json horn, if you're using extension.json, there shouldn't be any need for $wmg*[1] variables anymore. You can set 'wgFooBar' in InitialiseSettings.php and it'll just work, because loading through extension.json doesn't set the defaults to global scope upon loading like the old PHP entry points did, requiring the $wg = $wmg hack. Since a lot of already-deployed extensions use the $wg = $wmg pattern, we're tracking this cleanup at <https://phabricator.wikimedia.org/T119117>. [1] Okay, you'll still need the $wmgUseExtensionName variable, but that is only introduced when deploying the extension for the first time. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Gerrit 2.12.2 test instance - PLEASE TEST
On 07/12/2016 07:45 PM, Bartosz Dziewoński wrote: > I have tested it and I love it. It is all I was hoping it would be and > more. The inline commit editing is quite nice, the lists of related > commits are useful, but the warnings about merge conflicts are the best > thing ever. The sooner it's live, the better! This. Thanks to everyone who is making this upgrade happen <3 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] phpunit: "Extension mysql is required"?
Hi, On 07/13/2016 08:19 AM, Daniel Barrett wrote: > @requires extension mysql > > and this extension is gone in PHP 7.0. The old "mysql" extension was removed in favor of the "mysqli" extension. MediaWiki should automatically use mysqli when it's available. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Engineering] The train will resume tomorrow (was Re: All wikis reverted to wmf.8 last night due to T119736)
Hi, On 07/12/2016 04:56 PM, Ori Livneh wrote: > Is it actually fixed? It doesn't look like it, from the logs. > > Since midnight UTC on July 7, 3,195 distinct users have tried and failed to > log in a combined total of 25,047 times, or an average of approximately > eight times per user. The six days that have passed since then were > business as usual for the Wikimedia Engineering. We should not be blocking login anymore. The patch[1] I deployed last night catches the exceptions so users are able to login, but still continues to log them. I'm not sure if there's a way to tell the difference between an exception that was shown to a user and one that was just logged. [1] https://gerrit.wikimedia.org/r/#/c/298416/ -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-Codesniffer 0.7.2 released (bugfix)
Hi, I just released MediaWiki-Codesniffer 0.7.2, which is a bugfix release for 0.7.1. There was an issue[1] in the SpacyParenthesisSniff that would sometimes eat the content inside of the parenthesis or brackets. I ran a script to check and the only extension that was affected and had bad code committed was ArticlePlaceholder, which has already been fixed. Thanks to Thiemo and Nikerabbit for reporting, and PleaseStand for the patch. Additionally, EBernhardson overhauled how our tests run so we now have the ability to test sniffs that automatically fix code, which will hopefully prevent regressions like this in the future! I submitted autogenerated patches to upgrade all extensions that were already at 0.7.1 to upgrade to 0.7.2 - they should only touch composer.json. [1] https://phabricator.wikimedia.org/T134857 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Linker::link() rewrite
Hi, On 05/16/2016 07:51 AM, Chris Steipp wrote: > Is there any way we can default to having the body of the link not be > passed as html? It's called $html, well documented that it's raw html, and > I've lost track of the number of times people pass unsanitized text to it. > I'd rather it not be something developers have to worry about, unless they > know they need to handle the sanitization themselves. Maybe typehint a > Message object, and people can add raw params if they really need to send > it raw html? Yeah, that sounds good, I implemented that change in PS22. Clients that still need to pass in raw HTML can use the new MediaWiki\Linker\HtmlArmor class to prevent it from being escaped. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Linker::link() rewrite
Hi, For the past few weeks I've been working[1] on rewriting Linker::link() to be non-static, use LinkTarget/TitleValue and some of the other fancy new services stuff. Yay! For the most part, you'd use it in similar ways: Linker::link( $title, $html, $attribs, $query ); is now: $linkRenderer = MediaWikiServices::getInstance() ->getHtmlPageLinkRenderer(); $linkRenderer->makeLink( $title, $html, $attribs, $query ); And there are makeKnownLink() and makeBrokenLink() entry points as well. Unlike Linker::link(), there is no $options parameter to pass in every time a link needs to be made. Those options are set on the HtmlPageLinkRenderer instance, and then applied to all links made using it. MediaWikiServices has an instance using the default settings, but other classes like Parser will have their own that should be used[2]. I'm also deprecating the two hooks called by Linker::link(), LinkBegin and LinkEnd. They are being replaced by the mostly-equivalent HtmlPageLinkRendererBegin and HtmlPageLinkRendererEnd hooks. More details are in the commit message. [3] is an example conversion for Wikibase. The commit is still a WIP because I haven't gotten around to writing specific tests for it (it passes all the pre-existing Linker and parser tests though!), and will be doing that in the next few days. Regardless, reviews / comments / feedback on [1] is appreciated! [1] https://gerrit.wikimedia.org/r/#/c/284750/ [2] https://gerrit.wikimedia.org/r/#/c/288572/ [3] https://gerrit.wikimedia.org/r/#/c/288674/ -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Breaking Change] Scap change for deployers
Hi, On 05/10/2016 10:08 AM, Tyler Cipriani wrote: > Long story short, you will now run: > > scap sync-file [message] > > Instead of: > > sync-file [message] > Would it be possible to have tab completion for the new scap subcommands? "mwv" → mwversionsinuse versus typing out all of "scap wikiversions-inuse" ;) Same with "sync-f", etc. It's not that huge of a deal, but I think we all use those commands pretty frequently that tab completion saves a lot of keystrokes... -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Getting rid of $wgWellFormedXml = false;
Hi, On 05/02/2016 11:42 AM, Brian Wolff wrote: > See gerrit patch https://gerrit.wikimedia.org/r/286495 I would > appreciate everyone's feedback. Given the lack of objections here and on Gerrit, I went ahead and merged it today. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal to invest in Phabricator Calendar
Hi, On 05/13/2016 12:36 PM, Quim Gil wrote: > The Technical Collaboration team has some budget that we could use to fund > the Phabricator maintainers to prioritize some improvements in their > Calendar. If you think this is a bad idea and/or you have a better one, > please discuss in the task (preferred) or here. If you think this is a good > idea, your reassuring feedback is welcome too. ;) If we're going to be investing money into improving Phabricator upstream (a great idea IMO), I think we should start with problem areas that affect a large number of users/developers. There's plenty of low-hanging fruit like non-drag-n-drop file uploads[1]. [2] was also mentioned on #wikimedia-tech a few days ago, or some of the UI/UX issues Nemo brought up after the last Phabricator upgrade[3]. [1] https://phabricator.wikimedia.org/T165#2289766 [2] https://secure.phabricator.com/T10691#167705 [3] https://lists.wikimedia.org/pipermail/wikitech-l/2016-May/085489.html -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-Codesniffer 0.7.1 released
Hello! MediaWiki-Codesniffer 0.7.1 is now available for use in your MediaWiki extensions and other projects. 0.7.0 was a botched release that had bugs, please don't use it. Here are the notable changes since the last release (0.6.0): * Also check for space after elseif in SpaceAfterControlStructureSniff (Lethexie) * Factor our tokenIsNamespaced method (addshore) * Make IfElseStructureSniff can detect and fix multiple white spaces after else (Lethexie) * Make SpaceyParenthesisSniff can fix multiple white spaces between parentheses (Lethexie) * Make spacey parenthesis sniff work with short array syntax (Kunal Mehta) * Speed up PrefixedGlobalFunctionsSniff (addshore) * Update squizlabs/php_codesniffer to 2.6.0 (Paladox) Thanks to some awesome work by Addshore, MW-CS is now twice as fast for large repos like mediawiki/core! This release also features contributions from GSoC student Lethexie who will be working on improving our static analysis tools this summer! -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 2016-05-04 Scrum of Scrums meeting notes
On 05/05/2016 08:24 AM, Grace Gellerman wrote: > https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-05-04 > > = 2016-05-04 = > > ==Analytics == > *Still trouble with jenkins Are there any more details about this or a bug filed for it? -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Requiring composer 1.0.0 as minimum version for 1.27?
Hi, Composer recently released their first stable version of 1.0.0, which among other things mandates usage of secure connections and validates certificates[1]. I'd like for 1.27 to require 1.0.0 as a minimum version people must use when fetching installing MediaWiki dependencies (people can always use mediawiki/vendor instead of composer though). As a side-effect, this would let us get rid of some old back-compat code that is currently triggering a deprecation notice on every composer install command[2]. Thoughts? [1] https://phabricator.wikimedia.org/T119272#2125086 [2] https://phabricator.wikimedia.org/T119590#2234183 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Feelings
Hi, It's well known that Wikipedia is facing threats from other social networks and losing editors. While many of us spend time trying to make Wikipedia different, we need to be cognizant that what other social networks are doing is working. And if we can't beat them, we need to join them. I've written a patch[1] that introduces a new feature to the Thanks extension called "feelings". When hovering over a "thank" link, five different emoji icons will pop up[2], representing five different feelings: happy, love, surprise, anger, and fear. Editors can pick one of those options instead of just a plain thanks, to indicate how they really feel, which the recipient will see[3]. Of course, some might consider this feature to be controversial (I suspect they would respond to my email with "anger" or "fear"), so I've added a feature flag for it. Setting $wgDontFixEditorRetentionProblem = true; will disable it for your wiki. Please give the patch a try, I've only tested it in MonoBook so far, it might need some extra CSS in Vector. [1] https://gerrit.wikimedia.org/r/280961 [2] https://phabricator.wikimedia.org/F3810964 [3] https://phabricator.wikimedia.org/F3810963 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] extension.json schema validation is now voting for extensions
Hi, With [1] now merged, the extension phpunit tests that are run by jenkins will also validate loaded extension.json and skin.json files against the formal JSON schema (docs/extension.schema.json in core). To validate locally, you can either run the structure phpunit tests or use the validateRegistrationFile.php maintenance script. [1] https://gerrit.wikimedia.org/r/#/c/273751/ -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] T43327: Add page views graph(s) to MediaWiki's info action for Wikimedia wikis
Hi! (I meant to send this out earlier but got distracted by other things :() At the Wikimedia developer summit, Addshore and I started working on a MediaWiki extension to resolve T43327[1], which asked for page view graphs to be added to action=info. The "WikimediaPageViewInfo"[2] extension has passed a security review, and should be deployed soon™. If you want to try it out, there's a test wiki[3] which uses the pageview data for en.wp. And for updates on the deployment status, please follow T125917[4]. [1] https://phabricator.wikimedia.org/T43327 [2] https://www.mediawiki.org/wiki/Extension:WikimediaPageViewInfo [3] http://bf-wmpageview.wmflabs.org/wiki/Taylor_Swift?action=info [4] https://phabricator.wikimedia.org/T125917 -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] New RFC: remove mbstring fallbacks
Hi, On 03/09/2016 06:47 PM, Max Semenik wrote: > Hey, I have a new topic I'd like to discuss. It's about mbstring and > whether do we really need to support running without it. > > The RFC is at https://gerrit.wikimedia.org/r/#/c/267309/ I think you meant <https://phabricator.wikimedia.org/T129435> :) -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: [Ops] Minor Phab instability
Forwarded Message Subject:[Ops] Minor Phab instability Date: Wed, 2 Mar 2016 19:55:13 + From: Chad HorohoeTo: Operations Engineers , Development and Operations engineers (WMF only) Hi all, As you may have noticed lately, there's been a bit of a problem with Phabricator--mostly of the sort where it complains about not being able to connect to the m3 database. This is known, and is unfortunately a one time migration cost associated with getting the remaining Gerrit repositories indexed in Phabricator. The work should be done by the end of the week, and I'm trying to batch it as small as possible to try and limit disruption. Thanks for bearing with me! -Chad ___ Ops mailing list o...@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/ops ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] RfC: Notifications in core
Hi, I've written an RfC about moving most of the Echo extension code into MediaWiki core: <https://www.mediawiki.org/wiki/Requests_for_comment/Notifications_in_core>. RobLa has written a summary of it on the Phab task: <https://phabricator.wikimedia.org/T128351>. Feedback is appreciated. :) -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] pt.wikimedia.org - database naming
Hi, On 02/24/2016 06:18 AM, Antoine Musso wrote: > Given most of the useful data/history has been exported, I would suggest > to rename on WMF cluster the ptwikimedia DB to something like > ptwikimedia_old or even just archive a dump of it and drop it. > > Then create a new ptwikimedia and import the external db there. I think reusing the "ptwikimedia" db name would be ideal. My only concern with doing so is that it might be in databases somewhere, but I checked CentralAuth and there are no ptwikimedia entries, so at least that will be fine. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] ORES extension soon be deployed, help us test it
Hi, On 02/19/2016 02:35 PM, Amir Ladsgroup wrote: > Hey all, > TLDR: ORES extension [1] which is an extension that integrates ORES service > [2] with Wikipedia to make fighting vandalism easier and more efficient is > in the progress of deployment. You can test it in > https://mw-revscoring.wmflabs.org (Enable it in your preferences first) I just played with it, and the changes and work you and Adam have put into it are awesome! I can't wait to use this on actual wikis :) -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Mass migration to new syntax - PRO or CON?
On 02/16/2016 01:52 PM, Legoktm wrote: > On 02/12/2016 07:27 AM, Daniel Kinzler wrote: >> Please give a quick PRO or CON response as a basis for discussion. > > No one has responded in a few days, and the current count is 13-5-2, so > I'm going to find a time to do the mass migration when there aren't that > many people making core changes and do this today or tomorrow. {{done}}. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki-Codesniffer 0.6.0 released and [arrays, like, this]
Hello! MediaWiki-Codesniffer 0.6.0 has been released, here's what's changed: * Add Generic.Arrays.DisallowLongArraySyntax to ruleset, autofix this repo (Kunal Mehta) * Add sniff to detect consecutive empty lines in a file (Vivek Ghaisas) * Disable Generic.Functions.CallTimePassByReference.NotAllowed (Kunal Mehta) * Update squizlabs/php_codesniffer to 2.5.1 (Paladox) Notably, this release now requires []-style arrays, as well as PHP 5.5.9 or higher to run. To help automatically migrate, you can use the "phpcbf" tool. I recommend setting up a "composer fix" command (for example [1]). Running just "composer fix" will try to auto-fix every file in the repo, while "composer fix filename" will just check that filename. [1] https://gerrit.wikimedia.org/r/#/c/271220/ Thanks, -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Mass migration to new syntax - PRO or CON?
On 02/12/2016 07:27 AM, Daniel Kinzler wrote: > Please give a quick PRO or CON response as a basis for discussion. No one has responded in a few days, and the current count is 13-5-2, so I'm going to find a time to do the mass migration when there aren't that many people making core changes and do this today or tomorrow. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Mass migration to new syntax - PRO or CON?
Hi, On 02/12/2016 07:27 AM, Daniel Kinzler wrote: > Now that we target PHP 5.5, some people are itching to make use of some new > language features, like the new array syntax, e.g. > <https://gerrit.wikimedia.org/r/#/c/269745/>. > > Mass changes like this, or similar changes relating to coding style, tend to > lead to controversy. I want to make sure we have a discussion about this here, > to avoid having the argument over and over on any such patch. > > Please give a quick PRO or CON response as a basis for discussion. > > In essence, the discussion boils down to two conflicting positions: > > PRO: do mass migration to the new syntax, style, or whatever, as soon as > possible. This way, the codebase is in a consistent form, and that form is the > one we agreed is the best for readability. Doing changes like this is > gratifying, because it's low hanging fruit: it's easy to do, and has large > visible impact (well ok, visible in the source). I'll offer an alternative, which is to convert all of them at once using PHPCS and then enforce that all new patches use [] arrays. You then only have one commit which changes everything, not hundreds you have to go through while git blaming or looking in git log. > CON: don't do mass migration to new syntax, only start using new styles and > features when touching the respective bit of code anyway. The argument is here > that touching many lines of code, even if it's just for whitespace changes, > causes merge conflicts when doing backports and when rebasing patches. E.g. if > we touch half the files in the codebase to change to the new array syntax, who > is going to manually rebase the couple of hundred patches we have open? There's no need to do it manually. Just tell people to run the phpcs autofixer before they rebase, and the result should be identical to what's already there. And we can have PHPCS run in the other direction for backports ([] -> array()). But if we don't do that, people are going to start converting things manually whenever they work on the code, and you'll still end up with hundreds of open patches needing rebase, except it can't be done automatically anymore. > My personal vote is CON. No rebase hell please! Changing to the syntax doesn't > buy us anything. Consistency buys us a lot. New developers won't be confused on whether to use [] or array(). It makes entry easier for people coming from other languages where [] is used for lists. I think you're going to end up in rebase hell regardless, so we should rip off the bandaid quickly and get it over with, and use the automated tools we have to our advantage. So, if we're voting, I'm PRO. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki core master now requires 5.5.9+ and other CI changes
Hi, <https://gerrit.wikimedia.org/r/266931> has been merged, so MediaWiki core now requires PHP 5.5.9 or higher to run. We had to make some (read: a lot) of CI changes for that to happen, here's a quick summary: * php53 jobs are only triggered for REL1_2[3-6] branches * php55 jobs are only triggered for branches that are not REL1_2[3-6] * All extensions that previously had php53 tests now run them under both hhvm and php55 * composer related jobs were renamed to standardize with other CI jobs Since this is a large change for extensions, I went ahead and ran jobs for all extensions that have unittests, and collected the results: <https://www.mediawiki.org/wiki/User:Legoktm/PHP_5.5/Extensions>. Please take a look to see if your extension(s) are failing, and fix or file bugs as needed! Some of the failures aren't even related to HHVM/PHP5.5, the repos have just been broken. And if you run into any CI related issues, please file a bug in the #CI-Config Phab project! -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Close test2wiki?
On 01/28/2016 08:57 AM, Ori Livneh wrote: > The setup of test2.wikipedia.org is no longer meaningfully different from > test.wikipedia.org. Is there a good reason for keeping test2? Especially when debugging and testing cross-wiki features, it is extremely useful to have two test wikis to use. MassMessage, GlobalCssJs, GlobalUserPage, and now cross-wiki notifications were all initially deployed using testwiki as the "central" wiki, and test2wiki as a "client" wiki. Maybe testwikidatawiki could be used in place of test2wiki, but that's already a bit special... -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Scope of ArchCom
Hi, On 01/27/2016 12:46 PM, Matthew Flaschen wrote: > On 01/25/2016 03:16 PM, Rob Lanphier wrote: >> In the short-term, I believe a non-Wikimedia focused subgroup of ArchCom >> may make sense. The declining MediaWiki use outside of Wikimedia has >> been >> a longstanding problem for us, but not the biggest problem. > > Are there stats that show a decline? Just curious. These aren't exactly stats, but I think the Google Trends graph of "MediaWiki" vs. "Confluence" shows a relevant picture: <https://www.google.com/trends/explore#q=mediawiki%2C%20confluence=q=Etc%2FGMT-11> -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Modern module loading needing code review
Hi, On 01/21/2016 10:01 AM, Jon Robson wrote: > As part of the work of the frontend standards group > https://gerrit.wikimedia.org/r/#/c/260071/ adds require and > module.exports to MediaWiki ResourceLoader. This will make it easier > to share code outside MediaWiki and with nodejs systems as well as > discouraging the overloading of the mediawiki variable. > > I'm looking for code review at the moment. If you are familiar with > ResourceLoader's internals I'd appreciate your input. > > See https://phabricator.wikimedia.org/T108655 for further background I'll ask since we just discussed this in the last IRC meeting...why wasn't/isn't this an RfC? -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] New MediaWiki structure test for API module's i18n completeness
Hi, Anomie has written a new structure test[1] that checks whether API modules have all the proper i18n messages needed for their documentation. It's possible that your extension's tests will start failing due to this; if you need help figuring it out, please ping one of us and we can help out. [1] https://gerrit.wikimedia.org/r/#/c/260598/ -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bulk link rewrites for HTTP -> HTTPS migration?
On 01/13/2016 09:09 AM, Chris Adams wrote: > I've been working with a number of colleagues getting ready to turn HTTPS > on by default for various loc.gov domains. This has been fairly successful > and we're working through the old legacy apps now. Awesome! > When that work completes, we'll have somewhere around half a million links > which differ only in the URL scheme. What would be the best way to rewrite > all of those URLs? I'd like to reduce the window during which users transit > from HTTPS -> HTTP -> HTTPS. You can use Pywikbot's replace.py[1], which lets you provide regex find/replace and can get a list of pages from the API equivalent of Special:LinkSearch. You should also consider setting up HSTS[2] so regardless if users click on an HTTP link, they'll be sent to the HTTPS version of the site. [1] https://www.mediawiki.org/wiki/Manual:Pywikibot/replace.py [2] https://en.wikipedia.org/wiki/HTTP_Strict_Transport_Security -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Updating the MediaWiki Debian package
Hi, Over the past few months, I've been working with LFaraone, Faidon, and Moritz on updating the super old 1.19.x Debian package of MediaWiki to use a more modern version (currently 1.25.x) and just generally improve the quality of it. (Yes, it uses MySQL by default, not postgres!) It's currently waiting in the "new" queue[1], but if you'd like to test it out, I've uploaded the debs at [2]. If you find any bugs or have feature requests, you can let me know or file a request in the MediaWiki-Debian[3] Phabricator project. The debian/ files used to create the package can be found in Gerrit in the mediawiki/debian repository[4]. [1] https://ftp-master.debian.org/new.html [2] https://people.wikimedia.org/~legoktm/debian/ [3] https://phabricator.wikimedia.org/tag/mediawiki-debian/ [4] https://gerrit.wikimedia.org/r/#/projects/mediawiki/debian,dashboards/default -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The right way to inject user-supplied JavaScript?
Hi, On 12/29/2015 12:59 PM, Daniel Barrett wrote: > tl;dr: What's the right way for a tag extension to execute JavaScript > provided by the user? (On a private wiki without Internet access.) > > Details: > I run a private wiki for developers (not accessible from the Internet) that > lets any wiki page author run JavaScript on a page by adding a tag: > > alert("hi"); > > (We understand the security implications, which is why the wiki isn't > accessible by the world.) When we upgraded to MediaWiki 1.26 (from 1.24), a > problem occurred: the tag stopped recognizing the "mediawiki" > and "mw" objects, but otherwise works. The following code reports an > undefined variable "mw": > > mw.loader.using() > > I assume this is because the extension builds a
[Wikitech-l] MediaWiki-Codesniffer 0.5.1 released
Hello! MediaWiki-Codesniffer 0.5.1 is now available for use in your MediaWiki extensions and other projects. This release shouldn't contain any changes in the ruleset or sniffs, so upgrading should be trivial. Here are the notable changes since the last release (0.5.0): * Avoid in_array for performance reasons (Thiemo Mättig) * Remove dead code from SpaceBeforeSingleLineCommentSniff (Thiemo Mättig) * Revert "CharacterBeforePHPOpeningTagSniff: Support T_HASHBANG for HHVM >=3.5,<3.7" (Legoktm) * Simplify existing regular expressions (Thiemo Mättig) * Update squizlabs/php_codesniffer to 2.5.0 (Paladox) Special thanks to Thiemo Mättig looking into the performance of PHPCS and Paladox for following the upstream changelog. -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Mediawiki message delivery problems
Hi, On 12/11/2015 05:42 AM, Bináris wrote: > 2015-12-11 11:52 GMT+01:00 Purodha Blissenbach <puro...@blissenbach.org>: >> Reported as https://phabricator.wikimedia.org/T121209 > > Nemo closed it as invalid. While he is right that correct solution is ~ > (5 tildes rather than 4), he is not right that the issue is purely a user > error. > > As mass delivery is a MW service (as far as I see, an extension), the > software should handle it: either put the tildes itself or refuse a > malformed message. MassMessage will warn[1] you if your message is missing a timestamp. It won't forcibly make you add one because the detection is imperfect and in principle, MassMessage doesn't restrict users, but tries to guide them into not doing stupid things. I'm not sure if the warning would have triggered in this case as a UTC timestamp is valid on Meta-Wiki, which is where the message originated from. There's also some discussion about how signatures should be added at [2]. [1] https://commons.wikimedia.org/wiki/File:MassMessage_timestamp_warning.png [2] https://meta.wikimedia.org/wiki/Talk:MassMessage#Signature_recommendation -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Procedural updates to the RFC process (Re: Objections to PHP 5.5 version requirement)
Hi, On 12/02/2015 09:52 PM, Rob Lanphier wrote: > We also discussed having some sort of minimum discussion time on > wikitech-l prior to having the decision-oriented RFC meeting. That > seems sensible, with the caveat that we discussed: we should ask that > those proposing RFCs announce their intention on wikitech-l, directing > the conversation toward the Phab ticket associated with the RFC. The > Phab ticket should be the "discussion of record", whereas wikitech-l > can be "announcement of record" (torturing the "newspaper of record" > metaphor[2]) This has always been part of the RfC process[1] (steps 4-5), I think it would be a good idea to enforce that all steps of the process have been followed first before scheduling a meeting and making a decision. [1] https://www.mediawiki.org/wiki/Requests_for_comment/Process#Making_a_proposal -- Legoktm ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l