Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 2014-06-10, 8:59 PM, John wrote: There is zero reason that this shouldnt be in an extension. Basically a few users want to install a shinny new toy called swiftmailer into core, just because its shiny. In doing so they add a complexity and headache. Such a addition should be done as an extension Sorry but these strawmen are quite annoying, and frankly disrespectful to the developers trying to improve core. SwiftMailer is not some toy or shiny trinket, it is a serious and well maintained library dedicated to sending email from PHP. Adding it to core is not some new feature that would be better in an extension. It is a serious improvement to our core handling of email within PHP that replaces our crude UserMailer code. Our current UserMailer code is ridiculous when you look at it. By default our UserMailer code will just use php's creaky mail() function. Which has some 'features' like: - Unpredictable message headers - Lack of feedback regarding delivery failures - And this beautiful comment in our codebase: # PHP's mail() implementation under Windows is somewhat shite, and # can't handle Joe Bloggs j...@bloggs.com format email addresses, # so don't bother generating them If you want to send email a different way, you could of course instead use $wgSMTP which: - First requires you to install PEAR Mail. - Who the hell uses PEAR anymore! - And don't forget, PEAR is a tool that installs modules globally and is difficult to impossible to use without shell and even admin access. - ;) And this is the hell we put tarball users through, not devs. - This PEAR Mail library we're relying on to handle all users who can't use mail() or don't want it's features, here's the dev page: - http://pear.php.net/package/Mail - Current stable 1.2.0, released in March of 2010 - Next release, 1.2.1 scheduled release date in January of 2011, never released. - ((In other words, the library we're entrusting all our SMTP handling to is practically dead and no longer maintained, Whoops.)) - Admittedly if you hunt down PEAR Mail's github repo there has been a bit of activity: - https://github.com/pear/Mail/commits/trunk - But none of this will be installed by pear, it'll only install old code from 2010 missing fixes for a number of bugs in PEAR Mail - The majority of changes in 2014 have been minor/trivial things (adding travis builds, whitespace fixes) - And, ;) making PEAR Mail installable via Composer *snicker* And to sprinkle this all off, because mail() and PEAR Mail are so different, half the code in UserMailer::send() is split into two different large code paths, which is a recipe for maintenance headaches. Using Swift Mailer on the other hand: - The library has ample development and maintenance going on: https://github.com/swiftmailer/swiftmailer/commits/master - SMTP and mail() are abstracted away into transports, so instead of two large (unmaintained?) code paths we just craft and send an email in one code path, a well maintained library takes care of the difference between transports. - In the future we can move away from using mail() and add the ability to use sendmail as a transport directly, without the bugs (in theory I think it would even be possible to try swapping a different transport in place of mail() automatically). - All this gets bundled into the tarball directly, so $wgSMTP now works out of the box and doesn't require installation of something that in some situations is impossible to install. ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/] ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 11/06/14 16:18, Daniel Friesen wrote: - ((In other words, the library we're entrusting all our SMTP handling to is practically dead and no longer maintained, Whoops.)) Only 3 open bugs though. Sometimes code just keeps working for decades, even without being maintained. Some might even call this a goal. And to sprinkle this all off, because mail() and PEAR Mail are so different, half the code in UserMailer::send() is split into two different large code paths, which is a recipe for maintenance headaches. It's only 50 lines of code each way, which is not quite enough to give me a headache. It could use refactoring, but it wouldn't take long to refactor 100 lines of code. - All this gets bundled into the tarball directly, so $wgSMTP now works out of the box and doesn't require installation of something that in some situations is impossible to install. That is a nice feature, yes. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 11.06.2014 9:01, Matthew Walker wrote: This also has knock on impacts elsewhere. BD808 has a patch that uses PSR-log and Monolog for logging. We're starting to move to a model where we recognize that we shouldn't write everything and that things in core have significantly better replacements in the wider PHP community. It doesn't make sense to keep maintaining the vastly inferior core components when more and more core and extensions are going to want to rely on the newer interfaces and features. PSR loader is ok. PSR standard of using four spaces instead of tabs for indentation is strange. That prevents from easily adjusting the indentation (good editors can visually render tabs like N-spaces according to user preference) and has another drawbacks. The question Tim posed in the commit comes down to: * Do we bundle third party dependencies, or * Do we allow composer to do what it was designed to do and manage the dependencies for us composer can use git to fetch dependencies, at least it does so when I developed with symfony2. https://getcomposer.org/doc/05-repositories.md Dmitriy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Hi, from my point of view this discussion mixes two or three different questions: * Do we want core to regularly use external libraries (which are installed via one git submodule or composer)? * Should Swiftmailer be used in core? * (How) Do we want to handle this for plain git cloning to be usable? I think the first question is sort of answered, although maybe not everybody committed on that. The second question is something different entirely. From my point of view it doesn't make any sense to discuss this here, because (considering how we IMHO answered the first question) we will have a required dependency eventually, even if that is not going to be Swiftmailer. The third question is what Tim actually asked, and it's an interesting, technical question we should focus on. I like Tim's suggestion of having an optional git submodule. The installer could then show an error if vendor/ is missing proposing to either run composer install or git submodule update --init. Do we also have to consider composer.lock? Regards, Adrian ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Just to add to the technical stuff:- * We made a local repo to fork the original swiftmailer repo at https://github.com/wikimedia/mediawiki-core-vendor-swiftmailer/tree/5.2.0-patch, and our https://gerrit.wikimedia.org/r/#/c/137538/ makes composer take up the files from our repo and not the upstream one, which is for the development end. * Our VERP project is going on at https://gerrit.wikimedia.org/r/#/c/138655/ Thanks, Tony Thomas http://tttwrites.in FOSS@Amrita http://foss.amrita.ac.in *where there is a wifi, there is a way* On Wed, Jun 11, 2014 at 1:10 PM, Adrian Lang adrian.l...@wikimedia.de wrote: Hi, from my point of view this discussion mixes two or three different questions: * Do we want core to regularly use external libraries (which are installed via one git submodule or composer)? * Should Swiftmailer be used in core? * (How) Do we want to handle this for plain git cloning to be usable? I think the first question is sort of answered, although maybe not everybody committed on that. The second question is something different entirely. From my point of view it doesn't make any sense to discuss this here, because (considering how we IMHO answered the first question) we will have a required dependency eventually, even if that is not going to be Swiftmailer. The third question is what Tim actually asked, and it's an interesting, technical question we should focus on. I like Tim's suggestion of having an optional git submodule. The installer could then show an error if vendor/ is missing proposing to either run composer install or git submodule update --init. Do we also have to consider composer.lock? Regards, Adrian ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 2014-06-11, 12:07 AM, Tim Starling wrote: On 11/06/14 16:18, Daniel Friesen wrote: - ((In other words, the library we're entrusting all our SMTP handling to is practically dead and no longer maintained, Whoops.)) Only 3 open bugs though. Sometimes code just keeps working for decades, even without being maintained. Some might even call this a goal. Well yes, there are only a few (actually 4 not 3) open bugs: http://pear.php.net/bugs/search.php?cmd=displaypackage_name[]=Mailstatus=OpenFeedbackbug_type=Bugs Although there's also this: http://pear.php.net/bugs/roadmap.php?package=Mailroadmapdetail=1.2.1#a1.2.1 Because 1.2.1 was never released to pear there are 6 extra already fixed bugs which everyone using PEAR Mail are still affected by. ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/] ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Le 11/06/2014 04:30, Tim Starling a écrit : In CR comments on https://gerrit.wikimedia.org/r/#/c/135290/ it has been proposed that we make a git clone of the MW core not be installable until $IP/vendor is populated somehow -- either by separately cloning the mediawiki/core/vendor project, or preferably by running composer to obtain dependencies. I have suggested, as a compromise, to make the vendor directory be a submodule pointing to mediawiki/core/vendor. Then users can either run git submodule update --init to obtain dependencies, or they can omit submodule initialisation and instead run composer. I would like to hear more comments on this. Hello, I would prefer us to avoid embedding third party libraries directly in core: - the repository is already big enough as it is (recently Chad proposed to drop history to shrink the repo). - we might be tempted to have local hack and forget to push them back to upstream Assuming the above, we would require an extra step and I am fine with it. For people running mediawiki out of a git clone that should not be too much of an hassle, the rest of the user base uses the tarball or a package which could both embed all the third party libraries. I do not think we should mix both system and should make a choice between git submodule and composer. A rough comparison on top of my mind, should probably start up a formal RFC about it though they are probably some going on. == git submodule == + command already available + already used by Wikimedia to handle extensions dependencies in the wmf branches + let us review the code + ability to patch third parties - require to fully clone each repositories - version tracked by git sha1 in .gitmodules instead of a version number == composer == + generates autoloader entries + has a post install system which could be used to inject settings in LocalSettings.php + could be used to handle extensions dependencies - depends on upstream to incorporate our patches - needs to install composer - might not be used as-is on Wikimedia cluster Before we make any decision, I would love to hear from Bryan Davis (he is on vacations this week) and Jeroen De Dauw who already put a lot of efforts on dependencies management. cheers, -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 1:58 PM, Antoine Musso hashar+...@free.fr wrote: A rough comparison on top of my mind, should probably start up a formal RFC about it though they are probably some going on. Please find Bryan's RFC here:- https://www.mediawiki.org/wiki/Requests_for_comment/Composer_managed_libraries_for_use_on_WMF_cluster Thanks, Tony Thomas http://tttwrites.in FOSS@Amrita http://foss.amrita.ac.in *where there is a wifi, there is a way* ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] problem offline browsing wikipedia using polipo http proxy
Hi, using polipo http proxy which works fine with wikipedia as long as the machine is online. For offline usage, polipo has the option to store cached http data persistently on disk - and something perhaps style sheet related goes wrong when trying to do this with wikipedia and firefox. The polipo version I am using is this: http://www.pps.univ-paris-diderot.fr/~jch/software/files/polipo/polipo-1.1.1.tar.gz http://www.pps.univ-paris-diderot.fr/~jch/software/files/polipo/polipo-1.1.1.tar.gz.asc Testing with http://en.wikipedia.org/wiki/Sanskrit : * connect network, logout from WP, set firefox proxy, load wikipedia page - works fine * now configure polipo as offline (eg /usr/bin/curl -m 5 -d 'proxyOffline=true' 'http://127.0.0.1:8123/polipo/config?' ) * clear firefox cache * reload page - content is loaded but looks like css is not applied at all. Watching the load process with web-console/network does not show any obvious problems. All GET requests succeed and appear to return sensible data from the polipo cache, also 2 css requests. Only the CentralAutoLogin request fails which is probably expected and should be irrelevant for css? Does anyone have an idea what goes wrong here? Richard --- Name and OpenPGP keys available from pgp key servers pgpJz4BRBrnB0.pgp Description: PGP signature ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 4:28 AM, Antoine Musso hashar+...@free.fr wrote: - depends on upstream to incorporate our patches - might not be used as-is on Wikimedia cluster I would just like to point out that you can override where Composer finds packages using the composer.json file. In other words, if we wanted to force Composer to use WMF's git repositories to check out dependencies, that is entirely possible. In the end, Composer uses git to check out the packages anyway. https://getcomposer.org/doc/04-schema.md#repositories *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
OK, now that I have some more time, I will expand on what I’ve already said in the Gerrit patch. == Introduction == The WMF does not have that many developers. And it’s not that they haven’t hired enough or anything like that. It’s just they do not have that many people in general. As of last year, they had under 200 employees, total. I work at an enterprise software company, and we have 200 developers alone on the app/dev team, let alone the additional people (including myself) in the architecture group, in the business analyst team, in management, etc. To think that under 200 people can develop a 400,000+ line codebase is insane. That’s why the WMF gets help from the open source community. Not only because people love to work on open source projects that benefit the world and humanity, but because such projects *need* volunteers because they simply do not have the billion-dollar operating revenue necessary to develop an enterprise product. With that said, it is absolutely necessary that the WMF get as much help as it can. Right now it has a lot of help: numerous volunteer developers working on core, working on translatewiki, etc. However, it’s not easy convincing people to join an open source software project. I’m sure Quim and Sumana can attest to this. What we can do, however, is “export” the work. == Third-party libraries == === How they can help? === Third-party libraries, i.e., other open-source software projects, are helpful in that: 1) They have existing communities of interested volunteer (and sometimes even paid) developers. 2) They are structured in an abstract manner, meaning they can be used inside MediaWiki without having extensive knowledge of the library’s internals. 3) They help to remove possibly hundreds if not thousands of lines of code from MediaWiki core that does not need to be there. The premise is: why reinvent the wheel? Or, to be more specific, why reinvent the wheel by carving a square stone rock? There are libraries out there (such as SwiftMailer, which I’ll get to) that do certain tasks that MediaWiki needs to do. However, they are ten times as comprehensive as MediaWiki’s make-shift implementations, and are maintained a lot more. === The qualifications === Obviously, we cannot just start throwing in third-party libraries willy-nilly. There are major implications, specifically: * Possible security issues introduced by the library * Having to submit patches upstream when bugs are discovered These are real issues. However, if the third-party library we are thinking of including is: 1) popular, i.e., used by other projects, especially commercial and/or commercially-backed projects; 2) uses a sane and reliable code review process and release process; and 3) is stable and unlikely to change drastically, then the library is reliable, and we do not really have to worry about issues. One argument that people seem to be making is that since the code did not go through *our* review process, the code might be vulnerable or insecure. That is a fallacy. To say that code going through our own review process is more secure than others is ridiculous. Doing a security review of code requires security knowledge, and those with security knowledge (i.e., Chris) are not reviewing every line of code that goes into core. Long story short, we do need to evaluate what libraries we put in, but if they are reliable it will not be a problem. == SwiftMailer == Now we get to SwiftMailer. Some are calling it a new shiny toy, and like Daniel said, this is a bit insulting to anybody who has even looked at the UserMailer.php code. With SwiftMailer, the only mail code that will be in MediaWiki will be what is necessary to get the e-mail address of a user. Everything else is handled inside the library. We no longer have to maintain mail-sending code at all. Putting PEAR aside, considering the major annoyances in using PHP’s mail() function, including differences in handling between operating systems, it is a big success if we do not have to deal with the mail() function inside MediaWiki. -- Tyler Romeo 0xC86B42DF From: Tyler Romeo tylerro...@gmail.com Reply: Tyler Romeo tylerro...@gmail.com Date: June 11, 2014 at 7:21:46 To: Wikimedia developers wikitech-l@lists.wikimedia.org Subject: Re: [Wikitech-l] Making a plain MW core git clone not be installable On Wed, Jun 11, 2014 at 4:28 AM, Antoine Musso hashar+...@free.fr wrote: - depends on upstream to incorporate our patches - might not be used as-is on Wikimedia cluster I would just like to point out that you can override where Composer finds packages using the composer.json file. In other words, if we wanted to force Composer to use WMF's git repositories to check out dependencies, that is entirely possible. In the end, Composer uses git to check out the packages anyway. https://getcomposer.org/doc/04-schema.md#repositories -- Tyler Romeo Stevens Institute of Technology, Class of 2016
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 4:28 AM, Antoine Musso hashar+...@free.fr wrote: == git submodule == + command already available + already used by Wikimedia to handle extensions dependencies in the wmf branches + let us review the code + ability to patch third parties - require to fully clone each repositories Couldn't you do a shallow clone, if the disk space or bandwidth is a concern? - version tracked by git sha1 in .gitmodules instead of a version number == composer == + generates autoloader entries + has a post install system which could be used to inject settings in LocalSettings.php This could well be a minus if it winds up being done poorly. + could be used to handle extensions dependencies - depends on upstream to incorporate our patches Well, we could fork if necessary. Although that's seldom fun. - needs to install composer This is one of my pet peeves. It's one thing to have to install dependencies to run MediaWiki, but having to install various dependencies just to *install* it? Ugh! Composer is a first step. Then there's the proposals floating around to make various nodejs services be required (which themselves tend to be a pain because they seem to require dozens of npm packages), which leads to proposals that MediaWiki require puppet/docker/something like that just to be installed because of the overly complex interdependencies. I have nothing against puppet/docker/etc in general, but it seems like overkill for running a simple MediaWiki installation off of a single host. - might not be used as-is on Wikimedia cluster -- Brad Jorsch (Anomie) Software Engineer Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
I will mention that any solution short of sucking in the third party dependencies into the main repo (not as a submodule) -- which no one wants to do anyway -- will be slightly awkward to git bisect. Not impossible; the pain is about the same for both main options: a) in theory git-bisect should adjust submodules to the correct hash. in practice you need to run `git submodule update` after every step in order to check out the appropriate submodule commits. b) similarly, for composer you need to run a command to update the 3rd party packages, if any dependencies have changed. (for node.js projects, which have similar issues, you need to run `npm install` after every step. For regressions that are easily found by running a test suite, you can arrange for `git bisect run` to do the appropriate git, composer, or npm command before running the test. So, it's somewhat awkward, but manageable. And usually the 3rd-party dependencies don't typically change as often as the core code, so this doesn't come up all that often. Two main benefits of `git submodule`: (1) perhaps one day `git bisect` and `git submodule` will play nicer together; (2) since references are to a git hash, crawling through history is repeatable. One disadvantages: (1) because git allows `.gitmodules` to differ from your local set of module repo sources (in `.git/config`), it is rather too easy to forget to push a submodule commit referenced from the main repo -- although hopefully jenkins will catch errors of this sort, The main disadvantage of using `composer`/`npm`/etc directly is that you are at the mercy of the upstream package repo to behave well. That is, if the upstream repo allows uploaders to change the tarball associated with version X.Y.Z, you may find it hard to reproduce past configurations. Similarly, if you specify loose package version constraints, you are at the mercy of the upstream maintainers to actually maintain compatibility between minor versions or what-have-you. Parsoid and some other WMF projects actually use a hybrid approach where we use `npm` and not submodules, but we maintain a separate deploy repo which combines the main code base (as a submodule) and specific versions of the third party code. --scott ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
I'm not a developer so it's perfectly normal that I can't understand anything about your talk; nevertheless, please, remember of KISS principle when building any installing tools for poor, final users. I'm waiting for something like pip install core. Alex 2014-06-11 15:58 GMT+02:00 C. Scott Ananian canan...@wikimedia.org: I will mention that any solution short of sucking in the third party dependencies into the main repo (not as a submodule) -- which no one wants to do anyway -- will be slightly awkward to git bisect. Not impossible; the pain is about the same for both main options: a) in theory git-bisect should adjust submodules to the correct hash. in practice you need to run `git submodule update` after every step in order to check out the appropriate submodule commits. b) similarly, for composer you need to run a command to update the 3rd party packages, if any dependencies have changed. (for node.js projects, which have similar issues, you need to run `npm install` after every step. For regressions that are easily found by running a test suite, you can arrange for `git bisect run` to do the appropriate git, composer, or npm command before running the test. So, it's somewhat awkward, but manageable. And usually the 3rd-party dependencies don't typically change as often as the core code, so this doesn't come up all that often. Two main benefits of `git submodule`: (1) perhaps one day `git bisect` and `git submodule` will play nicer together; (2) since references are to a git hash, crawling through history is repeatable. One disadvantages: (1) because git allows `.gitmodules` to differ from your local set of module repo sources (in `.git/config`), it is rather too easy to forget to push a submodule commit referenced from the main repo -- although hopefully jenkins will catch errors of this sort, The main disadvantage of using `composer`/`npm`/etc directly is that you are at the mercy of the upstream package repo to behave well. That is, if the upstream repo allows uploaders to change the tarball associated with version X.Y.Z, you may find it hard to reproduce past configurations. Similarly, if you specify loose package version constraints, you are at the mercy of the upstream maintainers to actually maintain compatibility between minor versions or what-have-you. Parsoid and some other WMF projects actually use a hybrid approach where we use `npm` and not submodules, but we maintain a separate deploy repo which combines the main code base (as a submodule) and specific versions of the third party code. --scott ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 10:51 AM, Tyler Romeo tylerro...@gmail.com wrote: curl -sS https://getcomposer.org/installer | php ... That's just awful. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 10:42 AM, Alex Brollo alex.bro...@gmail.com wrote: I'm not a developer so it's perfectly normal that I can't understand anything about your talk; nevertheless, please, remember of KISS principle when building any installing tools for poor, final users. I'm waiting for something like pip install core. Just to clarify on the implications of using Composer, specifically: 1) If you are using MediaWiki through tarball releases (or rather, anything other than using git), this will *not* affect you. Tarballs will ship with the dependencies pre-loaded, so nothing changes. 2) If you are using MediaWiki through git, the extra step you will need to do to install MediaWiki is: curl -sS https://getcomposer.org/installer | php ./composer.phar install And hopefully, if the user has enough knowledge to clone a git repository, they should understand command line and how to run these two basic commands. (As a side note, you can install composer globally by moving composer.phar to /usr/bin/local/composer, which makes thing simpler.) Of course, all of this will be thoroughly documented. Additionally, it is also possible to just integrate Composer into the MediaWiki installer, and have the normal install process load the dependencies on its own. Of course, this assumes the web server has write permissions to the install directory. *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 10:56 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org wrote: ... That's just awful. How so? *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Fwiw, that's not required. Lots of packages exist already. Even if you don't have a package for your chosen distro you can easily install it to your $PATH in less scary ways. -Chad On Jun 11, 2014 7:56 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org wrote: On Wed, Jun 11, 2014 at 10:51 AM, Tyler Romeo tylerro...@gmail.com wrote: curl -sS https://getcomposer.org/installer | php ... That's just awful. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 10:58 AM, Tyler Romeo tylerro...@gmail.com wrote: On Wed, Jun 11, 2014 at 10:56 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org wrote: ... That's just awful. How so? Well, it makes *me* wince because you're directing people to pull code over the network and feed it straight to the PHP interpreter, probably as root, without inspecting it first. And the site is happy to send it to you via plain HTTP, which means a one-character typo gives an active attacker a chance to pwn your entire installation. No, nobody bothers to read all the code they just checked out of Git, but it's integrity-protected by design, independent of the transport channel -- you know that the code you just received is the exact same code everyone else is getting, so you can trust that *someone* did the security audit. (And yeah, no one does *that* either, which is how we got the OpenSSL fiasco, but computers can't solve that problem.) zw ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Ouch, thanks for wasting a few of my brain cells. This is why do dont add stupid code to core. My web server doesnt have curl installed, nor does it have /usr/bin/local/ You havent bothered to think your code through. Why dont you un-fuck your code, configure it as an extension and go from there? at that point you can find out exactly how many site your going to break. Once you have a stable reviewed extension we can *think* about merging it to core. On Wed, Jun 11, 2014 at 11:21 AM, Tyler Romeo tylerro...@gmail.com wrote: On Wed, Jun 11, 2014 at 11:05 AM, Zack Weinberg za...@cmu.edu wrote: Well, it makes *me* wince because you're directing people to pull code over the network and feed it straight to the PHP interpreter, probably as root, without inspecting it first. And the site is happy to send it to you via plain HTTP, which means a one-character typo gives an active attacker a chance to pwn your entire installation. It's over HTTPS. As long as you trust that getcomposer.org is the domain you are looking for, this is really no different than installing via a package manager. *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 11:05 AM, Zack Weinberg za...@cmu.edu wrote: Well, it makes *me* wince because you're directing people to pull code over the network and feed it straight to the PHP interpreter, probably as root, without inspecting it first. And the site is happy to send it to you via plain HTTP, which means a one-character typo gives an active attacker a chance to pwn your entire installation. It's over HTTPS. As long as you trust that getcomposer.org is the domain you are looking for, this is really no different than installing via a package manager. *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Can we kill the subthread dealing with the awful pipe the output of curl to php install for composer? It's evilness is not really on topic (not until we start writing suggested install directions in the wiki). As Chad noted, there are sane-sysadmin ways to install composer. I think it would be more productive to continue discussing how we want to handle third-party dependencies, rather than arguing over install instructions. --scott On Wed, Jun 11, 2014 at 11:21 AM, Tyler Romeo tylerro...@gmail.com wrote: On Wed, Jun 11, 2014 at 11:05 AM, Zack Weinberg za...@cmu.edu wrote: Well, it makes *me* wince because you're directing people to pull code over the network and feed it straight to the PHP interpreter, probably as root, without inspecting it first. And the site is happy to send it to you via plain HTTP, which means a one-character typo gives an active attacker a chance to pwn your entire installation. It's over HTTPS. As long as you trust that getcomposer.org is the domain you are looking for, this is really no different than installing via a package manager. *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- (http://cscott.net) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 11:21 AM, Tyler Romeo tylerro...@gmail.com wrote: It's over HTTPS. As long as you trust that getcomposer.org is the domain you are looking for, this is really no different than installing via a package manager. Nothing stops you from installing it over insecure HTTP. (I filed https://github.com/composer/composer/issues/3047 for that.) But this is bad practice even with HTTPS; you're relying on *transport* integrity/authenticity to secure *document* authenticity. Yeah, we do that all the time on today's Web, but software installation is (I don't think this is hyperbole) more security-critical than anything else and should be held to higher standards. In this case, there should be an independently verifiable (i.e. not tied to the TLS PKI) PGP signature on the installer and people should be instructed to check that before executing it. Note that Git submodules do this for you automatically, because the revision hash is unforgeable. zw ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
No one has really addressed the point of making this an extension and not adding the excessive overhead to core. Especially for something that may have such a wide impact. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Thanks Zack for actually explaining the reasoning to me, rather than trying to insult my intelligence and then use it as an argument against the proposal. -- Tyler Romeo 0xC86B42DF From: Zack Weinberg za...@cmu.edu Reply: Wikimedia developers wikitech-l@lists.wikimedia.org Date: June 11, 2014 at 11:47:34 To: Wikimedia developers wikitech-l@lists.wikimedia.org Subject: Re: [Wikitech-l] Making a plain MW core git clone not be installable Nothing stops you from installing it over insecure HTTP. (I filed https://github.com/composer/composer/issues/3047 for that.) But this is bad practice even with HTTPS; you're relying on *transport* integrity/authenticity to secure *document* authenticity. Yeah, we do that all the time on today's Web, but software installation is (I don't think this is hyperbole) more security-critical than anything else and should be held to higher standards. In this case, there should be an independently verifiable (i.e. not tied to the TLS PKI) PGP signature on the installer and people should be instructed to check that before executing it. Note that Git submodules do this for you automatically, because the revision hash is unforgeable. signature.asc Description: Message signed with OpenPGP using AMPGpg ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] How to make MediaWiki easier to install: use cases
In the current discussion about git submodules vs. composer there are several different underlying assumptions about the user's situation. I think it would help the discussion to clarify which use cases we are dealing with. Here is an attempt: 1) Shared hosting without shell. The user uploads code with (s)ftp, and can't install anything globally. 2) Shared hosting with non-root shell and git installed. The user can use git directly on the server, but can't install anything globally without root. They can manually download composer to their home directory. 3) Root on a (virtual) server. The user can install packages, and do any of the above. The git submodules vs. composer discussion seems to focus on case 2). Case 1) could be addressed by providing a 'bundle' tar file with all dependencies that can be uploaded via (s)ftp. In case 2) composer or git can be used on the server to fetch dependencies separately. When using git, it might be worth considering Parsoid's method of making the core repository a submodule of a 'core-deploy' repository which has all dependencies, rather than making the dependencies a submodule of core. This avoids issues with git complaining about dirty submodules in the common case of updating core often. In case 3) the user has a full packaging system at their disposal, which means that it is theoretically possible to set up a fully-featured MediaWiki system with a few commands. So far we don't have any special support for this case (we expect users to follow the manual tarball setup), which made sense in the past as folks running their own server were fairly rare. Many of our users are starting to take advantage of cheap virtual machines though, which are now widely available at a price point comparable to shared hosting. For this reason I think that we should put more effort into supporting case 3), for example by providing good Debian packaging which lets you do apt-get install mediawiki-full and get a MediaWiki install with caching, VisualEditor and so on. There are also other benefits here beyond the initial install, like automatic security updates with unattended-upgrades. So far we don't have a good idea of how common the different use cases are, and how this distribution is changing. I think that we should try to get this information so that we can have a more informed debate. Gabriel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tech Talk: How, What, Why of WikiFont
Hello! Reminder: This TechTalk is starting in 30 minutes. *If you would like to join the hangout or follow along on YouTube:* https://plus.google.com/events/chpgv8usjd6dn38on07njjk28hg IRC: #wikimedia-office *More information about this Tech Talk from Jared Zimmerman: * If you or someone you love every uses raster or vector (svg) assets in any of your work this tech talk is for you. *What is Wikifont?* Wikifont is an embedded webfont https://en.wikipedia.org/wiki/Web_typography which contains single color glyphs for common iconography used in our projects. It is part of our move toward a unified cross-device/platform user interface, sometime referred to as mediawiki.ui Its also a way to reduce the number of calls to the server for assets (yay performance!) *What you may learn at this talk:* Is wikifont relevant to my work? (*hint: yes*) How do I use WikiFont? How do I request new assets in WikiFont? Who is using WikiFont already (Mobile Apps, Flow, Mobile Web, Some Beta Features) How does WikiFont work with i18n l10n? Do we hate IE6 users? *Bring your questions, your lunch, and come hear Shahyar, May, and Monte share some cool new things. * On Mon, Jun 9, 2014 at 5:55 PM, Rachel Farrand rfarr...@wikimedia.org wrote: Please join us on google hangout June 11 @ 1900 UTC http://www.timeanddate.com/worldclock/fixedtime.html?msg=WikiFontiso=20140611T12p1=224ah=1 for the following Tech Talk: *How, What, Why of WikiFont * *Presented by Core Features Engineer: Shahyar Ghobadpour, Visual Designer: May Galloway and Mobile Apps Engineer: Monte Hurd* *Wikifont-glyphs is a collection of icons that have been used in our projects, made into a font. We will be showcasing how you can take advantage of this icon font set, how to request for more icons and contribute to the set. * If you want to watch live you can join the hangout here https://plus.google.com/events/chpgv8usjd6dn38on07njjk28hg and follow along and ask questions in #wikimedia-office on IRC. More info about Wikifont here https://www.mediawiki.org/wiki/Design/Wikifont The WebFont Tech Talk will also be available for viewing later here https://www.youtube.com/user/watchmediawiki on the mediawiki youtube channel. Thanks! Rachel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Improving event announcements (was Re: Tech Talk: How, What, Why of WikiFont)
On Jun 10, 2014 10:19 AM, Quim Gil q...@wikimedia.org wrote: The video can be watched from the Google+ URL that we are advertising. Even if not logged in? Please double check with a clean cookie jar. (Seems to just give me a login prompt. No chance to play the video. I just trued with the wikifont link.) If for whatever reason you prefer to watch it in YouTube, that page also contains a link to the corresponding YouTube page, under Links. If it makes sense, I would prefer to avoid sending two different URLs in our announcements pointing to the same video. Maybe then the youtube link should be the primary link advertised. Also, more general question: emails are nice but how about an onwiki page per event? Email could link to that and maybe youtube too and then for extra links like plus you would have to go to the wiki to get them. -Jeremy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to make MediaWiki easier to install: use cases
On 2014-06-11, 11:47 AM, Gabriel Wicke wrote: In the current discussion about git submodules vs. composer there are several different underlying assumptions about the user's situation. I think it would help the discussion to clarify which use cases we are dealing with. Here is an attempt: 1) Shared hosting without shell. The user uploads code with (s)ftp, and can't install anything globally. ... The git submodules vs. composer discussion seems to focus on case 2). Case 1) could be addressed by providing a 'bundle' tar file with all dependencies that can be uploaded via (s)ftp. In case 2) composer or git can be used on the server to fetch dependencies separately. Shared hosting users using (s)ftp (1) inherently already use our tarball releases exclusively. No-one has suggested requiring extra steps for tarball users, any dependencies we use would be bundled with tarball releases. The only thing being discussed is whether users that have already chosen to use git (and already have git) to clone core (which is technically supposed to be a dev tool even though people like to abuse it for production use) should have to run one extra command (either `composer install` or `git submodule update --init`) before it can be used. ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/] ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to make MediaWiki easier to install: use cases
As Daniel hinted at, I'd like to add one more use case: (4) prospective developers who want to do a small install for local testing and contribute patches. This turns out to be very similar to use case (2), but it motivates the use of git (rather than a tarball) more strongly. Case (4) also prioritizes low overhead installs, ie can we get the developer setup and productive quickly enough that they don't lose interest. Approaches like using a sqlite3 database help case (4), even though it might not be reasonable to use sqlite for a real wiki (case 2) for performance reasons, regardless of the hosting situation. Gerrit-vs-Phabricator-vs-github and other tooling tradeoffs also matter for use case (4), as we want to minimize the total number of tools and packages the prospective developer needs to install/learn before they can test and submit their first patch. --scott ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Le 11/06/2014 15:51, Brad Jorsch (Anomie) a écrit : - require to fully clone each repositories Couldn't you do a shallow clone, if the disk space or bandwidth is a concern? The git submodule add|update subcommands support --depth to produce shallow clone. The commit: https://github.com/git/git/commit/275cd184d Introduced in git v1.8.4 ( git tag --contains 275cd184 ) So yeah that would save up time/bandwith/disk. -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
Le 11/06/2014 15:51, Brad Jorsch (Anomie) a écrit : - needs to install composer This is one of my pet peeves. It's one thing to have to install dependencies to run MediaWiki, but having to install various dependencies just to *install* it? Ugh! Composer is a first step. Then there's the proposals floating around to make various nodejs services be required (which themselves tend to be a pain because they seem to require dozens of npm packages), which leads to proposals that MediaWiki require puppet/docker/something like that just to be installed because of the overly complex interdependencies. I have nothing against puppet/docker/etc in general, but it seems like overkill for running a simple MediaWiki installation off of a single host. composer is being more and more used in the PHP world and I see it as essentially replacing Pear that probably nobody is going to regret. So if one is going to install MediaWiki from git, there is a very high chance he already has composer and the person would definitely be able to figure out how to install it. For end users, they would rely on tarballs that ships the third parties dependencies. The orientation toward service oriented architecture (ie bunch of well defined API implemented in whatever language) would surely make it harder and harder to keep supporting third parties. But that is another can of worm. -- Antoine hashar Musso ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] 45 minutes till RfC lightning round!
On 06/06/2014 11:05 PM, Sumana Harihareswara wrote: At the Wednesday, 11 June RfC meeting we'll briefly look at several RfCs to decide on next actions. Early in the week I'll make a list and send it out onlist. https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-06-11 would be the page to suggest your RfC! And at a Friday, 13 June discussion we'll be discussing the security guidelines that Chris Steipp is working on. That's 1500 UTC. https://www.mediawiki.org/wiki/Architecture_meetings/Security_guidelines_discussion_2014-06-13 Time: http://www.timeanddate.com/worldclock/fixedtime.html?iso=20140613T15p1=1440 Both will be in #wikimedia-office on Freenode IRC. I'm sorry I didn't make the list of stuff we'll discuss today earlier. I always welcome your suggestions and they make it easier and faster! * HTML templating library - per Gabriel Wicke's note, does anyone from Mobile have feedback on the prototype? Initial thoughts on HTML content templating? * Reducing image quality for mobile - the patch has been merged into core. Yuri asked: do we use it via JavaScript rewrite or Varnish-based rewrite? * Debugging at production server - any thoughts on the patch or proposed implementation? * VERY BRIEF thoughts on Composer if we have time. -- Sumana Harihareswara Senior Technical Writer Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to make MediaWiki easier to install: use cases
On 06/11/2014 12:29 PM, C. Scott Ananian wrote: As Daniel hinted at, I'd like to add one more use case: (4) prospective developers who want to do a small install for local testing and contribute patches. Scott I have started to summarize the use cases at https://www.mediawiki.org/wiki/Distribution/Use_cases Lets refine that page, so that we can use it as a reference while discussing solutions. Gabriel ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to make MediaWiki easier to install: use cases
Also: people with shell but no git. Sent from my mobile device ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Improving event announcements (was Re: Tech Talk: How, What, Why of WikiFont)
On Wednesday, June 11, 2014, Jeremy Baron jer...@tuxmachine.com wrote: On Jun 10, 2014 10:19 AM, Quim Gil q...@wikimedia.org javascript:_e(%7B%7D,'cvml','q...@wikimedia.org'); wrote: The video can be watched from the Google+ URL that we are advertising. Even if not logged in? Please double check with a clean cookie jar. After saying good-bye to my entire collection of convenient cookies... I can still access https://plus.google.com/events/chpgv8usjd6dn38on07njjk28hg as anonymous users, and from there I can play the video and I can find the YouTube link under Links. (Seems to just give me a login prompt. No chance to play the video. I just trued with the wikifont link.) Anybody else willing to test, please? If for whatever reason you prefer to watch it in YouTube, that page also contains a link to the corresponding YouTube page, under Links. If it makes sense, I would prefer to avoid sending two different URLs in our announcements pointing to the same video. Maybe then the youtube link should be the primary link advertised. Also, more general question: emails are nice but how about an onwiki page per event? Email could link to that and maybe youtube too and then for extra links like plus you would have to go to the wiki to get them. The Google+ page is automatically created when we schedule a hangout, and it can have the description and links required. It is not perfect but it saves us one of the many steps we have to do to announce an event, which are just too many. I used to create those wiki pages, and I felt it was too much work for too little return. -- Quim Gil Engineering Community Manager @ Wikimedia Foundation http://www.mediawiki.org/wiki/User:Qgil ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 11 Jun 2014, at 22:05, Antoine Musso hashar+...@free.fr wrote: Le 11/06/2014 15:51, Brad Jorsch (Anomie) a écrit : - needs to install composer This is one of my pet peeves. It's one thing to have to install dependencies to run MediaWiki, but having to install various dependencies just to *install* it? Ugh! Composer is a first step. [..] composer is being more and more used in the PHP world and I see it as essentially replacing Pear that probably nobody is going to regret. [..] True, composer should be easier to work with than PEAR[1]. However, afaik we never depended on any PEAR packages. We either shipped them (Services_JSON), provided a fallback or made the feature optional / in an extension. -- Krinkle [1] Easier to instal because it requires less permissions since it uses local directory. Easier to deploy because of this. And managed with a manifest in composer.json instead of reading a manual of sorts etc. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] LiquidThreads - how do we kill it?
On 9 Jun 2014, at 20:58, Bartosz Dziewoński matma@gmail.com wrote: On Mon, 09 Jun 2014 20:52:44 +0200, Martijn Hoekstra martijnhoeks...@gmail.com wrote: In this case, which post are you replying to in flow when you reply to multiple people? In mediawiki you sort of work around the issue, and it sort of works because you try to create some ad-hoc solution. When the software creates a hard dependency between posts, where it is difficult now to keep track of these kinds of discussion, it may become even more difficult to follow them then. Since we've established that this is something that currently does happen, I think even if it is (to be polite (?)) completely insane, it's something that should be supported anyway. When I encounter this issue on mailing lists, I usually just reply to the lowest common ancestor of all the posts I want to reply to at once, or split my reply and respond to each separately. (And mailing lists are interesting by itself, because most actual e-mail clients display the discussion in a threaded fashion, while most webmails like GMail display a flat list of replies.) Introducing a structured discussion is hard enough, let's not invent issues where there are none. :) I have used many different desktop and mobile e-mail applications that aren't web based. The last time I've seen it displayed threaded instead of flattened by default was when I installed Microsoft Office on Windows 98 SE which hid Outlook Express and intruced Microsoft Outlook and with it it exposed me to threaded display of mailing lists[1]. Every other mail client I've used did not do this (not for mailing lists, not for regular inbox). Outlook Express, Eudora, Thunderbird, Apple Mail and various web-based clients. These different clients may have carried over my preferences, but they all had an option to display it flattened. And I believe it was the default. While we may or may not know what the default was, I'm pretty sure that most e-mail clients display discussions in a threaded fashion is pertinently not true. -- Krinkle [1] Would've roughly looked like this: http://i.imgur.com/fK1NV2G.png ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tech Talk: How, What, Why of WikiFont
On Mon, Jun 9, 2014 at 5:55 PM, Rachel Farrand rfarr...@wikimedia.org wrote: Please join us on google hangout June 11 @ 1900 UTC http://www.timeanddate.com/worldclock/fixedtime.html?msg=WikiFontiso=20140611T12p1=224ah=1 for the following Tech Talk: *How, What, Why of WikiFont* *Presented by Core Features Engineer: Shahyar Ghobadpour, Visual Designer: May Galloway and Mobile Apps Engineer: Monte Hurd* *Wikifont-glyphs is a collection of icons that have been used in our projects, made into a font. We will be showcasing how you can take advantage of this icon font set, how to request for more icons and contribute to the set. * If you want to watch live you can join the hangout here https://plus.google.com/events/chpgv8usjd6dn38on07njjk28hg and follow along and ask questions in #wikimedia-office on IRC. More info about Wikifont here https://www.mediawiki.org/wiki/Design/Wikifont The WebFont Tech Talk will also be available for viewing later here https://www.youtube.com/user/watchmediawiki on the mediawiki youtube channel. Thanks! Rachel Thank you very much for the interesting talk, everybody. I do have a follow up question though, that I didn't think of during the talk. It was indicated that the glyphs will occupy the private use space. Many symbols already have their own codepoints over severals blocks, http://en.wikipedia.org/wiki/Unicode_Symbols . Would it make sense for those symbols that already have a code point to use the already defined code point rather than a private use code point? I'm thinking about the star, black star, pencil, paperclip and gear for example, but I think the same goes for most glyps on the image in the github page. --Martijn ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] How to make MediaWiki easier to install: use cases
On 12/06/14 05:29, C. Scott Ananian wrote: As Daniel hinted at, I'd like to add one more use case: (4) prospective developers who want to do a small install for local testing and contribute patches. For development, I like to have everything checked out from some kind of version control system, so that if I need to edit a third-party component, I can easily do that and have my changes tracked, so that a diff or commit can easily be generated. Also, on my laptop, I have a privilege separation boundary between code editing (which is done as the main desktop user) and execution of bleeding-edge code. I think that's a very sensible boundary, since there is all sorts of private data available to the main desktop user, and code from git is often not reviewed. So I'm not going to run composer install hooks on my laptop as a user that can actually install code. So I think for developers, either submodules or explicit git clones are the best solutions. But I don't think anyone is proposing a solution that would prevent explicit git clones. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Nesting of mediawiki/core/vendor inside mediawiki/core
One thing that concerns me about the proposed composer setup, that I haven't mentioned yet, is the nesting of the project hierarchy, with mediawiki/core/vendor inside mediawiki/core. You know that we have mediawiki/extensions instead of mediawiki/core/extensions -- that makes it easy to check out all Gerrit repos in the natural directory hierarchy and to still have a clean $IP/extensions which you can use for installer testing or whatever. I wonder if a similar solution is possible with vendor -- could we have mediawiki/vendor instead of mediawiki/core/vendor? Then it would be possible to play around with dropping things into $IP/vendor while still having the vendor git repo checked out in a sensible location, available for editing. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 11/06/14 05:01, Matthew Walker wrote: This also has knock on impacts elsewhere. BD808 has a patch that uses PSR-log and Monolog for logging. We're starting to move to a model where we recognize that we shouldn't write everything and that things in core have significantly better replacements in the wider PHP community. It doesn't make sense to keep maintaining the vastly inferior core components when more and more core and extensions are going to want to rely on the newer interfaces and features. The question Tim posed in the commit comes down to: * Do we bundle third party dependencies, or * Do we allow composer to do what it was designed to do and manage the dependencies for us ~Matt Walker Wikimedia Foundation Fundraising Technology Team Problem with composer is it doesn't actually work for all use cases (farms in particular can be problematic, even if the wmf did get it working), so depending on it isn't really a good idea unless there is a sane fallback. What folks have mentioned as alternatives so far (submodules in core etc) do not sound sane. I'm all for having better handling for things in core, but we need a sane way to include them, and sane defaults that don't require weird dependencies and bending over backwards when all you want to do is clone the thing and get set up. -I ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On Wed, Jun 11, 2014 at 7:20 PM, Isarra Yos zhoris...@gmail.com wrote: Problem with composer is it doesn't actually work for all use cases (farms in particular can be problematic, even if the wmf did get it working), so depending on it isn't really a good idea unless there is a sane fallback. What folks have mentioned as alternatives so far (submodules in core etc) do not sound sane. Which use cases are you referring to exactly? (I'm curious as to how composer does not work with farms.) *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Nesting of mediawiki/core/vendor inside mediawiki/core
On 11/06/14 22:26, Tim Starling wrote: One thing that concerns me about the proposed composer setup, that I haven't mentioned yet, is the nesting of the project hierarchy, with mediawiki/core/vendor inside mediawiki/core. You know that we have mediawiki/extensions instead of mediawiki/core/extensions -- that makes it easy to check out all Gerrit repos in the natural directory hierarchy and to still have a clean $IP/extensions which you can use for installer testing or whatever. I wonder if a similar solution is possible with vendor -- could we have mediawiki/vendor instead of mediawiki/core/vendor? Then it would be possible to play around with dropping things into $IP/vendor while still having the vendor git repo checked out in a sensible location, available for editing. -- Tim Starling Actually, I like the sound of this. The extensions thing is sane (well, relatively) and useful. This would also perhaps be sane. Hopefully we won't wind up with an explosion of these pocket repositories down the road, though, especially necessary ones... technically none of them are actually necessary currently, and I don't think we want core to eventually just be replaced by a meta repository of a whole bunch of component repositories that then have all their own component dealies and fish and squid and a bunch of goats scattered throughout? Or... something. -I ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 11/06/14 23:21, Tyler Romeo wrote: On Wed, Jun 11, 2014 at 7:20 PM, Isarra Yos zhoris...@gmail.com wrote: Problem with composer is it doesn't actually work for all use cases (farms in particular can be problematic, even if the wmf did get it working), so depending on it isn't really a good idea unless there is a sane fallback. What folks have mentioned as alternatives so far (submodules in core etc) do not sound sane. Which use cases are you referring to exactly? (I'm curious as to how composer does not work with farms.) I'm afraid the sysadmins responsible for the decisions at ShoutWiki and Uncyclomedia would have to speak for themselves as to details; I only know it didn't work for either, and that such is something to consider. (Apparently there was an rfc on this - did nobody bring it up there? Or did they even know about it? Or am I thinking of something else?) -I ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Nesting of mediawiki/core/vendor inside mediawiki/core
I actually think that Composer-installed stuff should be under $IP/includes as that's where most code should be. On Wed, Jun 11, 2014 at 3:26 PM, Tim Starling tstarl...@wikimedia.org wrote: One thing that concerns me about the proposed composer setup, that I haven't mentioned yet, is the nesting of the project hierarchy, with mediawiki/core/vendor inside mediawiki/core. You know that we have mediawiki/extensions instead of mediawiki/core/extensions -- that makes it easy to check out all Gerrit repos in the natural directory hierarchy and to still have a clean $IP/extensions which you can use for installer testing or whatever. I wonder if a similar solution is possible with vendor -- could we have mediawiki/vendor instead of mediawiki/core/vendor? Then it would be possible to play around with dropping things into $IP/vendor while still having the vendor git repo checked out in a sensible location, available for editing. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Best regards, Max Semenik ([[User:MaxSem]]) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 11/06/14 14:49, I wrote: In particular, there is a mail library called SwiftMailer which provides bounce detection, among other things. Bounce detection would be a nice thing to have. Sorry, it seems I was misled on this. SwiftMailer does not have any special handling for bounces, judging by the code review I have done just now and http://pookey.co.uk/wordpress/archives/183-using-verp-with-swiftmailer-symfony-and-exim All it has is a setReturnPath() method, and of course we have equivalent code in UserMailer.php already: $headers['Return-Path'] = $from-address; With VERP this would become: $headers['Return-Path'] = $this-makeReturnPath( $from ); or some such. With SwiftMailer, you could instead have: $message-setReturnPath( $this-makeReturnPath( $from ) ); ...with identical implementation of makeReturnPath(). SwiftMailer does not give you any help in receiving bounces. It has no MIME parsers or clients for POP, IMAP, MTA pipes, etc. On [[Talk:VERP]], Tyler suggested writing a SwiftMailer plugin to implement VERP, by which he apparently means a plugin which would serialize the relevant parameters and calculate an HMAC, and then set the return path. Presumably this would look like: $verp = new Swift_Plugins_VerpPlugin( array( 'wiki' = wfWikiId(), 'ua' = 'UserMailer', 'time' = wfTimestampNow() ), $secretKey ) ); $mailer-registerPlugin( $verp ); $mailer-send( $message ); With Swift_Plugins_VerpPlugin being along the lines of: class Swift_Plugins_VerpPlugin { function __construct( $context ) { $this-context = $context; } function beforeSendPerformed( $evt ) { $message = $evt-getMessage(); $message-setReturnPath( $this-makeReturnPath( $message-getReturnPath() ) ); } ... } ...with makeReturnPath() being the serialization code, equivalent to my suggestion of UserMailer::makeReturnPath() above. On [[Talk:VERP]], Jeff Green was fairly skeptical about the value of doing it this way, and I guess I am too. You have to wonder where the boundary is between abstraction and needless complexity. And where do you put the receive code? Surely the return path parser and the return path printer should be maintained together. SwiftMailer is basically just an SMTP client. I would think that a bounce processing feature as a whole would be an elegant thing to put a module boundary around, rather than having half of it implemented as an SMTP client plugin and the other half implemented as a MediaWiki-specific CLI interface for an MTA pipe. In other news, I found a serious security vulnerability in SwiftMailer and have reported it upstream. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Making a plain MW core git clone not be installable
On 11/06/14 16:18, Daniel Friesen wrote: - Current stable 1.2.0, released in March of 2010 Note that the SMTP part is implemented in Net_SMTP, which was last released in 2013, and the MIME printer part (which we don't currently use) is in Mail_Mime, last released May 2014. Swift, by contrast, is a 43 KLOC monolith which does all three things and a kitchen sink of other bits and pieces, like its own dependency injection system (Swift_DependencyContainer) supported by a custom autoloader. - And, ;) making PEAR Mail installable via Composer *snicker* You snicker at the idea that the maintainers of PEAR Mail are not soldiers in a pitched battle for the minds of developers, PEAR on one side and Composer on the other? Installer neutrality doesn't seem so strange to me. Note that the PEAR core itself is installable via Composer, and has a composer.json in its git source tree. Net_SMTP and Mail_Mime also have a composer.json. Maybe we should think about installing PEAR Mail in the vendor directory using Composer, instead of Swift. -- Tim Starling ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l