Re: [Wikitech-l] SSL certificates for Wikimedia sites
Tim Landscheidt t...@tim-landscheidt.de wrote in message news:m3skfsna0i@passepartout.tim-landscheidt.de... Domas Mituzas midom.li...@gmail.com wrote: I know what happens when self-signed certificate is used. Why the heck is that an issue with wikitech.wikimedia.org wiki? Because when you access URI:https://wikitech.wikimedia.org/, it will bark :-). Would not all references to wikitech.leuksman.com have been advertizing the HTTPS access (and the Google ratio is still about 55900:209 :-)), I would not care. But IMVHO *if* HTTPS requests are served, that should be done properly. Firefox, for example, gives a very scary notice if you visit that address. I for one would not trust anything for which such a scary notice was generated, even if I trust the owners of the site (as I do here). The message indicates that the site may have been compromised, and that is too much of a risk to take these days. IE gives a less scary message, but it still very firmly informs you: close this webpage and do not continue to this website. Again, not a message I would ignore. Seriously, unless you are intentionally trying to scare people away from the site, then this should be fixed. - Mark Clements (HappyDog) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] please make wikimedia.org mailing lists searchable
We used to provide a search using htdig, but it failed to update and finally got deactivated. What about adding a new search with lucene, just as the wiki search? Then mediawiki.org search could incorporate a search mediawiki-l' checkbox. :) Seems like a neat project for the codeathon. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] please make wikimedia.org mailing lists searchable
On Mon, Aug 24, 2009 at 1:16 AM, jida...@jidanni.org wrote: Why have each user jump through such hoops, and still leave this door open to the the bad guys whoever they are. [snip] If you wish to have a productive discussion with people you'll be most successful if you try to understand and empathize with their concerns, so that you can find a solution which satisfies everyone. You won't go far with scare-quoted phrases like the bad guys and hyperbole like held for ransom and North Korean style. The current behaviour was established as the result of experience: It's not something that was done speculatively, but as a solution to real problems which were occurring. Removing messages from archives was found to be time-consuming and ineffective because once out the removal often did nothing. The annoying of dealing with it was magnified because it had to be done by someone with shell access and because it was, naturally, always urgent. People make mistakes, both the clicked the wrong button type and the failed to consider the consequence type, and people often play fast and loose with other people's privacy. As an example— an issue we've had in the past is people responding with private details to a message which included a public list buried in its carbon-copy chain. So admonishing be more careful really doesn't solve it: The lack of google indexing is intended to address the cases where be careful failed. The intent isn't to stop people from searching for information in the lists, which would be an impossible goal, but to prevent material from the lists from showing up at the top of google when people perform random searches for various people's names and to make removals actually effective. So the availability of archive files is not a problem. Perhaps this is more of a problem for the Wikimedia Lists than many others due to the high search placement of the Wiki(p|m)edia sites in general. I think the comparison to LKML is entirely inappropriate: not only can you make an entirely different set of assumptions about the users technical prowess but LKML is open for posting to non-subscribers … the level of SPAM received through it in the past has exceeded the volume of some of our lists, its like arguing that we shouldn't wear underwear because the nice folks at the nudist colony don't either. :) Different culture, different issues, different solutions. Other people do have the same problems and concerns— though obviously you're less likely to see them if they aren't indexed by google! Being able to keep your messages out of the search indexes while remaining open to anyone who is willing to click a few buttons is a primary attraction of the yahoo-groups service. Be thankful that we don't force you though an infuriating web interface like they do. I think everyone would like better search than we currently have available. It should be possible to provide a solid search interface without increasing the level of exposure. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
--- On Sun, 8/23/09, Aryeh Gregor simetrical+wikil...@gmail.com wrote: If they can run commands on the command line, then they can use environment variables. If they can't, then your suggestion doesn't help. If there are administrators who can execute command lines, but cannot set environmental variables (e.g., they are confined to use a special shell) There aren't. That would make no sense. Thanks for clarifying the situation. Given this information I suggest changing all code in command line utilities of the form: $IP = getenv( 'MW_INSTALL_PATH' ); if ( $IP === false ) { $IP = dirname(__FILE__).'/../..'; } to: $IP = getenv( 'MW_INSTALL_PATH' ); if ( $IP === false ) { echo Error. The environmental variable MW_INSTALL_PATH must be set to the root of the MW distribution. Exiting.\n; die(); } This would eliminate file position dependent code from the command line utilities, making them easier to maintain (i.e., they can be moved in the distribution without breaking them). Dan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] please make wikimedia.org mailing lists searchable
On Mon, Aug 24, 2009 at 6:44 AM, Gregory Maxwellgmaxw...@gmail.com wrote: On Mon, Aug 24, 2009 at 1:16 AM, jida...@jidanni.org wrote: Why have each user jump through such hoops, and still leave this door open to the the bad guys whoever they are. [snip] If you wish to have a productive discussion with people you'll be most successful if you try to understand and empathize with their concerns, so that you can find a solution which satisfies everyone. You won't go far with scare-quoted phrases like the bad guys and hyperbole like held for ransom and North Korean style. The current behaviour was established as the result of experience: It's not something that was done speculatively, but as a solution to real problems which were occurring. Removing messages from archives was found to be time-consuming and ineffective because once out the removal often did nothing. The annoying of dealing with it was magnified because it had to be done by someone with shell access and because it was, naturally, always urgent. People make mistakes, both the clicked the wrong button type and the failed to consider the consequence type, and people often play fast and loose with other people's privacy. As an example— an issue we've had in the past is people responding with private details to a message which included a public list buried in its carbon-copy chain. So admonishing be more careful really doesn't solve it: The lack of google indexing is intended to address the cases where be careful failed. The intent isn't to stop people from searching for information in the lists, which would be an impossible goal, but to prevent material from the lists from showing up at the top of google when people perform random searches for various people's names and to make removals actually effective. So the availability of archive files is not a problem. Perhaps this is more of a problem for the Wikimedia Lists than many others due to the high search placement of the Wiki(p|m)edia sites in general. I think the comparison to LKML is entirely inappropriate: not only can you make an entirely different set of assumptions about the users technical prowess but LKML is open for posting to non-subscribers … the level of SPAM received through it in the past has exceeded the volume of some of our lists, its like arguing that we shouldn't wear underwear because the nice folks at the nudist colony don't either. :) Different culture, different issues, different solutions. Other people do have the same problems and concerns— though obviously you're less likely to see them if they aren't indexed by google! Being able to keep your messages out of the search indexes while remaining open to anyone who is willing to click a few buttons is a primary attraction of the yahoo-groups service. Be thankful that we don't force you though an infuriating web interface like they do. I think everyone would like better search than we currently have available. It should be possible to provide a solid search interface without increasing the level of exposure. I'd like to echo the last point. I'd certainly like to see a decent search function for the mailing lists. (Though given the number of sites that already archive some of our mailing lists, even opening them to Google doesn't seem likely to increase exposure by all that much.) How difficult would it be for someone to set up Lucene (or similar) to go through the collective mailing list archives and provide some form of centralized search interface? If someone is feeling really ambitious, one might even look at replacing the pipermail archive with something more stable (links can break when the index gets rebuilt) and easier to manage with respect to things like removing private info. (There might even be workable alternatives already in existence somewhere.) We've been using what appears to be a more or less generic Mailman install for ages. Seems like a good target for improvements. -Robert Rohde ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
--- On Mon, 8/24/09, Chad innocentkil...@gmail.com wrote: Why skip trying to find the location? If MW_INSTALL_PATH is already missing, what have we got to lose from trying to guess the location? The vast majority of people don't screw with the default structure, so it should be just fine. That's a reasonable question, stating in another way the useful maxim, if it ain't broke, don't fix it. The problem is I think it's broke. Here is my take on the pros/cons of leaving things unchanged: Pros: * Some administrators are used to simply typing the line php utility.php. Making them type: MW_INSTALL_PATH=/var/wiki/mediawiki php utility.php would be inconvenient. In answer to this, for the MW installations running on unix, it is pretty simple to alias MW_INSTALL_PATH=/var/wiki/mediawiki php and put the definition into .bash_profile (or the appropriate shell initialization script). This is a one time effort and so the change isn't as onerous as it might seem. I assume there is a similar tactic available for windows systems. Cons: * The use of file position dependent code is a problem during development and much less of a problem during installation and production (as you suggest). Right now there are ~400 sub-directories in the extensions directory. It seems to me reorganization of the extensions directory would help understanding the relationship between individual extensions and the core. For example, having two subdirectories, one for cli utilities and another for hook based extensions would clarify the role each extension plays. However, currently there are 29 extensions where $IP is set using the relative position of the file in the MW directory structure (a couple of other extensions set $IP based on MW_INSTALL_PATH). Reorganizing the directory structure has the potential of breaking them. * CLI utilities are moved around for reasons other than a reorganization of the extensions directory. For example, as I understand it, DumpHTML was moved from maintenance/ to extensions/. dumpHTML.php sets $IP based on its relative position in the distribution tree. It was a happy coincidence that when it was moved, its relative position didn't change. However, it is unreasonable to think such reclassifications will always be as fortunate. Since the cons outweigh the pros, I remain convinced that the change I suggested (using die()) improves the code. Dan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
2009/8/24 dan nessett dness...@yahoo.com: Pros: * Some administrators are used to simply typing the line php utility.php. Making them type: MW_INSTALL_PATH=/var/wiki/mediawiki php utility.php would be inconvenient. Here's a question: These utilities are php. Is there any reason in principle that they couldn't be set up to be accessed from a MediaWiki control panel page by WikiSysop? - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
Then we shall never agree. I believe its pretty much accepted that we *like* the approach of use the environment variable if it's available, guess where we think it is when it's not is good. Of course, the env variable is best: that's why it was added. What I fail to understand is why trying to guess the path when the variable isn't around is such a bad idea? In my mind, the logic should be: 1) try the variable, 2) try to guess the path if we don't know #1, and 3) fail. You seem to be wanting to take out step 2, which makes zero sense to me. Most installs never touch the default directory structure, and should be able to fail back to that just fine. -Chad On Aug 24, 2009 1:30 PM, dan nessett dness...@yahoo.com wrote: --- On Mon, 8/24/09, Chad innocentkil...@gmail.com wrote: Why skip trying to find the location?... That's a reasonable question, stating in another way the useful maxim, if it ain't broke, don't fix it. The problem is I think it's broke. Here is my take on the pros/cons of leaving things unchanged: Pros: * Some administrators are used to simply typing the line php utility.php. Making them type: MW_INSTALL_PATH=/var/wiki/mediawiki php utility.php would be inconvenient. In answer to this, for the MW installations running on unix, it is pretty simple to alias MW_INSTALL_PATH=/var/wiki/mediawiki php and put the definition into .bash_profile (or the appropriate shell initialization script). This is a one time effort and so the change isn't as onerous as it might seem. I assume there is a similar tactic available for windows systems. Cons: * The use of file position dependent code is a problem during development and much less of a problem during installation and production (as you suggest). Right now there are ~400 sub-directories in the extensions directory. It seems to me reorganization of the extensions directory would help understanding the relationship between individual extensions and the core. For example, having two subdirectories, one for cli utilities and another for hook based extensions would clarify the role each extension plays. However, currently there are 29 extensions where $IP is set using the relative position of the file in the MW directory structure (a couple of other extensions set $IP based on MW_INSTALL_PATH). Reorganizing the directory structure has the potential of breaking them. * CLI utilities are moved around for reasons other than a reorganization of the extensions directory. For example, as I understand it, DumpHTML was moved from maintenance/ to extensions/. dumpHTML.php sets $IP based on its relative position in the distribution tree. It was a happy coincidence that when it was moved, its relative position didn't change. However, it is unreasonable to think such reclassifications will always be as fortunate. Since the cons outweigh the pros, I remain convinced that the change I suggested (using die()) improves the code. Dan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
Most require far too much execution time to be done over HTTP, you hit timeouts too easily. -Chad On Aug 24, 2009 1:37 PM, David Gerard dger...@gmail.com wrote: 2009/8/24 dan nessett dness...@yahoo.com: Pros: * Some administrators are used to simply typing the line php utility.php. Making them t... Here's a question: These utilities are php. Is there any reason in principle that they couldn't be set up to be accessed from a MediaWiki control panel page by WikiSysop? - d. ___ Wikitech-l mailing list wikitec...@lists.wikimedia ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
2009/8/24 Chad innocentkil...@gmail.com: Most require far too much execution time to be done over HTTP, you hit timeouts too easily. Huh, that's annoying. Would a progress ticker be sufficient to keep the connection alive and the admin interested? (I'm putting this forward as an inchoate feature request not quite up to Bugzilla yet :-) ) - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
On Mon, Aug 24, 2009 at 1:52 PM, David Gerarddger...@gmail.com wrote: 2009/8/24 Chad innocentkil...@gmail.com: Most require far too much execution time to be done over HTTP, you hit timeouts too easily. Huh, that's annoying. Would a progress ticker be sufficient to keep the connection alive and the admin interested? (I'm putting this forward as an inchoate feature request not quite up to Bugzilla yet :-) ) - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l There's some extensions that make a stab at it (Maintenance and MaintenanceShell), but I'm not sure what they do to try and overcome the timeout issue. I'm sure it's doable, and it's probably more doable now with the new Maintenance class, rather than trying to shell out to other scripts like we used to :) -Chad ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
dan nessett wrote: --- On Mon, 8/24/09, Chad innocentkil...@gmail.com wrote: Why skip trying to find the location? If MW_INSTALL_PATH is already missing, what have we got to lose from trying to guess the location? The vast majority of people don't screw with the default structure, so it should be just fine. That's a reasonable question, stating in another way the useful maxim, if it ain't broke, don't fix it. The problem is I think it's broke. Here is my take on the pros/cons of leaving things unchanged: Pros: * Some administrators are used to simply typing the line php utility.php. Making them type: MW_INSTALL_PATH=/var/wiki/mediawiki php utility.php would be inconvenient. In answer to this, for the MW installations running on unix, it is pretty simple to alias MW_INSTALL_PATH=/var/wiki/mediawiki php and put the definition into .bash_profile (or the appropriate shell initialization script). This is a one time effort and so the change isn't as onerous as it might seem. I assume there is a similar tactic available for windows systems. Cons: * The use of file position dependent code is a problem during development and much less of a problem during installation and production (as you suggest). Right now there are ~400 sub-directories in the extensions directory. It seems to me reorganization of the extensions directory would help understanding the relationship between individual extensions and the core. For example, having two subdirectories, one for cli utilities and another for hook based extensions would clarify the role each extension plays. However, currently there are 29 extensions where $IP is set using the relative position of the file in the MW directory structure (a couple of other extensions set $IP based on MW_INSTALL_PATH). Reorganizing the directory structure has the potential of breaking them. * CLI utilities are moved around for reasons other than a reorganization of the extensions directory. For example, as I understand it, DumpHTML was moved from maintenance/ to extensions/. dumpHTML.php sets $IP based on its relative position in the distribution tree. It was a happy coincidence that when it was moved, its relative position didn't change. However, it is unreasonable to think such reclassifications will always be as fortunate. Since the cons outweigh the pros, I remain convinced that the change I suggested (using die()) improves the code. Except that the cons are mostly hypothetical. I don't believe anyone except you has actually proposed restructuring the extensions directory. But even if we do, it still wouldn't affect most users because A) There aren't that many extensions that add command line utilities (several extensions also have scripts and hook based extensions so wouldn't neatly fit into such categories) and B) Most users don't checkout the entire extensions directory, just the few extensions they're actually going to use, so they can continue to put them all in the same directory. It may be, technically, a slight improvement to code quality, but its arguably a degradation to end-user experience. It makes things slightly easier for developers in the event of a hypothetical change (though time-wise, its really only a benefit after the second such change) but makes it so things that have just worked for years for users no longer work without changing settings or adding extra command line parameters. -- Alex (wikipedia:en:User:Mr.Z-man) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] SSL certificates for Wikimedia sites
On Mon, Aug 24, 2009 at 8:50 AM, Mark Clements (HappyDog)gm...@kennel17.co.uk wrote: Seriously, unless you are intentionally trying to scare people away from the site, then this should be fixed. wikitech is mainly intended for Wikimedia tech staff, not the general public, so I assume that they don't care very much if the general public is scared away. Anyone who can use the site usefully presumably knows enough about HTTPS to understand that they can safely ignore the warning. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
David Gerard wrote: 2009/8/24 Chad innocentkil...@gmail.com: Most require far too much execution time to be done over HTTP, you hit timeouts too easily. Huh, that's annoying. Would a progress ticker be sufficient to keep the connection alive and the admin interested? (I'm putting this forward as an inchoate feature request not quite up to Bugzilla yet :-) ) Actually SMW has something like this. It has a Special:SMWAdmin page used to set up and upgrade the DB tables, and refresh the data. I believe for the data refresh it splits it into chunks and puts it in the job queue. -- Alex (wikipedia:en:User:Mr.Z-man) ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] SSL certificates for Wikimedia sites
On 8/24/09 3:04 PM, Aryeh Gregor wrote: On Mon, Aug 24, 2009 at 8:50 AM, Mark Clements (HappyDog)gm...@kennel17.co.uk wrote: Seriously, unless you are intentionally trying to scare people away from the site, then this should be fixed. wikitech is mainly intended for Wikimedia tech staff, not the general public, so I assume that they don't care very much if the general public is scared away. Anyone who can use the site usefully presumably knows enough about HTTPS to understand that they can safely ignore the warning. Pretty much, yeah. :) We put real certs on public-facing sites, but just haven't bothered with what is essentially our tech department intranet. (But since we're crazy people it's open if you want to look at it!) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] please make wikimedia.org mailing lists searchable
On 8/24/09 2:09 PM, Robert Rohde wrote: On Mon, Aug 24, 2009 at 6:44 AM, Gregory Maxwellgmaxw...@gmail.com wrote: I think everyone would like better search than we currently have available. It should be possible to provide a solid search interface without increasing the level of exposure. I'd like to echo the last point. I'd certainly like to see a decent search function for the mailing lists. (Though given the number of sites that already archive some of our mailing lists, even opening them to Google doesn't seem likely to increase exposure by all that much.) How difficult would it be for someone to set up Lucene (or similar) to go through the collective mailing list archives and provide some form of centralized search interface? It's not hard in theory, just needs an interested party and some elbow grease to replace the horror that is pipermail. :) The existing solutions we've tried (htdig integration) have not been very stable, hence the status quo. In the meantime, the wide availability of searchable archive copies through multiple third-party list aggregation services means most of our lists are *already* searchable via Google etc, so we're not missing much functionality. -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
On 8/24/09 2:40 PM, Chad wrote: Most require far too much execution time to be done over HTTP, you hit timeouts too easily. Yup. I'd love to see a good infrastructure for breaking up a lot of these things into smaller chunks that could be queued up and run more cleanly through a control panel. Nice project for someone? :) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] SSL certificates for Wikimedia sites
On 8/24/09 3:38 PM, Lane, Ryan wrote: Pretty much, yeah. :) We put real certs on public-facing sites, but just haven't bothered with what is essentially our tech department intranet. (But since we're crazy people it's open if you want to look at it!) Wouldn't it be safer, and more convenient, to have internal sites use an internally created CA instead of self-signed certificates? Safer, but less convenient as it would take us a few extra minutes to set up which we might as well spend on buying an $8 public-friendly cert. ;) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] SSL certificates for Wikimedia sites
Pretty much, yeah. :) We put real certs on public-facing sites, but just haven't bothered with what is essentially our tech department intranet. (But since we're crazy people it's open if you want to look at it!) Wouldn't it be safer, and more convenient, to have internal sites use an internally created CA instead of self-signed certificates? At least then users would simply have to trust the CA once and not get the warning on other, or future, internal sites. V/r, Ryan Lane ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
On 8/23/09 9:48 PM, dan nessett wrote: --- On Sun, 8/23/09, Andrew Garrettagarr...@wikimedia.org wrote: $ MW_INSTALL_PATH=/var/wiki/mediawiki php/maintenance/update.php I don't understand the point you are making. If an MW administrator can set environmental variables, then, of course, what you suggests works. However, Brion mentions in his Tues, Aug 11, 10:09 email that not every MW installation admin can set environmental variables and Aryeh states in his Tues, Aug. 11, 10:09am message that some MW administrators only have FTP access to the installations they manage. For a little background -- There are basically two ways to get at 'mediawiki stuff': 1) Through the web (PHP run via web server) -- this is the main MediaWiki user interface. 2) On the command line (PHP run from a login shell) -- the various maintenance scripts. Everybody can do the web, but configuration may be limited: * environment variables can be set in web server configuration, but this might not be available in a limited shared hosting environment * environment variables can be set with 'putenv' within a PHP script... but this may be disabled in some shared hosting environments. Some folks also have no command-line shell access to their server, in which case they can't run any of the maintenance scripts -- so have no place to set an environment variable there either. -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
--- On Mon, 8/24/09, Alex mrzmanw...@gmail.com wrote: I don't believe anyone except you has actually proposed restructuring the extensions directory. Perhaps not. But, I don't see why that is relevant. I am making arguments why the extensions directory should be restructured. I may convince no one, but I don't think I should presume that. A) There aren't that many extensions that add command line utilities (several extensions also have scripts and hook based extensions so wouldn't neatly fit into such categories) Here are the files in /extensions/ that reference /maintenance/command.inc. There are 65 of them (line number of the reference at the end). I don't know which of these are commonly used and therefore included in installation extension/ directories, but I assume all of them are used by at least a small number of sites (otherwise, why include them in the extensions directory at all?) /extensions/AbuseFilter/install.php:8 /extensions/AbuseFilter/phpTest.php:8 /extensions/AdvancedSearch/populateCategorySearch.php:9 /extensions/AntiSpoof/batchAntiSpoof.php:6 /extensions/AntiSpoof/generateEquivset.php:4 /extensions/Babel/txt2cdb.php:9 /extensions/BoardVote/voterList.php:6 /extensions/CentralAuth/migratePass0.php:8 /extensions/CentralAuth/migratePass1.php:8 /extensions/CentralAuth/migrateStewards.php:3 /extensions/CentralNotice/rebuildLocalTemplates.php:3 /extensions/CentralNotice/rebuildTemplates.php:3 /extensions/CheckUser/importLog.php:4 /extensions/CheckUser/install.php:8 /extensions/cldr/rebuild.php:11 /extensions/CodeReview/svnImport.php:6 /extensions/CommunityVoice/CLI/Initialize.php:4 /extensions/Configure/findSettings.php:18 /extensions/Configure/manage.php:19 /extensions/Configure/migrateFiles.php:17 /extensions/Configure/migrateToDB.php:16 /extensions/Configure/writePHP.php:18 /extensions/DataCenter/CLI/Import.php:4 /extensions/DataCenter/CLI/Initialize.php:4 /extensions/DumpHTML/dumpHTML.php:61 /extensions/DumpHTML/wm-scripts/old/filterNamespaces.php:4 /extensions/DumpHTML/wm-scripts/queueController.php:6 /extensions/FlaggedRevs/maintenance/clearCachedText.php:13 /extensions/FlaggedRevs/maintenance/reviewAllPages.php:8 /extensions/FlaggedRevs/maintenance/updateAutoPromote.php:8 /extensions/FlaggedRevs/maintenance/updateLinks.php:10 /extensions/FlaggedRevs/maintenance/updateQueryCache.php:8 /extensions/FlaggedRevs/maintenance/updateStats.php:8 /extensions/LiquidThreads/compat/generateCompatibilityLocalisation.php:6 /extensions/LiquidThreads/import/import-parsed-discussions.php:4 /extensions/LiquidThreads/migrateDatabase.php:7 /extensions/LocalisationUpdate/update.php:7 /extensions/MetavidWiki/maintenance/download_from_archive_org.php:4 /extensions/MetavidWiki/maintenance/maintenance_util.inc.php:15 /extensions/MetavidWiki/maintenance/metavid2mvWiki.inc.php:16 /extensions/MetavidWiki/maintenance/metavid_gov_templates.php:2 /extensions/MetavidWiki/maintenance/mv_oneTime_fixes.php:2 /extensions/MetavidWiki/maintenance/mv_update.php:6 /extensions/MetavidWiki/maintenance/ogg_thumb_insert.php:15 /extensions/MetavidWiki/maintenance/scrape_and_insert.inc.php:12 /extensions/MetavidWiki/maintenance/transcode_to_flv.php:13 /extensions/MetavidWiki/maintenance/video_ocr_thumb_insert.php:15 /extensions/OAI/oaiUpdate.php:17 /extensions/ParserFunctions/testExpr.php:4 /extensions/SecurePoll/voterList.php:11 /extensions/SemanticMediaWiki/maintenance/SMW_conceptCache.php:18 /extensions/SemanticMediaWiki/maintenance/SMW_dumpRDF.php:34 /extensions/SemanticMediaWiki/maintenance/SMW_refreshData.php:41 /extensions/SemanticMediaWiki/maintenance/SMW_setup.php:46 /extensions/SemanticMediaWiki/maintenance/SMW_unifyProperties.php:27 /extensions/SemanticResultFormats/Ploticus/SRF_Ploticus_cleanCache.php:24 /extensions/SemanticTasks/ST_CheckForReminders.php:6 /extensions/SpamBlacklist/cleanup.php:9 /extensions/SwarmExport/swarmExport.php:23 /extensions/TitleKey/rebuildTitleKeys.php:3 /extensions/TorBlock/loadExitNodes.php:7 /extensions/TrustedXFF/generate.php:8 /extensions/UsabilityInitiative/PrefStats/populatePrefStats.php:9 /extensions/WikiAtHome/internalCmdLineEncoder.php:6 /extensions/WikiTrust/sql/create_db.php:74 Dan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] please make wikimedia.org mailing lists searchable
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 In the meantime, the wide availability of searchable archive copies through multiple third-party list aggregation services means most of our lists are *already* searchable via Google etc, so we're not missing much functionality. -- brion The key exception of course being our private mailing lists like stewards-l and in particular Checkuser-l which desperately needs to be searcheable. - -Mike -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.9 (GNU/Linux) iEYEARECAAYFAkqS6gEACgkQst0AR/DaKHtmkwCeOdm+V3grMR/JRh1r+PEs02Hs 2a0AnRxqc3WlkeJns+L3p/2485cI8ZqI =vV3/ -END PGP SIGNATURE- ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] RFC on API for Html::input() and related functions
On Mon, Aug 24, 2009 at 11:24 AM, Tim Starlingtstarl...@wikimedia.org wrote: I'm afraid your thinking on this is going in precisely the opposite direction to mine. I think long lists of formal parameters make code almost impossible to read and nearly as difficult to write. It might take slightly longer to type Html::img( array( 'src' = $src, 'alt' = $alt ) ); but at least you can read that, and write it, without having to look up the parameter order in the documentation, and without having to memorise the parameter orders for large numbers of similar functions. Unless a programmer uses a function regularly, recalling what order the parameters should be in will be a difficult task, requiring several seconds if it's possible at all. . . . It would be good if we could have some sort of consensus on this so that we don't end up converting each others' code. I agree with pretty much everything you've said. I only wish PHP supported named parameters like Python, since PHP's array syntax is pretty ugly. I'll see about removing Html::input(), at least. I'm not sure yet whether HTMLForm is a suitable replacement, since it has essentially no comments and I haven't yet done enough experimentation to figure out how it works . . . Before being gradually replaced by this starting in MW 1.14: function link( $target, $text = null, $customAttribs = array(), $query = array(), $options = array() ) which is an unfortunate example of bad design from the outset, since there are 5 formal parameters, 4 of them optional, and a number of callers only override $options. Consider this: $sk-link( $target, null, array(), array(), array( 'known' ) ); versus this: $sk-link( $target, array( 'known' = true ) ); and tell me which one is easier to understand and type. To be fair, $sk-knownLink( $target ) is easier to type than either and no harder to understand, and that's what's used now for this *specific* case. I agree that condensing the options is a good idea, though -- it's rare that any caller uses all five parameters, but many use at least two or three, and often not the first two or three. Something like $sk-link( $this-mTitle, array( 'text' = wfMsgHtml( 'markaspatrolledtext' ), 'query' = array( 'action' = 'markpatrolled', 'rcid' = $rcid ), 'known', 'noclasses' ) ) isn't very pretty, but it's certainly much more readable. I can't remember the order of the parameters myself, and I wrote them. It's actually pretty easy in PHP to totally switch over the kind of parameters that a function takes. In the case of link(), we could collapse the last four arguments into an associative array, and use func_get_args() to fall back to the old way if the second argument is a non-array. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
dan nessett wrote: --- On Mon, 8/24/09, Alex mrzmanw...@gmail.com wrote: I don't believe anyone except you has actually proposed restructuring the extensions directory. Perhaps not. But, I don't see why that is relevant. I am making arguments why the extensions directory should be restructured. I may convince no one, but I don't think I should presume that. Most of your argument in favor of changing the method for determining include path seems to revolve around the assumption that we're going to rearrange the directory at some point, possibly multiple times. But if nobody else wants to do that, then its just academic, and even then, it assumes that end-users will also structure their own extensions directory in the same way. A) There aren't that many extensions that add command line utilities (several extensions also have scripts and hook based extensions so wouldn't neatly fit into such categories) Here are the files in /extensions/ that reference /maintenance/command.inc. There are 65 of them (line number of the reference at the end). I don't know which of these are commonly used and therefore included in installation extension/ directories, but I assume all of them are used by at least a small number of sites (otherwise, why include them in the extensions directory at all?) /extensions/AbuseFilter/install.php:8 /extensions/AbuseFilter/phpTest.php:8 /extensions/AdvancedSearch/populateCategorySearch.php:9 /extensions/AntiSpoof/batchAntiSpoof.php:6 /extensions/AntiSpoof/generateEquivset.php:4 /extensions/Babel/txt2cdb.php:9 /extensions/BoardVote/voterList.php:6 /extensions/CentralAuth/migratePass0.php:8 /extensions/CentralAuth/migratePass1.php:8 /extensions/CentralAuth/migrateStewards.php:3 /extensions/CentralNotice/rebuildLocalTemplates.php:3 /extensions/CentralNotice/rebuildTemplates.php:3 /extensions/CheckUser/importLog.php:4 /extensions/CheckUser/install.php:8 /extensions/cldr/rebuild.php:11 /extensions/CodeReview/svnImport.php:6 /extensions/CommunityVoice/CLI/Initialize.php:4 /extensions/Configure/findSettings.php:18 /extensions/Configure/manage.php:19 /extensions/Configure/migrateFiles.php:17 /extensions/Configure/migrateToDB.php:16 /extensions/Configure/writePHP.php:18 /extensions/DataCenter/CLI/Import.php:4 /extensions/DataCenter/CLI/Initialize.php:4 /extensions/DumpHTML/dumpHTML.php:61 /extensions/DumpHTML/wm-scripts/old/filterNamespaces.php:4 /extensions/DumpHTML/wm-scripts/queueController.php:6 /extensions/FlaggedRevs/maintenance/clearCachedText.php:13 /extensions/FlaggedRevs/maintenance/reviewAllPages.php:8 /extensions/FlaggedRevs/maintenance/updateAutoPromote.php:8 /extensions/FlaggedRevs/maintenance/updateLinks.php:10 /extensions/FlaggedRevs/maintenance/updateQueryCache.php:8 /extensions/FlaggedRevs/maintenance/updateStats.php:8 /extensions/LiquidThreads/compat/generateCompatibilityLocalisation.php:6 /extensions/LiquidThreads/import/import-parsed-discussions.php:4 /extensions/LiquidThreads/migrateDatabase.php:7 /extensions/LocalisationUpdate/update.php:7 /extensions/MetavidWiki/maintenance/download_from_archive_org.php:4 /extensions/MetavidWiki/maintenance/maintenance_util.inc.php:15 /extensions/MetavidWiki/maintenance/metavid2mvWiki.inc.php:16 /extensions/MetavidWiki/maintenance/metavid_gov_templates.php:2 /extensions/MetavidWiki/maintenance/mv_oneTime_fixes.php:2 /extensions/MetavidWiki/maintenance/mv_update.php:6 /extensions/MetavidWiki/maintenance/ogg_thumb_insert.php:15 /extensions/MetavidWiki/maintenance/scrape_and_insert.inc.php:12 /extensions/MetavidWiki/maintenance/transcode_to_flv.php:13 /extensions/MetavidWiki/maintenance/video_ocr_thumb_insert.php:15 /extensions/OAI/oaiUpdate.php:17 /extensions/ParserFunctions/testExpr.php:4 /extensions/SecurePoll/voterList.php:11 /extensions/SemanticMediaWiki/maintenance/SMW_conceptCache.php:18 /extensions/SemanticMediaWiki/maintenance/SMW_dumpRDF.php:34 /extensions/SemanticMediaWiki/maintenance/SMW_refreshData.php:41 /extensions/SemanticMediaWiki/maintenance/SMW_setup.php:46 /extensions/SemanticMediaWiki/maintenance/SMW_unifyProperties.php:27 /extensions/SemanticResultFormats/Ploticus/SRF_Ploticus_cleanCache.php:24 /extensions/SemanticTasks/ST_CheckForReminders.php:6 /extensions/SpamBlacklist/cleanup.php:9 /extensions/SwarmExport/swarmExport.php:23 /extensions/TitleKey/rebuildTitleKeys.php:3 /extensions/TorBlock/loadExitNodes.php:7 /extensions/TrustedXFF/generate.php:8 /extensions/UsabilityInitiative/PrefStats/populatePrefStats.php:9 /extensions/WikiAtHome/internalCmdLineEncoder.php:6 /extensions/WikiTrust/sql/create_db.php:74 Of those 65 files, they appear to be in ~30 extensions (of around 400 total) and as far as I can tell, only 2 are CLI-only extensions (SwarmExport and DumpHTML). -- Alex (wikipedia:en:User:Mr.Z-man) ___ Wikitech-l mailing
Re: [Wikitech-l] A potential land mine
2009/8/24 David Gerard dger...@gmail.com: 2009/8/24 Brion Vibber br...@wikimedia.org: Yup. I'd love to see a good infrastructure for breaking up a lot of these things into smaller chunks that could be queued up and run more cleanly through a control panel. Nice project for someone? :) Sounds tempting! (if anyone else does it first good luck to them ;-) ) What sort of environment should be presumed? i.e., how locked down is a locked down server in this context? This sort of thing will make MediaWiki so much nicer in quite a lot of ways. I should point out that I asked from personal annoyance, i.e. never quite managing to figure out the requisite environment variables from the command line myself. And it's an obvious nice interface idea. Even if making it technically feasible is more than a little fiddly. - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] SSL certificates for Wikimedia sites
Brion Vibber br...@wikimedia.org wrote: Pretty much, yeah. :) We put real certs on public-facing sites, but just haven't bothered with what is essentially our tech department intranet. (But since we're crazy people it's open if you want to look at it!) Wouldn't it be safer, and more convenient, to have internal sites use an internally created CA instead of self-signed certificates? Safer, but less convenient as it would take us a few extra minutes to set up which we might as well spend on buying an $8 public-friendly cert. ;) Does this mean that if I make an earmarked donation we could close this thread? :-) Tim ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] please make wikimedia.org mailing lists searchable
Thank you everybody for your comments. Pipermail is fine, if you would only let Google™ index it. Every mailing list in the world occasionally sees the accidental slip of the cut and paste finger, and the need for an administrator to remove the spilled beans, which he should then do. But there is no need to not let Google index it. When we think search, we think Google. I hate proprietary software, and RMS http://jidanni.org/comp/index.html#rms is my idol, but when I think search, I think Google, and will not remember to use a special search for a special list. I end up reposting this thread every time I realize I can't find something again due to someone's arbitrary decision, so this time Cc'd jwales to perhaps get a second arbitrary opinion. Anyway, you are throwing the baby out with the bathwater, and as you mention the stuff is mostly in Google indirectly anyway, why do this crippleware concept of not letting it be indexed? Also no need to reinvent the wheel of a substitute search engine... OK to have it alongside Google, but don't block Google. There are a lot of tools, noindex, nofollow, just don't block entirely. OK, maybe you all operate on some higher logic. P.S., if removing a message will cause renumbering, just leave a stub message. I have an idea, put a message on each subscription page, and post to all subscribers: Starting 9.9.2009 all lists will once again be open to indexing in Google™. This means due to Mediawiki.org ranking, anything you say can and will end up at the top of search engine results, you have been warned. Tim, good job finding that. I in fact long ago have given up on searching for anything related to these mailing lists. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
On 8/24/09 6:09 PM, David Gerard wrote: 2009/8/24 Brion Vibberbr...@wikimedia.org: On 8/24/09 2:40 PM, Chad wrote: Most require far too much execution time to be done over HTTP, you hit timeouts too easily. Yup. I'd love to see a good infrastructure for breaking up a lot of these things into smaller chunks that could be queued up and run more cleanly through a control panel. Nice project for someone? :) Sounds tempting! (if anyone else does it first good luck to them ;-) ) What sort of environment should be presumed? i.e., how locked down is a locked down server in this context? Assume web-only; no shell, no guarantee that you can shell out to a command-line PHP. -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] A potential land mine
On 8/24/09 5:49 PM, Alex wrote: Most of your argument in favor of changing the method for determining include path seems to revolve around the assumption that we're going to rearrange the directory at some point, possibly multiple times. But if nobody else wants to do that, then its just academic, and even then, it assumes that end-users will also structure their own extensions directory in the same way. Exactly; I'd appreciate if we don't spend more time on this list on the subject. :) -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] please make wikimedia.org mailing lists searchable
[snip] As already noted in this thread and your several previous repetitions of it, all the public lists you're talking about are already searchable: a Google search for wikitech-l jidanni returns 3,010 results. I'm placing Jidanni under moderation on this list; further repetitions of this previously-answered question will not be let through. -- brion ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l