Re: [Wikitech-l] Wikitech-l Digest, Vol 90, Issue 48
Reply to message 7 Seems it can use anywhere but it not support Chinese at least. It do not support the http://en.wikipedia.org/wiki/Chinese_input_methods_for_computers . May it be fix? HW 寄件人﹕ wikitech-l-requ...@lists.wikimedia.org wikitech-l-requ...@lists.wikimedia.org 收件人﹕ wikitech-l@lists.wikimedia.org 傳送日期﹕ 2011/1/23 (日) 9:40:33 AM 主題: Wikitech-l Digest, Vol 90, Issue 48 Send Wikitech-l mailing list submissions to wikitech-l@lists.wikimedia.org To subscribe or unsubscribe via the World Wide Web, visit https://lists.wikimedia.org/mailman/listinfo/wikitech-l or, via email, send a message with subject or body 'help' to wikitech-l-requ...@lists.wikimedia.org You can reach the person managing the list at wikitech-l-ow...@lists.wikimedia.org When replying, please edit your Subject line so it is more specific than Re: Contents of Wikitech-l digest... Today's Topics: 1. Re: File licensing information support (Bryan Tong Minh) 2. Re: File licensing information support (Platonides) 3. Re: Announcing OpenStackManager extension (Platonides) 4. Re: File licensing information support (Krinkle) 5. Re: File licensing information support (Platonides) 6. Re: File licensing information support (Magnus Manske) 7. Re: WYSIFTW status (Magnus Manske) 8. Re: Farewell JSMin, Hello JavaScriptDistiller! (Maciej Jaros) -- Message: 1 Date: Sat, 22 Jan 2011 20:15:10 +0100 From: Bryan Tong Minh bryan.tongm...@gmail.com Subject: Re: [Wikitech-l] File licensing information support To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID: aanlktinn_n3km0s_ycymh5jysu47h8mtdaqzx9bxp...@mail.gmail.com Content-Type: text/plain; charset=ISO-8859-1 On Fri, Jan 21, 2011 at 3:36 AM, Michael Dale md...@wikimedia.org wrote: On 01/20/2011 05:00 PM, Platonides wrote: I would have probably gone by the page_props route, passing the metadata from the wikitext to the tables via a parser function. I would also say its probably best to pass metadata from the wikitext to the tables via a parser function. ?Similar to categories, and all other user edited metadata. This has the disadvantage that its not easy 'as easy' to edit via structured api entry point, ?but has the advantage of working well with all the existing tools, templates and versioning. This is actually the biggest decision that has been made, the rest is mostly implementation details. (Please note that I'm not presenting you with a fait accompli, it is of course still possible to change this) Handling metadata separately from wikitext provides two main advantages: it is much more user friendly, and it allows us to properly validate and parse data. Having a clear separate input text field Author: is much more user friendly {{#fileauthor:}}, which is so to say, a type of obscure MediaWiki jargon. I know that we could probably hide it behind a template, but that is still not as friendly as a separate field. I keep on hearing that especially for newbies, a big blob of wikitext is plain scary. We regulars may be able to quickly parse the structure in {{Information}}, but for newbies this is certainly not so clear. We actually see that from the community there is a demand for separating the meta data from the wikitext -- this is after all why they implemented the uselang= hacked upload form with a separate text box for every meta field. Also, a separate field allows MediaWiki to understand what a certain input really means. {{#fileauthor:[[User:Bryan]]}} means nothing to MediaWiki or re-users, but Author: Bryan___ [checkbox] This is a Commons username can be parsed by MediaWiki to mean something. It also allows us to mass change for example the author. If I want to change my attribution from Bryan to Bryan Tong Minh, I would need to edit the wikitext of every single upload, whereas in the new system I go to Special:AuthorManager and change the attribution. Similar to categories, and all otheruser edited metadata. Categories is a good example of why metadata does not belong in the wikitext. If you have ever tried renaming a category... you need to edit every page in the category and rename it in the wikitext. Commons is running multiple bots to handle category rename requests. All these advantage outweigh the pain of migration (which could presumably be handled by bots) in my opinion. Best regards, Bryan -- Message: 2 Date: Sat, 22 Jan 2011 21:04:05 +0100 From: Platonides platoni...@gmail.com Subject: Re: [Wikitech-l] File licensing information support To: wikitech-l@lists.wikimedia.org Message-ID: ihfd31$buv$1...@dough.gmane.org Content-Type: text/plain; charset=ISO-8859-1 An internally handled parser function doesn't conflict with showing it as a textbox. We could for instance store it as a hidden page prefix. Data stored in the text blob: Author: [[Author:Bryan]]
[Wikitech-l] Sentence-level editing / InlineEditor extension update
Hey all, I've been working on the InlineEditor extension again, primarily working on a new interface that doesn't use the different edit modes anymore, as the usability testing showed that this was not the right approach. Luckily, without a change in the underlying algorithms, it was doable to combine all the edit modes into one interface. This also resulted in the deletion of tens of files plus hundreds of lines of code [1], which usually is a good thing, because it shows it's a simpler and more natural approach. If you're interested in testing this on your own wiki or looking at the code, grab your copy from SVN. [2] If you're interested in testing this, a wiki has been set up here: http://janpaulposma.nl/sle/wiki As you might know, some more usability testing will be done soon by GRNET [3], so we can see if this interface works better or not. Feel free to ask questions and throw in suggestions, etc. Best regards, Jan Paul [1] http://www.mediawiki.org/w/index.php?path=%2Ftrunk%2Fextensions%2FInlineEditortitle=Special%3ACode%2FMediaWiki%2Fpath [2] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/InlineEditor/ [3] https://code.grnet.gr/projects/wikipedia-wysiwyg/wiki ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] 1.17 revisions to review by tag
Hi everyone, Here's a breakdown of the revisions left to review: http://www.mediawiki.org/wiki/MediaWiki_roadmap/1.17/Revision_report Current count of branches plus extensions: 283 There's a script to generate this (publishing source later; requires toolserver), so we should be able to maintain this list up until release. The main goal right now is to give us better visibility into the revisions that don't have reviewer tags on them, since that's hard to see otherwise. Rob ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] File licensing information support
* Magnus Manske magnusman...@googlemail.com [Sun, 23 Jan 2011 00:38:53 +]: On Sat, Jan 22, 2011 at 10:09 PM, Platonides platoni...@gmail.com wrote: Krinkle wrote: So PHP would extract {{#author:4}} and {{#license:12}} from the textblob when showing the editpage. And show the remaining wikitext in the textarea and the author/ license as seperate form elements. And upon saving, generate {{#author:4}} {{#license:12}}\n again and prepend to the textblob. Double instances of these would be ignored (ie. stripped automatically since they're not re-inserted to the textblob upon saving). One small downside would be that if someone would edit the textarea manually to do stuff with author and license, the next edit would re-arrange them since they're extracted and re-insterted thus showing messy diffs. (not a major point as long as it's done independant from JavaScript, which it can be if done from core / php). If that's what you meant, I think it is an interesting concept that should not be ignored, however personally I am not yet convinced this is the way to go. But when looking at the complete picture of up/down sides, this could be something to consider. -- Krinkle That's an alternative approach. I was thinking in accepting them only at the beginning of the page, but extracting from everywhere is also an alternative. OK, my 2 cents: I would be in favour of extracting data from the {{Information}} template via the parser, but we talked about this over a year ago at the Paris meeting, and it was deemed too complicated (black caching magick etc.), and noone has stepped forward to do anything along those line, so I guess it's dead and buried. Things like {{#author:4}} seem to be a nice hack to Get Things Done (TM). As was mentioned before, the temptation is great to expand it into a generic triplet storage a la Semantic MediaWiki, but that would probably complicate things to an extend where nothing gets done, again. But one thing comes to mind: If someone implements an abstraction layer (4 to a specific author) anyway, it should be dead simple to use it for tags as well. Just allow multiple {{#tag}}s per page (as opposed to {{#author}}), done. The same code that will allow for editing author and license information centrally should make it possible to edit tag information, i18n for example, so the tag display could be in the current user language (with en fallback). Search for tags i18n-style could be possible as well, if the translation information is encoded machine-readable as well, e.g. as language links ([[de:Pferd]] on the [[Tag:Horse]] page). You are correct - triplets definition are always meant to be as much generic as possible, something like categorizing or tagging. It is better to define them separately from templates and to include the references to their values in the template. Such way it would not complicate the parsing too much. Although one might want to have a fancy visual forms to edit these, that probably brings caching issues? It might be too much to try to activate all of that in the first round, but IMHO the code should keep the use as tags in mind; it would be dreadful to waste such an opportunity. Dmitriy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MATH markup question
On Fri, Jan 21, 2011 at 7:08 AM, Aryeh Gregor simetrical+wikil...@gmail.com wrote: When I load their homepage, the formulas don't appear for about two seconds of 100% CPU usage, on Firefox 4b9. And that's for two small formulas. Works OK in Safari. WebKit perhaps? I'm not impressed. IMO, the correct way forward is to work on native MathML support I used to think that too. Then I looked at the examples on the wiki page on the issue. Although I find TeX rather opaque, a much worst issue is obscurity through verbosity, which not only makes the formula difficult to understand, but the entire source of the article too. That's why I don't use CITE either. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MATH markup question
Aryeh Gregor wrote: When I load their homepage, the formulas don't appear for about two seconds of 100% CPU usage, on Firefox 4b9. And that's for two small formulas. I'm not impressed. IMO, the correct way forward is to work on native MathML support -- Gecko and WebKit both support it these days, and Opera somewhat does too. I'm sure the support is a bit spotty, but if Wikipedia used it (even as an off-by-default option) that would surely drive a lot of progress. These days (with the deployment of HTML5 parsers) it can be embedded directly into HTML, it's not limited to XML. Looking at http://www.mathjax.org/demos/tex-samples/ it may indeed take a couple of seconds to convert from TeX to the graphical view, but without 100% CPU usage or looking blocked. I'm not using 49b but 3.6.12, though. I see a similar result in chromium. A disadvantage is that the showing the formula needs to reposition the content, instead of reserving the space in advance. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Sentence-level editing / InlineEditor extension update
Jan Paul Posma jp.posma at gmail.com writes: Hey all, I've been working on the InlineEditor extension again, primarily working on a new interface that doesn't use the different edit modes anymore, as the usability testing showed that this was not the right approach. Luckily, without a change in the underlying algorithms, it was doable to combine all the edit modes into one interface. This also resulted in the deletion of tens of files plus hundreds of lines of code [1], which usually is a good thing, because it shows it's a simpler and more natural approach. If you're interested in testing this on your own wiki or looking at the code, grab your copy from SVN. [2] If you're interested in testing this, a wiki has been set up here: http://janpaulposma.nl/sle/wiki As you might know, some more usability testing will be done soon by GRNET [3], so we can see if this interface works better or not. Feel free to ask questions and throw in suggestions, etc. Best regards, Jan Paul [1] http://www.mediawiki.org/w/index.php? path=%2Ftrunk%2Fextensions%2FInlineEditortitle=Special%3ACode%2FMediaWiki%2Fpat h [2] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/InlineEditor/ [3] https://code.grnet.gr/projects/wikipedia-wysiwyg/wiki Hi Jan I will start testing with http://janpaulposma.nl/sle/wiki Following will be covered in initial test run. 1) Sentence-level editing 2) Browser compatibility Thanks Janesh Kodikara ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Sentence-level editing / InlineEditor extension update
Jan Paul Posma wrote: Hey all, I've been working on the InlineEditor extension again, primarily working on a new interface that doesn't use the different edit modes anymore, as the usability testing showed that this was not the right approach. Luckily, without a change in the underlying algorithms, it was doable to combine all the edit modes into one interface. This also resulted in the deletion of tens of files plus hundreds of lines of code [1], which usually is a good thing, because it shows it's a simpler and more natural approach. If you're interested in testing this on your own wiki or looking at the code, grab your copy from SVN. [2] If you're interested in testing this, a wiki has been set up here: http://janpaulposma.nl/sle/wiki As you might know, some more usability testing will be done soon by GRNET [3], so we can see if this interface works better or not. Feel free to ask questions and throw in suggestions, etc. Best regards, Jan Paul [1] http://www.mediawiki.org/w/index.php?path=%2Ftrunk%2Fextensions%2FInlineEditortitle=Special%3ACode%2FMediaWiki%2Fpath [2] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/InlineEditor/ [3] https://code.grnet.gr/projects/wikipedia-wysiwyg/wiki I completely missed the You can edit the article below, by clicking on blue elements in the page. line. Only found after thinking this needs some kind of notice on how to edit, since it's not clear what to do to change the page ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Sentence-level editing / InlineEditor extension update
Jan, Very cool. Took me a minute to figure out how to use it, but once I did I really liked it. I think the user should have some way to correct any incorrectly parsed sentences though. Doing this would help develop a better sentence boundary parser in the long run too. Nice work. -- Jeff On Sun, Jan 23, 2011 at 11:35 AM, Platonides platoni...@gmail.com wrote: Jan Paul Posma wrote: Hey all, I've been working on the InlineEditor extension again, primarily working on a new interface that doesn't use the different edit modes anymore, as the usability testing showed that this was not the right approach. Luckily, without a change in the underlying algorithms, it was doable to combine all the edit modes into one interface. This also resulted in the deletion of tens of files plus hundreds of lines of code [1], which usually is a good thing, because it shows it's a simpler and more natural approach. If you're interested in testing this on your own wiki or looking at the code, grab your copy from SVN. [2] If you're interested in testing this, a wiki has been set up here: http://janpaulposma.nl/sle/wiki As you might know, some more usability testing will be done soon by GRNET [3], so we can see if this interface works better or not. Feel free to ask questions and throw in suggestions, etc. Best regards, Jan Paul [1] http://www.mediawiki.org/w/index.php?path=%2Ftrunk%2Fextensions%2FInlineEditortitle=Special%3ACode%2FMediaWiki%2Fpath [2] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/InlineEditor/ [3] https://code.grnet.gr/projects/wikipedia-wysiwyg/wiki I completely missed the You can edit the article below, by clicking on blue elements in the page. line. Only found after thinking this needs some kind of notice on how to edit, since it's not clear what to do to change the page ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Saturday SUL breakage
Hello, I have made a mistake Saturday evening (around 18:30 UTC) which broke some SUL-related functions. The issue was fixed by Apergos about 1 hour later while I was out of home. Here is the report: I tried to create the Esperanto wikisource (bug 26136 [1]) by following our guide on wikitech [[add a wiki]] [2]. I ran addwiki.php with the following invocation: $ php addwiki.php eo wikisource eowikisource eo.wikisource.org This complained about eowikisource not existing in wgLanguageName, most probably because the first two arguments are eaten by the Maintenance class or something like that. Re checking our guide, it says: You need to put in --wiki=aawiki after the addwiki.php and before the langcode for now. Script is wonky. I thought aawiki was a place-holder for the wiki database name and ran: $ php addwiki.php --wiki eowikisource eo wikisource \ eowikisource eo.wikisource.org That triggers a database error saying the database eowikisource does not exist, which is the intended result. I then edited the all.dblist and pmtpa.dblist to manually add eowikisource and ran sync-dblist. At this point, SUL was broken but I did not know about it :-( Please note those additions are normally handled by the addwiki.php script. Root cause: Addition of a non existent database name. Impact: Any script / functions querying all databases and expecting them to exist without error checking. How it got fixed: Apergos removed the eowikisource from the dblists and synced the list. How it could have been avoided: I should have reverted my change in the dblist. That would have fixed the issue. I should have stayed idling in IRC for sometime after my change. I was not reachable since my contact-details were a bit old. I have updated our local file with mobile and home phone. Real solution: Run, as instructed in the guide, addwiki.php --wiki aawiki [...] TODO: - really create eowikisource - Investigate potential damages (ex bug 26877 [3]) - properly handle non existing database in functions / scripts - fix addwiki.php in 1.6wmf4 - review the [[Add a wiki]] article I hope this report clarify the issue and would like to thanks Apergos, JeLuF and RobH for their kind words on IRC on Sunday morning, it is always appreciated when you are ashamed by such a stupid mistake. [1] https://bugzilla.wikimedia.org/26136 [2] http://wikitech.wikimedia.org/view/Add_a_wiki [3] https://bugzilla.wikimedia.org/26877 -- Ashar hashar Voultoiz ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Saturday SUL breakage
Ashar Voultoiz wrote: Hello, I have made a mistake Saturday evening (around 18:30 UTC) which broke some SUL-related functions. The issue was fixed by Apergos about 1 hour later while I was out of home. Don't worry more about it, Ashar. I think the need for --wiki aawiki is fixed in r80837 ($wgNoDBParam sets aawiki internally). sync-dblist should have verified that all the listed dbs existed (somewhere) before syncing them. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Farewell JSMin, Hello JavaScriptDistiller!
On 1/22/11 5:40 PM, Maciej Jaros wrote: Just remember that people all over the world are hacking into Mediawiki all the time. Making it harder won't help a bit. I think minification is orthogonal to the hacking question. I've said it before here but the key to enabling hackers is to have a stable and documented API (both a remote web service, and an API for things one might want to do in Javascript on the page). We know that people are going to hack and that we like that, but if we don't want to be trapped into supporting old APIs forever we should document what's likely to change and what we will aim to keep stable. -- Neil Kandalgaonkar (| ne...@wikimedia.org ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Best way to track RELEASE-NOTES changes each svnupdate?
jida...@jidanni.org wrote in message news:87mxmwfie4.fsf...@jidanni.org... C On Sun, Jan 16, 2011 at 7:16 PM, Happy-melon happy-me...@live.com wrote: Isn't that what release notes are for? Say, how do you pros see what changed? Here is my extra stupid way. I do it every few weeks. cp RELEASE-NOTES /tmp svn update diff --ignore-space-change -U0 /tmp RELEASE-NOTES Often old lines are shown again, because somebody tidied their formatting. A svn diff wouldn't be any better. The worst thing is each 1.17 to 1.18, 1.18 to 1.19 change, when the whole file changes. So how do you folks track RELEASE-NOTES level changes (not source code level changes, too many), coinciding with your SVN updates? One generally gets information out of a release notes file by reading it. The purpose of release notes is to be a list of changes between versions; if I want to upgrade from 1.15 to 1.17, I read the 1.16 and 1.17 release notes, and I know all the changes I should be expecting. Why would I want the 1.16 release notes to contain changes which do not affect 1.16? If you are determined to deploy bleeding-edge code on production sites, you will need to be a little more adventurous in getting hold of the latest changes. You can go to [1] and select the latest revision, and the last revision you checked out, and read the diff in a slightly prettier format. Pretty much by definition, there isn't a nicer way of reading them. --HM [1] http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/RELEASE-NOTES ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] 1.17 revisions to review by tag
Rob Lanphier robla at wikimedia.org writes: Current count of branches plus extensions: 283 Hi Rob, I quickly checked the count and it matches with the actuals. Thanks Janesh ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] File licensing information support
On 01/22/2011 08:15 PM, Bryan Tong Minh wrote: Having a clear separate input text field Author: is much more user friendly {{#fileauthor:}}, which is so to say, a type of obscure MediaWiki jargon. I disagree. In real life, there are always more compliated cases, where an author is not an author, but two authors or a sculptor, or one painter and one photographer. These things never fit in a single author field, and the same goes for any other separated fields. But the free-form Wikipedia can handle all real-world cases in plain human language. Various expert systems based on artificial intelligence existed since the 1980s, but none of them produced a universal encyclopedia. Only the text-based Wikipedia did. After this humiliating fact, the same AI people (now dressed as semantic web scholars) come and claim that they too could have built Wikipedia, if it only were more structured. They are wrong, of course. Lack of structure is precisely what built Wikipedia. -- Lars Aronsson (l...@aronsson.se) Aronsson Datateknik - http://aronsson.se ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] File licensing information support
* Lars Aronsson l...@aronsson.se [Mon, 24 Jan 2011 07:06:02 +0100]: On 01/22/2011 08:15 PM, Bryan Tong Minh wrote: Having a clear separate input text field Author: is much more user friendly {{#fileauthor:}}, which is so to say, a type of obscure MediaWiki jargon. I disagree. In real life, there are always more compliated cases, where an author is not an author, but two authors or a sculptor, or one painter and one photographer. These things never fit in a single author field, and the same goes for any other separated fields. But the free-form Wikipedia can handle all real-world cases in plain human language. Various expert systems based on artificial intelligence existed since the 1980s, but none of them produced a universal encyclopedia. Only the text-based Wikipedia did. After this humiliating fact, the same AI people (now dressed as semantic web scholars) come and claim that they too could have built Wikipedia, if it only were more structured. They are wrong, of course. Lack of structure is precisely what built Wikipedia. One may have not just a single triple for that, but the list / set of triples for the same person in a different role (different kind of author). There are two extremes - not to have any structure or to be overly structural. If there are few extra fields for an image description, why don't generalize it for all kinds of measured data - geographical, historical, population statistics, financial and economical data and so on? Why only the images are allowed to have structural and measurable data? However, I don't think that Wikipedia should have AI, because it requires huge computing power, and the problem is that AI algorithms are not efficient enough. To have the data structured is not a bad thing. It probably should not even try to do SPARQL, but offer these things to external sits. Don't make complex queries, leave it for offline tools / bots or toolserver. Semantic bots are a good idea - they might mine the data finding the cross-sets. It should be even lighter than SMW. However, I might be wrong. Dmitriy ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l