Re: [Wikitech-l] [testing] TitleBlacklist now tested under Jenkins

2012-08-30 Thread Antoine Musso
Le 29/08/12 16:27, Chad a écrit :
 Question: why does the config for non-extension tests attempt
 to load extensions? -Parser and -Misc both seem to be failing
 due to a broken inclusion of Wikibase.

The -Parser and -Misc jobs are triggered by both the MediaWiki core job
and the one testing the Wikidata branch.  I originally thought it was a
good idea to a job dedicated to a PHPUnit group, I will end up creating
a job dedicated to testing the Wikidata branch.

 Core tests should be run without any extensions.

Fully agree. We can later create a job to test core + the extension
deployed on the wmf and another one for a Semantic MediaWiki setup.

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] 5 tips to get your code reviewed faster

2012-08-30 Thread Antoine Musso
Le 30/08/12 00:05, Thomas Gries a écrit :
 Something which needs so much text, is broken is some respect.
 Mies van der Rohe: Less is more.

Well we probably nickname the community team too long did not read for
a reason, their job is indeed to write full reports which I myself find
enjoying.  Tip for the hurry people: only read the titles :-)

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] 5 tips to get your code reviewed faster

2012-08-30 Thread Antoine Musso
Le 29/08/12 23:55, Sumana Harihareswara wrote:
 1) Write small commits.

I cant stress how important this is. git has several ways to split a commit:
- git rebase --interactive parent commit sha1
- reset to master and git cherry-pick --no-commit sha1 then use git
add --patch to select the hunk to craft a new small commit.
  -- http://nuclearsquid.com/writings/git-add/

snip
 3) Don't mix rebases with changes.
 
 When rebasing, only rebase. That makes it easier to use the Old Version
 History dropdown, which greatly quickens reviews. If non-rebase changes
 are made inside a rebase changeset, you have to read through a lot more
 code to find it and it's non-obvious.

The way to go is usually to git rebase and send that then do the new
change. That make reviewing ten times easier.

It is also a good practice to add a cover message on the new patchset to
explain what it changes compared to the previous one. For example:

  PS3: rebased on latest master
  PS4: fix the, now obsolete, function() call

Where PS is used instead of PatchSet.

 4) Add reviewers.

If you do not know who could review your change, do ask in #mediawiki or
#wikimedia-dev. People with a long experience will be able to tell you
who might be able to help.

 5) Review more.

Asking a question about code, commenting about anything (whitespaces,
code you do not understand or dont like), doing +1 / -1, is NEVER going
to harm anyone. I have seen reviews from newbie that were not
understanding some part, and it ended up being a bug.
Reading code is a great way to learn the framework, asking question is
even better :)

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [testing] TitleBlacklist now tested under Jenkins

2012-08-30 Thread Oren Bochman
I've tried to do this for translate ext last week - so here are a couple of
questions:
1. is successfully runnig the test a requirement to successfully score on
gerrit? (i.e. how is gerrit integrated)
2. does the extension need to include php unit?

On Thu, Aug 30, 2012 at 9:43 AM, Antoine Musso hashar+...@free.fr wrote:

 Le 29/08/12 16:27, Chad a écrit :
  Question: why does the config for non-extension tests attempt
  to load extensions? -Parser and -Misc both seem to be failing
  due to a broken inclusion of Wikibase.

 The -Parser and -Misc jobs are triggered by both the MediaWiki core job
 and the one testing the Wikidata branch.  I originally thought it was a
 good idea to a job dedicated to a PHPUnit group, I will end up creating
 a job dedicated to testing the Wikidata branch.

  Core tests should be run without any extensions.

 Fully agree. We can later create a job to test core + the extension
 deployed on the wmf and another one for a Semantic MediaWiki setup.

 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 

Oren Bochman

Office tel. 061 4921492
Mobile +36 30 866 6706
skype id: orenbochman
e-mail: o...@romai-horizon.com
site http://www.riverport.hu
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $( 'div' ) vs. $( 'div /') in coding conventions (and methods for building DOMs with jQuery)

2012-08-30 Thread Daniel Werner
 In that case, perhaps we should just say that all of the options are fine:
 $( 'div' )
 $( 'div/' )
 $( 'div/div' )
 but emphasize not to use attributes in the tag creation.


+1
All three will return the same, and this is not HTML, it's JavaScript. It
really is just a matter of personal flavor in coding style, nothing else.

By the way, you can also use

$( 'div/', { 'class': 'foo', 'title': 'myTitle', ... } );

for adding attributes right away. Should be faster, more readable and less
error-prone than using tags directly in the div's definition string.

Cheers,
Daniel W
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] AJAX sharing of Commons data/metadata

2012-08-30 Thread Alex Brollo
2012/8/30 Brion Vibber br...@pobox.com



 Luckily, if you're using jQuery much of the low-level stuff can be taken
 care of for you. Something like this should work for API calls,
 automatically using the callback behind the scenes:


Thanks! really I tested some interproject AJAX API call  with no luck; but
I never tested a similar syntax. Can I ask for something more help using
your email, if needed (and I guess, it will...)? Obviuosly I'll share into
this thread any good result.

Unluckily, I found that field of Information and Book templates are well,
clearly tagged with informative ids, while Creator fields are not. I posted
a proposal there, but I know that editing complex templates is a hard job.

Alex
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Please help MediaWiki I18n/L10n improve

2012-08-30 Thread Niklas Laxström
tl;dr: Please help in getting https://bugzilla.wikimedia.org/38638
resolved and ensure that your code doesn't cause additional issues for
translators.

At translatewiki.net we have an [ask question] button for translators,
and they have really used it to report various kinds of issues with
the messages they are translating. In fact, we barely have time to
sort those things into correct places, let alone poking developers to
act on them. We are experimenting with different ways to improve this
process – the last experiment is using Semantic MediaWiki to track
open issues, see [1].

Basically the backlog is growing, and we need some help to reverse the
direction. We have tried filing bugs and poking developers on IRC and
via email, but that is not scaling anymore, moreover bugs and pokes
are sometimes ignored[2].

There are many ways you can help:
1. Follow the best i18n practices like documenting new messages as
they are added [4]. Some of you may have noticed that we have started
to review any i18n and L10n changes more closely. This is not to annoy
you. This is to help you write code that can be translated better and
without too many questions from translators.
2. Have a look at [1] and act on issues.
3. Have a look at [1] and poke someone else to act on issues.
4. We need maintainers for [[Support]] [3] to sort stuff into the
correct places.
5. When committing code which has i18n changes, add Nikerabbit (me) or
Siebrand to reviewers.
6. Ideas on how to improve this process are welcome.

We think that we are all responsible for providing them with the
information they need. We feel it's our (translatewiki.net staff's)
obligation to make sure translators can do their work without too many
distractions and wasted efforts.

[1] https://translatewiki.net/wiki/Support/Open_requests
[2] https://bugzilla.wikimedia.org/38638
[3] https://translatewiki.net/wiki/Support
[4] https://www.mediawiki.org/wiki/Localisation

-- 
Niklas Laxström
Siebrand Mazeland
Raimond Spekking
Federico Leva
Amir E. Aharoni

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Disabling direct pushing to master for extensions

2012-08-30 Thread Chad
On Wed, Aug 29, 2012 at 10:12 PM, Ryan Lane rlan...@gmail.com wrote:
 For a bunch of reasons, we need to look at disabling direct pushing
 on the master branch for all extensions in Gerrit […] Before I make
 the change though, I wanted to ask about it publicly to make sure
 there's no major blockers to me doing so.

 E3 is using Phabricator for code-reviews, so this would complicate things 
 for us. We could, I guess, have a separate Gerrit / +2 step. I'm curious, 
 though: why is the scope for this proposal limited to extensions? Are other 
 types of projects exempt?


 I think most projects already have it disabled.


Indeed. Most extensions do as well.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] quality at Google, 2006-2011

2012-08-30 Thread Chris McMahon
This crossed my desk this morning, it is a long and detailed (and honest!)
account by an insider of Google's efforts to increase code quality and
product quality.  I think it's relevant to what we're doing at WMF, and
what we might do in the future.
http://mike-bland.com/2012/07/10/test-mercenaries.html
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] $( 'div' ) vs. $( 'div /') in coding conventions (and methods for building DOMs with jQuery)

2012-08-30 Thread Trevor Parscal
On Thu, Aug 30, 2012 at 3:24 AM, Daniel Werner
daniel.wer...@wikimedia.dewrote:


 By the way, you can also use

 $( 'div/', { 'class': 'foo', 'title': 'myTitle', ... } );


Just be aware this also allows you to use things like 'html' and 'text'
which are not attributes at all, but call .html() or .text() internally.
There are also property aliases here.

In short - be aware of what the code does by reading the manual.

http://api.jquery.com/

- Trevor
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work offer inside

2012-08-30 Thread Akshay Agarwal
This is definitely a fraud message, checked it on various scamming sites

On Thu, Aug 30, 2012 at 8:45 PM, bo...@wikimedia.org wrote:

 I would like to take this time to welcome you to our hiring process
 and give you a brief synopsis of the position's benefits and requirements.

 If you are taking a career break, are on a maternity leave,
 recently retired or simply looking for some part-time job, this position
 is for you.

 Occupation: Flexible schedule 2 to 8 hours per day. We can guarantee a
 minimum 20 hrs/week occupation
 Salary: Starting salary is 2000 GBP per month plus commission, paid every
 month.
 Business hours: 9:00 AM to 5:00 PM, MON-FRI, 9:00 AM to 1:00 PM SAT or
 part time (UK time).

 Region: United Kingdom.

 Please note that there are no startup fees or deposits to start working
 for us.

 To request an application form, schedule your interview and receive more
 information about this position
 please reply to to...@xpatjobsuk.com,with your personal identification
 number for this position IDNO: 3985



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work offer inside

2012-08-30 Thread Daniel Friesen

This brings up the question.
Why does wikimedia.org not have a SPF record?

We should be rejecting wikimedia.org emails that we know do not come from  
Wikimedia.


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]

On Thu, 30 Aug 2012 10:19:52 -0700, Akshay Agarwal  
akshay.leadin...@gmail.com wrote:



This is definitely a fraud message, checked it on various scamming sites

On Thu, Aug 30, 2012 at 8:45 PM, bo...@wikimedia.org wrote:


I would like to take this time to welcome you to our hiring process
and give you a brief synopsis of the position's benefits and  
requirements.


If you are taking a career break, are on a maternity leave,
recently retired or simply looking for some part-time job, this position
is for you.

Occupation: Flexible schedule 2 to 8 hours per day. We can guarantee a
minimum 20 hrs/week occupation
Salary: Starting salary is 2000 GBP per month plus commission, paid  
every

month.
Business hours: 9:00 AM to 5:00 PM, MON-FRI, 9:00 AM to 1:00 PM SAT or
part time (UK time).

Region: United Kingdom.

Please note that there are no startup fees or deposits to start working
for us.

To request an application form, schedule your interview and receive more
information about this position
please reply to to...@xpatjobsuk.com,with your personal identification
number for this position IDNO: 3985



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [testing] TitleBlacklist now tested under Jenkins

2012-08-30 Thread Antoine Musso
Le 30/08/12 10:11, Oren Bochman a écrit :
 1. is successfully runnig the test a requirement to successfully score on
 gerrit? (i.e. how is gerrit integrated)

When a change is submitted in Gerrit a notification is sent to Jenkins.
If a job is setup for the repository, it will be triggered. For
extension tests the process is:
 - copy a snapshot of the current master version of MediaWiki
 - clone latest master version of the extension
 - apply the change
 - load the extension entry point (Foobar/Foobar.php by convention)
 - Run PHPUnit against the 'extensions' directory
 - Test results are aggregate and publish under Jenkins
 - If all tests are successful, Jenkins mark the change in Gerrit as
Verified +2. Else it marks it as Verified -2

 2. does the extension need to include php unit?

The PHPUnit software is installed on the continuous integration server.
The only thing you need in the extension is to have your tests in a PHP
file ending with 'Test.php'.

The TitleBlacklist is an example.

cheers,

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] upgrades to search cluster

2012-08-30 Thread Peter Youngmeister
Hi All,

I have moved all search traffic to hosts now running ubuntu 12.04
(precise). This should be a no-op, as the search software itself has not
changed, and all automated tests looked good, but I wanted to make you all
aware as I neither the time nor languages skills to manually test every
language.

If you notice any irregularities, please let me know!

--peter
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [testing] TitleBlacklist now tested under Jenkins

2012-08-30 Thread Roan Kattouw
On Thu, Aug 30, 2012 at 1:11 AM, Oren Bochman orenboch...@gmail.com wrote:
 I've tried to do this for translate ext last week - so here are a couple of
 questions:
 1. is successfully runnig the test a requirement to successfully score on
 gerrit? (i.e. how is gerrit integrated)
Jenkins sets a Verified +1 or -1 on the change, depending on whether
the tests succeed or not, just like for core. A V+1 is required to
merge a change. So to merge any change into an extension, the tests
(that is, the core tests and any extension tests) have to pass. But
the extension isn't required to have tests, it's just that you can't
submit changes that break the core tests or break any tests that are
present in the extension. Existing extensions without any tests or
awareness of phpunit will continue to work just fine, as long as they
don't try to do crazy stuff that breaks a test in core.

 2. does the extension need to include php unit?

No, all of the phpunit wrapper stuff is in MediaWiki core. Adding
tests to extensions is simple, see
https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Writing_unit_tests_for_extensions
.

Roan

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [testing] TitleBlacklist now tested under Jenkins

2012-08-30 Thread Jon Robson
Excellent!
Can't wait to get MobileFrontend using this.

Great work Antoine.

On Thu, Aug 30, 2012 at 12:59 PM, Roan Kattouw roan.katt...@gmail.com wrote:
 On Thu, Aug 30, 2012 at 1:11 AM, Oren Bochman orenboch...@gmail.com wrote:
 I've tried to do this for translate ext last week - so here are a couple of
 questions:
 1. is successfully runnig the test a requirement to successfully score on
 gerrit? (i.e. how is gerrit integrated)
 Jenkins sets a Verified +1 or -1 on the change, depending on whether
 the tests succeed or not, just like for core. A V+1 is required to
 merge a change. So to merge any change into an extension, the tests
 (that is, the core tests and any extension tests) have to pass. But
 the extension isn't required to have tests, it's just that you can't
 submit changes that break the core tests or break any tests that are
 present in the extension. Existing extensions without any tests or
 awareness of phpunit will continue to work just fine, as long as they
 don't try to do crazy stuff that breaks a test in core.

 2. does the extension need to include php unit?

 No, all of the phpunit wrapper stuff is in MediaWiki core. Adding
 tests to extensions is simple, see
 https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Writing_unit_tests_for_extensions
 .

 Roan

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



-- 
Jon Robson
http://jonrobson.me.uk
@rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] 5 tips to get your code reviewed faster

2012-08-30 Thread Krinkle
On Aug 30, 2012, at 9:59 AM, Antoine Musso hashar+...@free.fr wrote:

 It is also a good practice to add a cover message on the new patchset to
 explain what it changes compared to the previous one. 

Yes, very important. If you submit a patch set, please do leave a quick comment
explaining what you changed. I personally like to use bullet points for those
comments like:

* Rebased

or

* Addressed comments from John.
* Removed redundant code in foo.js.

 
  PS3: rebased on latest master
  PS4: fix the, now obsolete, function() call
 
 Where PS is used instead of PatchSet.
 

This is not needed. Because if you leave a comment (and do it right, as in,
click Review on the main gerrit change page under the correct Patch set
heading) gerrit prefixes this to your comments automatically.

The only reason to need such a prefix is if you're putting it somewhere else,
such as the commit-msg – where such comments don't belong in the first place.

Putting it in the commit message is imho annoying because:
* It is not at all helpful when the commit is merged because only the last
version is merged. The rest are in-between steps during the review process and
discussion and details of that process belong in the review thread, not in the
commit message.
* And, as the author, it is kinda hard because you don't know the patch version
number until it is submitted, and someone else can submit a version while you're
working.

Having said that, do make sure that your commit message is still accurate and
explains the full change (not just the first version of the patch, nor just the
last amendment to the patch).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Work offer inside

2012-08-30 Thread Tim Starling
On 31/08/12 04:15, Daniel Friesen wrote:
 This brings up the question.
 Why does wikimedia.org not have a SPF record?
 
 We should be rejecting wikimedia.org emails that we know do not come
 from Wikimedia.

In May, Jeff Green proposed deploying it with softfail, but it
wasn't ever actually done. Nobody wanted to use a fail qualifier,
due to the risk of legitimate mail not being delivered. So even if he
had deployed it, it probably wouldn't have helped in this case.

Mailman's security weaknesses are inherent to the protocol it uses,
there's no way to repair it. The scam email could have been sent with
a From header copied from anyone who has posted to the list
recently. In the unlikely event that SPF fail was used for that sender
and the receiver respected it, the scammer could have just picked
again. We should use a web interface for posting to groups, web
interfaces can be password protected without breaking 99% of clients.

I removed bo...@wikimedia.org from the list of email addresses
that are allowed to post to the list without being subscribed.

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] AJAX sharing of Commons data/metadata

2012-08-30 Thread Alex Brollo
Thanks again Brion, it runs perfectly and - strange to say - I got no hard
difficulty, just a little bit of review of API calls and of structure of
resulting formidable objects. It's really much simpler to parse original
template contents than resulting html from their expansion ;-) and AJAX
API calls are SO fast!

Alex
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] MediaWiki security release: 1.19.2 and 1.18.5

2012-08-30 Thread Chris Steipp
I would like to announce the release of MediaWiki 1.19.2 and 1.18.5.
These releases fix 6 security related bugs that could affect users of
MediaWiki. Download links are given at the end of this email.

* Wikipedia administrator Writ Keeper discovered a stored XSS (HTML
injection) vulnerability. This was possible due to the handling of
link text on File: links for nonexistent files. MediaWiki 1.16 and
later is affected. For more details, see
https://bugzilla.wikimedia.org/show_bug.cgi?id=39700

* User Fomafix reported several DOM-based XSS vulnerabilities, made
possible by a combination of loose filtering of the uselang parameter,
and JavaScript gadgets on various language Wikipedias. For more
details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=37587

* During internal review, it was discovered that CSRF tokens,
available via the api, were not protected with X-Frame-Options
headers. This could lead to a CSRF vulnerability if the API response
is embedded in an external website using an iframe. For more details,
see https://bugzilla.wikimedia.org/show_bug.cgi?id=39180

* During internal review, it was discovered extensions were not always
allowed to prevent the account creation action. This allowed users
blocked by the GlobalBlocking extension to create accounts. For more
details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=39824

* During internal review, it was discovered that password data was
always saved to the local MediaWiki database, even if authentication
was handled by an extension, such as LDAP. This could allow a
compromised MediaWiki installation to leak information about user's
LDAP passwords. Additionally, in situations when an authentication
plugin returned false in its strict function, this would allow old
passwords to be used for accounts that did not exist in the external
system, indefinitely. For details on how to clean up the data leakage,
see: https://bugzilla.wikimedia.org/show_bug.cgi?id=39184

* During internal review, it was discovered that metadata about
blocks, hidden by a user with suppression rights, was visible to
administrators. For more details, see
https://bugzilla.wikimedia.org/show_bug.cgi?id=39823


Full release notes for 1.19.2:
https://www.mediawiki.org/wiki/Release_notes/1.19

Full release notes for 1.18.5:
https://www.mediawiki.org/wiki/Release_notes/1.18

For information about how to upgrade, see
https://www.mediawiki.org/wiki/Manual:Upgrading

**
   1.19.2
**
Download:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.2.tar.gz

Patch to previous version (1.19.1):
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.2.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.2.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.2.patch.gz.sig

Public keys:
https://secure.wikimedia.org/keys.html

**
   1.18.5
**
Download:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.5.tar.gz

Patch to previous version (1.18.4), without interface text:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.5.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-i18n-1.18.5.patch.gz

GPG signatures:
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.5.tar.gz.sig
http://download.wikimedia.org/mediawiki/1.18/mediawiki-1.18.5.patch.gz.sig
http://download.wikimedia.org/mediawiki/1.18/mediawiki-i18n-1.18.5.patch.gz.sig

Public keys:
https://secure.wikimedia.org/keys.html

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Flagged Reviews default by quality levels

2012-08-30 Thread Dovi Jacobs
When a page reaches X level of quality, that version becomes the default.  
When creating all the interface message for editing, viewing, and history,
this is definitely not easy to get right and keep simple for new users. 
Anyway, to be clear, you can make the latest version the default for all
pages and manually make the latest reviewed version the default on a
per-page basis already. You just can't use quality versions as the default 
version.

Hi, and thanks for the info. On an immediate practical level I will inform them 
of this at the wiki, and we will unfortunately be forced to turn off the last 
approved version as default so as not to offput new users.
In terms of the future, I respectfully disagree with you that there is any 
essential real-life problem for users if the extension were to be implemented 
in the way I described. Even for those who share your evaluation (which I 
don't), there should be no reason not to provide such basic functionality as at 
least an option. Furthermore, to manually make the latest reviewed version 
the default on a per-page basis is extremely cumbersome (even using bots) and 
not a reasonable solution to the problem.
To conclude, I personally think the extension you have developed (I was only 
recently reminded that you are the primary developer) is already one of the 
most important developments for the future of Mediawiki (and often under 
appreciated as such). Please let me know if there is any positive way I can 
become involved in helping this functionality become part of the program.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l