Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-12 Thread Ori Livneh



On Monday, March 11, 2013 at 6:51 PM, Rob Lanphier wrote:

 On Sun, Mar 10, 2013 at 3:32 PM, Kevin Israel pleasest...@live.com 
 (mailto:pleasest...@live.com) wrote:
  On 03/10/2013 06:03 PM, Bartosz Dziewoński wrote:
   A shallow clone certainly shouldn't be as large as a normal one.
   Something's borked.
   
   
   
  --depth 0 is what's broken. --depth 1 works fine.
   
  $ git clone --depth 1
 [...]
  Receiving objects: 100% (2815/2815), 17.87 MiB | 1.16 MiB/s, done.
  
  
  
 Yup, I'm seeing more or less the same thing. Importantly:
 $ du -sh .git
 19M .git
  
 I was able to do the clone in 50 seconds over HTTPS. Most of that
 time was spent in data transfer (which would be the same for a
 snapshot).
  
 Ori, have you tried this with --depth 1?
  
 Rob
Rob, thanks for checking. I tried it yesterday and again just now, and in both 
cases it took around 15 minutes:

vagrant@precise32:~$ time git clone --depth 1 
https://gerrit.wikimedia.org/r/p/mediawiki/core.git
Cloning into 'core'...
remote: Counting objects: 46297, done
remote: Finding sources: 100% (46297/46297)
remote: Getting sizes: 100% (25843/25843)
remote: Compressing objects:  76% (19864/25833)
remote: Total 46297 (delta 33063), reused 26399 (delta 20010)
Receiving objects: 100% (46297/46297), 102.66 MiB | 194 KiB/s, done.
Resolving deltas: 100% (37898/37898), done.

real 15m14.500s
user 0m27.562s
sys 0m13.421s

The output of 'git config --list' is blank; this is vanilla git. 'Compressing 
objects' took the longest.

For comparison:

vagrant@precise32:~$ time wget -q 
https://github.com/wikimedia/mediawiki-core/archive/master.zip  unzip -x -q 
master.zip

real 1m15.592s
user 0m0.184s
sys 0m3.480s


--
Ori Livneh



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-03-12 Thread Daniel Friesen
On Sat, 09 Mar 2013 08:15:05 -0800, Platonides platoni...@gmail.com  
wrote:



On 09/03/13 15:47, Waldir Pimenta wrote:

So mw-config can't be deleted after all? Or you mean the installer at
includes/installer?
Is you mean the former, then how about run-installer instead of my
previous proposal of first-run?
Any of these would be clearer than mw-config, imo.

--Waldir


You can delete it, but then you can't use it to upgrade the wiki (or
rather, you would need to copy it again from the new tree).


mw-config only updates the databse to the currently installed version of  
MW. So it's fine to delete mw-config because you won't need that mw-config  
anymore. When you upgrade you'll need the mw-config that comes with the  
code for the new version of MediaWiki you're installing.


--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-03-12 Thread Daniel Friesen
On Fri, 08 Mar 2013 18:18:52 -0800, Waldir Pimenta wal...@email.com  
wrote:



On Wed, Feb 27, 2013 at 9:13 PM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:


I wouldn't even include mw-config in entrypoint modifications that would
be applied to other entrypoint code.


You mean like this one https://gerrit.wikimedia.org/r/#/c/49208/? I can
understand, in the sense that it gives people the wrong idea regarding  
its

relationship with the other access points, but if the documentation is
clear, I see no reason not to have mw-config/index.php benefit from  
changes

when the touched code is the part common to all *entry* points (in the
strict meaning of files that can be used to enter the wiki from a web
browser).


No different changes. I was talking about this RFC:
https://www.mediawiki.org/wiki/Requests_for_comment/Entrypoint_Routing_and_404_handling

Part of the plan was to move all the entrypoint handling into the common  
code. All the entrypoint files like api.php would become non-mandatory.  
They would basically be tiny files that all just include the MediaWiki  
code in the exact same way. While the common code does the work deciding  
what code to run for which entrypoint.
After that most of the paths could be blindly directed right to index.php  
and the codebase would handle delegating paths like api.php to the Api,  
handling 404s, etc...


mw-config/index.php would never get that tweak.


--Waldir

--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-12 Thread Andre Klapper
On Mon, 2013-03-11 at 17:09 +0100, Christian Aistleitner wrote:
 On Mon, Mar 11, 2013 at 12:43:44AM +0100, Andre Klapper wrote:
  Are there some thoughts already how to make a match between a
  project in Gerrit and its related Bugzilla product/component?
  [Disclaimer: I'm a fan of DOAP metadata files, if used properly.]
 
 While DOAP looks great, is there a standard tag to specify component
 etc for a bug tracker?

There are no rules how values should look like; guidelines and checks
are required. (That's what I meant with if used properly.) 
http://bugzilla.wikimedia.org/enter_bug.cgi?product=PRODUCTcomponent=COMPONENT
might work. Plus you could define custom parameters if wanted.

 That looks like being too generic to be useful for us.

Yes, it would require strict rules for the URI values, etc. 
Generic URIs are as unhelpful as empty values.

 However, gerrit already comes with a meta/config ref that already
 stores the project configuration as plain git configuration. And its
 publicly visible for our gerrit, so any script can use that. We could
 just put something like
 
 [bugzilla]
product = ComponentA
component = ComponentA
 
 in there and pick it up from gerrit's hooks-bugzilla plugin.

Wasn't aware of that. Are there projects in Gerrit that don't use
bugzilla.wikimedia.org? A lot of MediaWiki extensions, I assume? 
In that case we're back to the problem that you need a URI value for the
bugtracker.

Rest is offtopic and about DOAP only:
There are a few things I like about DOAP. One thing is that Apache's
DOAP files also include the programming language of a project, so it's
easier to redirect new contributors according to their skills. Or to
store who is the maintainer of a project (in theory). Or where to find
its mailing list, homepage, tarball releases, ... 
I like that when I want to find out how active a project is or how large
its community is, apart from code repository activity only.

andre
-- 
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-12 Thread Chad
On Tue, Mar 12, 2013 at 6:45 AM, Andre Klapper aklap...@wikimedia.org wrote:
 Wasn't aware of that. Are there projects in Gerrit that don't use
 bugzilla.wikimedia.org? A lot of MediaWiki extensions, I assume?
 In that case we're back to the problem that you need a URI value for the
 bugtracker.


I'd say the opposite is true--the vast majority of extensions in Gerrit *do use*
bugzilla.wm.o. Granted, some may not, so having a tracker uri would also be
nice (if not the default).

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] GLAM-Wiki 2013: THATCamp hackathon

2013-03-12 Thread Richard Nevell
As a part of the GLAM-Wiki 2013
Conferencehttp://uk.wikimedia.org/wiki/GLAM-WIKI_2013,
Wikimedia UK is putting up a free unconference event on Sunday 14 April and
you are most welcome to attend.

The event is organised by THATCamp as a free unconference and
hackathonhttp://uk.wikimedia.org/wiki/GLAM-WIKI_2013/Hackathon,
where the agenda and session topics are determined by attendees on the
day. THATCamp
London 2013 http://london2013.thatcamp.org/will be an unconference
exploring the humanities and technology. *We’re hoping to see lots of
exciting creations and thoughts around free-licensing, open access and the
interface between humanities and technology.*

To attend, please follow these steps:

   1. Register here http://glamwiki2013.eventbrite.co.uk/ (free!)
   2. Create an account on the THATCamp website
herehttp://london2013.thatcamp.org/
   3. Check any details on the wiki page
herehttp://uk.wikimedia.org/wiki/GLAM-WIKI_2013/Hackathon
   .


Regards,
Richard Nevell
-- 
Richard Nevell
Wikimedia UK
+44 (0) 20 7065 0753

Wikimedia UK is a Company Limited by Guarantee registered in England and
Wales, Registered No. 6741827. Registered Charity No.1144513. Registered
Office 4th Floor, Development House, 56-64 Leonard Street, London EC2A 4LT.
United Kingdom. Wikimedia UK is the UK chapter of a global Wikimedia
movement. The Wikimedia projects are run by the Wikimedia Foundation (who
operate Wikipedia, amongst other projects).

*Wikimedia UK is an independent non-profit charity with no legal control
over Wikipedia nor responsibility for its contents.*
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Jenkins upgraded to 1.480.3

2013-03-12 Thread Antoine Musso
Hello,

We have upgraded our Jenkins installation to version 1.480.3 (LTS
release). My basic regression tests did not catch any issue, if you see
anything wrong please ping me and or open a bug report against
Wikimedia  Testing Infrastructure in Bugzilla (I am in CC).

The most interesting bug fix is Remember me on this computer does not
work, cookie is not accepted in new session (issue 16278)


cheers,

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-12 Thread Guillaume Paumier
Hi,

On Sun, Mar 10, 2013 at 2:11 AM, Rob Lanphier ro...@wikimedia.org wrote:
 Hi folks,

 Short version: This mail is fishing for feedback on proposed work on
 Gerrit-Bugzilla integration to replace code review tags.

 I preferred that if we were going to have our own hacky solution, it
 should at least be implemented as a Gerrit plugin, so that it would at
 least stand a chance of becoming a well-integrated solution.

 A Bugzilla-based solution would be an ideal replacement for fixme,
 since fixmes are basically bugs anyway.  It would work reasonably well
 for scaptrap, since they generally imply something that needs to be
 done prior to deployment.  It would be an awkward replacement for
 backcompat and others.

Thank you for this detailed e-mail. One thing I think I'm missing is
why the bugzilla-based solution is better than the gerrit plugin one.

It seems to me that if the tagging functionality was developed as a
gerrit plugin, it would have all the advantages of the bugzilla-based
solution (good integration, etc.) without its drawbacks (awkwardness
for non-bugs tags, e-mail addresses mismatches, dependency on
bugzilla).

Admittedly, I'm not a primary user of gerrit, but I've been pondering
the idea of using tags in order to surface noteworthy changes, so they
can be easily listed and communicated about to our users. This would
make it much easier to identify the most important changes on pages
like https://www.mediawiki.org/wiki/MediaWiki_1.21/wmf7 , and could
also perhaps be used for release notes summaries.

A bugzilla-based tagging system seems too restrictive for this kind of
use, but perhaps I'm just not seeing how it would work. It's difficult
to predict the kinds of tags people will come up with in the future,
and I feel it would be a pity to develop a tagging solution that
restricts the type of tags you can use with it.

Just my $0.02 data point.

-- 
Guillaume Paumier

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-12 Thread Rob Lanphier
Hi Ori,

I'm at the office now, but (heh) that may be a tougher test than my
home connection.  More below

On Tue, Mar 12, 2013 at 12:00 AM, Ori Livneh o...@wikimedia.org wrote:
 Rob, thanks for checking. I tried it yesterday and again just now, and in 
 both cases it took around 15 minutes:

 vagrant@precise32:~$ time git clone --depth 1 
 https://gerrit.wikimedia.org/r/p/mediawiki/core.git
 Cloning into 'core'...
 remote: Counting objects: 46297, done
 remote: Finding sources: 100% (46297/46297)
 remote: Getting sizes: 100% (25843/25843)
 remote: Compressing objects:  76% (19864/25833)
 remote: Total 46297 (delta 33063), reused 26399 (delta 20010)
 Receiving objects: 100% (46297/46297), 102.66 MiB | 194 KiB/s, done.
 Resolving deltas: 100% (37898/37898), done.

 real 15m14.500s
 user 0m27.562s
 sys 0m13.421s

 The output of 'git config --list' is blank; this is vanilla git. 'Compressing 
 objects' took the longest.

 For comparison:

 vagrant@precise32:~$ time wget -q 
 https://github.com/wikimedia/mediawiki-core/archive/master.zip  unzip -x -q 
 master.zip

 real 1m15.592s
 user 0m0.184s
 sys 0m3.480s

I'm on Ubuntu 12.10 (Quantal), so that might have something to do with it.
$ git --version
git version 1.7.10.4

$ time git clone --depth 1
https://gerrit.wikimedia.org/r/p/mediawiki/core.git core-shallow
Cloning into 'core-shallow'...
remote: Counting objects: 3456, done
remote: Finding sources: 100% (3456/3456)
remote: Getting sizes: 100% (3074/3074)
remote: Compressing objects:  63% (1958/3069)
remote: Total 3456 (delta 690), reused 1496 (delta 379)
Receiving objects: 100% (3456/3456), 18.71 MiB | 1.49 MiB/s, done.
Resolving deltas: 100% (816/816), done.

real 0m34.507s
user 0m4.252s
sys 0m0.940s


I wasn't able to find anything in the Git release notes to indicate
anything different on the client side, though I didn't look too hard.
However, it appears as though JGit server-side support for shallow
clones is relatively recent:
https://bugs.eclipse.org/bugs/show_bug.cgi?id=394543

...characterized here as coming along bit by bit:
http://dev.eclipse.org/mhonarc/lists/jgit-dev/msg01976.html

...which would imply that maybe the JGit server side stuff hasn't been
tested with a lot of client versions.

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki's access points and mw-config

2013-03-12 Thread Brian Wolff
On 2013-03-12 6:31 AM, Daniel Friesen dan...@nadir-seen-fire.com wrote:

 On Sat, 09 Mar 2013 08:15:05 -0800, Platonides platoni...@gmail.com
wrote:

 On 09/03/13 15:47, Waldir Pimenta wrote:

 So mw-config can't be deleted after all? Or you mean the installer at
 includes/installer?
 Is you mean the former, then how about run-installer instead of my
 previous proposal of first-run?
 Any of these would be clearer than mw-config, imo.

 --Waldir


 You can delete it, but then you can't use it to upgrade the wiki (or
 rather, you would need to copy it again from the new tree).


 mw-config only updates the databse to the currently installed version of
MW. So it's fine to delete mw-config because you won't need that mw-config
anymore. When you upgrade you'll need the mw-config that comes with the
code for the new version of MediaWiki you're installing.


 --
 ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

There are cases where you still want mw-config even without upgrading. For
example changing wgCategoryCollation or installing an extension with schema
changes.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-12 Thread Greg Grossmeier
Hello!

quote name=Guillaume Paumier date=2013-03-12 time=15:03:04 +0100
 Hi,
 
 On Sun, Mar 10, 2013 at 2:11 AM, Rob Lanphier ro...@wikimedia.org wrote:
  Hi folks,
 
  Short version: This mail is fishing for feedback on proposed work on
  Gerrit-Bugzilla integration to replace code review tags.
 
  I preferred that if we were going to have our own hacky solution, it
  should at least be implemented as a Gerrit plugin, so that it would at
  least stand a chance of becoming a well-integrated solution.
 
  A Bugzilla-based solution would be an ideal replacement for fixme,
  since fixmes are basically bugs anyway.  It would work reasonably well
  for scaptrap, since they generally imply something that needs to be
  done prior to deployment.  It would be an awkward replacement for
  backcompat and others.
 
 Thank you for this detailed e-mail. One thing I think I'm missing is
 why the bugzilla-based solution is better than the gerrit plugin one.

I've been wavering back and forth on this myself, honestly.

 It seems to me that if the tagging functionality was developed as a
 gerrit plugin, it would have all the advantages of the bugzilla-based
 solution (good integration, etc.) without its drawbacks (awkwardness
 for non-bugs tags, e-mail addresses mismatches, dependency on
 bugzilla).
 
 Admittedly, I'm not a primary user of gerrit, but I've been pondering
 the idea of using tags in order to surface noteworthy changes, so they
 can be easily listed and communicated about to our users. This would
 make it much easier to identify the most important changes on pages
 like https://www.mediawiki.org/wiki/MediaWiki_1.21/wmf7 , and could
 also perhaps be used for release notes summaries.

So, a few things that are related to the above in no specific order:

1) Major changes/issues aren't always associated with just one merge. In
this case a bug in BZ would be nice as it is more topical as opposed to
tied to a specific merge.

2) Tracking the most important changes quality is something I think we
need to do better, but I honestly haven't thought of The One Right Way
(TM) yet. :) I experimented with keeping track of merges that I thought
were interesting via 'starring' them in Gerrit, but the limit there is
those stars are tied to my account, not a shared thing (ie: you can't
add one to the list without having me do it).

Manually adding them to a wiki page is about as productive as something
really unproductive. ;) (I tried that for a bit, too.)

Maybe clicking on a file bug about this merge and making that bug
block a Release Notes tracking bug is useful? That also seems less
efficient than a Gerrit ReleaseNotes/Scaptrap tag.

3) I have a personal preference for building light-weight tools first to
see if that addresses the need then building more detailed/complex ones
later as needed.

4) A Gerrit-based tagging plugin would need some engineering that might
not be apparent at first blush, for example: who can set tags and remove
them? Does that vary by tag? How could I, for example, keep track of all
scaptrap-type things and be sure I don't miss something because someone
mistakenly removed the scaptrap tag from a merge and I didn't
notice/remember (ie: can I *trust* the tag, both what is there and what
isn't, so I don't have to remember everything/double check every week?).


Just my $0.04 (penny per item).

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-12 Thread K. Peachey
On Wed, Mar 13, 2013 at 2:43 AM, Greg Grossmeier g...@wikimedia.org wrote:

 4) A Gerrit-based tagging plugin would need some engineering that might
 not be apparent at first blush, for example: who can set tags and remove
 them? Does that vary by tag? How could I, for example, keep track of all
 scaptrap-type things and be sure I don't miss something because someone
 mistakenly removed the scaptrap tag from a merge and I didn't
 notice/remember (ie: can I *trust* the tag, both what is there and what
 isn't, so I don't have to remember everything/double check every week?).


I don't think we need to be that granular with the tagging permissions, In
CR we used to just have it so you needed the Coder right, And i'm not aware
of many instances where tagging was abused. You just need to have some sort
of version of whom added/removed the tags.

Although there were instances where people forgot to tag changes, But that
is unavoidable no matter what level(/s) of permissions you have.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Mediawiki branches and deployments

2013-03-12 Thread Greg Grossmeier
Hello all,

--
tl;dr: There's a difference between how the WMF Platform team and
Feature Teams commit to master. Why? How can we unify and/or make
testing from development branches easier for everyone?
--

Two weeks ago Ops and Platform both had all staff local in SF for a week
for a slew of meetings. I took that opportunity to make even more
meetings with people/teams regarding our dev and deployment process.
(I promise I'll be nicer in the future.)

You can see the fruits of that labor here:
https://wikitech.wikimedia.org/wiki/Deployments/Features_Process/General_Feedback

(More info linked from:
https://wikitech.wikimedia.org/wiki/Deployments/Features_Process )

One of the things that became apparent is that the Features Teams (eg:
E2, E3, Visual Editor) tend to have separate branches that they develop
on and only merge to master closer to their deploy windows. There is
some variation among them, of course, but the general idea stands.

This contrasts with the way the WMF Platform team develops; we
predominately commit to master as we develop with any new features set
as disabled in a config until it is ready. The Features Teams also use
the disabled in a config until ready bit, of course.

What is the reasoning for the use of this development/deployment
distinction on the Feature Teams? Mostly it is testing, from what I
could tell. Many teams have the development branch running on a test
instance that they, well, test against. Then, as things stabilize they
merge to master via Gerrit.

(Feature Teams members, please correct me if I'm wrong anywhere above,
generally: specifics may differ, of course, but is the overall notion
accurate?)

How do we encourage a more unified development, testing, and deployment
process across all WMF teams and community members (sometimes they are
one and the same)?

This issue should also be looked at with betalabs in mind; how can we
utilize betalabs to be the best test environment it can be? What code
should be running there and how often should it be updated? What tests
should be automatically run against it? These questions are almost out
of scope of this thread, but a solution to this thread should also have
an eye towards betalabs.


Thanks,

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Detect running from maintenance script in a parser function extension

2013-03-12 Thread Toni Hermoso Pulido
Hello,

I'm checking whether I can detect that a process is run from a
maintenance script in a parser function extension.

Which would be the best way / more recommendable to detect it?

Thanks!
-- 
Toni Hermoso Pulido
http://www.cau.cat

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Detect running from maintenance script in a parser function extension

2013-03-12 Thread Tyler Romeo
On Tue, Mar 12, 2013 at 1:47 PM, Toni Hermoso Pulido toni...@cau.catwrote:

 Hello,

 I'm checking whether I can detect that a process is run from a
 maintenance script in a parser function extension.

 Which would be the best way / more recommendable to detect it?

 Thanks!


$wgCommandLineMode should be able to tell you, although I think checking if
the RUN_MAINTENANCE_IF_MAIN constant is set is probably a better method.

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Detect running from maintenance script in a parser function extension

2013-03-12 Thread Platonides
On 12/03/13 18:47, Toni Hermoso Pulido wrote:
 Hello,
 
 I'm checking whether I can detect that a process is run from a
 maintenance script in a parser function extension.
 
 Which would be the best way / more recommendable to detect it?
 
 Thanks!

Why do you want to do it? It is probably a bad idea.




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Detect running from maintenance script in a parser function extension

2013-03-12 Thread Brian Wolff
On 2013-03-12 3:19 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Tue, Mar 12, 2013 at 1:47 PM, Toni Hermoso Pulido toni...@cau.cat
wrote:

  Hello,
 
  I'm checking whether I can detect that a process is run from a
  maintenance script in a parser function extension.
 
  Which would be the best way / more recommendable to detect it?
 
  Thanks!
 

 $wgCommandLineMode should be able to tell you, although I think checking
if
 the RUN_MAINTENANCE_IF_MAIN constant is set is probably a better method.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

More interesting question - why do you need to know.

Making wikitext vary between maintenance script and normal may cause a bit
of breakage given jobQueue etc.

-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-12 Thread Platonides
Doing the tags in Gerrit is the right thing.
Who can change them is not a problem, just make a changetags log. (BTW,
you would have the same problem in bugzilla with people removing that
superimportant tag)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki branches and deployments

2013-03-12 Thread Matthew Flaschen
On 03/12/2013 01:44 PM, Greg Grossmeier wrote:
 One of the things that became apparent is that the Features Teams (eg:
 E2, E3, Visual Editor) tend to have separate branches that they develop
 on and only merge to master closer to their deploy windows. There is
 some variation among them, of course, but the general idea stands.

We should clarify whether we're talking about core, extensions, or both.

To my knowledge, E3 does not have any server-side branches for core.  In
other words, we have what Gerrit calls topics, but not what Gerrit calls
branches.  Even for big changes like a major update to core login and
account creation (https://gerrit.wikimedia.org/r/#/c/30637/), it's still
just a big single Gerrit change on master.

For extensions, for the most part we also use master and topics.
However, there was recently a notable exception for v2 (a major change)
of https://www.mediawiki.org/wiki/Extension:GettingStarted .

 This contrasts with the way the WMF Platform team develops; we
 predominately commit to master as we develop with any new features set
 as disabled in a config until it is ready. The Features Teams also use
 the disabled in a config until ready bit, of course.

Right, the loging/creation change (Gerrit link above) is using that kind
of gating (as well as a query-string override).

 What is the reasoning for the use of this development/deployment
 distinction on the Feature Teams? Mostly it is testing, from what I
 could tell. Many teams have the development branch running on a test
 instance that they, well, test against. Then, as things stabilize they
 merge to master via Gerrit.

In the case of the GettingStarted extension v2, we used a feature branch
because it was a user-facing change that took a few weeks to get ready
for deployment.  Because it was user-facing, there were interactions,
and we wanted to test it, we wanted to do it in one shot.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Category sorting in random order

2013-03-12 Thread Bartosz Dziewoński

On Mon, 11 Mar 2013 13:00:22 +0100, Lars Aronsson l...@aronsson.se wrote:


This category is sorted strangely,
http://sv.wikipedia.org/wiki/Kategori:Svenska_kokboksf%C3%B6rfattare

A, B, E, G, H, L, B, M, N, F, J, R, Þ, S, W, Z, Å

Not only does B appear twice, and F after N, but
the article sorted under Þ has
{{STANDARDSORTERING:Tunberger, Pernilla}}
where STANDARDSORTERING is Swedish for DEFAULTSORT,
and there is no Þ (Icelandic Thorn) in sight.


Okay, so as you can see in the comments on
https://bugzilla.wikimedia.org/show_bug.cgi?id=45446 , there is something weird 
happening. WMF people are working on it, and worst-case, it should fix itself 
in a week or so.

Sorry about the mess, but I asked you all to try out the collations on the 
testwiki first... :)

--
Matma Rex

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-12 Thread Ori Livneh
On Tuesday, March 12, 2013 at 8:51 AM, Rob Lanphier wrote:
 ...which would imply that maybe the JGit server side stuff hasn't been
 tested with a lot of client versions.

Rob, thanks again for investigating. I filed a bug in Bugzilla so we can track 
this: https://bugzilla.wikimedia.org/show_bug.cgi?id=46041

Could you capture the output of the following command:

GIT_CURL_VERBOSE=1 GIT_TRACE=true git clone --verbose --depth 1 
https://gerrit.wikimedia.org/r/p/mediawiki/core.git ~/core-git-clone.log

And upload it as an attachment? I already uploaded mine. This way we can 
compare.

--
Ori Livneh



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki branches and deployments

2013-03-12 Thread Chris McMahon
On Tue, Mar 12, 2013 at 2:21 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:


 In the case of the GettingStarted extension v2, we used a feature branch
 because it was a user-facing change that took a few weeks to get ready
 for deployment.  Because it was user-facing, there were interactions,
 and we wanted to test it, we wanted to do it in one shot.


If there were a convenient mechanism to do it, this is a great example of
something I would like to see enabled on beta labs during development but
disabled in production until ready.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki branches and deployments

2013-03-12 Thread Matthew Flaschen
On 03/12/2013 04:43 PM, Chris McMahon wrote:
 On Tue, Mar 12, 2013 at 2:21 PM, Matthew Flaschen
 mflasc...@wikimedia.orgwrote:
 

 In the case of the GettingStarted extension v2, we used a feature branch
 because it was a user-facing change that took a few weeks to get ready
 for deployment.  Because it was user-facing, there were interactions,
 and we wanted to test it, we wanted to do it in one shot.

 
 If there were a convenient mechanism to do it, this is a great example of
 something I would like to see enabled on beta labs during development but
 disabled in production until ready.

The problem is, gating can become a maintenance burden when you're
making larger changes.  You have a long strings of code in if/else
blocks, sometimes necessary files.

In such cases, I think branching might be better.  In our case, we
tested on http://toro.wmflabs.org/wiki/Main_Page .  If it were
considered alright to test branches on Beta labs, that would have been
an option.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Vague question re: future of Vector skin

2013-03-12 Thread David Gerard
A few people on rationalwiki.org have been muttering about doing a
customised Vector skin.

The trouble with Vector is that, as I understand it, it's an odd
melange of extension and skin, with functionality that should be in
one being in the other, both ways.

I also understand that there are plans to refactor it to be sensibly
organised with the right functionality in the right place. (Though I
have no idea if there is actually anyone assigned to such a task.)

Is this the case? If not, what is? What's the present and future of Vector?


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Vague question re: future of Vector skin

2013-03-12 Thread Matthew Flaschen
On 03/12/2013 05:07 PM, David Gerard wrote:
 A few people on rationalwiki.org have been muttering about doing a
 customised Vector skin.
 
 The trouble with Vector is that, as I understand it, it's an odd
 melange of extension and skin, with functionality that should be in
 one being in the other, both ways.
 
 I also understand that there are plans to refactor it to be sensibly
 organised with the right functionality in the right place. (Though I
 have no idea if there is actually anyone assigned to such a task.)

Yes, people are working on eliminating the extension (removing features
that didn't work and moving the good stuff to core).

https://bugzilla.wikimedia.org/show_bug.cgi?id=45051

It's a big task, so help would probably be welcome.

Remember, you can also do customization on-wiki at MediaWiki:Vector.js
and MediaWiki:Vector.css, or through gadgets (which can be defaulted to
on).  Of course, if it's generally useful you may want to try to merge
it into core or the extension.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Vague question re: future of Vector skin

2013-03-12 Thread Bartosz Dziewoński

On Tue, 12 Mar 2013 22:07:29 +0100, David Gerard dger...@gmail.com wrote:


I also understand that there are plans to refactor it to be sensibly
organised with the right functionality in the right place. (Though I
have no idea if there is actually anyone assigned to such a task.)



Is this the case? If not, what is? What's the present and future of Vector?


There are plans, yes, and things are even starting to be getting done. See 
https://bugzilla.wikimedia.org/show_bug.cgi?id=45051 . And feel free to help :)

I sort of voluntarily assigned myself to it. As in, I want it done, and 
probably no one will unless I get on this myself, so I started working in it. 
I'm mostly fighting the footer cleanup thing right now (which is still disabled 
by default), the rest is free to take (there is a breakdown on the bug).

My plan for the extension is to leave only the disabled features no one was 
using anyway in it, and just let it stay like this. It'll probably be disabled 
on WMF servers once all the features that are actually in use are ported.



The trouble with Vector is that, as I understand it, it's an odd
melange of extension and skin, with functionality that should be in
one being in the other, both ways.


Yes, but really, much worse problem is the hardcore CSS hackery used to get it 
to look pretty in IE6 and FF2. Cleaning this up would make it much easier to 
customize the skin, but I don't see a way to do it without dropping support (or 
at least breaking some rendering a bit), and I don't see this getting any WMF 
support :)


--
Matma Rex

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Detect running from maintenance script in a parser function extension

2013-03-12 Thread Toni Hermoso Pulido
Al 12/03/13 21:08, En/na Brian Wolff ha escrit:
 On 2013-03-12 3:19 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Tue, Mar 12, 2013 at 1:47 PM, Toni Hermoso Pulido toni...@cau.cat
 wrote:

 Hello,

 I'm checking whether I can detect that a process is run from a
 maintenance script in a parser function extension.

 Which would be the best way / more recommendable to detect it?

 Thanks!


 $wgCommandLineMode should be able to tell you, although I think checking
 if
 the RUN_MAINTENANCE_IF_MAIN constant is set is probably a better method.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 More interesting question - why do you need to know.
 
 Making wikitext vary between maintenance script and normal may cause a bit
 of breakage given jobQueue etc.

Hello,

maybe it's a bit weird and little orthodox…
In any case, it's for batch processing (with WikiPage::doEdit) some wiki
pages that have a UserFunctions parserfunction in their wikitext
http://www.mediawiki.org/wiki/Extension:UserFunctions
so that such parser function is ignored in building the page.

Cheers,
-- 
Toni Hermoso Pulido
http://www.cau.cat

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Vague question re: future of Vector skin

2013-03-12 Thread MZMcBride
David Gerard wrote:
A few people on rationalwiki.org have been muttering about doing a
customised Vector skin.

The trouble with Vector is that, as I understand it, it's an odd
melange of extension and skin, with functionality that should be in
one being in the other, both ways.

I also understand that there are plans to refactor it to be sensibly
organised with the right functionality in the right place. (Though I
have no idea if there is actually anyone assigned to such a task.)

Is this the case? If not, what is? What's the present and future of
Vector?

Several people (myself included) railed against having both a skin and a
MediaWiki extension named Vector, but we were ultimately unsuccessful in
avoiding the creation of the current clusterfuck.

Relevant (current) bugs:

* https://bugzilla.wikimedia.org/show_bug.cgi?id=45051
  Phase out the Vector extension; merge the good parts into core

* https://bugzilla.wikimedia.org/show_bug.cgi?id=43689
  Implement Templates used on this page collapsing below edit window and
  the rest of footer cleanup module in core

* https://bugzilla.wikimedia.org/show_bug.cgi?id=45977
  Better, central default for MediaWiki:Edithelppage

* https://bugzilla.wikimedia.org/show_bug.cgi?id=42630
  Implement English Wikipedia-only enhancements for other Wikimedia wikis
  (tracking)

I believe the Wikimedia Foundation would like to eventually phase-out the
Vector skin in favor of the Athena skin (cf.
https://www.mediawiki.org/wiki/Athena).

I recently tripped across https://wikitech-test.wmflabs.org/wiki/ which
is using an interesting skin, but I'm not sure which.

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Vague question re: future of Vector skin

2013-03-12 Thread Chad
On Tue, Mar 12, 2013 at 6:24 PM, MZMcBride z...@mzmcbride.com wrote:
 I recently tripped across https://wikitech-test.wmflabs.org/wiki/ which
 is using an interesting skin, but I'm not sure which.


I believe this was based off twitter's bootstrap.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-12 Thread Ori Livneh


On Tuesday, March 12, 2013 at 1:36 PM, Ori Livneh wrote:

 I filed a bug in Bugzilla so we can track this: 
 https://bugzilla.wikimedia.org/show_bug.cgi?id=46041

Platonides figured it out. Incredibly enough, in the half-sub-sub-version delta 
from 1.7.9.5 to 1.7.10, the behavior of git-clone was fixed so that it only 
fetches a single branch rather than all.

The workaround for git 1.7.10 is:

mkdir core; cd core
git init
git fetch --depth=1 https://gerrit.wikimedia.org/r/p/mediawiki/core.git 
master:refs/remotes/origin/master



--
Ori Livneh




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Vague question re: future of Vector skin

2013-03-12 Thread Matthew Flaschen
On 03/12/2013 06:24 PM, MZMcBride wrote:
 I believe the Wikimedia Foundation would like to eventually phase-out the
 Vector skin in favor of the Athena skin (cf.
 https://www.mediawiki.org/wiki/Athena).

Maybe, but right now there is a lot more discussion about Agora
(basically, what https://en.wikipedia.org/wiki/Special:UserLogin/signup
looks like signed out).

Where these Agora styles will eventually end up is being discussed, but
currently it's in Extension:Agora (though that version of the signup
page uses its own Agora styles).

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Detect running from maintenance script in a parser function extension

2013-03-12 Thread Brian Wolff
On 3/12/13, Toni Hermoso Pulido toni...@cau.cat wrote:
 Al 12/03/13 21:08, En/na Brian Wolff ha escrit:
 On 2013-03-12 3:19 PM, Tyler Romeo tylerro...@gmail.com wrote:

 On Tue, Mar 12, 2013 at 1:47 PM, Toni Hermoso Pulido toni...@cau.cat
 wrote:

 Hello,

 I'm checking whether I can detect that a process is run from a
 maintenance script in a parser function extension.

 Which would be the best way / more recommendable to detect it?

 Thanks!


 $wgCommandLineMode should be able to tell you, although I think checking
 if
 the RUN_MAINTENANCE_IF_MAIN constant is set is probably a better method.

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 More interesting question - why do you need to know.

 Making wikitext vary between maintenance script and normal may cause a bit
 of breakage given jobQueue etc.

 Hello,

 maybe it's a bit weird and little orthodox…
 In any case, it's for batch processing (with WikiPage::doEdit) some wiki
 pages that have a UserFunctions parserfunction in their wikitext
 http://www.mediawiki.org/wiki/Extension:UserFunctions
 so that such parser function is ignored in building the page.

 Cheers,
 --
 Toni Hermoso Pulido
 http://www.cau.cat

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Ok, that's probably safe, since that extension disables caching.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Nightly shallow clones of mediawiki/core

2013-03-12 Thread Chad
On Tue, Mar 12, 2013 at 5:31 PM, Ori Livneh o...@wikimedia.org wrote:


 On Tuesday, March 12, 2013 at 1:36 PM, Ori Livneh wrote:

 I filed a bug in Bugzilla so we can track this: 
 https://bugzilla.wikimedia.org/show_bug.cgi?id=46041

 Platonides figured it out. Incredibly enough, in the half-sub-sub-version 
 delta from 1.7.9.5 to 1.7.10, the behavior of git-clone was fixed so that it 
 only fetches a single branch rather than all.

 The workaround for git 1.7.10 is:

 mkdir core; cd core
 git init
 git fetch --depth=1 https://gerrit.wikimedia.org/r/p/mediawiki/core.git 
 master:refs/remotes/origin/master


This totally doesn't surprise me at all. Version numbers in Git don't
reflect any
sort of reality in terms of features or things to look forward to--if
memory serves,
a series of really bad performance regressions (and fixes) were
introduced in the
course of just 1.6.x.y.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
We just finished deploying a new SSL certificate to the sites. Now all *.m
and *. certificates are included in a single certificate, except
mediawiki.org. Unfortunately we somehow forgot mediawiki.org when we
ordered the updated cert. We'll be replacing this soon with another cert
that had mediawiki.org included.

This should fix any certificate errors that folks have been seeing on
non-wikipedia m. domains.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Category sorting in random order

2013-03-12 Thread Bartosz Dziewoński

Ad fixed now. Say thanks to Reedy and Tim.

--
Matma Rex

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] A new feedback extension - review urgently needed

2013-03-12 Thread Lukas Benedix
Hi there!

I am Lukas Benedix, a student of computer science at the Freie Universität
Berlin in Germany. In cooperation with the Wikidata developer team, I’m
currently working on my bachelor thesis about usability testing in open
source software projects and I’d like to provide the Wikidata community my
developed feedback mechanisms (only as a test). Wikidata is a very active,
emerging project which is why I think it’s a great platform for my
project.

And now here's the problem: The deadline of my bachelor thesis is
approaching soon. The test is designed to run for two weeks and I
unfortunately underestimated how much time it needs to get a review for my
extension before deployment.

Is it possible to accelerate that review process somehow? The extension is
in gerrit (https://gerrit.wikimedia.org/r/#/c/50004)

Do you have any advice what I can do?

For further information about my project: Here's a little description I
wrote for the Wikidata community:
http://www.wikidata.org/wiki/User:Lbenedix/UIFeedback

Best regards,
Lukas Benedix



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Brion Vibber
On Tue, Mar 12, 2013 at 3:43 PM, Ryan Lane rlan...@gmail.com wrote:
 We just finished deploying a new SSL certificate to the sites. Now all *.m
 and *. certificates are included in a single certificate, except
 mediawiki.org. Unfortunately we somehow forgot mediawiki.org when we
 ordered the updated cert. We'll be replacing this soon with another cert
 that had mediawiki.org included.

 This should fix any certificate errors that folks have been seeing on
 non-wikipedia m. domains.

Thanks guys!

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A new feedback extension - review urgently needed

2013-03-12 Thread Sumana Harihareswara
On 03/12/2013 07:37 PM, Lukas Benedix wrote:
 Hi there!
 
 I am Lukas Benedix, a student of computer science at the Freie Universität
 Berlin in Germany. In cooperation with the Wikidata developer team, I’m
 currently working on my bachelor thesis about usability testing in open
 source software projects and I’d like to provide the Wikidata community my
 developed feedback mechanisms (only as a test). Wikidata is a very active,
 emerging project which is why I think it’s a great platform for my
 project.
 
 And now here's the problem: The deadline of my bachelor thesis is
 approaching soon. The test is designed to run for two weeks and I
 unfortunately underestimated how much time it needs to get a review for my
 extension before deployment.
 
 Is it possible to accelerate that review process somehow? The extension is
 in gerrit (https://gerrit.wikimedia.org/r/#/c/50004)
 
 Do you have any advice what I can do?
 
 For further information about my project: Here's a little description I
 wrote for the Wikidata community:
 http://www.wikidata.org/wiki/User:Lbenedix/UIFeedback
 
 Best regards,
 Lukas Benedix

Lukas, can you give more specifics regarding your deadline?  A variety
of people would have to help you get through all the steps of
https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment and
so it might not be feasible to get all these things done in time. :(  In
that case you should possibly consider setting up a Wikidata variant in
Wikimedia Labs -- https://wikitech.wikimedia.org/ .

-- 
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Virtual machines for testing IE

2013-03-12 Thread Matthew Flaschen
http://www.modern.ie/en-us/virtualization-tools is offering VMs for
testing various versions of IE.

Unlike before, they now even offer VirtualBox and VMWare images, so you
don't have to convert the Virtual PC ones.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Lars Aronsson

In Wiktionary, it's very convenient that some words
have sound illustrations, e.g.
http://en.wiktionary.org/wiki/go%C3%BBter

These audio bites are simple 2-3 second OGG files, e.g.
http://commons.wikimedia.org/wiki/File:Fr-go%C3%BBter.ogg

but they are limited in number. It would be very
easy to record more of them, but before you get
started it takes some time to learn the details,
and then you need to upload to Commons and specify
a license, and provide a description, ... It's not
very likely that the person who does all that is
also a good voice in each desired language.

Here's a better plan:

Provide a tool on the toolserver, or any other
server, having a simple link syntax that specifies
the language code and the text, e.g.
http://toolserver.org/mytool.php?lang=frtext=gouter

The tool uses a cookie, that remembers that this
user has agreed to submit contributions using cc0.
At the first visit, this question is asked as a
click-through license.

The user is now prompted with the text (from the URL)
and recording starts when pressing a button. The
user says the word, and presses the button again.
The tool saves the OGG sound, uploads it to Commons
with the filename fr-gouter-XYZ789.ogg and
the cc0 declaration and all metadata, placing it
in a category of recorded but unverified words.

Another user can record the same word, and it will
be given another random letter-digit code.

As a separate part of the tool, other volunteers are
asked to verify or rate (1 to 5 stars) the recordings
available in a given language. The rating is stored
as categories on commons.

Now, a separate procedure (manual or a bot job) can
pick words that need new or improved recordings,
and list them (with links to the tool) on a normal
wiki page.

I know HTML supports uploading of a file, but I don't
know how to solve the recording of sound directly to
a web service. Perhaps this could be a Skype application?
I have no idea. Please just be creative. It should be
solvable, because this is 2013 and not 2003.


--
  Lars Aronsson (l...@aronsson.se)
  Aronsson Datateknik - http://aronsson.se



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
On Tue, Mar 12, 2013 at 4:47 PM, Brion Vibber br...@pobox.com wrote:

 Thanks guys!

 -- brion


Don't thank us too quick. We needed to revert this for mobile. Seems *.
m.wikipedia.org was also missing from the cert. Needless to say I'll be
writing a script that can be run against a cert to ensure it's not missing
anything. We'll also be adding monitoring to check for invalid certificates
for any top level domain.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Brian Wolff
On 3/12/13, Lars Aronsson l...@aronsson.se wrote:
 In Wiktionary, it's very convenient that some words
 have sound illustrations, e.g.
 http://en.wiktionary.org/wiki/go%C3%BBter

 These audio bites are simple 2-3 second OGG files, e.g.
 http://commons.wikimedia.org/wiki/File:Fr-go%C3%BBter.ogg

 but they are limited in number. It would be very
 easy to record more of them, but before you get
 started it takes some time to learn the details,
 and then you need to upload to Commons and specify
 a license, and provide a description, ... It's not
 very likely that the person who does all that is
 also a good voice in each desired language.

 Here's a better plan:

 Provide a tool on the toolserver, or any other
 server, having a simple link syntax that specifies
 the language code and the text, e.g.
 http://toolserver.org/mytool.php?lang=frtext=gouter

 The tool uses a cookie, that remembers that this
 user has agreed to submit contributions using cc0.
 At the first visit, this question is asked as a
 click-through license.

 The user is now prompted with the text (from the URL)
 and recording starts when pressing a button. The
 user says the word, and presses the button again.
 The tool saves the OGG sound, uploads it to Commons
 with the filename fr-gouter-XYZ789.ogg and
 the cc0 declaration and all metadata, placing it
 in a category of recorded but unverified words.

 Another user can record the same word, and it will
 be given another random letter-digit code.

 As a separate part of the tool, other volunteers are
 asked to verify or rate (1 to 5 stars) the recordings
 available in a given language. The rating is stored
 as categories on commons.

 Now, a separate procedure (manual or a bot job) can
 pick words that need new or improved recordings,
 and list them (with links to the tool) on a normal
 wiki page.

 I know HTML supports uploading of a file, but I don't
 know how to solve the recording of sound directly to
 a web service. Perhaps this could be a Skype application?
 I have no idea. Please just be creative. It should be
 solvable, because this is 2013 and not 2003.


 --
Lars Aronsson (l...@aronsson.se)
Aronsson Datateknik - http://aronsson.se



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

It was solvable with a java applet (or flash, but that's usually
considered evil) back in 2003. However it still requires someone to
actually do it.

With modern web browsers, you can do it with html5/webRTC [1].

Someone could probably make an extension that integrates with
MediaWiki, so all user has to do is go to special:recordAudio and they
could record/upload from there. Perhaps that would make a good gsoc
project (Not sure if the scope is big enough, but could probably add
stuff like making a slick ui to make it big enough).

[1] http://www.html5rocks.com/en/tutorials/getusermedia/intro/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Tyler Romeo
On Tue, Mar 12, 2013 at 9:29 PM, Brian Wolff bawo...@gmail.com wrote:

 It was solvable with a java applet (or flash, but that's usually
 considered evil) back in 2003. However it still requires someone to
 actually do it.


For security purposes, I'm really hoping we don't plan on using a Java
applet. :P

*--*
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Matthew Flaschen
On 03/12/2013 09:01 PM, Lars Aronsson wrote:
 Provide a tool on the toolserver, or any other
 server, having a simple link syntax that specifies
 the language code and the text, e.g.
 http://toolserver.org/mytool.php?lang=frtext=gouter

Good idea, though I agree with Brian a special page would be preferable.

 The tool uses a cookie, that remembers that this
 user has agreed to submit contributions using cc0.
 At the first visit, this question is asked as a
 click-through license.

Why CC0 (public domain)?  Your example
(http://commons.wikimedia.org/wiki/File:Fr-go%C3%BBter.ogg) is CC-BY,
which is not public domain and requires attribution (which I think all
Wikimedia projects do for text).  I'd say CC-BY-SA or CC-BY would be a
better default.

Matt Flaschen

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Brian Wolff
On 3/12/13, Tyler Romeo tylerro...@gmail.com wrote:
 On Tue, Mar 12, 2013 at 9:29 PM, Brian Wolff bawo...@gmail.com wrote:

 It was solvable with a java applet (or flash, but that's usually
 considered evil) back in 2003. However it still requires someone to
 actually do it.


 For security purposes, I'm really hoping we don't plan on using a Java
 applet. :P

 *--*
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Why? There's nothing inherently insecure about java applets. We
already use them to play ogg files on lame browsers that don't support
html5.

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Tyler Romeo
On Mar 12, 2013 10:08 PM, Brian Wolff bawo...@gmail.com wrote:

 On 3/12/13, Tyler Romeo tylerro...@gmail.com wrote:
  On Tue, Mar 12, 2013 at 9:29 PM, Brian Wolff bawo...@gmail.com wrote:
 
  It was solvable with a java applet (or flash, but that's usually
  considered evil) back in 2003. However it still requires someone to
  actually do it.
 
 
  For security purposes, I'm really hoping we don't plan on using a Java
  applet. :P
 
  *--*
  *Tyler Romeo*
  Stevens Institute of Technology, Class of 2015
  Major in Computer Science
  www.whizkidztech.com | tylerro...@gmail.com
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 Why? There's nothing inherently insecure about java applets. We
 already use them to play ogg files on lame browsers that don't support
 html5.

 --bawolff

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Can you say that for sure? With the number of exploits in Java over the
past few months, everybody I know has already disabled their browser plugin.

--Tyler Romeo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Brian Wolff
On 3/12/13, Tyler Romeo tylerro...@gmail.com wrote:
 On Mar 12, 2013 10:08 PM, Brian Wolff bawo...@gmail.com wrote:

 On 3/12/13, Tyler Romeo tylerro...@gmail.com wrote:
  On Tue, Mar 12, 2013 at 9:29 PM, Brian Wolff bawo...@gmail.com wrote:
 
  It was solvable with a java applet (or flash, but that's usually
  considered evil) back in 2003. However it still requires someone to
  actually do it.
 
 
  For security purposes, I'm really hoping we don't plan on using a Java
  applet. :P
 
  *--*
  *Tyler Romeo*
  Stevens Institute of Technology, Class of 2015
  Major in Computer Science
  www.whizkidztech.com | tylerro...@gmail.com
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 Why? There's nothing inherently insecure about java applets. We
 already use them to play ogg files on lame browsers that don't support
 html5.

 --bawolff

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 Can you say that for sure? With the number of exploits in Java over the
 past few months, everybody I know has already disabled their browser plugin.

 --Tyler Romeo
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Those types of people will probably have an html5 capable web browser :P

Let me rephrase my previous statement as, using java as a fallback
doesn't introduce any new issues that wouldn't be already there if we
didn't use java as a fallback. (Since we'd only fallback to java if
the user already had it installed). Furthermore, I imagine (or hope at
least) that oracle fixes the security vulnerabilities of their plugin
as they are discovered.

-bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread K. Peachey
On Wed, Mar 13, 2013 at 11:29 AM, Brian Wolff bawo...@gmail.com wrote:

 Someone could probably make an extension that integrates with
 MediaWiki, so all user has to do is go to special:recordAudio and they
 could record/upload from there. Perhaps that would make a good gsoc
 project (Not sure if the scope is big enough, but could probably add
 stuff like making a slick ui to make it big enough).


That wouldn't be a bad project for GSoC as it isn't too large so it means
we could actually see some results, And if it was too small, The student
could probably do a couple of smaller projects (it being one) then focus on
one after the other.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-12 Thread Rob Lanphier
Hi Guillaume,

Good point.  Comments below...

On Tue, Mar 12, 2013 at 7:03 AM, Guillaume Paumier
gpaum...@wikimedia.org wrote:
 On Sun, Mar 10, 2013 at 2:11 AM, Rob Lanphier ro...@wikimedia.org wrote:
 Short version: This mail is fishing for feedback on proposed work on
 Gerrit-Bugzilla integration to replace code review tags.

 I preferred that if we were going to have our own hacky solution, it
 should at least be implemented as a Gerrit plugin, so that it would at
 least stand a chance of becoming a well-integrated solution.

 A Bugzilla-based solution would be an ideal replacement for fixme,
 since fixmes are basically bugs anyway.  It would work reasonably well
 for scaptrap, since they generally imply something that needs to be
 done prior to deployment.  It would be an awkward replacement for
 backcompat and others.

 Thank you for this detailed e-mail. One thing I think I'm missing is
 why the bugzilla-based solution is better than the gerrit plugin one.

The Bugzilla-based solution has some of the advantages of the
MediaWiki-based solution.  We may be able to implement it more quickly
than something native to Gerrit because we're already working on
Bugzilla integration, and we get features like queries for free, as
well as the minor convenience of not having to have a new database
table or two to manage.  It may be useful for Chad and/or Christian to
weigh in on this point.

Workflow advantages:
*  We're already managing Bugzilla (Andre is full-time on it), so
managing tags is straightforward.
*  We get all sorts of extra management functions, such as assignee
with BZ, whereas fixmes, for example, only have an implicit,
unchangeable assignee (the committer)

 It seems to me that if the tagging functionality was developed as a
 gerrit plugin, it would have all the advantages of the bugzilla-based
 solution (good integration, etc.) without its drawbacks (awkwardness
 for non-bugs tags, e-mail addresses mismatches, dependency on
 bugzilla).

True.  They may not be mutually exclusive, but a question of ordering
and priority.  A BZ-based solution can potentially be our short-term
solution, while we still continue to prod upstream on a better
long-term solution.  We may also still decide that both features are
useful enough to implement them both ourselves.

The nice thing about a BZ-based solution is that we will probably
still want to keep using it for some things after there's a proper
upstream solution.  Perhaps we'll also grow attached to a homegrown
tagging solution for Gerrit, but it seems less likely we'll want to
keep it unless the Gerrit devs never get around to implementing
tagging in core.

 Admittedly, I'm not a primary user of gerrit, but I've been pondering
 the idea of using tags in order to surface noteworthy changes, so they
 can be easily listed and communicated about to our users. This would
 make it much easier to identify the most important changes on pages
 like https://www.mediawiki.org/wiki/MediaWiki_1.21/wmf7 , and could
 also perhaps be used for release notes summaries.

 A bugzilla-based tagging system seems too restrictive for this kind of
 use, but perhaps I'm just not seeing how it would work. It's difficult
 to predict the kinds of tags people will come up with in the future,
 and I feel it would be a pity to develop a tagging solution that
 restricts the type of tags you can use with it.

I think we probably just need to be accommodating for more uses of
Bugzilla than narrowly using it for bugs only.  I think these would be
perfectly fine to track in Bugzilla.

If the problem is with the rigidity of keywords, one thing I should
note is that, in addition to tagging, there's also the whiteboard in
Bugzilla, which I believe you can use for free form stuff to query.  I
believe Andre only enabled that in recent months, so it may be that it
wasn't available the last time you went looking, and is a feature we
don't yet use for much (or anything?)

Part of why I sent my proposal to the list is because I'm not too
wedded to a Bugzilla-based solution.  The main things I wanted to
check were:
1.  Should we tackle Gerrit-Bugzilla bug filing as a higher priority
than a Gerrit tagging system?  (My original assumption is yes, but
this is the point I'm most malleable on)
2.  Is a MediaWiki-based solution with some JS integration to Gerrit
acceptable alternative to building a Gerrit-only plugin?  (My
assumption is no)

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Replacement for tagging in Gerrit

2013-03-12 Thread Rob Lanphier
On Tue, Mar 12, 2013 at 10:14 AM, K. Peachey p858sn...@gmail.com wrote:
 On Wed, Mar 13, 2013 at 2:43 AM, Greg Grossmeier g...@wikimedia.org wrote:

 4) A Gerrit-based tagging plugin would need some engineering that might
 not be apparent at first blush, for example: who can set tags and remove
 them? Does that vary by tag? How could I, for example, keep track of all
 scaptrap-type things and be sure I don't miss something because someone
 mistakenly removed the scaptrap tag from a merge and I didn't
 notice/remember (ie: can I *trust* the tag, both what is there and what
 isn't, so I don't have to remember everything/double check every week?).


 I don't think we need to be that granular with the tagging permissions, In
 CR we used to just have it so you needed the Coder right, And i'm not aware
 of many instances where tagging was abused. You just need to have some sort
 of version of whom added/removed the tags.

 Although there were instances where people forgot to tag changes, But that
 is unavoidable no matter what level(/s) of permissions you have.

While I agree that we can model our permissions after what we did with
the old SVN/MediaWiki code review system, I'll note that there were
some other features that made this more complicated under the hood,
such as keeping a history of who tagged what.  It's all stuff we've
done before in MediaWiki, which is likely part of the reason Chad was
tempted to do it there, but doing it in Gerrit may end up being more
complicated.

Rob

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pronunciation recording tool wanted

2013-03-12 Thread Antoine Musso
Le 13/03/13 04:07, K. Peachey wrote:
 That wouldn't be a bad project for GSoC as it isn't too large so it means
 we could actually see some results, And if it was too small, The student
 could probably do a couple of smaller projects (it being one) then focus on
 one after the other.

The smaller big project: get its code deployed on the cluster and
enabled for all wikis!

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Risker
On 12 March 2013 21:15, Ryan Lane rlan...@gmail.com wrote:

 On Tue, Mar 12, 2013 at 4:47 PM, Brion Vibber br...@pobox.com wrote:

  Thanks guys!
 
  -- brion
 
 
 Don't thank us too quick. We needed to revert this for mobile. Seems *.
 m.wikipedia.org was also missing from the cert. Needless to say I'll be
 writing a script that can be run against a cert to ensure it's not missing
 anything. We'll also be adding monitoring to check for invalid certificates
 for any top level domain.


I think it might also be missing some of the small/private wikis.  I got
bad certificate messages for the English Wikipedia Arbcom wiki tonight.

But thanks for working on this.

Risker/Anne
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
You mean: https://arbcom.en.wikipedia.org ?

Our certificates have never covered that. That's a sub-sub domain, and our
certs only cover single subdomains. We really need to rename all of our
sub-sub domains to single subdomains for them to be covered (or we need to
include every sub-subdomain in the unified cert, but that's going to bloat
it).


On Tue, Mar 12, 2013 at 9:18 PM, Risker risker...@gmail.com wrote:

 On 12 March 2013 21:15, Ryan Lane rlan...@gmail.com wrote:

  On Tue, Mar 12, 2013 at 4:47 PM, Brion Vibber br...@pobox.com wrote:
 
   Thanks guys!
  
   -- brion
  
  
  Don't thank us too quick. We needed to revert this for mobile. Seems *.
  m.wikipedia.org was also missing from the cert. Needless to say I'll be
  writing a script that can be run against a cert to ensure it's not
 missing
  anything. We'll also be adding monitoring to check for invalid
 certificates
  for any top level domain.
 
 
 I think it might also be missing some of the small/private wikis.  I got
 bad certificate messages for the English Wikipedia Arbcom wiki tonight.

 But thanks for working on this.

 Risker/Anne
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Risker
Yes, that's the wiki I mean.  And I can see your point about all those
sub-subdomains; there must be a stack of them.  The domain name was changed
fairly recently and we got the bad cert messages then, and added our
exceptions. Tonight we got the messages again.  Perhaps it was because the
subdomain's cert changed.

Risker/Anne

On 13 March 2013 00:30, Ryan Lane rlan...@gmail.com wrote:

 You mean: https://arbcom.en.wikipedia.org ?

 Our certificates have never covered that. That's a sub-sub domain, and our
 certs only cover single subdomains. We really need to rename all of our
 sub-sub domains to single subdomains for them to be covered (or we need to
 include every sub-subdomain in the unified cert, but that's going to bloat
 it).


 On Tue, Mar 12, 2013 at 9:18 PM, Risker risker...@gmail.com wrote:

  On 12 March 2013 21:15, Ryan Lane rlan...@gmail.com wrote:
 
   On Tue, Mar 12, 2013 at 4:47 PM, Brion Vibber br...@pobox.com wrote:
  
Thanks guys!
   
-- brion
   
   
   Don't thank us too quick. We needed to revert this for mobile. Seems *.
   m.wikipedia.org was also missing from the cert. Needless to say I'll
 be
   writing a script that can be run against a cert to ensure it's not
  missing
   anything. We'll also be adding monitoring to check for invalid
  certificates
   for any top level domain.
  
  
  I think it might also be missing some of the small/private wikis.  I got
  bad certificate messages for the English Wikipedia Arbcom wiki tonight.
 
  But thanks for working on this.
 
  Risker/Anne
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikimedia engineering February 2013 report

2013-03-12 Thread Sumana Harihareswara
Hi,

The report covering Wikimedia engineering activities (including upcoming
events, new hires, and open positions) in February 2013 is
now available.  Thanks to Guillaume Paumier, Priyanka Nag, Tilman Bayer
and the engineers who helped put this together.

Wiki version:
https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2013/February
Blog version:
https://blog.wikimedia.org/2013/03/12/engineering-february-2013-report/

We're also providing a shorter, simpler and translatable version of this
report that does not assume specialized technical knowledge:
https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2013/February/summary

Below is the full HTML text of the report, as previously requested.

As always, feedback is appreciated about the usefulness of the report
and its summary, and on how to improve them.

--


Major news in February include:

   - The Wikipedia Zero
projecthttps://blog.wikimedia.org/2013/02/22/getting-wikipedia-to-the-people-who-need-it-most//got
a Knight News Challenge grant.
   - Additional input
methodshttps://blog.wikimedia.org/2013/02/20/report-from-the-spring-2013-open-source-language-summit/were
made available for jQuery.IME.
   - The Translate extension introduced a new iteration of the Translation

Editorhttps://blog.wikimedia.org/2013/02/15/inching-towards-enabling-our-improvements-to-the-translation-user-experience/
   .
   - The Wikimedia mobile web team launched the ability to view or add
   pages
tohttps://blog.wikimedia.org/2013/02/13/follow-your-favorite-wikipedia-pages-on-the-mobile-web/watchlist
— all from mobile devices.
   - Echo is A new notification system for
Wikipediahttps://blog.wikimedia.org/2013/02/07/echo-a-new-notification-system-for-wikipedia/
   .
   - The Technical Operations team found ways to stop
problemshttps://blog.wikimedia.org/2013/02/05/how-the-technical-operations-team-stops-problems-in-their-tracks/in
their tracks.
   - Wikipedia Mobile hit 3 billion monthly page
viewshttps://blog.wikimedia.org/2013/02/01/wikipedia-mobile-hits-3-billion-monthly-page-views/
   .

 *Note: We’re also providing a shorter, simpler and translatable version
of this
reporthttps://www.mediawiki.org/wiki/Wikimedia_engineering_report/2013/February/summarythat
does not assume specialized technical knowledge.
*
Upcoming events

There are many opportunities for you to get involved and contribute to
MediaWiki and technical activities to improve Wikimedia sites, both for
coders and contributors with other talents.

For a more complete and up-to-date list, check out the Project:Calendar.
Date Event Contact  Mar 7 QA: General MediaWiki reports Bug
Triagehttps://www.mediawiki.org/wiki/Bug_management/Triage/20130307
AKlapper https://www.mediawiki.org/wiki/User:AKlapper_%28WMF%29,
Valeriejhttps://www.mediawiki.org/wiki/User:Valeriej  Mar
13 Increase the
backloghttps://www.mediawiki.org/wiki/QA/Browser_testing/Increase_the_backlog:
Given/When/Then explained, with examples from Search tests and suggestions
for more
Zeljko.filipinhttps://www.mediawiki.org/wiki/User:Zeljko.filipin%28WMF%29,
Qgil https://www.mediawiki.org/wiki/User:Qgil,
Cmcmahonhttps://www.mediawiki.org/wiki/User:Cmcmahon%28WMF%29  Mar
18 QA: LiquidThreads (LQT) Bug
Triagehttps://www.mediawiki.org/wiki/Bug_management/Triage/20130318
AKlapper https://www.mediawiki.org/wiki/User:AKlapper_%28WMF%29,
Valeriejhttps://www.mediawiki.org/wiki/User:Valeriej  Mar
19 Office hour about Wikimedia’s issue
trackerhttps://www.mediawiki.org/wiki/Bugzillaand Bug
management https://www.mediawiki.org/wiki/Bug_management in
#wikimedia-officeconnecthttp://webchat.freenode.net/?channels=#wikimedia-office
AKlapper https://www.mediawiki.org/wiki/User:AKlapper_%28WMF%29  Mar
20 SMWCon
Spring 2013 http://semantic-mediawiki.org/wiki/SMWCon_Spring_2013 (New
York City, USA)
 Mar 22 LibrePlanet
https://www.mediawiki.org/wiki/Events/LibrePlanet2013(Cambridge, MA,
USA)
 Mar 25 QA: Collaborate with Weekend Testers Americas: test new tools for
new users (E3)
Cmcmahonhttps://www.mediawiki.org/wiki/User:Cmcmahon%28WMF%29,
Qgil https://www.mediawiki.org/wiki/User:Qgil
 Personnel Work with us https://wikimediafoundation.org/wiki/Work_with_us

Are you looking to work for Wikimedia? We have a lot of hiring coming up,
and we really love talking to active community members about these roles.

   - Software Engineer – Editor
Engagementhttp://hire.jobvite.com/Jobvite/Job.aspx?j=ovvXWfwD
   - Software Engineer –
Parserhttp://hire.jobvite.com/Jobvite/Job.aspx?j=oIsbXfw2
   - Software Engineer –
Appshttp://hire.jobvite.com/Jobvite/Job.aspx?j=oqU0Wfw0
   - Software Engineer –
Mobilehttp://hire.jobvite.com/Jobvite/Job.aspx?j=o4cKWfwG
   - Software Engineer – Multimedia
Systemshttp://hire.jobvite.com/Jobvite/Job.aspx?j=oj40Wfw3
   - Software Engineer – Multimedia User
Interfaceshttp://hire.jobvite.com/Jobvite/Job.aspx?j=ohqbXfwz
   - Software Engineer –

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
On Tue, Mar 12, 2013 at 9:45 PM, Risker risker...@gmail.com wrote:

 Yes, that's the wiki I mean.  And I can see your point about all those
 sub-subdomains; there must be a stack of them.  The domain name was changed
 fairly recently and we got the bad cert messages then, and added our
 exceptions. Tonight we got the messages again.  Perhaps it was because the
 subdomain's cert changed.


Ah. Yes. You're likely to get another round of messages when we get the new
cert in as well. All of the arbcom wikis, and other sub-subdomain wikis are
in the same cluster and use the same IP addresses and as such use the same
certificates. I wonder if we have a bug in about renaming sub-subdomain
wikis

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Jeremy Baron
On Wed, Mar 13, 2013 at 12:50 AM, Ryan Lane rlan...@gmail.com wrote:
 I wonder if we have a bug in about renaming sub-subdomain
 wikis

Let's use Bug 31335 - https://bugzilla.wikimedia.org/31335

-Jeremy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Risker
On 13 March 2013 00:50, Ryan Lane rlan...@gmail.com wrote:

 On Tue, Mar 12, 2013 at 9:45 PM, Risker risker...@gmail.com wrote:

  Yes, that's the wiki I mean.  And I can see your point about all those
  sub-subdomains; there must be a stack of them.  The domain name was
 changed
  fairly recently and we got the bad cert messages then, and added our
  exceptions. Tonight we got the messages again.  Perhaps it was because
 the
  subdomain's cert changed.
 
 
 Ah. Yes. You're likely to get another round of messages when we get the new
 cert in as well. All of the arbcom wikis, and other sub-subdomain wikis are
 in the same cluster and use the same IP addresses and as such use the same
 certificates. I wonder if we have a bug in about renaming sub-subdomain
 wikis



Well, if you need to rename the wiki again, a bit of notice would be
appreciated; it was rather a shock when folks went to log in using
bookmarks, only to find they no longer worked. We still haven't finished
cleaning up links. :-)

But as to security certificates, I do want to thank the team - this does
make a difference for a lot of users.

Risker/Anne
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New unified SSL certificate deployed

2013-03-12 Thread Ryan Lane
On Tue, Mar 12, 2013 at 10:06 PM, Risker risker...@gmail.com wrote:


 Well, if you need to rename the wiki again, a bit of notice would be
 appreciated; it was rather a shock when folks went to log in using
 bookmarks, only to find they no longer worked. We still haven't finished
 cleaning up links. :-)


Hm. Kind of annoying that they were renamed, but weren't renamed to
something that would solve the certificate issues. :(

If we rename them to solve the certificate issues, I'll make sure the
community is notified well in advance.


 But as to security certificates, I do want to thank the team - this does
 make a difference for a lot of users.


Great. Glad to hear it!

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l