Re: [Wikitech-l] wmf-deployment branch

2009-07-14 Thread Gerard Meijssen
Hoi,
When the updates for Wikimedia life have to go into this new branch, what
is then the status of the trunk? Will all the updates *have* to go into
trunk as well and will they ? Or will trunk suffer from bit rot and
consequently will everyone work on this new branch ? What will be the status
of the extensions NOT used by the WMF ?? When an update to a new release
happens, will trunk and this new branch be merged and how will the effects
on extensions be assessed ???

How will this work ?
Thanks,
 GerardM


2009/7/14 Brion Vibber br...@wikimedia.org

 Since we seem to routinely have our deployment linger a bit behind trunk
 but still need to make updates on it, I've gone ahead and created a
 branch with out live versions of MediaWiki  extensions on it:

 http://svn.wikimedia.org/svnroot/mediawiki/branches/wmf-deployment

 Misc coders: please don't commit to it or we break you. :)


 With Subversion 1.5 or later on the client, merging updates from trunk
 should be easier, as SVN now saves merge info into properties and can
 more intelligently figure out where you last left off. For instance to
 update the UsabilityInitiative extension to the current version from trunk:

 cd extensions/UsabilityInitiative
 svn merge \
 svn+ssh://
 svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/UsabilityInitiative
 svn commit


 The theoretical workflow is:
 * Merge updates that need to go live into branches/wmf-deployment
 * Test it locally  on staging server (once we've got a nicer staging
 area set up)
 * Update deployment master: svn up
 * Poke live test on test.wikipedia.org
 * Deploy updates to all servers: scap


 For the moment I've applied the various 'svn up' and manually merged
 live patches from our live deployment, with the exception of any
 _manually added_ files. There's a lot of cruft and I'm not sure yet
 which of the extra files are ancient one-offs and which we really need. :)

 We'll want to test a fresh checkout on test.wikipedia for a bit before
 syncing live to all wikis.

 -- brion vibber (brion @ wikimedia.org)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wmf-deployment branch

2009-07-14 Thread Gerard Meijssen
Hoi,
I  forgot, how will this affect our localisation and internationalisation
??? What version of MediaWiki is it to support.. it does support trunk and
imho that is the right thing..
I seriously do not get it.
Thanks,
  GerardM

2009/7/14 Gerard Meijssen gerard.meijs...@gmail.com

 Hoi,
 When the updates for Wikimedia life have to go into this new branch, what
 is then the status of the trunk? Will all the updates *have* to go into
 trunk as well and will they ? Or will trunk suffer from bit rot and
 consequently will everyone work on this new branch ? What will be the status
 of the extensions NOT used by the WMF ?? When an update to a new release
 happens, will trunk and this new branch be merged and how will the effects
 on extensions be assessed ???

 How will this work ?
 Thanks,
  GerardM


 2009/7/14 Brion Vibber br...@wikimedia.org

 Since we seem to routinely have our deployment linger a bit behind trunk
 but still need to make updates on it, I've gone ahead and created a
 branch with out live versions of MediaWiki  extensions on it:

 http://svn.wikimedia.org/svnroot/mediawiki/branches/wmf-deployment

 Misc coders: please don't commit to it or we break you. :)


 With Subversion 1.5 or later on the client, merging updates from trunk
 should be easier, as SVN now saves merge info into properties and can
 more intelligently figure out where you last left off. For instance to
 update the UsabilityInitiative extension to the current version from
 trunk:

 cd extensions/UsabilityInitiative
 svn merge \
 svn+ssh://
 svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/UsabilityInitiative
 svn commit


 The theoretical workflow is:
 * Merge updates that need to go live into branches/wmf-deployment
 * Test it locally  on staging server (once we've got a nicer staging
 area set up)
 * Update deployment master: svn up
 * Poke live test on test.wikipedia.org
 * Deploy updates to all servers: scap


 For the moment I've applied the various 'svn up' and manually merged
 live patches from our live deployment, with the exception of any
 _manually added_ files. There's a lot of cruft and I'm not sure yet
 which of the extra files are ancient one-offs and which we really need. :)

 We'll want to test a fresh checkout on test.wikipedia for a bit before
 syncing live to all wikis.

 -- brion vibber (brion @ wikimedia.org)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wmf-deployment branch

2009-07-14 Thread Chad
On Tue, Jul 14, 2009 at 7:30 AM, Gerard
Meijssengerard.meijs...@gmail.com wrote:
 Hoi,
 When the updates for Wikimedia life have to go into this new branch, what
 is then the status of the trunk? Will all the updates *have* to go into
 trunk as well and will they ? Or will trunk suffer from bit rot and
 consequently will everyone work on this new branch ? What will be the status
 of the extensions NOT used by the WMF ?? When an update to a new release
 happens, will trunk and this new branch be merged and how will the effects
 on extensions be assessed ???

 How will this work ?
 Thanks,
     GerardM


 2009/7/14 Brion Vibber br...@wikimedia.org

 Since we seem to routinely have our deployment linger a bit behind trunk
 but still need to make updates on it, I've gone ahead and created a
 branch with out live versions of MediaWiki  extensions on it:

 http://svn.wikimedia.org/svnroot/mediawiki/branches/wmf-deployment

 Misc coders: please don't commit to it or we break you. :)


 With Subversion 1.5 or later on the client, merging updates from trunk
 should be easier, as SVN now saves merge info into properties and can
 more intelligently figure out where you last left off. For instance to
 update the UsabilityInitiative extension to the current version from trunk:

 cd extensions/UsabilityInitiative
 svn merge \
 svn+ssh://
 svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/UsabilityInitiative
 svn commit


 The theoretical workflow is:
 * Merge updates that need to go live into branches/wmf-deployment
 * Test it locally  on staging server (once we've got a nicer staging
 area set up)
 * Update deployment master: svn up
 * Poke live test on test.wikipedia.org
 * Deploy updates to all servers: scap


 For the moment I've applied the various 'svn up' and manually merged
 live patches from our live deployment, with the exception of any
 _manually added_ files. There's a lot of cruft and I'm not sure yet
 which of the extra files are ancient one-offs and which we really need. :)

 We'll want to test a fresh checkout on test.wikipedia for a bit before
 syncing live to all wikis.

 -- brion vibber (brion @ wikimedia.org)

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


This branch isn't where main development goes. Brion clearly laid
out that non-WMF people (ie me and most of the other volunteer
devs) shouldn't touch this branch. All main development continues
on trunk, I don't know why you'd think it would rot.

The main purpose of this branch is to provide a clearly visible snapshot
of what is running on the WMF cluster. This is never HEAD from trunk,
it's always a bit behind. Some critical bugfixes and hacks get pushed
in. Now we can all see when this happens, instead of just sysadmins.

I would imagine this new branch would be updated whenever A) code
has been reviewed and needs merging to WMF and B) whenever some
hotfix was put in place.

Some WMF hacks make it into trunk code because they're generally
helpful. Some do not.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] wmf-deployment branch

2009-07-14 Thread Gerard Meijssen
Hoi,
Almost every day the localisations from translatewiki.net into trunk. This
new branch cannot just these localisations from the trunk SVN because they
can be different. The key question for me is how is this going to work. I
do not see it. The LocalisationUpdate extension might do the trick ...
Thanks,
  GerardM

2009/7/14 Chad innocentkil...@gmail.com

 On Tue, Jul 14, 2009 at 7:30 AM, Gerard
 Meijssengerard.meijs...@gmail.com wrote:
  Hoi,
  When the updates for Wikimedia life have to go into this new branch,
 what
  is then the status of the trunk? Will all the updates *have* to go into
  trunk as well and will they ? Or will trunk suffer from bit rot and
  consequently will everyone work on this new branch ? What will be the
 status
  of the extensions NOT used by the WMF ?? When an update to a new release
  happens, will trunk and this new branch be merged and how will the
 effects
  on extensions be assessed ???
 
  How will this work ?
  Thanks,
  GerardM
 
 
  2009/7/14 Brion Vibber br...@wikimedia.org
 
  Since we seem to routinely have our deployment linger a bit behind trunk
  but still need to make updates on it, I've gone ahead and created a
  branch with out live versions of MediaWiki  extensions on it:
 
  http://svn.wikimedia.org/svnroot/mediawiki/branches/wmf-deployment
 
  Misc coders: please don't commit to it or we break you. :)
 
 
  With Subversion 1.5 or later on the client, merging updates from trunk
  should be easier, as SVN now saves merge info into properties and can
  more intelligently figure out where you last left off. For instance to
  update the UsabilityInitiative extension to the current version from
 trunk:
 
  cd extensions/UsabilityInitiative
  svn merge \
  svn+ssh://
 
 svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/UsabilityInitiative
  svn commit
 
 
  The theoretical workflow is:
  * Merge updates that need to go live into branches/wmf-deployment
  * Test it locally  on staging server (once we've got a nicer staging
  area set up)
  * Update deployment master: svn up
  * Poke live test on test.wikipedia.org
  * Deploy updates to all servers: scap
 
 
  For the moment I've applied the various 'svn up' and manually merged
  live patches from our live deployment, with the exception of any
  _manually added_ files. There's a lot of cruft and I'm not sure yet
  which of the extra files are ancient one-offs and which we really need.
 :)
 
  We'll want to test a fresh checkout on test.wikipedia for a bit before
  syncing live to all wikis.
 
  -- brion vibber (brion @ wikimedia.org)
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 

 This branch isn't where main development goes. Brion clearly laid
 out that non-WMF people (ie me and most of the other volunteer
 devs) shouldn't touch this branch. All main development continues
 on trunk, I don't know why you'd think it would rot.

 The main purpose of this branch is to provide a clearly visible snapshot
 of what is running on the WMF cluster. This is never HEAD from trunk,
 it's always a bit behind. Some critical bugfixes and hacks get pushed
 in. Now we can all see when this happens, instead of just sysadmins.

 I would imagine this new branch would be updated whenever A) code
 has been reviewed and needs merging to WMF and B) whenever some
 hotfix was put in place.

 Some WMF hacks make it into trunk code because they're generally
 helpful. Some do not.

 -Chad

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wmf-deployment branch

2009-07-14 Thread Roan Kattouw
2009/7/14 Gerard Meijssen gerard.meijs...@gmail.com:
 Hoi,
 Almost every day the localisations from translatewiki.net into trunk. This
 new branch cannot just these localisations from the trunk SVN because they
 can be different. The key question for me is how is this going to work. I
 do not see it. The LocalisationUpdate extension might do the trick ...
Using LocalisationUpdate is probably best. Please note that the
intention of the new branch is to *reflect* the state of the code
running live and to make it easier for ops people to push stuff live
(mess around in the branch, commit, update on the servers). The policy
concerning pushing changes *will not change* unless a senior dev says
so; it is not affected by the introduction of this branch in any way,
merely simplified.

Roan Kattouw (Catrope)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] wmf-deployment branch

2009-07-14 Thread Brion Vibber
Gerard Meijssen wrote:
 Hoi,
 When the updates for Wikimedia life have to go into this new branch, what
 is then the status of the trunk? Will all the updates *have* to go into
 trunk as well and will they ? Or will trunk suffer from bit rot and
 consequently will everyone work on this new branch ? What will be the status
 of the extensions NOT used by the WMF ?? When an update to a new release
 happens, will trunk and this new branch be merged and how will the effects
 on extensions be assessed ???

There is *ABSOLUTELY NO* difference from what has been going on 
previously, except that the state of our deployment checkout will 
actually be reproducible and trackable instead of mysterious and opaque.

As this is information you frequently ask us to be more open about, I 
expect you should be happy about it. :)

Here's how it works:

1) ALL DEVELOPMENT IS DONE ON TRUNK

2) The wmf-deployment branch contains the snapshot of trunk (plus 
partial upgrades/downgrades or other temporary hacks) that we're running 
live.

Previously the state of 2) was hidden in our working directory on the 
servers and could not be reproduced easily when we're in between 
complete pure syncs from trunk.

-- brion


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is this the right list to ask questions about parserTests

2009-07-14 Thread dan nessett

Can anyone tell me which of the parser tests are supposed to fail? Also, is 
there a trunk version for which only these tests fail?

--- On Fri, 7/10/09, Aryeh Gregor simetrical+wikil...@gmail.com wrote:

 From: Aryeh Gregor simetrical+wikil...@gmail.com
 Subject: Re: [Wikitech-l] Is this the right list to ask questions about 
 parserTests
 To: Wikimedia developers wikitech-l@lists.wikimedia.org
 Date: Friday, July 10, 2009, 3:49 PM
 On Fri, Jul 10, 2009 at 6:35 PM, dan
 nessettdness...@yahoo.com
 wrote:
  I don't want to irritate people by asking
 inappropriate questions on this list. So please direct me to
 the right list if this is the wrong one for this question.
 
  I ran parserTests and 45 tests failed. The result
 was:
 
  Passed 559 of 604 tests (92.55%)... 45 tests failed!
 
  I expect this indicates a problem, but sometimes test
 suites are set up so certain tests fail. Is this result good
 or bad?
 
 We usually have about 14 failures.  We should really
 be able to mark
 them as expected, but our testing framework doesn't support
 that at
 the moment.  The current workaround is to use --record
 and --compare,
 but that's a pain for a few reasons.
 
 I get 49 test failures.  It looks like someone broke a
 lot of stuff.
 It happens; frankly, we don't take testing too seriously
 right now.
 There are no real automated warnings.  Brion used to
 have a bot post
 parser test results daily to wikitech-l, but that was
 discontinued.
 So people tend to break parser tests without noticing.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 


  

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is this the right list to ask questions about parserTests

2009-07-14 Thread Aryeh Gregor
On Tue, Jul 14, 2009 at 5:16 PM, dan nessettdness...@yahoo.com wrote:
 Can anyone tell me which of the parser tests are supposed to fail? Also, is 
 there a trunk version for which only these tests fail?

These are the perpetual failures:

  13 still FAILING test(s) :(
  * Table security: embedded pipes
(http://lists.wikimedia.org/mailman/htdig/wikitech-l/2006-April/022293.html)
 [Has never passed]
  * Link containing double-single-quotes '' (bug 4598)  [Has never passed]
  * HTML bullet list, unclosed tags (bug 5497)  [Has never passed]
  * HTML ordered list, unclosed tags (bug 5497)  [Has never passed]
  * HTML nested bullet list, open tags (bug 5497)  [Has never passed]
  * HTML nested ordered list, open tags (bug 5497)  [Has never passed]
  * Inline HTML vs wiki block nesting  [Has never passed]
  * dt/dd/dl test  [Has never passed]
  * Images with the | character in the comment  [Has never passed]
  * Bug 6200: paragraphs inside blockquotes (no extra line breaks)
 [Has never passed]
  * Bug 6200: paragraphs inside blockquotes (extra line break on
open)  [Has never passed]
  * Bug 6200: paragraphs inside blockquotes (extra line break on
close)  [Has never passed]
  * Bug 6200: paragraphs inside blockquotes (extra line break on
open and close)  [Has never passed]

r51509 is a revision on which they're the only failures, but it's
pretty old (there's probably a somewhat more recent one).  The
breakage looks like it occurred in r52213 and r52726, according to

git bisect start trunk `git svn find-rev r51509`  git bisect run php
phase3/maintenance/parserTests.php --regex 'Section headings with TOC'
git bisect start trunk `git svn find-rev r51509`  git bisect run php
phase3/maintenance/parserTests.php --regex 'references after
gallery'

(yay git!).

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is this the right list to ask questions about parserTests

2009-07-14 Thread dan nessett

Thanks. From your response I'm not sure if these tests are supposed to fail 
(there are test suites that have tests like that) or they are supposed to 
succeed but there are bugs in the parser or other code that cause them to fail. 
Can you clarify?

--- On Tue, 7/14/09, Aryeh Gregor simetrical+wikil...@gmail.com wrote:

 From: Aryeh Gregor simetrical+wikil...@gmail.com
 Subject: Re: [Wikitech-l] Is this the right list to ask questions about 
 parserTests
 To: Wikimedia developers wikitech-l@lists.wikimedia.org
 Date: Tuesday, July 14, 2009, 3:40 PM
 On Tue, Jul 14, 2009 at 5:16 PM, dan
 nessettdness...@yahoo.com
 wrote:
  Can anyone tell me which of the parser tests are
 supposed to fail? Also, is there a trunk version for which
 only these tests fail?
 
 These are the perpetual failures:
 
   13 still FAILING test(s) :(
       * Table security: embedded pipes
 (http://lists.wikimedia.org/mailman/htdig/wikitech-l/2006-April/022293.html)
  [Has never passed]
       * Link containing double-single-quotes
 '' (bug 4598)  [Has never passed]
       * HTML bullet list, unclosed tags (bug
 5497)  [Has never passed]
       * HTML ordered list, unclosed tags
 (bug 5497)  [Has never passed]
       * HTML nested bullet list, open tags
 (bug 5497)  [Has never passed]
       * HTML nested ordered list, open tags
 (bug 5497)  [Has never passed]
       * Inline HTML vs wiki block
 nesting  [Has never passed]
       * dt/dd/dl test  [Has never
 passed]
       * Images with the | character in the
 comment  [Has never passed]
       * Bug 6200: paragraphs inside
 blockquotes (no extra line breaks)
  [Has never passed]
       * Bug 6200: paragraphs inside
 blockquotes (extra line break on
 open)  [Has never passed]
       * Bug 6200: paragraphs inside
 blockquotes (extra line break on
 close)  [Has never passed]
       * Bug 6200: paragraphs inside
 blockquotes (extra line break on
 open and close)  [Has never passed]
 
 r51509 is a revision on which they're the only failures,
 but it's
 pretty old (there's probably a somewhat more recent
 one).  The
 breakage looks like it occurred in r52213 and r52726,
 according to
 
 git bisect start trunk `git svn find-rev r51509` 
 git bisect run php
 phase3/maintenance/parserTests.php --regex 'Section
 headings with TOC'
 git bisect start trunk `git svn find-rev r51509` 
 git bisect run php
 phase3/maintenance/parserTests.php --regex
 'references after
 gallery'
 
 (yay git!).
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 


  

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is this the right list to ask questions about parserTests

2009-07-14 Thread Aryeh Gregor
On Tue, Jul 14, 2009 at 6:50 PM, dan nessettdness...@yahoo.com wrote:
 Thanks. From your response I'm not sure if these tests are supposed to fail 
 (there are test suites that have tests like that) or they are supposed to 
 succeed but there are bugs in the parser or other code that cause them to 
 fail. Can you clarify?

They're supposed to pass, in theory, but never have.  Someone wrote
the tests and the expected output at some point as a sort of to-do
list.  I don't know why we keep them, since they just confuse
everything and make life difficult.  (Using the --record and --compare
options helps, but they're not that convenient.)  All of them would
require monkeying around with the parser that nobody's willing to do,
since the parser is a hideous mess that no one understands or wants to
deal with unless absolutely necessary.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] The prospects for Theora on the iPhone or iPod Touch

2009-07-14 Thread David Gerard
Forwarded with permission of the author - he emailed this privately
after a comment on the whatwg list.

Summary: Theora video on iPhone is not going to be easy even with a
volunteer to write it - the first and second generation iPhone/iPodt
Touch CPUs aren't up to the task. So, Theora fans have a new puzzle to
try: get an anaemic ARM to decode Theora fast enough to be useful!

When I asked if I could forward this here, he said to feel free and also noted:

I've been hanging out in irc to improve ffmpeg's Theora decoder, but
arm11 is going to be especially hard for any Theora decoder due to the
lack of L2 cache and only 16k of L1 data cache.


- d.



-- Forwarded message --
From: David Conrad lesse...@gmail.com
Date: 2009/7/14
Subject: Re: [whatwg] HTML 5 video tag questions
To: dger...@gmail.com


Hi David,

On Jul 13, 2009, at 2:09 PM, David Gerard wrote:

 iPhone Safari users (does iPhone Safari support video yet?) are,
 unfortunately, out in the cold until someone writes a Wikimedia client
 app that does Theora for them. That won't be us unless a volunteer
 steps up.

First of all, iPhone Safari does indeed recognize the video tag, but
treats it essentially the same way as object, in that it uses its
own controls and plays the video completely separate from the web
page. Of course, not much else really makes sense on a small screen.

I recently investigated how feasible it would be to create a Theora
video player for the iPhone/iPod touch and found the following
shortcomings (targeting a iPod touch 1g):

- libtheora-thusnelda can only get 23 fps on the 640x272 Transformers
trailer used at Dailymotion's HTML5 demo decoding to /dev/null. Given
the weak simd capabilities of the arm11, I doubt that this could be
sped up by more than 20%, and I think 10% is a more likely upper
bound.
- The only iPhone API for displaying frames that is fast enough for
video is OpenGL ES 1.1, which requires each frame to be converted to
RGB, padded to power of two dimensions, and then a blocking copy to
video memory. All of this adds significant overhead.

All in all, I think it may not be possible to play Theora much larger
than CIF on current iPod touch or any iPhone other than the iPhone
3gs. The iPhone 3gs (and likely this year's iPod touch), however, has
a much more powerful Cortex-A8 and also supports OpenGL ES 2.0,
eliminating the need for a CPU yuv - rgb conversion and padding to
power of two dimensions. This should be more than sufficient for SD
Theora; the Transformers clip currently decodes at 52 fps to /dev/null
on a BeagleBoard with some NEON optimizations.

So, I'm shelving this for now. I might pick it back up once an iPod
touch with a Cortex-A8 is released this September, but I thought you
might be interested in my findings anyway.

-David

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is this the right list to ask questions about parserTests

2009-07-14 Thread dan nessett

Hm. Sounds like an opportunity. How about Mediawiki issuing a grand challenge. 
Create a well-documented/structured (open source) parser that produces the same 
results as the current parser on 98% of Wikipedia pages. The prize is bragging 
rights and a letter of commendation from someone or other. I suspect there are 
a bunch of graduate students out there that would find the challenge 
interesting.

Rationalizing the parser would help the development process. For the 2% of the 
pages that fail, challenge others to fix them. They key is not getting stuck in 
the we need a formal syntax debate. If the challengers want to create a 
formal syntax that is up to them. Mediawiki should only be interested in the 
final results.

--- On Tue, 7/14/09, Aryeh Gregor simetrical+wikil...@gmail.com wrote:
 
 They're supposed to pass, in theory, but never have. 
 Someone wrote
 the tests and the expected output at some point as a sort
 of to-do
 list.  I don't know why we keep them, since they just
 confuse
 everything and make life difficult.  (Using the
 --record and --compare
 options helps, but they're not that convenient.)  All
 of them would
 require monkeying around with the parser that nobody's
 willing to do,
 since the parser is a hideous mess that no one understands
 or wants to
 deal with unless absolutely necessary.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 


  

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is this the right list to ask questions about parserTests

2009-07-14 Thread Aryeh Gregor
On Tue, Jul 14, 2009 at 7:36 PM, dan nessettdness...@yahoo.com wrote:
 Hm. Sounds like an opportunity. How about Mediawiki issuing a grand 
 challenge. Create a well-documented/structured (open source) parser that 
 produces the same results as the current parser on 98% of Wikipedia pages. 
 The prize is bragging rights and a letter of commendation from someone or 
 other. I suspect there are a bunch of graduate students out there that would 
 find the challenge interesting.

I suspect nobody's going to stand a chance without funding.

$ cat includes/parser/*.php | wc -l
11064

That's not the kind of thing most people write for an interesting challenge.

Also, you realize that 2% of pages would mean 350,000 pages on the
English Wikipedia alone?  Probably a million pages across all
Wikimedia wikis?  And who knows how many if you include third-party
wikis?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The prospects for Theora on the iPhone or iPod Touch

2009-07-14 Thread Brion Vibber
David Gerard wrote:
 Forwarded with permission of the author - he emailed this privately
 after a comment on the whatwg list.
 
 Summary: Theora video on iPhone is not going to be easy even with a
 volunteer to write it - the first and second generation iPhone/iPodt
 Touch CPUs aren't up to the task. So, Theora fans have a new puzzle to
 try: get an anaemic ARM to decode Theora fast enough to be useful!
 
 When I asked if I could forward this here, he said to feel free and also 
 noted:
 
 I've been hanging out in irc to improve ffmpeg's Theora decoder, but
 arm11 is going to be especially hard for any Theora decoder due to the
 lack of L2 cache and only 16k of L1 data cache.

Thanks for the research  update! Would certainly be interesting to see 
if it can be done, even if only on the newer devices.

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How should I generate static OSM maps for Wikimedia with the SlippyMap extension?

2009-07-14 Thread Brion Vibber
Ævar Arnfjörð Bjarmason wrote:
 Hello there, long time no see:)
 
 In the last few days I've been working on the project of getting
 OpenStreetMap onto Wikimedia as outlined here:

Woohoo!

 Anyway, one thing standing between us and world domination is
 rendering those static maps, I'm going to implement this but first I'd
 like to get comments on *how* we'd like to do it, so I've written a
 plan for doing it:
 
 http://www.mediawiki.org/wiki/Extension:SlippyMap/Static_map_generation

I added a couple quickie notes on the talk page...

It might be good to use a Google Maps Static-like model here; the 
MediaWiki end can just make itself an img and stick all the relevant 
info onto the URL, calling out to the static map server.

Having a basic time-based expiration on rendered images is probably fine 
for our needs (and easy to handle, especially if we let the squids deal 
with keeping the static images around!)

-- brion

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is this the right list to ask questions about parserTests

2009-07-14 Thread dan nessett

Well, its just an idea. I'm not going to bet my house on its acceptance. But, 
here are some thoughts why it might work.

Mediawiki powers an awful lot of wikis, some used by businesses that cannot 
afford instability in its operation. It is in their interest to ensure it 
remains maintainable. So, they might be willing to provide some funding. In 
addition I'm sure Mediawiki is used by some parts of the government (both US 
and other countries), so there might be some funding available through those 
channels.

As to whether it is an interesting challenge, I agree writing a new parser in 
and of itself isn't. But, reengineering a heavily used software product that 
has to keep working during the process is a significant software reengineering 
headache. I once worked on a system that attempted to do that and we failed. It 
took us 10 years to transition (we actually got it into production for while) 
and by that time everything had changed. They ultimately through it away. The 
grand challenge is to do rapid software reengineering.

In regards to the 2%, you could stipulate that the solution must provide tools 
to automatically convert the 2% (or the vast majority of them).

Anyway, its only an idea. I think the biggest impediment is it requires someone 
with both a commitment to it and significant juice to spearhead it. That is 
probably why it wouldn't work.

--- On Tue, 7/14/09, Aryeh Gregor simetrical+wikil...@gmail.com wrote:

 From: Aryeh Gregor simetrical+wikil...@gmail.com
 Subject: Re: [Wikitech-l] Is this the right list to ask questions about 
 parserTests
 I suspect nobody's going to stand a chance without
 funding.
 
 $ cat includes/parser/*.php | wc -l
 11064
 
 That's not the kind of thing most people write for an
 interesting challenge.
 
 Also, you realize that 2% of pages would mean 350,000 pages
 on the
 English Wikipedia alone?  Probably a million pages
 across all
 Wikimedia wikis?  And who knows how many if you
 include third-party
 wikis?
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 


  

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Is this the right list to ask questions about parserTests

2009-07-14 Thread Ævar Arnfjörð Bjarmason
On Wed, Jul 15, 2009 at 1:24 AM, dan nessettdness...@yahoo.com wrote:
 Mediawiki powers an awful lot of wikis, some used by businesses that cannot 
 afford instability in its operation. It is in their interest to ensure it 
 remains maintainable. So, they might be willing to provide some funding. In 
 addition I'm sure Mediawiki is used by some parts of the government (both US 
 and other countries), so there might be some funding available through those 
 channels.

Lots of people using your software does not translate into funds for
you, even if it's mission critical for those users.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Maps-l] How should I generate static OSM maps for Wikimedia with the SlippyMap extension?

2009-07-14 Thread Ævar Arnfjörð Bjarmason
On Wed, Jul 15, 2009 at 12:42 AM, Brion Vibberbr...@wikimedia.org wrote:
 Ævar Arnfjörð Bjarmason wrote:
 Hello there, long time no see:)

 In the last few days I've been working on the project of getting
 OpenStreetMap onto Wikimedia as outlined here:

 Woohoo!

 Anyway, one thing standing between us and world domination is
 rendering those static maps, I'm going to implement this but first I'd
 like to get comments on *how* we'd like to do it, so I've written a
 plan for doing it:

     http://www.mediawiki.org/wiki/Extension:SlippyMap/Static_map_generation

 I added a couple quickie notes on the talk page...

Ah, thanks, to reply to this:

 Static maps of arbitrary size presumably have much the same problem here as 
 the map tile images. How is it handled there?

The same way I want to do static map generation. Just put an arbitrary
expiry time on it  serve it to the client.

 It might be good to use a Google Maps Static-like model here; the
 MediaWiki end can just make itself an img and stick all the relevant
 info onto the URL, calling out to the static map server.

That's a very good idea, but I'd been assuming that I wouldn't be able
to do that -- that each apache server was supposed to do things like
EasyTimeline generation / image rescaling and likewise static map
generation on its own  write it to the shared image NFS.

But just being able to call out to an existing server makes things a
whole lot easier.

Then we can just run dedicated rendering machines with the apaches
being dumb about all this crazy map stuff.

This also means that we can set up the Wikimedia Deutschland servers
to do tile rendering and then easily test it on a big wiki (like
dewiki) by just enabling the SlippyMap extension which won't do
anything more fancy than point static/tile URLs to the external
service.

So yay!

 Either storing it on local disk there or simply leaving it to be cached by 
 upper-level squids

Throwing it at the squids and making it their problem would be simpler
for *me* at least.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l