There are multiple readings of Assume Good Faith. I think pi zero was
pointing out that it can be used to justify 'violent' communications. Oh,
sure, it might seem like I just punched you in the nose, but you must AGF
and respond as I were just trying to kill a mosquito that happened to have
On Tue, Feb 18, 2014 at 10:20 AM, Vito vituzzu.w...@gmail.com wrote:
In my experience 50% of people asking to AGF'em are actually in bad faith
:D
And I guess what I'm saying is that perhaps we should be focusing our
attention on WP:CIVIL and other rules which are supposed to protect against
An apropos link from daringfireball today:
http://www.jordanm.co.uk/tinytype
lists the available 'system fonts' on iOS/Android/Windows Phone/Blackberry.
For communication purposes, it would be great to fork it (
https://github.com/jordanmoore/tinytype) and add the available default
system fonts
Password hashing algorithms are not the same as general hash algorithms. I
would prefer we didn't use whirlpool; it is recommended by NESSIE and ISO
as a hash function, but as a password hash. CWE916 recommends bcrypt,
scrypt, and PBKDF2 specifically for password hashing.
To be clear, I have
I did ask the language group for a list of WMF fonts at the outset. I am
still very open to any guidance. That said, web fonts have very different
requirements than high quality typographic fonts. As one example, web
fonts need to have small file sizes. File size doesn't matter for
typographic
Answering my own questions, partially:
The canonical list of fonts available through ULS is:
https://git.wikimedia.org/tree/mediawiki%2Fextensions%2FUniversalLanguageSelector/master/data%2Ffontrepo%2Ffonts
Each directory has a font.ini file which specifies the canonical source of
the font, as
Yes, the prefix for the hebrew wiki is actually hewiki, not he.
Other than that, your command line looks correct.
--scott
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I don't happen to know the Fedora packages, but I'd be glad to add
them to the README when you get a set which works.
Unfortunately, the hard parts are identifying all the font packages,
as package names for non-latin fonts seem to be wildly inconsistent
even between different Ubuntu releases. :(
Yes, zhwiki is still an issue because of LanguageConverter. I will be
fixing that issue in both Parsoid and the PDF renderer. (As soon as I
fix some long-standing bugs in image handling for Parsoid/VE.)
It's a bit tough to test the renderer on-line at the moment, because
you have to import your
Will the default UI change after the upgrade? I've grown quite used
to the current way Gerrit looks. gerrithub (for instance) has the
New Screen as the default diff view, which I found very
disorienting. (If the Server default does change, you can change it
back under Preferences Change View
Amir, Gerard:
The easiest way to test locally at the moment is to use the standalone
'mw-ocg-bundler' and 'mw-ocg-latexer' node packages. There are good
installation instructions in the READMEs, see:
https://npmjs.org/package/mw-ocg-bundler
https://npmjs.org/package/mw-ocg-latexer
and let me
fwiw, i've published mw-ocg-bundler, mw-ocg-latexer, mw-ocg-texter
(from the upcoming PDF backend rewrite) under my own credentials as
well. npm allows multiple owners of a module, so if WMF ever does
decide to establish a generic WMF account for this, we can easily add
it.
--scott
(parsoid
Sure, I'd love to look at your code. Hopefully we can avoid
reinventing the wheel *too* many times. Is it available some where?
Or a written report?
--scott
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On Mon, Nov 25, 2013 at 10:49 PM, Santhosh Thottingal
santhosh.thottin...@gmail.com wrote:
To support complex
scriptshttps://en.wikipedia.org/wiki/Complex_text_layout we
need to use a tex system that can support Unicode and complex script
rendering system. Xetex
On Tue, Dec 10, 2013 at 3:06 PM, C. Scott Ananian
canan...@wikimedia.org wrote:
Could you take a look at the attached PDF, generated from
https://ml.wikipedia.org/wiki/%E0%B4%AE%E0%B4%B2%E0%B4%AF%E0%B4%BE%E0%B4%B3%E0%B4%82
with our not-yet-deployed new software? Any Malayam-specific feedback
Yes, I needed to turn off visual editor because it was altering the
section header 'edit' text which was causing parser tests to fail. I
fixed it like this: https://gerrit.wikimedia.org/r/84436
--scott
___
Wikitech-l mailing list
On Wed, Nov 27, 2013 at 9:52 AM, Bjoern Hassler bjohas...@gmail.com wrote:
could I check whether this new process would pick up formatting inserted via
css styles, e.g. attached to a span or div?
On our mediawiki (http://www.oer4schools.org) we use a handful of different
css styles to provide
The new PDF rendering pipeline does indeed use XeLaTeX. I haven't
used it to typeset non-latin scripts since a summer I spent at SIL in
1996 (and that might have been Omega, not XeLaTeX), so if you wanted
to pitch in and help out I'd greatly appreciate it. To start with,
short example LaTeX
Let me talk a little bit about the bundle format, briefly:
* It is intended to be a complete copy of all wiki resources required to
make an offline dump, in any format. That means that all the articles are
spidered and template-expanded and all related images and other media are
fetched and
The new PDF rendering pipeline includes a new wikitext to latex
converter, based on the Parsoid parser. You might want to check out:
https://git.wikimedia.org/summary/mediawiki%2Fextensions%2FCollection%2FOfflineContentGenerator%2Fbundler
and
Let's see what sorts of bugs crop up? In my (limited) experience, the most
common issues are probably article content which renders poorly as a PDF
for some reason. Those bugs aren't easy to fix in a bug day sprint, since
they tend to crop up slowly over time as people use the service and
On Mon, Nov 11, 2013 at 1:51 AM, Tim Starling tstarl...@wikimedia.orgwrote:
On 08/11/13 03:40, C. Scott Ananian wrote:
Basically, every major piece of WP should have a module owner.
[...]
Certain people 'own' larger collections of modules -- like there are
subsystem owners in the linux
And I'll add that there's another axis: gwicke (and others?) have been
arguing for a broader collection of services architecture for mw. This
would decouple some of the installability issues. Even if PDF rendering
(say) was a huge monster, Jimmy MediaWiki might still be able to simply
install
Yeah we've been running 0.10 in development for Parsoid for a while. So no
problems expected... other than unpredictable load gremlins or some such.
It sounds like gwicke's plan is to ramp up the load gradually to try to
head that off.
--scott
On Nov 13, 2013 6:02 PM, Matthew Walker
On Sat, Nov 9, 2013 at 1:24 AM, Matthew Flaschen mflasc...@wikimedia.orgwrote:
Assignment is useful to me for multiple reasons, but the most important
are:
1. Marking things I'm definitely going to work on (while trying to avoid
cookie-licking). Importantly, I can unmark myself as assigned,
.
That is, perhaps there's a preference to see 'advanced fields' which
defaults to off for everyone (and 'on' for members of the Platform team?).
New users won't see fields which (a) they can't change and (b) don't
matter anyway.
--scott
On Sat, Nov 9, 2013 at 9:40 AM, C. Scott Ananian canan
On Sat, Nov 9, 2013 at 9:40 AM, C. Scott Ananian canan...@wikimedia.orgwrote:
On Sat, Nov 9, 2013 at 1:24 AM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:
Assignment is useful to me for multiple reasons, but the most important
are:
1. Marking things I'm definitely going to work
Again, as far as I'm concerned, we're solving a non-problem. Yes, in
theory it would be nice if everyone easily got all permissions to
everything. But bugzilla is not set up with the antivandal features needed
to make that work. Instead, we should aim to make it possible for
newcomers to easily
On Fri, Nov 8, 2013 at 12:38 AM, Erik Moeller e...@wikimedia.org wrote:
What might have some degree of traction, based on the
discussion, is to have some blessed delegation coming from the
original triumvirate of architects.
+1
I'd personally like to see this kept flexible, so that it's
I've used cache manifests for offline applications. It works reasonably
well in that situation. So I wouldn't dismiss it entirely for all purposes.
But it's not the right solution here.
--scott
___
Wikitech-l mailing list
What should I be aware of as a developer? That is, if I'm running a local
MW and hacking on a resources (an extension / JavaScript / etc) will things
Just Work? Or should I take active steps to disable Module Storage so that
I don't inadvertently run the 'old' version of something I'm hacking
On Wed, Nov 6, 2013 at 10:13 AM, Brion Vibber bvib...@wikimedia.org wrote:
* However, I would consider avoiding using the term Architect for its
members as it's easily conflated with existing WMF job titles. I think job
titles are pretty unreliable indicators at the best of times, and of
On Thu, Nov 7, 2013 at 11:44 AM, Quim Gil q...@wikimedia.org wrote:
Is your proposal different from
https://www.mediawiki.org/wiki/Developers/Maintainers ?
No, it builds on it. The current wiki page isn't official, nor complete.
I'm suggesting that we embrace it officially, and that we
I'm a little puzzled here: this whole discussion is because new owners want
to have the bug actually assigned to them, instead of just commenting, I'm
working on this in the bug?
Let's look at the github model -- there's no assignment at all. I just
file a bug, maybe make some comments on it to
On Mon, Oct 21, 2013 at 4:36 PM, Jon Robson jdlrob...@gmail.com wrote:
Having mobile just joined it the only feedback I can give so far is it
is confusing knowing what is where but I'm not quite sure how to
improve that confusion yet other than having a gerrit page which tells
me what is
On Thu, Nov 7, 2013 at 12:22 PM, Jon Robson jdlrob...@gmail.com wrote:
I'm not sure why shift reloading would make the cache grow since the
same page should load exactly the same modules - if that's not the
case that points at a bug somewhere.
Seems like a bug, to me.
That said since
Even us technical folks are often ignorant of the deeper ways of ops. When
I last fixed a long-standing bug in the PHP parser with the potential to
cause regressions in existing wikitext, it was not exactly trivial to keep
track of where the code was currently live (and exactly when it went live)
Could we do an end run around to fix this? For example add a
Release-Notes: this text shows up in the release notes
field to the commit. Before release branching, some slightly-fancier
variant of git log | egrep '^Release-Notes:' gets run to automatically
populate the release notes with the
On Thu, Nov 7, 2013 at 1:16 PM, Bartosz Dziewoński matma@gmail.comwrote:
On Thu, 07 Nov 2013 19:03:33 +0100, C. Scott Ananian
canan...@wikimedia.org wrote:
Could we do an end run around to fix this? For example add a
Release-Notes: this text shows up in the release notes
field
I think the issue isn't really improve the existing packages so much as
package more extensions with useful configurations so that, for example,
I could install mediawiki-visualeditor and have mediawiki-parsoid installed
and properly configured as well.
But point taken: this (part of the)
One intermediate position might be for WMF to distribute virtual machine
images which can easily be installed on any one of a number of different
hosting services. OpenStack/Glance appears to be one such system, although
ops probably knows better. The current efforts with puppet and vagrant are
On Tue, Oct 1, 2013 at 12:13 PM, David Gerard dger...@gmail.com wrote:
That would be lovely! So that stuff will be stable and present for the
next LTS, as is likely to be used for Debian's package?
I know you're trolling, but: presumably wikimedia would start by
maintaining their own apt
http://www.timeanddate.com/worldclock/fixedtime.html?iso=20131002T06
That's 2am EDT, 11pm PDT, etc.
--scott
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
i've also been writing the togetherjs extension out of tree from v.e. so
have a little bit of experience there. a couple more apis could be
stabilized to help extensions find ve instances on a page
--scott
On Wed, Sep 25, 2013 at 5:55 PM, Roan Kattouw roan.katt...@gmail.comwrote:
On Tue,
Can we get the bot to pester people if patches sit too long unreviewed?
That's the other thing that might be off-putting to a new contributor.
--scott
On Sep 24, 2013 3:24 PM, Sumana Harihareswara suma...@wikimedia.org
wrote:
Public service reminder: consider adding yourself to
Our core developers get a lot of patches to review. Once something drops
down past the first page in gerrit, it's lost forever.
Backing up here, I'm assuming that our focus is on fostering new
developers. Old hands know who to add as reviewers, and know to pester
them if they don't get a review
Federico: providing information on the archive.org mirror (probably on
https://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps#Media)
along with any suggestions or ettiquette for using these would
probably
be useful.
--scott
___
I added Jeremy's helpful tips to
https://en.wikipedia.org/wiki/Wikipedia:Database_download#Where_are_images_and_uploaded_files--
feel free to improve these/reference them from other appropriate
places,
etc.
--scott
___
Wikitech-l mailing list
FWIW, categories can also be very useful for generating offline wiki
snapshots. For example, for education I might want to ensure that my
wikislice has all the articles related to chemical elements. One
reasonable way to do this is to grab all the articles in
[[Category:Chemical_elements]]. In
On Fri, Sep 20, 2013 at 2:35 PM, Ori Livneh o...@wikimedia.org wrote:
I personally think it'd be unfortunate for this current effort to collapse
over such considerations, but I'm obviously biased.
Oh, I certainly agree. For my part, I'm satisfied that the
LESS/Sass/stylus issues have been
Some more questions for discussion:
- I'm concerned that some of the useful things people do with sass (ie,
robust cross-browser support with compass) are impossible with less.
- Has http://learnboost.github.io/stylus/ been considered? I've heard that
it's a good compromise between sass and less
@ori: You might want to look into the different @import options before
being so dogmatic. In particular, the media-query restrictions are
probably very useful to MW. The (less) option also allows overriding CSS
files, which can help prevent the everything must be less! problem. And
the
Perhaps we could use some Math here? Can we grab a list of the last, say,
100,000 edits reverted for vandalism, look at the diff, and compute a
frequency score based on that?
--scott
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
It appears that YuviPanda is a magician (oops, am I not supposed to use
that word?) and comments are sync'ed bidirectionally:
https://git.wikimedia.org/blob/labs%2Ftools%2FSuchABot.git/62c8fcd6764db7d31da28f01f2a71108ce5b2168/README.md
--scott
On Wed, Sep 18, 2013 at 3:57 AM, Antoine Musso
Note also that zhwiki (and others?) profitably uses the first part of the
path to do variant selection.
https://zh.wikipedia.org/wiki/User:Cscott uses the wiki default variant (if
logged in, uses the variant from the user's preferences)
https://zh.wikipedia.org/zh-hans/User:Cscott
Since we already have github mirrors of wmf repos and a tool already
written to bridge github and credit, it seems like we should give that a go
before starting to work on any other alternatives. As I understand it,
projects/repos have to opt-in to the github bridge bot, and not many repos
have
New revelations on NSA capabilities yesterday in the New York Times: see
https://www.schneier.com/blog/archives/2013/09/the_nsa_is_brea.html for a
jumping off point.
The bottom line seems to be:
1) don't use RC4 (we're already working toward that goal, I believe)
2) don't use the Dual_EC_DRBG
RC4 has been deprecated for over a decade: the first flaws were found in
2001, and RC4 was fully-broken in WEP in 2004. Yes, there has been
movement back to RC4 due to the beast attacks, but the fact that it's the
best of a bad bunch should not fool us. As Schneier said before the
recent NSA
Stated more precisely: a non-GPL-compatible license for an extension means
that the extension can never be distributed with core.
The idea that deployment of software on a server entails license
obligations is a GPLv3 feature; mediawiki is licensed under the GPL v2 (or
later for theoretical
On Sat, Aug 24, 2013 at 6:38 PM, Maarten Dammers maar...@mdammers.nlwrote:
If you compare our current implementation to wheel of fortune [1]; all our
articles are evenly spread around.
Weighted would be putting bot articles closer to each other so you would
hit them less often. You just need
On Mon, Aug 26, 2013 at 3:35 PM, Derric Atzrott
datzr...@alizeepathology.com wrote:
I might have to look into licenses again and make sure what I use is GPL
compatible. The GPL is such a pain sometimes
https://fedoraproject.org/wiki/Licensing:Main is a useful guide.
--scott
--
This make a second draw approach would also let you tune how often you
saw the bad articles. That is, if it's a bad article, then flip a coin
to see if you should make a second draw. Repeat if the new article is bad,
but never make more than N draws. Someone with time on their hands and a
On Fri, Aug 23, 2013 at 9:24 AM, Lord_Farin lord_fa...@proofwiki.orgwrote:
The probability of displaying a bad page would be:
B q ((p B)^N - 1) / (p B - 1) + B (p B)^N
(modulo errors), where B is the fraction of bad pages, p is the
probability of repeating, q is the probability of
Upload your proposed patch to gerrit and add me (cscott
canan...@wikimedia.org) to the list of reviewers. I'll add the rest of
the usual suspects.
--scott
On Aug 16, 2013 4:56 PM, Lord_Farin lord_fa...@proofwiki.org wrote:
Dear devs,
A couple of months back, I discovered that the ToC of
And I really should have suggested reporting the feature request in
bugzilla as well. It is helpful when patches can be associated with a bug
number.
--scott
On Aug 16, 2013 7:17 PM, C. Scott Ananian canan...@wikimedia.org wrote:
Upload your proposed patch to gerrit and add me (cscott
canan
On Fri, Aug 16, 2013 at 8:17 PM, Tyler Romeo tylerro...@gmail.com wrote:
With that said, I think the real problem with Wikimedia's security right
now is a pretty big failure on the part of the operations team to inform
anybody as to what the hell is going on. Why hasn't the TLS cipher list
On Fri, Aug 16, 2013 at 8:04 PM, Zack Weinberg za...@cmu.edu wrote:
Wikipedia user handle. I realize how disruptive this would be, but I
think we need to consider changing the canonical Wikipedia URL format to
https://wikipedia.org/**LANGUAGE/PAGENAMEhttps://wikipedia.org/LANGUAGE/PAGENAME
.
On Fri, Aug 16, 2013 at 9:47 PM, Tyler Romeo tylerro...@gmail.com wrote:
To be fair, I'm really only talking about non-restrictive changes. For
example, right now we *only* have RC4. Rather than disable RC4 (which would
have consequences), I'm saying why haven't other normal ciphers been
Please read the link I provided more carefully. Apple devices and browsers
are still vulnerable.
--scott
On Aug 16, 2013 10:14 PM, Tyler Romeo tylerro...@gmail.com wrote:
On Fri, Aug 16, 2013 at 9:59 PM, C. Scott Ananian canan...@wikimedia.org
wrote:
Because the other TLS 1.0 ciphers
I recommend posters re-read bawolff's message:
On Mon, Aug 5, 2013 at 2:51 PM, Brian Wolff bawo...@gmail.com wrote:
https://bugzilla.wikimedia.org/show_bug.cgi?id=52556 sounds related...
--bawolff
lists.wikipedia.org is missing an SPF record in DNS. That is most likely
what is causing the
On Mon, Aug 5, 2013 at 5:57 PM, Ken Snider ksni...@wikimedia.org wrote:
I believe Greg G. has reached out to the Office IT team, who have the
ability to open a ticket against GMail (since wikimedia uses GMail for org
email).
If Brian Wolff's diagnosis is correct (
Like dgerald said, let's not let the perfect distract us from the
better. It will be impossible to 100% secure our visitors' traffic
against an adversary with as many resources as the NSA. But we can
secure our users against adversaries with fewer resources, and we can
increase the cost of a
I think we're mostly agreed now. And I agree that rule-based systems can
provide valuable bootstrapping, if the requisite language experts can be
found. I suspect we will find that different language pairs will favor
different techniques.
--scott
--
(http://cscott.net)
Mark has answered this question pretty well. From my perspective it's
the discovery and work together part (chat, etc) of Tow Truck which
is most worth exploring in the short term (and possibly adopting).
I'm (personally) less concerned with how the editing gets done (for
initial demo purposes at
http://en.wikipedia.org/wiki/Placebo_button would be a better
reference. It also contains a correct description of the function (or
lack thereof) of the 'door close' button. ;)
--scott
--
(http://cscott.net)
___
Wikitech-l mailing list
That ssllabs link also shows that wikimedia has RC4 encryption enabled
on SSL connections, which offers no real security. This is apparently
related to the TLS 1.0 -vs- TLS 1.1/1.2 issue:
https://community.qualys.com/blogs/securitylabs/2013/03/19/rc4-in-tls-is-broken-now-what
--scott
--
Au contraire: I think WMF has a responsibility to ensure the safety and
security of its editors, who might be working on topics controversial in
their home regions.
--scott
On Jul 29, 2013 9:36 PM, Tyler Romeo tylerro...@gmail.com wrote:
Yeah, I think MediaWiki has much more important security
On Sat, Jul 27, 2013 at 10:18 AM, David Cuenca dacu...@gmail.com wrote:
Scott, edit and maintain parallelism sounds wonderful on paper, until you
want to implement it and then you realize that you have to freeze changes
both in the source text and in the target language for it to happen, which
Without fanning the flames unduly, I'll note that VE is already enabled by
default for all registered users on de.wikipedia.org, and that the linked
vote on the de wiki was just initiated today and has not yet even proceeded
past the do we think the question is well-formed enough to vote on stage
On Fri, Jul 26, 2013 at 5:59 PM, Yuvi Panda yuvipa...@gmail.com wrote:
Next step: Rewrite mediawiki to run on nodejs + go + mongodb...
Who told you about the parsoid/VE team's secret plans?
--scott
ps. with the exception of the 'go' part and the mongodb part
--
(http://cscott.net)
On Fri, Jul 26, 2013 at 3:25 PM, David Cuenca dacu...@gmail.com wrote:
This is the preliminary draft:
https://meta.wikimedia.org/wiki/Collaborative_Machine_Translation_for_Wikipedia
The linked page says:
For this kind of project it is prefered to use a rule-based machine
On Thu, Jul 25, 2013 at 2:19 PM, Subramanya Sastry ssas...@wikimedia.orgwrote:
And, both Roan and Scott are correct. Pathway 2. would be a test of of
external libraries (HTML5 and Domino, not just domino). And, we did have
bugs in the HTML5 parsing library we used (which I fixed based on
It seems like many of those issues could be worked around if mediawiki/core
kept a simple uses math markup boolean for each page. All the overhead
of MathJax could be eliminated unless it was actually needed. Further, the
javascript could be wrapped in a big if clause, so if the browser
On Wed, Jul 24, 2013 at 11:16 AM, Brad Jorsch (Anomie)
bjor...@wikimedia.org wrote:
On Wed, Jul 24, 2013 at 10:55 AM, C. Scott Ananian
canan...@wikimedia.org wrote:
It seems like many of those issues could be worked around if
mediawiki/core
kept a simple uses math markup boolean for each
On Wed, Jul 24, 2013 at 11:20 AM, Subramanya Sastry
ssas...@wikimedia.orgwrote:
On 07/24/2013 09:58 AM, Roan Kattouw wrote:
There are a few things I wish it tested, but they're mostly about how it
tests things rather than what data is collected. For instance, it would be
nice if the
On Tue, Jul 23, 2013 at 5:20 AM, Derk-Jan Hartman
d.j.hartman+wmf...@gmail.com wrote:
So what I would actually propose for the short term (next few years) in
case we really want to go the direction of MathML is the following:
1: img tag + --data-math=formulaID in HTML
2: script to detect
On Tue, Jul 23, 2013 at 5:36 AM, Bináris wikipo...@gmail.com wrote:
Once you have a list of words which are used on the web (this must be got
from an outer source, nothing to do it within Wiktionary), the easiest way
is to run a bot, e.g. Pywikipedia.
Rather than writing a bot, you might
I am a Parsoid developer. I was participant #6 in the thread. As noted,
reponses #14 and #16 were also from VE developers, and our contributions
have continued. From my perspective the devs involved have done a pretty
good job of picking out technical content from the discussion and acting on
On Tue, Jul 23, 2013 at 6:28 PM, John Vandenberg jay...@gmail.com wrote:
On Wed, Jul 24, 2013 at 2:06 AM, Subramanya Sastry
ssas...@wikimedia.org wrote:
Hi John and Risker,
First off, I do want to once again clarify that my intention in the
previous
post was not to claim that
On Tue, Jul 23, 2013 at 7:13 PM, John Vandenberg jay...@gmail.com wrote:
http://parsoid.wmflabs.org:8001/stats
This is the url for our round trip testing on 160K pages (20K each from 8
wikipedias).
Fantastic! How frequently are those tests re-run? Could you add a
last-run-date on
On Tue, Jul 23, 2013 at 7:24 PM, C. Scott Ananian canan...@wikimedia.orgwrote:
Was a regression testsuite built using the issues encountered during
the last parser rewrite?
Yes, mediawiki/core/tests/parser/parserTests.txt (which predates parsoid)
has been continuously updated throughout
On Tue, Jul 23, 2013 at 7:55 PM, John Vandenberg jay...@gmail.com wrote:
On Wed, Jul 24, 2013 at 9:02 AM, Subramanya Sastry
ssas...@wikimedia.org wrote:
http://parsoid.wmflabs.org:8001/stats
This is the url for our round trip testing on 160K pages (20K each from 8
wikipedias).
Very
It seems that a package manager is a reasonable way to manage a growing set
of dependencies.
Composer seems to me (without knowing too much about it) to be a reasonable
package manager.
I've heard the objection expressed that this will require new users of
wikibase/etc to learn composer.
I've mentioned this informally to a few people, but it came up again in
discussion and I thought maybe I'd bring the idea to a wider audience.
TowTruck (https://towtruck.mozillalabs.com/) is a realtime collaboration
framework developed by Mozilla (but not firefox-specific). It provides the
As a(nother) user, I have been very pleased to see unicode-complete fonts
gradually make the use of images for non-roman orthography gradually
disappear. When I see non-English text on a page, greek letters, or simple
expressions with super- and sub-scripts, I can generally highlight, style,
and
On Mon, Jul 22, 2013 at 12:00 PM, Mark A. Hershberger m...@everybody.orgwrote:
On 07/22/2013 11:43 AM, Chad wrote:
Telling me it's like cpan just brings back awful awful memories...
I apologize. I can't say my experience with cpan was all roses, but it
seems that my experience was better
It seems like there are also a bunch of hacky search-alike features built
into the mediawiki database. For example, all pages linking to this
page, my contributions, etc. From a code cleanup standpoint, it would
also be worthwhile if these were all unified and brought together under a
single
On Mon, Jul 22, 2013 at 12:15 PM, Tyler Romeo tylerro...@gmail.com wrote:
Just to clear some things up, composer is *not* a package manager. It is
actually pretty terrible at being a package manager, mainly because it's
not supposed to be one. The purpose of composer is solely dependency
On Mon, Jul 22, 2013 at 12:55 PM, Tyler Romeo tylerro...@gmail.com wrote:
Partially. The issue is that extensions are both packages and dependencies.
Scribunto is itself an independent package that a wiki can install, but at
the same time it can be a dependency for other extensions. That's
On Mon, Jul 22, 2013 at 12:37 PM, Bartosz Dziewoński matma@gmail.comwrote:
The bug for that patch was just WONTFIXed, synchronizing information.
https://bugzilla.wikimedia.**org/show_bug.cgi?id=50929#c16https://bugzilla.wikimedia.org/show_bug.cgi?id=50929#c16
Just to accomodate people too
201 - 300 of 309 matches
Mail list logo