On 21/06/16 23:16, Greg Grossmeier wrote:
== tl;dr ==
On June 29th git.wikimedia.org (running Gitblit) will redirect all
requests to Phabricator. The vast majority of requests will be correctly
redirected.
== What is happening? ==
In an effort to reduce the maintenance burden of redudant
On 27/10/15 19:54, Antoine Musso wrote:
I think we standardized the MediaWiki core files at one point to
include the recommended GPL headers. The commit history should have
such trace.
We did. Copyright headers were added for files which lacked it, much to
my dismay. Actual descriptions of
On 15/09/15 01:34, wp mirror wrote:
Idea. I am thinking of piping the *pages-articles.xml.bz2 dump file
through an AWK script to write all unique [[File:*]] tags into a file. This
can be done quickly. The question then is: Given a file with all the media
tags, how can I generate all the thumbs.
On 13/09/15 18:20, Purodha Blissenbach wrote:
The idea is that third parties can publish texts, such as theis
statutes, via a open or public wiki, and readers can be sure to read,
download, sign, and mail the originals. Another use would be to have
pledges and petitions signed by many people.
On 02/09/15 00:12, Quim Gil wrote:
Wikimedia Phabricator will be soon one year old!
On Tue, Sep 1, 2015 at 2:00 AM, wrote:
Number of accounts created in (2015-08): 288
Kind of surprised about the fact that we keep having almost ten new
Phabricator users
Brad Jorsch (Anomie) wrote:
I wonder if it got lost in the move from Squid to Varnish, or something
along those lines.
That's likely, given that it was enforced by squid.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Jamison Lofthouse wrote:
The subject sounds exactly like the reCAPTCHA
https://www.google.com/recaptcha/intro/index.html tagline. Not sure how
beneficial the project would be but I have seen it used. Maybe worth
looking into.
Thanks,
Negative24
I should note that the latest reCAPTCHA¹ is
On 07/04/15 18:54, Daniel Barrett wrote:
Will anything bad happen if entries in the MediaWiki logging table are not
inserted in chronological order?
Due to a bug, our logging table has incomplete data. I'd like to insert the
missing data using a script.
However, the log_id column is
, login is
case-sensitive (e.g., MZ and Mz can be different users). In your
experience, is MediaWiki's current authentication architecture following
common or best practices? I personally think there's a lot of work needed.
MZMcBride
Emails are case-sensitive as well. platonides@gmail is different
On 20/02/15 00:58, phoebe ayers wrote:
Hi all,
I'm the one who started that bug-now-task a while back, and for
context, it was based directly on user feedback. What MzM says above
is right. I was working with a casual (but quite good) editor who said
to me well, I'd edit that Wikipedia page,
On 09/02/15 20:37, Tyler Romeo wrote:
This entire conversation is a bit disappointing, mainly because I am a
supporter of the free software movement, and like to believe that users
should have a right to see the source code of software they use.
Obviously
not everybody feels this way and not
Max Semenik wrote:
I'm pretty sure most users technical enough to use IRC are able to solve
captchas well.
That the backend is irc-based doesn't mean the would use a IRC frontend.
We routinely point to web irc, and plenty of noobs have proven able to
reach there
(sometimes even thinking we
On 09/11/14 06:21, Pine W wrote:
Discussing an option with the community to test replacing registration
CAPTCHAs with an email requirement makes sense to me. I would support a
small, carefully designed test. If someone is motivated to create a
Wikimedia account and they don't want to register an
On 07/11/14 02:52, Jon Harald Søby wrote:
The main concern is obviously that it is really hard to read, but there are
also some other issues, namely that all the fields in the user registration
form (except for the username) are wiped if you enter the CAPTCHA
incorrectly. So when you make a
On 09/11/14 17:19, Marc A. Pelletier wrote:
On 11/09/2014 10:20 AM, Brian Wolff wrote:
Does anyone have any attack scenario that is remotely plausible which
requiring a verified email would prevent?
Spambots (of which there are multitude, and that hammer any mediawiki
site constantly) have
On 20/01/14 08:26, Petr Bena wrote:
There is one more feature available on wm-bot.
First of all, that thing is smart enough to recognize who is irc
newbie and who is not. It is possible to direct this only to people
who are known to the bot (their cloak is trusted) so that it doesn't
bite the
On 30/12/13 23:21, Tyler Romeo wrote:
As the subject implies, there is a variable named $wgDBmysql5. I am
inclined to believe that the purpose of this variable is to use features
only available in MySQL 5.0 or later. The specific implementation affects
the encoding in the database.
On 23/11/13 02:03, Marc A. Pelletier wrote:
Interestingly enough, I /do/ have a project I'd be happy to put forward
that's quite in scope if we'd like to touch on something more
system-level than our usual UX fare: there is a serious hole in
reasonably self-contained distributed cron-like
On 21/06/13 12:32, Paul Selitskas wrote:
We should have stayed on PHP4 if this is a trouble. Perfomance may be a
problem (which is not much in 5.4 iirc), syntax flaws may be a problem (and
this one fixes one of the flaws). Yes, it will take us some time to upgrade
MW to 5.4 or 5.5, but
Le 21/06/13 21:03, Antoine Musso a écrit:
If you want a playground, we could get both backported packages on the
beta cluster (labs project: deployment-prep). That might help catch
some potential issues.
I have no idea how we could get different versions in production and
labs, but I am sure
On 06/06/13 07:21, Daniel Friesen wrote:
Side topic, anyone want to voice their bikeshed opinions on their
favorite the different ways of disambiguating a / inside urls for
various types of web UIs to repositories:
- Rejecting slash in repository names /.../mediawiki-core/... (ie:
GitHub :/)
-
On 06/06/13 18:17, Erik Moeller wrote:
I want to again take this opportunity to thank CT Woo for his tireless
operations leadership since December 2010. I’d also like to thank
everyone who’s participated in the Director of TechOps search process.
Since Dec 2010. Time flies!
Welcome Ken!
On 05/06/13 15:42, Brad Jorsch wrote:
There's nothing wrong with having a large list of fine-grained rights to
grant as long as you format them properly for the user.
In other words, implement another rights-grouping system just as
complicated and less clear than the approach currently
On 05/06/13 20:07, Greg Grossmeier wrote:
Hi all!
Tomorrow we start the one-week deploy cycle here at WMF!
You can see the general overview of the cycle here:
https://wikitech.wikimedia.org/wiki/Deployments/One_week
And the specific roadmap/plan for MediaWiki (with version numbers/dates)
On 05/06/13 01:17, Tyler Romeo wrote:
By saying you can only use OAuth if you're open source, it's the same as
saying if you're closed source you must use insecure authentication
methods. Because just saying OAuth must be open source isn't going to stop
closed source developers.
Yes, of
On 05/06/13 02:37, Tyler Romeo wrote:
On Tue, Jun 4, 2013 at 8:35 PM, Chris Steippcste...@wikimedia.org wrote:
We initially were going to use your patch and limit based on module,
but there were a few places where that seemed too course. But then if
we just used user rights, then to edit a
Rob Lanphier wrote:
Hi everyone,
Many of you already know Brian Wolff, who has been a steady
contributor to MediaWiki in the the past several years (User:Bawolff),
having gotten a start during Google Summer of Code 2010[1].
Brian is back for another summer working with us, working
On 06/05/13 18:12, Guillaume Paumier wrote:
Hi,
On Sun, Mar 10, 2013 at 2:11 AM, Rob Lanphier ro...@wikimedia.org wrote:
Short version: This mail is fishing for feedback on proposed work on
Gerrit-Bugzilla integration to replace code review tags.
I was wondering: has a decision been made
On 05/06/2013 01:12 PM, Siebrand Mazeland (WMF) wrote:
I would like to provide some feedback, too: The whole process of GSoC
was very confusing to me. Students communicated on melange,
mediawiki.org http://mediawiki.org, and mailing lists. Some also
emailed me and others privately. This
On 04/05/13 20:57, Krinkle wrote:
PS: The public mediawiki-config.git only dates back to 24 Feb 2012. Before
that date the information was in a non-public svn repository inside the
wmf-production cluster.
Rather than using mediawiki-config, you may have more luck using
On 26/04/13 22:23, Kiran Mathew Koshy wrote:
Hi guys,
I have an own idea for my GSoC project that I'd like to share with you.
Its not a perfect one, so please forgive any mistakes.
The project is related to the existing GSoC project *Incremental Data dumps
* , but is in no way a
On 19/04/13 00:21, Steven Walling wrote:
Hi all,
This is a heads up that we've added a small new feature which hopefully
will make things less painful for users across the projects: the ability to
refresh the CAPTCHA you're presented without refreshing the entire page. It
should work
On 14/04/13 15:41, anubhav agarwal wrote:
I don't we could take in account the roll back for automated learning. It
is not necessary that the person who edited the document, then rolled it
back did because it was a spam.
Getting the right data to train from is hard, since wiki is so flexible.
On 15/04/13 21:40, ro...@rogerchrisman.com wrote:
Hi,
What is the best way to install, modify and configure the Universal
Language Selector extension so that I can see how it might work with
Wikivoyage?
I created an account at
https://wikitech.wikimedia.org/wiki/User:Rogerhc but it is
On 09/04/13 18:20, Quim Gil wrote:
Hi Anubhav,
I have done a first reality check with Chris Steipp, who oversees the
area of security and also spam prevention. Your idea is interesting and
it seems to be feasible. This is a very good first step!
It would require adding a hook to MediaWiki
Yes, you would have to change it at Parser.php That point would be the
appropiate one. However, given the large amount of already-posted
timestamps (and that some people may not want the spans in the wiki
source), why not simply use a regex to replace the dates in the page?
On 25/03/13 23:19, Rahul Maliakkal wrote:
I installed SMW extension in my local wiki yesterday and now when i visit a
page in my local wiki i get this message A database query syntax error has
occurred. This may indicate a bug in the software. The last attempted
database query was:
(SQL
This should work:
WIKIMEDIA_REPOS=/path/where/you/have/your/clones
REPO=$1 # qa/browsertests
PULL=$2 # https://github.com/brainwane/qa-browsertests.git
TEMP=`mktemp --tmpdir -d pull-request.XXX`
git clone --reference=$WIKIMEDIA_REPOS/$REPO $PULL $TEMP
cd $TEMP
if [ ! -f .gitreview ]; then
On 25/03/13 23:35, Greg Grossmeier wrote:
Thanks for the link, but the reason I brought it up is because my first
week here I saw a removal of a function without an explicit @deprecated
warning.
:-)
Greg
Is it possible that it was a recently-introduced function that hadn't
been published
On 25/03/13 18:39, Greg Grossmeier wrote:
* Deprecations - SELF-TODO: We don't have any guarantee, that I can see,
that we deprecate for X releases before we remove
Not exactly a guarantee, but the general rule we use is to keep
deprecated for a couple releases before removing.
It's briefly
Trying to clarify:
APC can do two things:
1) Keep the compiled php opcodes, so php execution is faster.
2) Allow the application to store values in the web server memory (kept
accross requests).
ZendOptimizer only does 1.
MediaWiki only needs to be changed for 2, since 1 is done automatically
On 21/03/13 08:05, Federico Leva (Nemo) wrote:
Restrictive wikis for captchas are only a handful (plus pt.wiki which is
in permanent emergency mode).
https://meta.wikimedia.org/wiki/Newly_registered_user
For them you could request confirmed flag at
https://meta.wikimedia.org/wiki/SRP
Is sending an email to wikitech-ambassadors enough for unblocking it?
Although such should contain a timeframe expectation, which probably
only WMF can give.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On 20/03/13 11:59, Niklas Laxström wrote:
1) Why is Bug:43778 different from bug:43778 when searching?
2) Can we do the same for all things in the footer? I tried it but
bug seems to be a special case and nothing else works.
The stored things are set in gerrit config.
On 19/03/13 00:54, Sumana Harihareswara wrote:
Thomas, thank you for writing to us and mentioning that you're available
for work! I'd love for you to take a look at the Wikimedia Foundation's
job openings. For most of them, telecommuting is fine.
That's a bit misleading, as only a couple
On 19/03/13 14:38, Seb35 wrote:
Hello,
According to [1] and [2], Firefox 22 (release June 25, 2013) will change
the default third-party cookie policy: a third-party cookie will be
authorized only if there is already a cookie set on the third-party
website.
This would break most of the
On 19/03/13 17:41, Chris Steipp wrote:
On Tue, Mar 19, 2013 at 8:57 AM, Brion Vibber br...@pobox.com wrote:
On Tue, Mar 19, 2013 at 7:52 AM, Platonides platoni...@gmail.com wrote:
An idea to fix it would be to take advantage of the new certificate
which includes all projects, by having firefox
On 19/03/13 19:21, Jon Robson wrote:
Chris: On the latest iPhone cookies were not accepted from iframes
from sites that were not visited. You had to physically visit the site
by following a link or typing the url into the address bar first. We
are currently investigating whether meta refresh
On 12/03/13 18:47, Toni Hermoso Pulido wrote:
Hello,
I'm checking whether I can detect that a process is run from a
maintenance script in a parser function extension.
Which would be the best way / more recommendable to detect it?
Thanks!
Why do you want to do it? It is probably a bad
Doing the tags in Gerrit is the right thing.
Who can change them is not a problem, just make a changetags log. (BTW,
you would have the same problem in bugzilla with people removing that
superimportant tag)
___
Wikitech-l mailing list
On 11/03/13 00:17, Yuri Astrakhan wrote:
Answered Inline. Also, I apologize as I think my email was slightly
off-topic to Ori's question.
On Sun, Mar 10, 2013 at 6:57 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:
PHPMyAdmin also has major security issues. It isn't allowed on
On 12/03/13 00:05, Paul Selitskas wrote:
Git review, of course. The log is here: http://pastebin.com/iC4N1am0
Everyone can send patches to all repositories. If you get an error doing
that, it's not that You are denied, it's some configuration problem.
In this case, it looks like you didn't have
On 10/03/13 15:50, Waldir Pimenta wrote:
The motivation for the tool came from a post by Niklas [1], specifically
the section Coping with the proliferation of tools within your community.
In the comments section, Nemo announced his initiative to create a custom
google search to fit at least
On 10/03/13 22:39, Chad wrote:
Hi,
I've been thinking about this for the last week or so because it's becoming
incredibly clear to me that core isn't scaling. It's already taking up over
4GB on the Gerrit box, and this is the primary reason core operations are
slow.
4GB??
My not specially
Chris Steipp wrote:
csteipp. Feel free to ping me whenever.
And platonides :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 09/03/13 15:47, Waldir Pimenta wrote:
So mw-config can't be deleted after all? Or you mean the installer at
includes/installer?
Is you mean the former, then how about run-installer instead of my
previous proposal of first-run?
Any of these would be clearer than mw-config, imo.
--Waldir
On 06/03/13 16:28, Jay Ashworth wrote:
To “convey” a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user
through a computer network, with no transfer of a copy, is not
conveying.
As javascript is executed in the client, it
On 07/03/13 21:03, anubhav agarwal wrote:
Hey Chris
I was exploring SpamBlaklist Extension. I have some doubts hope you could
clear them.
Is there any place I can get documentation of
Class SpamBlacklist in the file SpamBlacklist_body.php. ?
In function filter what does the following
On 05/03/13 21:55, Matthew Flaschen wrote:
If it does turn out we legally *need* more license
preservation/disclosure, we should add more license preservation.
Getting a special get out of jail free card for WMF only is not
acceptable. Our sites run free software, software that anyone can
On 06/03/13 13:24, Platonides wrote:
I just checked and there are 73 authors of the resources of MediaWiki
core. More than I expected, but not unworkable. We could relicense our
css and javascript as MIT, MPL, GPL-with-explicit-exception...
I was going to provide the full list:
$ git log
On 05/03/13 14:07, Alexander Berntsen wrote:
On 05/03/13 13:18, Max Semenik wrote:
If you mean that we have to insert that huge chunk of comments from
[1] into every page, the answer is no because we'll have to
include several licenses here, making it ridiculously long.
Please see the
On 05/03/13 21:53, Matthew Flaschen wrote:
On 03/05/2013 12:29 PM, Luke Welling WMF wrote:
We should discuss them separately, but this core mediawiki JS is GPL2
https://github.com/wikimedia/mediawiki-core/tree/master/resources
I am referring to Isarra's comment:
The licensing information
On 01/03/13 23:59, Daniel Friesen wrote:
On Fri, 01 Mar 2013 14:45:14 -0800, Nischay Nahata wrote
I also prefer it in the header. The bug report is the best description :)
Is it not possible for Gerrit to search if its in the header? or make
it so
+1
Tools should be coded around people.
On 02/03/13 19:13, Bartosz Dziewoński wrote:
So you're volunteering to write release notes for my commits? By all
means, if so.
But I'm afraid this would end with simply no release notes being written.
Who would want to read and deeply understand 2000 commit messages per
release to note
On 28/02/13 18:48, Brion Vibber wrote:
On Thu, Feb 28, 2013 at 6:20 AM, Petr Bena benap...@gmail.com wrote:
it's blocked in my office as well, there are many ways to get through
the firewall... most simple is just to install a bouncer or use irssi
in a terminal of remote server if port 22 is
On 27/02/13 18:47, Tim Landscheidt wrote:
Petr Bena benap...@gmail.com wrote:
In addition to this I migrated labs nagios to icinga as well, few
minutes ago - http://nagios.wmflabs.org/icinga/
[...]
Interestingly, Google Chrome claims that this page is in
French and asks whether it
On 27/02/13 15:23, Tyler Romeo wrote:
Also, with the exception of asking for technical help, I don't really like
IRC for developer discussion, and it's not just because I don't go on IRC.
If you are coding/reviewing MW code, I recommend you to be available on
the irc channel. That way we could
On 26/02/13 11:57, Petr Bena wrote:
Hi, this is more related to mediawiki rather than wikimedia, but this
list is being watched a bit more I guess.
Is there any extension that allows permanent removal of deleted pages
(or eventually selected deleted pages) from database and removal of
On 23/02/13 23:58, Mark A. Hershberger wrote:
That is, I think it is safe to say LQT will remain usable in its current
state on any coming MW versions for the foreseeable future.
Right now, though, all I'm looking for is a confirmation that it will
remain usable. I imagine one of the first
Another option is just to only keep old versions of skins folder (1.4M)
Given that CSS and JS go through the RL, the few images we use and link
from the html could probably be squeezed in a single static path, though.
___
Wikitech-l mailing list
El 20/02/13 00:49, Luis Villa escribió:
Hah... I think calling me a developer is, at this point, a bit of a
stretch, but I look forward to working with the tech team :)
¡Welcome, Luis!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On 21/02/13 10:18, Denny Vrandečić wrote:
After evaluating different options, we want to use for generating
Wikidata's RDF export the EasyRDF library: http://www.easyrdf.org/
We only need a part of it -- whatever deals with serializers. We do not
need parsers, anything to do with SPARQL,
Platonides) the installer used to be
in the config folder, until the rewrite, which *moved the classes* to
includes/installer (emphasis mine). If the classes were moved to
includes/installer, why did those of overrides.php's remain?
Read the beginning of overrides.php:
?php
/**
* MediaWiki
On 19/02/13 13:56, Tyler Romeo wrote:
So unfortunately I don't have a clear idea of what the problem is,
primarily because I don't know anything about the Parser and its inner
workings, but as far as having all the data in one page, here's something.
Maybe this is a bad idea, but how about
Welcome Greg!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 15/02/13 17:28, Waldir Pimenta wrote:
On Fri, Feb 15, 2013 at 11:58 AM, Platonides platoni...@gmail.com wrote:
On 15/02/13 09:16, Waldir Pimenta wrote:
1) should all access points be on the root directory of the wiki, for
consistency?
No. The installer is on its on folder on purpose, so
On 15/02/13 09:16, Waldir Pimenta wrote:
While trying to add some more information to
https://www.mediawiki.org/wiki/Manual:Code, I came across a slightly
peculiar issue regarding the entry points for MediaWiki:
Right now, among all the entry points that I know of (those are listed in
On 15/02/13 01:38, Brad Jorsch wrote:
I'd propose one more:
* Someone else gives +2, but Jenkins rejects it because it needs a
rebase that is not quite trivial enough for it to do it automatically.
For example, something in RELEASE-NOTES-1.21.
Seems a better example.
I'm not convinced that
On 15/02/13 18:51, Chris Steipp wrote:
This process is painful (no one like reviewing patches in bugzilla),
and the wrangling to get the right people to review patches in
bugzilla is slowing down our security releases. It would be much
better if we had a way to submit the patches in gerrit, go
On 14/02/13 18:26, Faidon Liambotis wrote:
Ubuntu has experimented in the past with the concept of automatically
generating and shipping symbols for *all* packages, packaged up in a
ddebs (same format as .deb) and shipped via a different repository
that isn't mirrored by all of the downstream
Welcome Ed!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 12/02/13 06:26, Brian Wolff wrote:
For subpages to really fill this use case I think the page title would have
to show only (or primarily emphasize) the subpage name instead of the full
page name.
I think it has been brought up in the past, there may be an extension
doing that.
Also it
On 08/02/13 21:51, Lee Worden wrote:
As an aside, you could almost certainly do this cheaper with
WorkingWiki. If you can write a make rule to retrieve the Excel file
from the network drive and make it into html and image files (and maybe
a little wikitext to format the page), you're done.
On 07/02/13 21:54, Chad wrote:
On Thu, Feb 7, 2013 at 3:50 PM, Platonides wrote:
Also worth mentioning, our SVN is now read-only.
This actually happened on Feb 1st :)
-Chad
I did check before sending.
«Marking all of SVN as read-only» sent Jan 24th.
Follow-up the next day saying
On 07/02/13 20:57, Guillaume Paumier wrote:
*Git conversion https://www.mediawiki.org/wiki/Git/Conversion*
The
ExtensionDistributorhttps://www.mediawiki.org/wiki/Extension:ExtensionDistributorwas
rewritten in early January. While this was primarily done to support
the data center
I don't like it's cryptic nature.
Someone looking at the headers sent to his browser would be very
confused about what's the point of «X-MF-Mode: b».
Instead something like this would be much more descriptive:
X-Mobile-Mode: stable
X-Mobile-Request: secondary
But that also means sending more
Your system does not seem to support OGG audio format. Downloading this
file without ogg support may need a significant amount of space and
bandwith. Downloading article.wav ... 20%
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On 25/01/13 16:54, Yuri Astrakhan wrote:
Localization in v2 - all errors AND warnings are localized in default
language unless lang= is given, in which case you can get parameter
array or a non-default language. All standard translation magic
(plural/gender/etc) will be supported. Warnings
Please join me in welcoming Ram!
Rob
It's always good to get more ram! :)
Welcome!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Well, I would prefer to get a notice that I made a typo than having that
embarrassing typo in the commit log forever. That's the point of using a
gating system, right? :)
So yes, I do think they should be corrected. (And I have committed typos
in both commit messages and inside files, just as
On 15/01/13 14:44, Andreas Plank wrote:
Hi,
I'm using MW 1.20.2 and I want to get the content of a page for
further parsing in a PHP application. The PHP application is triggered
via a special page (Special:MobileKeyV1) and parses nature guides for
mobile devices.
I tried to get the
On 29/12/12 22:23, Alex Brollo wrote:
I'd like to use html comment into raw wiki text, to use them as effective,
server-unexpensive data containers that could be read and parsed by a js
script in view mode. But I see that html comment, written into raw wiki
text, are stripped away by parsing
On 28/12/12 18:29, Tilman Bayer wrote:
On Fri, Dec 28, 2012 at 1:26 AM, Sumana Harihareswara wrote:
I've floated this problem past Tor and privacy people, and here are a
few ideas:
1) Just use the existing mechanisms more leniently. Encourage the
communities (Wikimedia Tor) to use
On 19/12/12 20:48, Krinkle wrote:
This issue will be definitely solved by isolating tests in dedicated virtual
machines for each run. We are investigating Vagrant.
A VM seems overkill when it can be solved with standard user permissions
+ chroot (or even better, a bsd jail)
I found that I no longer can +1 to operations (in this case
operations/mediawiki-config, c38631).
The only options are 0 and -1.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 12/12/12 09:09, Ryan Kaldari wrote:
I found a solution to the problem:
If a gerrit administrator declares the mimetypes of the files to be safe
they will be displayed in-browser rather than downloaded as zip files:
On 12/12/12 21:23, Rob Lanphier wrote:
My 2cif we add a new status, it should equate to deployed on the
cluster, along with judicious use of milestone so that people who are
just interested in the tarball can infer from our numbering what the
corresponding release will be.
On which wiki?
If you only want to run parser tests from a different file, there's no
need to create a new class.
You simply add in the main file:
$wgParserTestFiles[] = dirname( __FILE__ ) . /lstParserTests.txt;
(already in lst.php)
It will be automagically picked when running the phpunit tests (if you
have
I don't think they are really a problem. core was broken (it shouldn't
contain submodules), thus all tests failed.
As c37975 was an automerge, jenkins didn't have the chance to stop it.
It would be interesting if jenkins would -2 whenever a change is sent to
master branch with a parent in a
On 12/12/12 00:44, bawolff wrote:
I'm actually quite curious to see if there are actually enough MW devs
in a single city (Other then WMF's home town) to form a group.
To be honest though, I kind of feel that if such groups were going
to form, they probably would have already. Formality
1 - 100 of 1204 matches
Mail list logo