Heya,
The Program Committee for the Architecture Summit has published a proposed
program: https://www.mediawiki.org/wiki/Architecture_Summit_2014
Highlights of the Program:
1) We have tried to incorporate flexibility into the program by allowing 4
unconference break-out slots, 1 open plenary
as a sign that's
not important, on the contrary, we are just trying to find the most
appropriate slot for it.
D
On Thu, Jan 16, 2014 at 10:47 AM, Diederik van Liere
dvanli...@wikimedia.org wrote:
Heya,
The Program Committee for the Architecture Summit has published a
proposed
program: https
Heya,
Today, January 8th until 11:59 PM PST, you can vote in the straw poll for
the Architecture Summit. Please cast your votes here:
https://www.mediawiki.org/wiki/Architecture_Summit_2014/Straw_poll
Tomorrow I will start creating the program for the Summit and cannot
promise to include your
Heya,
If you haven't exercised your right to cast a vote in the Straw Poll for
the Architecture Summit then this would be a real good time to do so.
I would like to close the poll by January 8th so we can start making the
final program.
You can find the straw poll here:
@Chad: should this be included in the straw poll for the architecture
summit or is that too soon?
D
On Tue, Dec 31, 2013 at 6:55 PM, Chad innocentkil...@gmail.com wrote:
I'm starting a new RFC to discuss ways we can improve our PHP profiling.
Hi everyone,
Best wishes for 2014! I hope on your list of resolutions for the New Year
is to participate in the straw poll for the Architecture Summit --
https://www.mediawiki.org/wiki/Architecture_Summit_2014/Straw_poll
Quick refresher: we have created clusters of related RFC's (
Bumping my proposal. I am particularly looking forward from responses from
community members.
Have an awesome New Year's Eve!
Best,
Diederik
Heya,
I just posted an initial proposal to start running a biweekly showcase to
feature all the cool things that are happening on Labs. Please have a
Heya,
We are making good progress with creating clusters to group RFC for the
upcoming Architectural Summit. Some clusters are still too big, in
particular the following clusters can/should be split in smaller clusters
of 3-4 RFC's each.
* General Mediawiki Functionality
* Backend code
Hola,
We wanted to update you with our proposal on how we are thinking on how to
create a program for the Architecture Summit coming January. We started
grouping the RFC's from https://www.mediawiki.org/wiki/RFC into clusters at
and get this written in next 2 weeks but it would be good to
capture this even in a stub like form (not sure if stubs are allowed
on the RFC page)
Hey Jon,
If there's anything I can do to help you with this RfC then please let me
know.
Best,
Diederik
On Tue, Nov 26, 2013 at 6:27 PM, Diederik van
Heya,
I would suggest to at least run it for a 7 day period so you capture at
least the weekly time-trends, increasing the sample size should also be
recommendable. We can help setup a udp-filter for this purpose as long as
the data can be extracted from the user-agent string.
D
On Wed, Sep 4,
Hi Cheol!
Thanks for alerting us to this issue. We are looking into it right now.
Best,
Diederik
On Mon, Aug 5, 2013 at 4:24 PM, Ryu Cheol rch...@gmail.com wrote:
Hello guys,
http://dumps.wikimedia.org/other/pagecounts-raw/2013/2013-08/ is not
updated for a few hours.
I don't know who
at 5:09 PM, Diederik van Liere
dvanli...@wikimedia.orgwrote:
Hi Cheol!
Thanks for alerting us to this issue. We are looking into it right now.
Best,
Diederik
On Mon, Aug 5, 2013 at 4:24 PM, Ryu Cheol rch...@gmail.com wrote:
Hello guys,
http://dumps.wikimedia.org/other/pagecounts-raw/2013
This bug has been fixed, see
https://bugzilla.wikimedia.org/show_bug.cgi?id=45178
I will post a message on the Village Pump as well.
Best,
Diederik
On Sun, Feb 3, 2013 at 3:44 PM, Brad Jorsch bjor...@wikimedia.org wrote:
On Fri, Jan 25, 2013 at 12:51 PM, Diederik van Liere
dvanli
Awesome news! Go team Mobile!
D
On Mon, Mar 18, 2013 at 1:59 PM, Rachel Farrand rfarr...@wikimedia.orgwrote:
Welcome Adam and Yuri! Looking forward to working with both of you. :)
Rachel
On Mon, Mar 18, 2013 at 10:48 AM, Erik Moeller e...@wikimedia.org wrote:
On Mon, Mar 18, 2013 at 10:29
a new generic
mediawiki http response header dedicated to logging containing key value
pairs.
-Asher
On Tue, Feb 12, 2013 at 9:56 AM, Asher Feldman afeld...@wikimedia.org
wrote:
On Tuesday, February 12, 2013, Diederik van Liere wrote:
It does still seem to me that the data to determine
Hi all,
Lars, Rupert thanks for flagging this and you are quite right: the numbers
are too high because webstatscollector, the software that does the counts,
just counts every request as a hit including bots, error pages etc.
I am planning on running a sprint at the Amsterdam Hackathon to built
It does still seem to me that the data to determine secondary api requests
should already be present in the existing log line. If the value of the
page param in an action=mobileview api request matches the page in the
referrer (perhaps with normalization), it's a secondary request as per case
Analytics folks, is this workable from your perspective?
Yes, this works fine for us and it's also no problem to set multiple
key/value pairs in the http header that we are now using for the X-CS
header.
Diederik
___
Wikitech-l mailing list
Thanks Ori, I was not aware of this
D
Sent from my iPhone
On 2013-02-02, at 16:55, Ori Livneh o...@wikimedia.org wrote:
On Saturday, February 2, 2013 at 1:36 PM, Platonides wrote:
I don't like it's cryptic nature.
Someone looking at the headers sent to his browser would be very
(Apologies for cross-posting)
Heya,
The mobile team needs accurate pageviews for the alpha and beta mobile
site. Currently, this information is only stored in a cookie, but we don't
want to go the route of starting to store this cookie because of cache
server performance, network performance
Yes let's not change the filenames
D
Sent from my iPhone
On 2013-01-31, at 18:45, Matthew Walker mwal...@wikimedia.org wrote:
We will most likely change the file names back to their original names in a
month or so
Please don't. It'll serve as a visible marker for the future for when we go
heya,
for all you Java junkies out there, oh wait there are very few within WMF
:) Anyways, if you do Java you can now use the Nexus Maven repo that is
installed on Labs at http://nexus.wmflabs.org/nexus/index.html#welcome
We are happy to give you an account, please poke us on IRC @
Apologies for crossposting
Heya,
The Analytics Team is planning to deploy tab as field delimiter to
replace the current space as fielddelimiter on the varnish/squid/nginx
servers. We would like to do this on February 1st. The reason for this
change is that we need to have a consistent number of
the format
of that will probably break third party scripts.
--
-bawolff
On Fri, Jan 25, 2013 at 1:41 PM, Diederik van Liere
dvanli...@wikimedia.org wrote:
Apologies for crossposting
Heya,
The Analytics Team is planning to deploy tab as field delimiter to
replace the current space
Hey Quim
I also sent you this survey a week ago with the question whether we should
participate :)
D
On Fri, Nov 16, 2012 at 5:13 PM, Quim Gil q...@wikimedia.org wrote:
Hi, sorry for cross-replying.
On Wed, Nov 14, 2012 at 3:11 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On
On 2012-11-14, at 18:33, Platonides platoni...@gmail.com wrote:
On 13/11/12 23:42, MZMcBride wrote:
Please stop top-posting. If you don't understand what that means, please
read https://wiki.toolserver.org/view/Mailing_list_etiquette.
As I posted at
Dario has been proposing RevTagging to exactly address this need, see:
http://www.mediawiki.org/wiki/Revtagging
I really think we should put this on the roadmap for 2013 for Mediawiki, we
definitely need this more granular level of instrumentation for determining
the source of an edit.
Best
Hi,
I made the exact same argument a while back (Dropping the LATER resolution
in Bugzilla
http://wikimedia.7.n6.nabble.com/Dropping-the-LATER-resolution-in-Bugzilla-td743804.html
)
+1
D
On Mon, Nov 5, 2012 at 5:25 PM, Quim Gil quim...@gmail.com wrote:
I was a bit of a lazy child, specially
I'm not even sure where to find the code for http://gerrit-stats.wmflabs.*
*org/ http://gerrit-stats.wmflabs.org/ . In gerrit I could only find
the /analytics/scorecard project.
The repo is available at:
https://gerrit.wikimedia.org/r/gitweb?p=analytics%2Fgerrit-stats.git;a=shortlog;h=HEAD
As
Question: what is the best approach to retrieve the number of existing
Gerrit accounts?
This number is already stored within gerrit-stats, it is just not being
written to a dataset.
D
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
On World IP6 day (June 6th 2012), we had about 5000 IP6 hits,
however, for the first 17 days of September we had a total
of 1,000,032,000 hits coming from IP6 addresses. This is based on the
sampled squid log data.
Best,
Diederik
On Mon, Sep 17, 2012 at 8:03 AM, David Gerard dger...@gmail.com
Here is the actual raw data since Jan. 1st 2012 (multiply each observation
with 1000 to get the estimated number of hits for that day). The assumption
is that each hit has the same probability of showing up in the squid log
file.
As you can see, after World IP6 day, we started supporting way more
Is the slowness issue known?
-Niklas
Yes this is known and it is related to the fact that gerrit-stats is
currently hosted on a Labs instance. We are working on migrating it to
another server.
Best,
Diederik
___
Wikitech-l mailing list
Hi everybody,
The Analytics Team is happy to announce the first version of gerrit-stats.
Gerrit-stats keeps track of the backlog of codereview for Git individual
repositories.
Gerrit-stats dashboard is available at http://gerrit-stats.wmflabs.org
Currently, it has a few example charts but we can
There seems to be a 10-day lag (no data after August 21st). Is this a
bug or a feature?
Data hasn't been pushed to gerrit for 10 days, something is wrong with the
script. We will fix it today.
D
___
Wikitech-l mailing list
Hi Harry
The change set numbers are accurate and the spikes are caused by translatewiki.
See my response to siebrand on how to remove the outliers and create a smoother
chart.
Best
Diederik
Sent from my iPhone
On 2012-08-25, at 17:01, Harry Burt jarry1...@gmail.com wrote:
I realise that
On 2012-08-23, at 2:42 AM, Siebrand Mazeland (WMF) wrote:
The graph for new changesets fluctuates a lot. I would guess this is
due to change sets submitted by user l10n-bot. Maybe it's a good idea
to filter those out, to get a line that's a little easier to
interpret.
Hey Siebrand,
I
Anyone want me to go back through the specs and make a list of some of the
things that are wrong with both
Yes! I think that would be hugely helpful!
Diederik
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Hi all,
The lead author of Oauth 2.0, Eran Hammer, has withdrawn his name from the
OAuth 2 spec:
http://hueniverse.com/2012/07/oauth-2-0-and-the-road-to-hell/
That's a very sad news, IMHO, and it probably means we really should
reconsider what protocol we want to support Oauth 1.0 / Oauth 2.0 /
Hey arthur
It seems that the redirection to the mobile donation site
(donate.m.wikimedia.org) does not work.
D
Sent from my iPhone
On 2012-07-19, at 19:35, Arthur Richards aricha...@wikimedia.org wrote:
PS big thanks to Asher Feldman for getting the change compiled and deployed.
On Thu,
Yes, I don't disagree that jshint should be run by Jenkins. AIUI
Timo's work to make jshint work on the command line is prep work for
exactly that.
Ah, I misunderstood you. Thought you meant so people can
run it before uploading which no one will ever do ;-)
Maybe we should create a
Roan Kattouw wrote:
Yes, ops essentially uses a post-commit workflow right now, and that
makes sense for them.
ops also uses pre-commit review for non-ops people :-]
Yeah, that's right. What I meant to say (and thought I had said in
some form later in that message) was that the puppet
Hey Chris
Could you give us a ballpark estimate of how much search queries you expect per
day?
Best
Diederik
Sent from my iPhone
On 2012-06-19, at 13:51, Chris Peterson cpeter...@mozilla.com wrote:
Thanks, Ryan. When you guys would like Mozilla to make this switch to HTTPS,
you can just
Hi Antoine,
I really think we need to rethink how we are handing out non-admin Gerrit
rights to our engineers, both staff and volunteers. Create repo and create
branch rights should be handed out by default. There is absolute zero
reason for being stingy in handing out these rights. The loss of
The analytics team has written A script to generate such reports and we will
publish these results shortly once we have enough data points
Best
Diederik
Sent from my iPhone
On 2012-06-14, at 4:31, Sébastien Santoro dereck...@espace-win.org wrote:
Hi,
I saw this morning those reviewed but
Hi Ori,
I absolutely 100% agree and we really need to sort this out this week. The
lost productivity is unacceptable.
So far I have heard different arguments why we cannot hand out 'create-repo
rights' to engineers:
The first reason was that only admin's could do it but that is not longer
true
On Tue, Jun 5, 2012 at 8:44 AM, Jeremy Baron jer...@tuxmachine.com wrote:
On Tue, Jun 5, 2012 at 2:25 AM, Diederik van Liere dvanli...@gmail.com
wrote:
Now a new argument is unleashed and that is that we cannot delete
repos. The fact that we cannot delete repos is a non-argument. None of us
So the estimated maximum number of projects is 10.000, while the default
maximum is 1.000.
For contributors, the default maximum is 1.000 and the estimated maximum
number is 50.000
Can we please tag this concern as addressed and start handing out the
rights?
Diederik
On Tue, Jun 5, 2012 at 11:32
Did anyway say, Ask about it? I'm sure if you followed up with the one
of the project creators (eg: chad) he would have been more than happy
to push things along.
I am sorry but I disagree. The question is not whether Chad or one of the
Gerrit admin's will help us, because they are super
I've whipped up a quick tutorial for people who want to create new
repositories[0]. If people can read and make sure they understand
this page (with its various caveats), then yes, we can start handing
this out.
-Chad
[0] https://www.mediawiki.org/wiki/Git/Creating_new_repositories
Hi all,
Ryan Lane just showed me that in Gerrit there is a separate right for creating
repositories. I suggest we give this right to all WMF engineers. A repo is free
and fun and will prevent unnecessary delays.
Best,
Diederik
___
Wikitech-l
automate the icky backend stuff.
-Chad
On Jun 1, 2012 10:33 AM, Diederik van Liere dvanli...@gmail.com wrote:
Hi all,
Ryan Lane just showed me that in Gerrit there is a separate right for
creating repositories. I suggest we give this right to all WMF engineers. A
repo is free and fun
I don't think we should aim to cater to non-developers at all. The changes that
a non-developer finds a real bug are very very small (in my previous life as an
academic I have done a lot of research on Bugzilla and developer productivity
and it's based on that experience that I am making this
Hey Lars,
You might be interested in the WMF Analytics mailing list at
https://lists.wikimedia.org/mailman/listinfo/analytics. There we discuss all
our overall analytics projects, usually a little bit less focused on Mediawiki
issues, but definitely focused on WMF data.
Hope to see you there!
Hi all,
In the last 24 hours I have found two new cases of spaces in log lines where
the space is not used as a delimiter.
Case 1:
There are mobile page requests that contain a space in the URL, for example:
ssl1002 2198871 2012-04-06T23:50:24.566 0.002 0.0.0.0 FAKE_CACHE_STATUS/301
1051 GET
The current version of http://www.mediawiki.org/wiki/OAuth was written by
me and Dario.
It's definitely a starting point and not a finished proposal. I am not sure
to what extent the OAuth 2
protocol has evolved since this was written but that definitely needs to be
checked.
Diederik
On Fri,
My suggestion for how to filter these bots efficiently in c program (no
costly nuanced regexps) before sending data to webstatscollector:
a) Find 14th field in space delimited log line = user agent (but beware of
false delimiters in logs from varnish, if still applicable)
b) Search this
Hi all,
March 2nd, we had our 2nd WMF Analytic Day. We taped all the sessions and they
are now available on Commons:
http://commons.wikimedia.org/wiki/File:WMF_Analytics_Day_-_Cassandra.ogv
http://commons.wikimedia.org/wiki/File:WMF_Analytics_Day_-_HBase.ogv
Hi Srikanth,
Yes, we are looking into the growth percentages as they seem
unrealistically high.
Best,
Diederik
On Mon, Apr 9, 2012 at 3:30 AM, Srikanth Lakshmanan srik@gmail.com wrote:
On Mon, Apr 9, 2012 at 00:46, Erik Zachte ezac...@wikimedia.org wrote:
returns 20 lines from this
Thanks, I meant to say what languages initially will be supported :)
D
On 2012-04-06, at 7:04 AM, Antoine Musso wrote:
Le 05/04/12 20:20, Diederik van Liere a écrit :
Which languages will Jenkins support?
Jenkins is just a bot, we can make it do whatever we want. The plan is
to have
Hi Chad,
On 2012-04-05, at 2:17 PM, Chad wrote:
Once we've got jenkins working reliably, I plan to remove the
verified permission so only the bots can set it.
-Chad
Which languages will Jenkins support?
Best,
Diederik
___
Wikitech-l mailing list
I am in touch with the core developers of the Apache Devicemap project
and we are exploring the possibility of collaborating. If something
comes out of this exploration then I will announce it here.
Best,
Diederik
On Wed, Mar 21, 2012 at 2:33 AM, Patrick Reilly prei...@wikimedia.org wrote:
I
Yes, ScientiaMobile has made some very important changes to the
license and it does mean (AFAIK) that you cannot store the wurfl.xml
in a repository.
This paragraph is particularly important:
You are not authorized to create a derivative work of or otherwise modify
this WURFL file, and you are
On 2012-03-07, at 6:01 AM, Chad wrote:
My main worry is that we are not spending enough time on getting all
engineers (both internal and in the community) up to speed with the
coming migration to Git and Gerrit and that we are going to blame the
tools (Gerrit and/or Git) instead of the
Hi all,
Some disclaimers before I start my thread:
1) I am a big believer in Git and dvcs and I think this is the right decision
2) I am a big believer in Gerrit and code-review and I think this is
the right decision
3) I might be wholly unaware / inaccurate of certain things, apologies
in
Hi,
Andre Engels did some analysis of the type of API formats used. The
data is from a single random Sunday in late 2011:
1997267 application/json
314285 text/xml
171259 -
68358 application/vnd.php.serialized
55549 text/html
34680 text/javascript
8907 application/x-www-form-urlencoded
8882
Welcome andrew!
Super excited to have you joining us!
Diederik
Sent from my iPhone
On 2012-01-06, at 13:13, Sumana Harihareswara suma...@wikimedia.org wrote:
On 01/06/2012 01:08 PM, Rob Lanphier wrote:
We're really excited to have Andrew on board to help bring some
systems rigor to our data
Hi Chad,
Reposurgeon (http://catb.org/~esr/reposurgeon/ ) might be a useful tool to
help fix the svn history.
Best,
Diederik
On Tue, Dec 13, 2011 at 11:47 AM, Chad innocentkil...@gmail.com wrote:
On Tue, Dec 13, 2011 at 11:44 AM, Chad innocentkil...@gmail.com wrote:
Couple of caveats
I think that the current version numbering system is confusing, incremental
version increases from 1.15 to 1.16 to 1.17 to 1.18, etc suggest to most
people minor changes with no compatibility implications. This is not the
case with MW. The Chrome version numbering is the other extreme, releasing
-1,
Personally, I like them because they give me a quick overview of the
inter-dependencies and how methods related to each other and so I guess
that for other 'newbies' this helps in getting through the learning curve
faster.
Diederik
On Thu, Dec 8, 2011 at 12:51 PM, Yuvi Panda
, Diederik van Liere dvanli...@gmail.com
wrote:
Hi folks,
Currently, we have a 'LATER' resolution in Bugzilla, it contains 339 bug
reports over all the products, see:
https://bugzilla.wikimedia.org/buglist.cgi?query_format=advancedlist_id=57731resolution=LATERproduct=CiviCRMproduct
to be fixed.
It is about expectation management :)
On Tue, Nov 29, 2011 at 2:53 PM, Merlijn van Deen valhall...@arctus.nlwrote:
On 29 November 2011 19:45, Diederik van Liere dvanli...@gmail.com wrote:
The question is, when is LATER? Technically, these bugs are not open and
so
nobody will ever see
So today I have read about a 100 LATER marked bug reports and I do think we
need the LATER resolution, but I would suggest to limit it's use case to only
those bugs were an external constituent, either the Wikipedia community or a
third-party software developer, needs to take an action and
It works on safari but it definitely gives a backtrace error on firefox 7
Diederik
Sent from my iPhone
On 2011-11-05, at 9:56, Amir E. Aharoni amir.ahar...@mail.huji.ac.il wrote:
2011/11/5 Andre Engels andreeng...@gmail.com:
There seems to be a Northern Soto Wikipedia at
I am running Firefox 7.01 on OSX Leopard 10.6.8 and it gives a backtrace
error.
Diederik
On Sat, Nov 5, 2011 at 10:15 AM, Ole Palnatoke Andersen palnat...@gmail.com
wrote:
On Sat, Nov 5, 2011 at 2:56 PM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
2011/11/5 Andre Engels
I think that is true unless you want to upload a fair use image, that is not
allowed on commons but is on some wikipedia sites like the English.
Diederik
Sent from my iPhone
On 2011-10-24, at 8:23, Greg DeKoenigsberg greg.dekoenigsb...@gmail.com wrote:
This is a good question. Simone sent it
This is really cool! Thanks Ariel and team for making this available.
best,
Diederik
On Thu, Sep 15, 2011 at 5:16 PM, MZMcBride z...@mzmcbride.com wrote:
Ariel T. Glenn wrote:
I think we finally have a complete copy from December 2007 through
August 2011 of the pageview stats scrounged from
Thanks for moving the page.
Diederik
On 2011-09-04, at 3:29 PM, Krinkle wrote:
2011/9/4 MZMcBride z...@mzmcbride.com
Diederik van Liere wrote:
I've suggested to generate bulk checksums as well but both Brion and
Ariel see
the primary purpose of this field to check the validity of the dump
Hi,
I've suggested to generate bulk checksums as well but both Brion and Ariel see
the primary purpose of this field to check the validity of the dump generating
process and so they want to generate the checksums straight from the external
storage.
In a general sense, there are two use cases
Hi!
I am starting this thread because Brion's revision r94289 reverted
r94289 [0] stating core schema change with no discussion [1].
Bugs 21860 [2] and 25312 [3] advocate for the inclusion of a hash
column (either md5 or sha1) in the revision table. The primary use
case of this column will be to
Hi!
Over the last year, I have been using the Wikipedia XML dumps
extensively. I used it to conduct the Editor Trends Study [0] and me
and the Summer Research Fellows [1] have used it in the last three
months during the Summer of Research. I am proposing some changes to
the current XML schema
Dear Alec,
Maybe the Community Department can help you out with your question. We
are doing a number of research sprints this summer to map out
different aspects of the Wikipedia communities and this sounds like a
great question and we have some researchers available to help write
the queries.
So
I love this idea!
Diederik
On Wed, Apr 6, 2011 at 5:21 PM, Mark A. Hershberger
mhershber...@wikimedia.org wrote:
Starting with this coming Monday's bug triage, I want to try and make
sure the community's voice is heard. In order to do that, I've created
the “triage” keyword in Bugzilla.
The Python Community recently switched to a DVCS and they have
documented their choice.
It compares Git, Mercurial and Bzr and shows the pluses and minuses of
each. In the end, they went for Mercurial.
Choosing a distributed VCS for the Python project:
http://www.python.org/dev/peps/pep-0374/
Please elaborate.
Diederik
Sent from my iPhone
On 2011-03-03, at 16:12, Dávid Tóth 90010...@gmail.com wrote:
Would it be useful to make a program that would create topic relations for
each wikipedia article based on the links and the distribution of semantic
structures?
...@gmail.comwrote:
On Mon, Feb 14, 2011 at 2:46 AM, Diederik van Liere dvanli...@gmail.com
wrote:
So maybe we can paste these 5 steps (or something similar) in the initial
form used to file a bugreport.
This would increase the quality of bugreports and make it easier for bug
triaging.
Increase
If I am not mistaken then mercurial has better support for highly
modularized open source
software projects. You can use a mercurial subrepository (which is
very similar to svn external and git submodule). According to their
manual:
Subrepositories is a feature that allows you to treat a
should add fields, we could add this text
as the default text in the textarea so people have a bit more guidance
when writing a bugreport.
No hard checks, nothing is mandatory.
On Mon, Feb 14, 2011 at 10:22 AM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
2011/2/14 Diederik van Liere dvanli
+1 to migrate to a DVCS
On Sun, Feb 13, 2011 at 8:38 PM, Mark A. Hershberger
mhershber...@wikimedia.org wrote:
mhershber...@wikimedia.org (Mark A. Hershberger) writes:
The solution I'm proposing is that we branch 1.18 immediately after the
release of the 1.17 tarball.
I want to give
Maybe we can make the bugathon part of the Berlin hackaton?
On Sun, Feb 13, 2011 at 4:03 PM, Ashar Voultoiz hashar+...@free.fr wrote:
On 13/02/11 11:54, Roan Kattouw wrote:
Bugzilla patches are another matter, yes, but I think making sure
patches get reviewed can be a Bugmeister task. We
I think we can draw some inspiration from Mozilla's use of Bugzilla and
particular the format they are encourage users when submitting a bugreport:
1) Steps to reproduce
2) Expected result
3) Actual result
4) Reproducible (by bugreporter): always / sometimes
5) Version information, extensions
://daniel.friesen.name]
On 11-02-13 07:53 PM, Diederik van Liere wrote:
Dear James, Amir and fellow wikimedia devs,
I understand your concern and I am not suggesting that we should force a
user to enter all Bugzilla fields but add those 5 questions as a
guideline
in the free-text form
For the last months I have been going through Bugzilla and what strikes me
is that we are not using it as efficiently as other communities do. In
particular, there is little follow up to reported problems (as Leo mentioned
as well). On the short term, I think we can have a bugathon to clean up
I think one way that non technical people can help is by trying to replicate
bugs, if they follow the steps as described in the bugreport Do you get the
same malfunction or not. That would be a great help as it weeds out invalid
bugreports
Sent from my iPhone
On 2011-02-12, at 17:26, phoebe
Dear dev's,
I am wondering whether the Mediawiki db contains a foreignkey
relationship between a main namespace article and the associated talk
page (if present).
Having this information would greatly simplify analytic projects to
monitor editor behaviour and understanding revert behaviour
Yes, manually matching is fairly simple but in the worst case you need
to iterate over n-1 talk pages (where n is the total number of talk
pages of a Wikipedia) to find the talk page that belongs to a user
page when using the dump files. Hence, if the dump file would contain
for each article a tag
The same error is given for:
* Russian
* Japanese
* Italian
* Arabic (ar is the language code)
Best,
Diederik
2011/1/7 Bryan Tong Minh bryan.tongm...@gmail.com:
On Fri, Jan 7, 2011 at 4:37 PM, Roan Kattouw roan.katt...@gmail.com wrote:
2011/1/7 Bryan Tong Minh bryan.tongm...@gmail.com:
Also
To continue the discussion on how to improve the performance, would it be
possible to distribute the dumps as a 7z / gz / other format archive containing
multiple smaller XML files. It's quite tricky to split a very large XML file in
smaller valid XML files and if the dumping process is already
Which dump file is offered in smaller sub files?
On Sun, Dec 19, 2010 at 6:02 PM, Platonides platoni...@gmail.com wrote:
Diederik van Liere wrote:
To continue the discussion on how to improve the performance, would it be
possible to distribute the dumps as a 7z / gz / other format archive
1 - 100 of 106 matches
Mail list logo