SInce thanking yourself would be weird, allow me to express my thanks for
all your work and shepherding during the logo process (and well beyond
that, but let's keep focus here).
Amir, thank you!
On Mon, Oct 26, 2020 at 8:05 AM Amir Sarabadani wrote:
> Hello people!
> So the voting period is
Thank you so much!
On Sat, Aug 8, 2020, 13:56 Amir Sarabadani wrote:
> Hey,
> Mailman, the software that powers our mailing lists, is extremely old, by
> looking at https://lists.wikimedia.org/ you can guess how old it is.
>
> I would really like to upgrade it to mailman 3 which has these
Thanks Victoria! I really enjoyed our interaction, and thank you for what
you did for the movement. I wish you all the best on your future path, and
hope you are not becoming a stranger!
Best wishes,
Denny
On Tue, Jan 15, 2019 at 10:25 AM Amir Sarabadani
wrote:
> One thing I want to point out
AllPages=202
> e.g. http://mappings.dbpedia.org/index.php/OntologyProperty:BirthDate
>
> You could directly use the DBpedia-lemon lexicalisation for Wikidata.
>
> The mappings can be downloaded with
>
> git clone https://github.com/dbpedia/extraction-framework ; cd core ;
> ../run download-mappin
ding because this (ambitious!) proposal may be of interest to
> people
> > >> on other lists. I'm not endorsing the proposal at this time, but I'm
> > >> curious about it.
> > >>
> > >> Pine
> > >> ( https://meta.wikimedia.org/wiki/Use
Ah, that sounds good. I was thinking of a scenario where someone runs code
in, say labs, and gains access to memory while that machine generates my
temporary code to send it to me, and thus gains access to that code.
Or, alternatively, just attack my browser through a compromised site
running a
I often get emails that someone is trying to get into my accounts. I guess
there are just some trolls, trying to login into my Wikipedia account. So
far, these have been unsuccessful.
Now I got an email that someone asked for a temporary password for my
account.
So far so good. What I am
Surely it should be possible to have a more resilient access to Wikipedia
content in the app than most vanilla browsers provide?
On Sat, Apr 29, 2017, 13:40 Florian Schmidt <
florian.schmidt.wel...@t-online.de> wrote:
> Because the app uses the same base URL for the requests to Wikipedia as
>
Does it also affect the app?
If so, why, and can we circumvent that?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
/RedisConnectionPool.php:233
And I can't find a bool being given here. For fun, I cast $port to an int,
but that didn't change anything.
Grmpf.
Thanks for the pointer though.
On Mon, Apr 17, 2017 at 7:49 PM Chad <innocentkil...@gmail.com> wrote:
> On Mon, Apr 17, 2017 at 3:07 PM Denny Vrandeči
Indeed, it seems your right - if I namespace the thing, the compiler
actually tells me that namespace/int is not the same as int. So yeah, it
probably assumes a type int here, which has no relation to the built-in
scalar int.
D'oh.
I didn't realize until know that one wasn't allowed to use type
That would be neat!
On Mon, Apr 17, 2017 at 7:51 PM Chad <innocentkil...@gmail.com> wrote:
> On Mon, Apr 17, 2017 at 2:07 PM Gergo Tisza <gti...@wikimedia.org> wrote:
>
> > On Mon, Apr 17, 2017 at 8:35 PM, Denny Vrandečić <vrande...@gmail.com>
> > wrote:
>
#29 ():
MWExceptionHandler::handleException()
Apr 17 21:48:57 mediawiki-vagrant hhvm[29575]: #30 {main}
On Mon, Apr 17, 2017 at 2:07 PM Gergo Tisza <gti...@wikimedia.org> wrote:
> On Mon, Apr 17, 2017 at 8:35 PM, Denny Vrandečić <vrande...@gmail.com>
> wrote:
>
> &g
The same works with bool too. I am glad I don't have to write (bool)true :)
On Mon, Apr 17, 2017 at 1:19 PM Denny Vrandečić <vrande...@gmail.com> wrote:
> Hm. If I try it in https://3v4l.org/WOTg0 I actually get good behavior
> for HHMV 3.12.14 - so it might be something problema
Hm. If I try it in https://3v4l.org/WOTg0 I actually get good behavior for
HHMV 3.12.14 - so it might be something problematic with our deployed HHMV
version?
On Mon, Apr 17, 2017 at 1:16 PM Denny Vrandečić <vrande...@gmail.com> wrote:
> Thanks for the suggestion, but I get exactly
Apr 17, 2017 21:36, "Denny Vrandečić" <vrande...@gmail.com> wrote:
>
> > I'm running into a weird problem, which made me reset my whole vagrant.
> >
> > I assume this is not strictly a MediaWiki issue, but probably an HHVM
> > problem, but maybe someone can h
I'm running into a weird problem, which made me reset my whole vagrant.
I assume this is not strictly a MediaWiki issue, but probably an HHVM
problem, but maybe someone can help me here.
So, if I start a fresh MediaWiki Vagrant installation, and the vagrant ssh
into the virtual machine, the
question!
On Mon, Apr 10, 2017 at 2:42 PM Bartosz Dziewoński <matma@gmail.com>
wrote:
> On 2017-04-10 06:17, Denny Vrandečić wrote:
> > On Sat, Apr 8, 2017 at 11:30 PM James Hare <jamesmh...@gmail.com> wrote:
> >
> >> Why, exactly, do you want a wik
andler for JSON in mediawiki). You can edit this in VE, although w/o
> any special support, and Parsoid will serialize it back to JSON.
>
> This could be turned into a very pleasant type-aware editor for JSON in VE
> pretty easily.
> --scott
>
> On Sun, Apr 9, 2017
similar activity around
XML in the MediaWiki development community.
Or, put differently, the same reason Wikibase is using JSON.
On Mon, Apr 10, 2017 at 11:06 AM Daniel Kinzler <daniel.kinz...@wikimedia.de>
wrote:
> Am 10.04.2017 um 06:17 schrieb Denny Vrandečić:
> > Ah, good
wikitext
> without going through all the trouble of writing an extension and getting
> it deployed, you might write that code in Lua as a Scribunto module.
>
>
> On Mon, Apr 10, 2017 at 12:17 AM, Denny Vrandečić <vrande...@gmail.com>
> wrote:
>
> > On Sat, Apr 8, 2017 at
ot when the markup is provided directly by an extension. The long
> term
> >> plan is for CollaborationListContent to put out HTML, since it’s more
> >> straightforward than using a wikitext intermediary that the user does
> not
> >> see anyway.
> >>
> >>
On Sun, Apr 9, 2017 at 8:38 AM Daniel Kinzler <daniel.kinz...@wikimedia.de>
wrote:
> Am 09.04.2017 um 08:23 schrieb Denny Vrandečić:
> > Here's my requirement:
> > - a wiki page is one JSON document
> > - when editing, the user edits the JSON directly
> >
On Sun, Apr 9, 2017 at 4:11 AM Gergo Tisza wrote:
> You probably want to subclass JsonContentHandler and add wikitext transform
>
What does it mean to add wikitext transform?
> and whatever else you need. For schemas, have a look
> at JsonSchemaContentHandler in
r the rationale in
particular.
>
> [0] https://www.mediawiki.org/wiki/Extension:CollaborationKit
>
> On April 8, 2017 at 11:23:38 PM, Denny Vrandečić (vrande...@gmail.com)
> wrote:
>
> Here's my requirement:
> - a wiki page is one JSON document
> - when editing, the use
Here's my requirement:
- a wiki page is one JSON document
- when editing, the user edits the JSON directly
- when viewing, I have a viewer that turns the JSON into wikitext, and that
wikitext gets rendered as wikitext and turned into HTML by MediaWiki
I have several options, including:
1) hook
Phabricator still has my wikimedia.de Email-Address and tries to confirm
it's me by sending an Email to that address.
Is there a way to reset the email?
(The weird thing is that I seem to receive those Emails on my regular gmail
account, but for logging in it still requires that I verify the
c features (e.g. collapsing
> > infoboxes).
> >
> >
> > -Dmitry
> >
> > On Thu, Oct 13, 2016 at 7:30 PM, Denny Vrandečić <vrande...@gmail.com>
> > wrote:
> >
> > > Two stupid questions, I couldn't find the answer on MediaWiki.org:
> &
Two stupid questions, I couldn't find the answer on MediaWiki.org:
1) is MediaWiki:common.js being loaded on the mobile web view? Or any other
js? What about User:Name/common.js?
2) same question but for the Wikipedia app. Does the app run any js?
___
Although I haven't touched MediaWiki code for a year or so, based on my
experience with large codebases with tons of contributors, I would be very
much PRO.
I understand it is a pain, but as Legoktm points out, it is a manageable
pain. Having a consistent and higher-quality code base is worth the
for insource:previous doesn't give me anything]. Most of the
> time, MediaWiki would convert that to a url with index.php in it.
>
> --bawolff
>
> On 11/16/15, Denny Vrandečić <vrande...@gmail.com> wrote:
> > An example here:
> >
> >
> https://he.wikipedia.org/wiki
Oh, my use case is simple: I just want to run my own instance of MediaWiki
for my personal wiki. I need to move it from my current provider.
I thought App Engine would be kinda simple and cheap to run MediaWiki on,
but it looks like it has a few rough edges, and I am wondering whether it
is worth
Did anyone manage to get MediaWiki running on Google App Engine? I am a bit
dense, it seems, and would appreciate a few pointers.
Cheers,
Denny
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
It's really in the tradeoffs, as others have mentioned.
It is obvious that we would love to get rid of tofu at all time. But what
is the effect on readership? As far as I know, a higher delivery speed of a
Website increases readership and the other way around. So if there was any
way to estimate
CC-BY-SA/GFDL is in the same spirit as GPL, which is an obviously
acceptable license for WMF (since MediaWiki is written in it).
Can you explain the need to relicense it under MIT?
2013/9/26 Steven Walling swall...@wikimedia.org
Forwarding, with permission.
For background the AFC Helper
Tim, thanks, I found this a very interesting aspect that I have not
considered before.
2013/8/28 Tim Starling tstarl...@wikimedia.org
On 27/08/13 03:12, C. Scott Ananian wrote:
Stated more precisely: a non-GPL-compatible license for an extension
means
that the extension can never be
Hi all,
we really would like to deploy the URL datatype to Wikidata with the next
deployment, i.e. next week. But it would make a lot of sense to have the
SpamBlackList extension be aware of ContentHandler for that.
There's already a +1 on the changeset, we would really appreciate someone
No, neither table would have unqiueness constraints (besides the primary
keys).
2013/7/29 Sean Pringle sprin...@wikimedia.org
On Tue, Jul 23, 2013 at 1:42 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
* EntityUsage: one table per client. It has two columns, one
2013/7/23 MZMcBride z...@mzmcbride.com
What we need is for you and Erik to recognize that you're wrong and to
make this right. Is there anyone besides you and Erik who agree with the
position you're taking here?
They are not alone. I also agree with their position, and I sincerely hope
we
2013/7/19 Antoine Musso hashar+...@free.fr
And installing yoursite would be something like:
mkdir mysite
cd mysite
composer require mediawiki/core
composer require wikibase/wikibase
# which installs data-values/data-values ask/ask as well
Just curious, why is composer require
Ah, OK, understood. Thanks.
2013/7/22 Jeroen De Dauw jeroended...@gmail.com
Hey,
wikibase/wikibase does not list mediawiki/core as a dependency :)
Indeed. Right now this just allows you to install Wikibase into an existing
MW install. Before we can go all the way, we first need to be
Hi,
please, everyone calm down and honesty try harder to assume faith and
respect the capabilities of each other. Respect includes to avoid
terminology like to bitch, to sneak in, stupid when describing each
other's actions.
One good way is when you are angry about an email, step back, wait a
Hi,
sorry for another long Email today.
Currently, when you change a Wikidata item, its associated Wikipedia
articles get told to update, too. So your change to the IMDB ID of a movie
in Wikidata will be pushed to all language versions of that article on
Wikipedia. Yay!
There are two use cases
Small correction.
2013/7/22 Denny Vrandečić denny.vrande...@wikimedia.de
* Subscriptions: one table on the client. It has two columns, one with the
pageId and one with the siteId, indexed on both columns (and one column
with a pk, I guess, for OSC).
That's entityId - siteId, not pageId
Another correction, same line. Gosh, it's hot here. Brain not working. Me
off home.
2013/7/22 Denny Vrandečić denny.vrande...@wikimedia.de
2013/7/22 Denny Vrandečić denny.vrande...@wikimedia.de
* Subscriptions: one table on the client. It has two columns, one with
the pageId and one
2013/7/22 Antoine Musso hashar+...@free.fr
Given we migrated our community from
subversion to git, I am confident enough that using composer will be
very easy to the community.
:D
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
2013/7/22 Ryan Lane rlan...@gmail.com
For pure PHP libraries, they could be distributed like pure PHP libraries
usually are. They can be packaged for multiple distros and be available via
apt/yum/composer (or pear). Having them as MediaWiki extensions is somewhat
awkward.
Yes, agree on
2013/7/22 Tyler Romeo tylerro...@gmail.com
Architectural integrity of code is a design-level issue. Continuous
integration is a programming and quality assurance-level issue. They have
nothing to do with each other, and you can maintain architectural integrity
just fine without having to
Tomorrow at 2pm Berlin time, the Wikidata team will host a public hangout
on Travis. This is mostly meant to inform ourselves, but it might be a good
resource for others as well.
We might be late, as this is the first time we are doing such a thing, so
bring a bit patience.
We will try to record
Awesome, that looks already pretty promising!
I am not completely sure I understand a few things:
1074167410
106
107215627
156
what do the two properties without a value mean here?
I would have expected:
1074167410
107215627
and now ask for suggested values for 31,
or for
Hey,
in order to sanity check the code we have written in the Wikidata project,
we have asked an external company to review our code and discuss it with
the team. The effort was very instructional for us.
We want to share the results with you. The report looks dauntingly big, but
this is mostly
I agree with most of the use cases, and I think they will possible with
Wikidata.
My suggestion would be to wait for this year, and then see which of the use
cases are still open: I think that by the end of the year we should have
made all of them possible (besides the Searches might be more than
2013/5/9 Brian Wolff bawo...@gmail.com
From what I hear wikidata phase 3 is going to basically be support for
inline queries. Details are vauge but if they support the typical types of
queries you associate with semantic networks - there is category
intersection right there.
Right.
If
That's awesome!
Two things:
* how set are you on a Java-based solution? We would prefer PHP in order to
make it more likely to be deployed.
* could you provide a link to a running demo?
Cheers,
Denny
2013/5/13 Nilesh Chakraborty nil...@nileshc.com
Hi everyone,
I'm working on a prototype
I am completely amazed by a particularly brilliant way that Wikipedia uses
Wikidata. Instead of simply displaying the data from Wikidata and removing
the local data, a template and workflow is proposed, which...
* grabs the relevant data from Wikidata
* compares it with the data given locally in
Just to add, ContentHandler is deployed on all Wikimedia projects.
2013/4/24 Bartosz Dziewoński matma@gmail.com
I think ContentHandler already theoretically has the ability to store
per-page language info, it's just not being used. (And of course it'd
have to be actually deployed
That looks like a cool idea.
I am trying to experiment it on a few pages, and it seems to considerably
reduce the number of web requests (for
https://en.wikipedia.org/wiki/Vincent_van_Gogh it goes from 120 to under 40
requests).
But the pages get quite bigger, obviously. Also, it introduces a
You can get the data from here:
http://dumps.wikimedia.org/wikidatawiki/20130417/
All items with all properties and their values are inside the dump. The
questions would be, based on this data, could we make suggestions for:
* when I create a new statement, suggest a property. then suggest a
the exact technology to use, i.e. whether
it would use the MobileFrontend or not, etc. We would help with setting it
up on labs.
Cheers,
Denny
On Tue, Apr 9, 2013 at 6:49 PM, Quim Gil q...@wikimedia.org wrote:
On 04/09/2013 02:39 AM, Denny Vrandečić wrote:
I would hope
It would also
One problem is that the necessity to link to a working test environment
does not allow to develop ideas with the community before they are
implemented.
2013/4/9 Amir E. Aharoni amir.ahar...@mail.huji.ac.il
2013/4/9 Steven Walling steven.wall...@gmail.com:
One system that I find a lot of
Those are very good points, both of them. Thanks.
2013/4/9 Matthew Flaschen mflasc...@wikimedia.org
On 04/09/2013 12:18 PM, Denny Vrandečić wrote:
I thought that in order to discuss these design decisions with the
community before hand, telling them on their respective village pump
Thank you, also for the explanation. I am glad we could defuse this.
On Apr 10, 2013 7:07 AM, Risker risker...@gmail.com wrote:
On 9 April 2013 12:15, Denny Vrandečić denny.vrande...@wikimedia.de
wrote:
Risker,
I find myself unconvinced by your argumentation as I perceive
2013/4/10 Risker risker...@gmail.com
On 9 April 2013 12:15, Denny Vrandečić denny.vrande...@wikimedia.de
wrote:
Risker,
I find myself unconvinced by your argumentation as I perceive it as
inconsistent.
On the one hand, you suggest that before we enable the option to access
data
I would hope that the widgets we have developed for the Wikidata desktop UI
- especially the Entity Selector widget - would be reusable on mobile.
It would also be extremely good to look in what the mobile team is doing
for Wikipedia. Since both is MediaWiki in the backend anyway, I would
assume
minor comments inline in your mail below.
2013/4/8 Risker risker...@gmail.com
On 6 April 2013 17:27, Denny Vrandečić denny.vrande...@wikimedia.de
wrote:
Or, put differently, the Wikidata proposal has been published nearly two
years ago. We have communicated on all channels for more than
Technical changes on the Wikimedia projects can be hairy. We are currently
having a discussion about the Wikidata deployment to the Wikipedias, and
there have been many examples in the past of deployments that raised
discussions.
One of my statements in this discussion is that the a priori
I fully agree with Robert and Phoebe in this matter. Wikidata is an option.
Requiring first to come up with rules on how to use Wikidata before it is
switched on simply won't work, because there is not sufficient interest and
experience for this discussion.
Or, put differently, the Wikidata
end well.
I am not even sure it is much more complicated. But I am very worried it is
too different.
Cheers,
Denny
Petr Onderka
[[en:User:Svick]]
2013/3/28 Denny Vrandečić denny.vrande...@wikimedia.de:
We have a first write up of how we plan to support queries in Wikidata.
Comments
We have a first write up of how we plan to support queries in Wikidata.
Comments on our errors and requests for clarifications are more than
welcome.
https://meta.wikimedia.org/wiki/Wikidata/Development/Queries
Cheers,
Denny
P.S.: unfortunately, no easter eggs inside.
--
Project director
I would strongly support to not lend support to the believe that everything
under the sun is copyrightable. We should, in my opinion, take the position
that trivial things like these are not copyrightable and should put a CC0
on it. We should not set an example and establish a practice that single
Hey,
as you remembered, we were asking about EasyRDF in order to use it in
Wikidata.
We have now cut off the pieces that we do not need, in order to simplify
the review. Most of the interesting parts of EasyRDF regarding security
issues -- parsing, serving, etc. -- has been removed.
Our code is
I certainly don't want to cookie-lick this usecase for Wikidata / Wikibase,
but I think that using the Wikibase extension for this might be easier than
it looks. The one major point would be a bit of coding to allow claims for
the media namespace. But indeed, right now we do not have the bandwidth
As you probably know, the search in Wikidata sucks big time.
Until we have created a proper Solr-based search and deployed on that
infrastructure, we would like to implement and set up a reasonable stopgap
solution.
The simplest and most obvious signal for sorting the items would be to
1) make a
After evaluating different options, we want to use for generating
Wikidata's RDF export the EasyRDF library: http://www.easyrdf.org/
We only need a part of it -- whatever deals with serializers. We do not
need parsers, anything to do with SPARQL, etc.
In order to minimize reviewing and potential
2013/2/19 Tim Starling tstarl...@wikimedia.org
On 19/02/13 21:11, MZMcBride wrote: Has any thought been given to what to
do about this? Will it require
manually paginating the data over collections of wiki pages? Will this be
something to use Wikidata for?
Ultimately, I would like it to
For later.
As discussed before, access via HTTP is probably hardly an option for the
Wikimedia wikis (and they are our priority), but for other wikis that will
be crucial.
Cheers,
Denny
2013/2/18 Yuri Astrakhan yuriastrak...@gmail.com
How useful would it be for Lua to access to the
There will be (actually, there is already) a web API offering the kind of
data required, and for client wikis not running on WMF infrastructure this
will eventually be the way to access the data.
For WMF clients, like the Wikipedias, our decision was not to use HTTP web
requests, but to
Thank you for the Roadmap, Gabriel! It is some exciting and interesting
stuff inside.
I am really happy that the roadmap would allow us this year to highly
optimize Wikidata-related changes on the Wikipedias, i.e. we would not need
to reparse the whole page when some data in Wikidata changes, and
(not sure if this is the right list)
There was the question raised about adding Wikidata pagecount data to the
dumps. As far as I can tell Wikivoyage is already inside the data (although
not described on the description page), but I could not find Wikidata
(maybe I am just missing the right
I am mildly surprised by Jenkins giving a +1 Code Review (not Verification)
here.
https://gerrit.wikimedia.org/r/#/c/33505/
Just wondering if there is maybe a setup issue.
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219
Ah, thanks, that explains it.
2013/1/9 Brad Jorsch bjor...@wikimedia.org
2013/1/9 Denny Vrandečić denny.vrande...@wikimedia.de:
I am mildly surprised by Jenkins giving a +1 Code Review (not
Verification)
here.
https://gerrit.wikimedia.org/r/#/c/33505/
Just wondering
Just curious -- I probably missed it in a previous mail -- why are the
tests switched off?
To preserve processing power?
To speed up tests for the whitelisted?
Security concerns when running tests with arbitrary code?
Other?
Cheers,
Denny
2012/12/19 Antoine Musso hashar+...@free.fr
Hello,
This looks very well for us. Just one thing: Phase 4 of 1.21wmf7 should
probably be Monday, January 14, not January 11.
Cheers,
Denny
2012/12/17 Rob Lanphier ro...@wikimedia.org
Hi everyone,
Because a number of people are planning to take time off for the
holidays, I'd like to postpone the
I get a Too many redirects error when trying to open this attachement
https://bug-attachment.wikimedia.org/attachment.cgi?id=11489
from this bug
https://bugzilla.wikimedia.org/show_bug.cgi?id=42955
Anyone an idea?
Cheers,
Denny
P.S.: I just saw that Daniel poste da bug on this
Hi all,
we have here a number of bugfixes and testfixes and other stuff where
some reviewing input would be very appreciated.
== Bugfixes ==
* There was some vanishing content
https://bugzilla.wikimedia.org/show_bug.cgi?id=41352 , and a
temporary fix for it. Here is a proper fix for the problem,
that this is the preferred solution and that this is better than the
solution we suggested.
Anyone having any comments, questions, or insights?
Cheers,
Denny
2012/11/5 Tim Starling tstarl...@wikimedia.org:
On 02/11/12 22:35, Denny Vrandečić wrote:
* For re-rendering the page, the wiki needs access
Hi all,
Wikidata is planned as a multilingual resource, and we are using ULS
for switching languages. ULS is pretty cool, if you have not tried it
out yet, you definitively should.
ULS works great if you are a logged in user.
ULS on Wikidata does not work so well right now if you are not logged
Hi all,
Wikidata aims to centralize structured datas from the Wikipedias in
one central wiki, starting with the language links. The main technical
challenge that we will face is to implement the data flow on the WMF
infrastructure efficiently. We invite peer-review on our design.
I am trying to
For now, we have no plans for Wikidata to create articles. This would,
in my opinion, meddle too much with the autonomy of the Wikipedia
language projects.
What will be possible is to facilitate the creation of such bots, as
some data that might be used for the article might be taken from and
Dear all,
let me say it like this:
WOH!!!
All of our patchsets have been merged into core. You are awesome!
Thank you so much!
We won't leave you without new work, though. Three points:
First, the merge of the ContentHandler branch is, now that it is being
deployed, revealing issues in
That is great to hear. Thanks for tying us together, Asher.
For Wikidata, we have not uploaded our Solr extension yet (mostly
because we are waiting for the repository to be set up), but we will
then upload it soon once it is there. I would be especially interested
in sharing schema and config
Hi all,
here is our weekly report on changesets to core relevant for Wikidata
development.
* Great news! The Wikidata branch got merged, possibly the biggest
single changeset to MediaWiki. Thanks to everyone for their input, I
am afraid to try to list them all because I would fail. Special
Hi all,
here's a status update of the reviews on the Wikidata related changes to core:
* ContentHandler: a lot has happened here in the last few days. We
created a faux commit that squashed the changes in order to gather
comments, and have responded to most of them, implementing the
suggested
Hi,
I have tried to create one changeset with the whole content handler
for your convenience:
https://gerrit.wikimedia.org/r/#/c/25736/
Note that is based on 7ccc77a and needs to get rebased before merge,
but the major parts should be there.
Since I am not a Git expert, I may have mad it all
Hi all,
here's our weekly mail on core related stuff for Wikidata. Thanks to
everyone giving feedback and reviews, most notably Chris, Chad, Tim,
Matmarex, DJ, and Krinkle!
* the ContentHandler branch is being reviewed to land in Core next
week, as Rob said. There is a separate thread on that.
2012/9/26 Tim Starling tstarl...@wikimedia.org:
On 26/09/12 03:54, Tyler Romeo wrote:
This looks pretty interesting. Is there a reason we don't just put this in
the core?
It has about 50 lines of useful code wrapped in 1600 lines of
abstraction. I don't think it is the sort of style we want
2012/9/21 Strainu strain...@gmail.com:
Well, you said something about Wikidata. But even if the client Wiki
would not need to load the full census, can it be avoided on Wikidata?
Talking about the template that Tim listed:
#Technical_requirements_and_rationales_3).
Cheers,
Denny
2012/9/21 Denny Vrandečić denny.vrande...@wikimedia.de:
2012/9/21 Strainu strain...@gmail.com:
Well, you said something about Wikidata. But even if the client Wiki
would not need to load the full census, can it be avoided on Wikidata?
Talking about the template
currently a sub-part of the sites management topic. I would
suggest that this gets started in its own RFC, what do you think?
Cheers,
Denny
2012/9/21 Daniel Friesen dan...@nadir-seen-fire.com:
On Thu, 20 Sep 2012 05:54:02 -0700, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
2012/9/20
:
On Wed, 19 Sep 2012 12:53:54 -0700, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Hi all,
here's our weekly list of Wikidata review items. Due to the hands-on
meeting last week we refrained from sending it earlier. Now that most
should be back home, I wanted to give an overview
1 - 100 of 135 matches
Mail list logo