For now, Wikidata does not plan to cover the part of scraping data
automatically from the Web, but only to provide a place where such data can
be edited, stored, and re-published, including references. My assumption is
that the community might create bots to perform such scrapings, and maybe
I guess the closest are these documents:
http://meta.wikimedia.org/wiki/Wikidata/Introduction
http://meta.wikimedia.org/wiki/Wikidata/Technical_proposal
and the articles linked here:
http://meta.wikimedia.org/wiki/Wikidata/Notes
which are still under development. But you have found these
How much more formal can it get? :)
Feel free to spread and forward.
2012/3/30 John Erling Blad jeb...@gmail.com
Can the link be forwarded now or should I await some formal anouncement?
John
On 30. mars 2012 10.49, Denny Vrandečić denny.vrande...@wikimedia.de
wrote:
Hi all,
Wikimedia
Since some claim that Wikidata's model of representing statements with
their references and the diversity of knowledge in the world outside is
just a way to bring Wikipedia's values to the world of data, Conservapedia
has launched an alternative approach to the challenges we are tackling. In
good
Yep, until now we always have been discussing and are strongly leaning
towards CC-0 for the data content, and CC-BY-SA for the textual content of
Wikidata, i.e. the project pages, discussion pages, etc.
We are fully aware that most CC licenses are not adequate for data.
Cheers,
Denny
2012/4/2
The label and the description together are meant to be identifying.
I.e. Georgia - A country in central Asia, or Frankfurt - A city in
Hesse, Germany, etc.
Additionally, the Wikipedia links provide quite some guidance to it.
Cheers,
Denny
2012/4/5 Gregor Hagedorn g.m.haged...@gmail.com
Dear Martynas,
if you try to model the following statement in RDF
The population density of France, as of an 2012 estimate, is 116 per
square kilometer, according to the Bilan demographique 2010.
you might notice that RDF requires a reification of the statement. The data
model that you have
Hi John,
no, you have seen correctly that there is no separation between classes and
instances. If this brings our model closer to topic maps, then this is
convenient.
I have to admit that my knowledge of topic maps is quite limited. As far as
I understand it, they are an ISO standard and can be
John,
your suggestion has two requirements that I think are hard to achieve:
* first, we need an agreement on the set of (non-overlapping but complete)
types that exist in the world
* second, we would need to assume that the Wikidata editors would agree on
one and exactly one type for every
to be more
ambiguity.
Just my two cents.
- Brent
Brent Hecht
Ph.D. Candidate in Computer Science
CollabLab: The Collaborative Technology Laboratory
Northwestern University
w: http://www.brenthecht.com
e: br...@u.northwestern.edu
On Apr 5, 2012, at 4:50 PM, Denny Vrandečić wrote
John,
thanks! I fully agree.
And this is indeed pretty much what we have in our data model.*
I think that we really need to get our draft mapping to RDF done, in order
to show that we align pretty much with this suggestion.
Cheers,
Denny
* well, we also add density, but I think that is merely
2012/5/9 Lydia Pintscher lydia.pintsc...@wikimedia.de
On Wed, May 9, 2012 at 10:52 AM, Chris Tophe kipmas...@gmail.com
wrote: Concerning collaboration between the two projects,
I am not sure why the new-Wikidata is starting from scratch, and not from
the
old-Wikidata, but there are
I was silent on this thread mostly due to the following two points:
1. as mentioned several times, a standard for us to be considered must be
free. Free as in Everyone can get it without having to pay or register for
it. I can give it to anyone legally without any restrictions. Free of
patents.
We are re-working the workflow for this use case. We are aiming to post
something this week, as the old workflow will not be used.
https://meta.wikimedia.org/wiki/Wikidata/Development/Storyboard_for_linking_Wikipedia_articles
- the old one!
Thanks for the question, this is indeed one of the
Hi JMC,
thank you for the explanations, I understand quite better now.
Re 1, I regard it as pretty obvious. But here's the relevant sentence taken
fromt he Wikimedia values: We believe that this mission requires thriving
open formats and open standards on the web to allow the creation of content
We are working on the RDF export draft. We will not extend RDF in any
way, but use it to express the model in its entirety. We will post to
this list once this is drafted.
Cheers,
Denny
2012/6/22 Paul A. Houle p...@ontology2.com:
On 6/20/2012 6:39 AM, Lydia Pintscher wrote:
Heya folks :)
Hi all,
the first phase of Wikidata will help to centralize many of the
Wikipedia language links. We did a small analysis to figure out the
possible impact of this step.
Here are a few highlights:
* there are more than 240 million language links in the Wikipedias
* they are responsible for about
Snaevar,
thanks, that was really helpful!
Cheers,
Denny
2012/6/25 Snaevar snaevar-w...@gmx.com:
Language links are commented out to stop bots from consistently adding the
same link, while also preventing it to appear under In other languages in
the sidebar.
This is essentially an inproper
be useful for you.
Cheers,
Christopher
On Mon, Jun 25, 2012 at 5:29 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
I'll maybe... I shouldn't... other stuff to do... gnah...
Let's see. I may well do a new run in the next few days...
(you do realize that some of them wikis
Hello Michael,
thank you for your input, this is extremely valuable.
In general I expect that Wikidata will serve your needs better than an
extraction from Wikipedia could. First, yes, we will have more stable
identifiers. Second, it should be better at identifying items of
interest. Some of the
Yes, we are planning to do both in parallel, as this page explains:
https://meta.wikimedia.org/wiki/Wikidata/Notes/URI_scheme
Cheers,
Denny
2012/7/5 Gregor Hagedorn g.m.haged...@gmail.com:
I don't mean to spin this out into a tangent about Drupal.
Me neither, my discussion point here is:
Hi Sumana,
yes, we should have more data about it by tomorrow. Some logistic
details are still being decided.
Cheers,
Denny
2012/7/10 Sumana Harihareswara suma...@wikimedia.org:
From your last weekly digest:
we will have a talk at the New York Times, open to the public, on Thursday,
July
Hi Michael,
answers inline.
2012/7/26 Michael Smethurst michael.smethu...@bbc.co.uk:
Very delayed reply but think I'm still confused on this. Made a picture to
clear my mind but not sure it works:
http://smethur.st/wikidata
The bit I think I get:
If I request
Just done.
2012/8/6 Yury Katkov katkov.ju...@gmail.com:
Hi Denny!
These news are probably worth spreading to Semantic Web and LOD mailing
lists.
-
Yury Katkov
On Mon, Aug 6, 2012 at 1:42 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Hello all,
we finally have
2012/8/5 Scott MacLeod worlduniversityandsch...@gmail.com:
What's the deployment plan, for languages 10 through 285 (number of
Wikipedia languages) through 7,358 (per The Ethnologue), particularly in
terms of developing community email lists (graduate students?)
Beyond the first few Wikipedias
And I forgot to add: many, many thanks for organizing and holding the
discussion on the it.wp! This is extremely appreciated. Once we go to
Phase II testing we certainly will keep that in mind!
You people are awesome!
Cheers,
Denny
2012/8/6 Denny Vrandečić denny.vrande...@wikimedia.de:
Ciao
2012/8/13 Lydia Pintscher lydia.pintsc...@wikimedia.de:
On Mon, Aug 13, 2012 at 5:58 AM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
2012/8/13 Snaevar snaevar-w...@gmx.com:
Is this http://meta.wikimedia.org/wiki/User:MerlIwBot/WikiData what you are
looking for?
Well, probably.
Hi Amir,
thanks for the bug report! We will go and implement it along the lines
you suggest it.
I have one question still, maybe you can help me:
About 10% of the titles in the Hebrew Wikipedia are in latin alphabet
(very rough estimate, may be completely off, based of a glance on
Hi Tgr,
thank you for the input!
2012/8/12 Tgr gti...@gmail.com:
These mockups seem focused on displaying data, with very little focus on
search/organization (there is only a reliability filter in the mockups and a
text search in some of them). If Wikidata is intended only as an infobox
it.
--
Amir
2012/8/13 Denny Vrandečić denny.vrande...@wikimedia.de:
Hi Amir,
thanks for the bug report! We will go and implement it along the lines
you suggest it.
I have one question still, maybe you can help me:
About 10% of the titles in the Hebrew Wikipedia are in latin alphabet
(very rough
It is not. Wikidata would know if an article on Eva exists in the
Wikipedia of a given language (if it is appropriately connected), and
thus would be able to automatically offer the link or not.
Cheers,
Denny
2012/8/14 dr.cueppers.g...@arcor.de:
In cases of changing an old name by a new name
Let's see. Joancreaus on IRC I think mentioned that he was interested
in pywikipedia + wikidata.
I hope that there are enough people interested in that so that the
pywp community won't have to wait too long to use that beautiful
framework.
Cheers,
Denny
2012/8/14 Snaevar snaevar-w...@gmx.com:
The editors would need to make the connections explicitly between the
article. The software would not be guessing it, if that is what you
are assuming. And the editors should be good at getting the correct
one.
Together with Marco's answer, does it make it clearer?
2012/8/14
The idea is that the mayor would not be represented as a String value,
even for smaller cities, but always by an item. This would possibly
lead to items that have no Wikipedia articles associated to them, but
there is no problem with that.
But humans (and other entities) should not be represented
Hi Gregor,
I disagree mostly with you in the question of using strings or items
for persons, but incidentally, that does not matter. I agree with you
fully in the following point:
2012/8/14 Gregor Hagedorn g.m.haged...@gmail.com:
I would prefer if the decision whether entity-identity is known
Theoretically multitext could be replaced, but I would not like to do
that. A property like Tagline for a movie or motto for a country
might make sense to be a multitext. Yes, you could make the tagline of
a movie an item -- but do we really want to require it to be an
intermediary item? The
2012/8/17 Gregor Hagedorn g.m.haged...@gmail.com:
Monotext is irreplaceable, though, and it means a simple string
without a language designation. Something like Chemical symbol, I
guess, would be a monotext, or ISO 3166 code. A intermediary item
could not do the job in that case.
I think
The idea of installing the extensions that are running on the
Wikipedias on the client is to see if we encounter any weird
interactions that shouldn't be there, i.e. it is some mild form of
integration testing. Thus installing Oversight makes sense, as it runs
on the Wikipedias.
Cheers,
Denny
Because we ARE using standards like RDF or OWL (or HTML or URIs) which
are W3C and IETF standards, and which in turn have a well documented
policy regarding patents and copyrights, see e.g.
http://www.w3.org/Consortium/Patent-Policy-20040205/ for W3C
standards.
I hope that answers that question.
Nadja,
John's question was:
So, the consequent question I asked then was, if you're not going to
use any (ISO or national) standard then how can you assure the WP
community that Wikidata is not violating someone's copyright(s)?
My answer to that question was that we are using standards. And
Nadja wrote:
Hello Denny Vrandecic
I hope you have a lawyer who checks this, on a first 1 min glance at
this page it doesnt
look to me obvious that a collision (like when creating a
classification scheme as described in
my previous email and as John asked) is excluded, it looks more as the
2012/9/5 Nadja Kutz na...@daytar.de:
Is it planned by the Wikidata team that someone phones these people in
Geneva and asks wether wikidata could at least base its ontology (here I
mean in particular the overall classification scheme, like a hammer is a
tool a.s.o) on
the ISO Standard
2012/9/5 Marco Fleckinger marco.fleckin...@gmail.com:
Hi,
On 05.09.2012 10:39, John Erling Blad wrote:
Please note that this is a breaking change for bots!
It is decided that the module wbsetitem will change from the present
short form in the json structure to a long form. Exactly how the
2012/9/17 Jeroen De Dauw jeroended...@gmail.com:
if two items have the same description, can one of them use an alias that
is the title of the other?
Good question. Right now this is not enforced. Then again, right now aliases
are not used anywhere for lookups except in the fulltext search
Re: keys for properties
For now the following solution seems to be the simplest:
* make labels for properties be unique for a given language
In that case they can be used as keys. Every wiki has one (and exactly
one) site language. If the label is unique, a property can be
addressed by
That sounds very good to me.
+1 (or am I allowed to say +2 ? ;)
2012/9/16 Jeroen De Dauw jeroended...@gmail.com:
Hey,
There is some disagreement regarding the interface to access statements in
various entities.
== Current implementation ==
This is not fully implemented yet, but it's what
2012/10/12 Lydia Pintscher lydia.pintsc...@wikimedia.de:
On Fri, Oct 12, 2012 at 4:45 PM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
5. Somebody complained that it's too easy to remove a link from a repo
- clicking the remove link is enough. I mentioned it in a bug
report:
Wikidata properties will be translated like items, i.e. they will have
wiki pages on Wikidata with labels (actually, you can see that already
on our demo version).
Properties are not specific to a Wikibase-Instance, which is why it
would not make sense to translate them using TranslateWiki.
Not contradictory, merely confusing :)
Whether you should, is the communities decision. I was merely mentioning
that I would be happy if there was a bit to do for humans still, because
they write better bug reports.
On the other hand, considering we already have over 100k items, it is
probably
Yes, there is something wrong with the divs. Need to investigate.
2012/11/27 Lukas Benedix bene...@zedat.fu-berlin.de
Am 27.11.2012 12:53, schrieb Anja Jentzsch:
Hi,
On 27/11/2012 10:59 Marco Fleckinger wrote:
Btw: I have one notes on the HTML in the Repository [2]:
* there is a
As said, this is correct with regard to deployment. We will not, though,
halt developing phase 2 until phase 1 is fully deployed. So development of
phase 1 is finished, its deployment is not yet. And we cannot promise or
predict when deployment of phase 1 will be completely finished.
Development
Can you give me a reference for the statement that CCBYSA and ODBL are
compatible? I understand that they have certain similarities, but I want to
understand if I can take ODBL content and relicense it under CCBYSA and the
other way around.
Or else what do you mean with compatible?
2012/11/28
Ooghe-Tabanou from Regards Citoyens (France)
On Wed, Nov 28, 2012 at 5:23 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Can you give me a reference for the statement that CCBYSA and ODBL are
compatible? I understand that they have certain similarities, but I want
to
understand if I
There is some confusion in this thread.
A single data item is not licensable. The fact that Berlin is the capital
of Germany, or that the population of the Seychelles is 84,000 is not
licensable. Let us hope that this will never change.
Copyright covers a specific expression, e.g. a concrete
From http://www.wikidata.org/wiki/Wikidata:Project_chat#Deployment_schedule
:
If everything goes well, we will deploy a new version of the Wikibase
extension today to Wikidata, probably around 7-9pm UTC (i.e. in the next
few hours). Things might go wrong, deployment might stall, or we might need
Hopefully today.
2012/12/2 Amir E. Aharoni amir.ahar...@mail.huji.ac.il
Hello,
There are a few RTL bugs that were already fixed in the Wikibase code a
while ago, but don't seem to be deployed yet. In particular, these two are
quite disruptive:
*
Due to an DB schema update issue we did not deploy today. The wiki was
read-only for a short while, as we started deployment, but we rolled it
back.
Nevertheless big thanks to everyone involved. We will reassess the schedule.
Cheers,
Denny
2012/12/3 Denny Vrandečić denny.vrande...@wikimedia.de
We have talked informally to some lawyers, some of them specialists in the
relevant fields. Basically the understanding I was previously describing
was confirmed, but more importantly there has been only a small number of
cases testing out the implementation of the European database directive
(and
, this
could be viable.
(Non-linear transformations -- most notoriously temperature -- will get its
own implementation anyway)
Opinions?
2012/12/17 Denny Vrandečić denny.vrande...@wikimedia.de
As Phase 2 is progressing, we have to decide on how to represent data
values.
I have created a draft
Thank you for your comments, Marco.
2012/12/18 Marco Fleckinger marco.fleckin...@wikipedia.at
On 2012-12-18 15:29, Denny Vrandečić wrote:
* Time: right now the data model assumes that the precision is given on
the level decade / year / month etc., which means you can enter a date
of birth
at Noon, but not the specific day.
Friedrich
On Tue, Dec 18, 2012 at 3:29 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Thanks for the input so far. Here are a few explicit questions that I
have:
* Time: right now the data model assumes that the precision is given
Martynas,
could you please let me know where RDF or any of the W3C standards covers
topics like units, uncertainty, and their conversion. I would be very much
interested in that.
Cheers,
Denny
2012/12/19 Martynas Jusevičius marty...@graphity.org
Hey wikidatians,
occasionally checking
I am still trying to catch up with the whole discussion and to distill the
results, both here and on the wiki.
In the meanwhile, I have tried to create a prototype of how a complex model
can still be entered in a simple fashion. A simple demo can be found here:
http://simia.net/valueparser/
The
Hi all,
wow! Thanks for all the input. I read it all through, and am trying to
digest it currently into a new draft of the data model for the discussed
data values. I will try to adress some questions here. Please be kind if I
refer the wrong person at one place or the other.
Whenever I refer to
...)
2013/1/3 Andy Mabbett a...@pigsonthewing.org.uk
On 3 January 2013 16:11, Denny Vrandečić denny.vrande...@wikimedia.de
wrote:
What it currently does not do is:
* enable dates like Date of birth: 437-436 BC
That'a complex issue, but some interesting work on a draft standard,
has been done
/Ficheiro:Bahia_Municip_Salvador.svg
[2] https://scruzwiki.org/map/City_of_Santa_Cruz/_edit
2013/1/3 Denny Vrandečić denny.vrande...@wikimedia.de
Hi all,
continuing from last weeks data values discussion, I would like to invite
comments on the following prototypes for understanding points
).aspx
[2] http://en.wikipedia.org/wiki/ISO_8601#Calendar_dates
On Thu, Jan 3, 2013 at 1:11 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Hi all,
continuing from last weeks data values discussion, I would like to invite
comments on the following prototypes for understanding
)
Sk!d
On Thu, Jan 3, 2013 at 5:11 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Hi all,
continuing from last weeks data values discussion, I would like to invite
comments on the following prototypes for understanding points of time and
points on Earth.
http://simia.net
year.
On Fri, Jan 4, 2013 at 12:53 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
If you enter 15. Jan 2013 it will know the difference. The goal of the
datatype is to understand a point in time within a given precision. 15th
of
January is not a point in time, it is a recurring date
Should the coordinates 456,-234 be understood as 84, 126, or should they
just be an error?
2013/1/3 Andy Mabbett a...@pigsonthewing.org.uk
On 3 January 2013 16:11, Denny Vrandečić denny.vrande...@wikimedia.de
wrote:
What it currently does not do is:
* enable dates like Date of birth
2013/1/8 Gregor Hagedorn g.m.haged...@gmail.com
ON COORDINATES:
a) what you describe is more specific than a geolocation (which may be
expressed by other means than coordinates). I suggest to give the data
type the more specific name:
geocoordinates
Yep, agreed. Or just coordinates.
. but otherwise it seems like the same things. (And they can
be transformed from one to the other using a simple factor).
2013/1/8 Katie Filbert katie.filb...@wikimedia.de
On Tue, Jan 8, 2013 at 1:54 PM, Nikola Smolenski smole...@eunet.rswrote:
On 08/01/13 12:36, Denny Vrandečić wrote
Hi all,
We did yet another version of the inclusion syntax (admittedly, the last
one is a few months old).
We decided to very much simplify the syntax and also the ability of the
inclusion syntax, and depend on anything more complex than what will be
possible with that syntax on Lua.
I am
Exactly. This is about the backend and the API. The user will rather use a
Widget maybe similar to this one:
http://localhost/~denny_WMDE/valueparser/time.html
2013/1/17 Luca Martinelli martinellil...@gmail.com
2013/1/17 Denny Vrandečić denny.vrande...@wikimedia.de:
Based on the feedback
I heard this URL might be better than the previous on most setups:
http://simia.net/valueparser/time.html
2013/1/17 Denny Vrandečić denny.vrande...@wikimedia.de
Exactly. This is about the backend and the API. The user will rather use a
Widget maybe similar to this one:
http://localhost
2013/1/25 Daniel Kinzler daniel.kinz...@wikimedia.de
Hi!
I thought about the RDF export a bit, and I think we should break this up
into
several steps for better tracking. Here is what I think needs to be done:
Daniel,
I am answering to Wikidata-l, and adding Tpt (since he started working
No, not by design.
The design would be to have
http://en.wikidata.org/wiki/Breck
be an alias for
http://www.wikidata.org/wiki/Q123803
but we didn't have yet the time to set this up properly.
If anyone knows Apache well and has some time on their hands, please ping
me on IRC.
Right now, they
We are reluctant, but open, to renaming it. But not to Fact. Statement
has the nice ambiguous quality regarding its correctness which Fact lacks.
On the other hand, the similarity to rdf:Statement is not merely syntactic,
so I do not see too much of an issue here.
2013/2/1 Nicholas Humfrey
I assume you mean the unability to remove the second Yen.
This is... interesting. We have right now no idea what is going one.
Investigating.
Thank you for reporting,
Denny
2013/2/6 Napoleon Tan napoleon@gmail.com
I think the wikidata Item entry for Japan is corrupted. I cannot remove
It is a feature. The reason for that is that in the Wikipedias when you
access the data you can't use the property names otherwise - they have to
be unique. So in order to be able to write {{#property:capital}}, the
capital property needs to be unique (and a {{#property:p25}} we considered
to be
Sven,
thank you for your honest opinion, and I know that you are not alone with
it - but I also heard a lot of people express excitement and joy about the
deployment, and based on the activity it seems that a lot of people like
it. We consider ourselves happy to be part of an intelligent and
We are still working on a postmortem.
As of now, it seems there has been some serious memchached failures and
some interplay with another software deployment.
2013/2/13 Jan Kučera kozuc...@gmail.com
What were the issues in detail?
2013/2/12 Lydia Pintscher lydia.pintsc...@wikimedia.de
You have examples of that? Did not happen to my edits (so far).
2013/2/13 Denny Vrandečić denny.vrande...@wikimedia.de
Block them until they behave?
2013/2/13 Katie Chan k...@ktchan.info
On 13/02/2013 21:01, Lydia Pintscher wrote:
Heya :)
Third time's a charm, right? We're live
Yes, every Wikidata page is about one and exactly one entity. There cannot
be two entities on one page.
Bonny and Clyde is one entity, designing the pair of people.
Bonny and Clyde each might also be one entity each, and there could be
relevant connections between the three entities Bonny,
There is currently a number of things going on re the future of Wiktionary.
There is, for example, the suggestion to adopt OmegaWiki, which could
potentially complicate a Wikibase-Solution in the future (but then again,
structured data is often rather easy to transform):
Hi Dario,
two or three features are still missing to enable that (sorted in order we
are probably going to deploy them):
* qualifiers
* the time datatype
* statement ranks
As soon as they are available, this can be modeled in a way that it can be
useful for projects accessing the data.
So,
That is a tough question. We are pretty sure that we technically scale
quite well, and there is no reason that the community should restrict
itself out of technical reasons. If the number of item suddenly increases
by one or two orders of magnitudes, we would probably meet a few hiccups on
the
We do have strong types, but only few of time: item, commons media, string,
time, geo, URL. Government leader would not be a supported type.
The exact list and details are here:
http://meta.wikimedia.org/wiki/Wikidata/Data_model#Datatypes_and_their_Values
Cheers,
Denny
2013/3/21 Michael
It really depends on your definitions :)
Items are strongly typed as items. Any item can have any property. And only
items can have properties. Time or geocoordinates, e.g., can not have
properties.
But yes, there is no forcing of properties onto any item, nor any
restriction of usage of every
Oh, I would please ask to wait another week or two, for us to have
qualifiers. Maybe they can deal with some of these cases. We just got them
demoed today, and they really look neat, so I am very convinced they will
be there with the next update.
2013/3/27 Michael Hale hale.michael...@live.com
We have a first write up of how we plan to support queries in Wikidata.
Comments on our errors and requests for clarifications are more than
welcome.
https://meta.wikimedia.org/wiki/Wikidata/Development/Queries
Cheers,
Denny
P.S.: unfortunately, no easter eggs inside.
--
Project director
) can be added to
what links here.
Feel free to include back the distribution list in your reply if you see
merits in this suggestion.
Best Regards,
Alex
On Apr 2, 2013, at 9:54 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Hi Janyong,
as Michael said, Wikidata does
end well.
I am not even sure it is much more complicated. But I am very worried it is
too different.
Cheers,
Denny
Petr Onderka
[[en:User:Svick]]
2013/3/28 Denny Vrandečić denny.vrande...@wikimedia.de:
We have a first write up of how we plan to support queries in Wikidata.
Comments
This is in my opinion an upstream issue for MediaWiki proper. I do not
think that templates and images from Commons are that different. Take this
image for example:
https://en.wikipedia.org/wiki/File:Treaty_of_Accession_2011_Ratification_Map.svg
It always reflects the current state of
Hey Dario,
there is on simple fix we want to apply rather sooner than later, which is
to use the number of language links for ranking. This should work rather
well. The thing is that this is kinda hard to implement in MySQL, I
figured, and that we would need to use something Lucene based
2013/4/7 Jianyong Zhang zhjy...@gmail.com
On Tue, Apr 2, 2013 at 9:54 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
2013/4/1 Jianyong Zhang zhjy...@gmail.com
1) It becomes redirect to another article, will Qx be changed in this
scenario?
I expect that if a Wikipedia article
Hey all,
I just got a warning from Ops that our log table is growing extremely fast.
One write up by this is here:
https://bugzilla.wikimedia.org/show_bug.cgi?id=47415
Basically, a vast majority of edits on Wikidata are written to the log
table as they are autopatrolled. And since we have a lot
I am completely amazed by a particularly brilliant way that Wikipedia uses
Wikidata. Instead of simply displaying the data from Wikidata and removing
the local data, a template and workflow is proposed, which...
* grabs the relevant data from Wikidata
* compares it with the data given locally in
Hello,
I would like all interested in the interaction of Wikidata and Wiktionary
to take a look at the following proposal. It is trying to serve all use
cases mentioned so far, and remain still fairly simple to implement.
http://www.wikidata.org/wiki/Wikidata:Wiktionary
To the best of our
Can I have a statement about how much easier it would have been with
Wikidata? :)
2013/6/13 Brent Hecht bhe...@cs.umn.edu
Hi all,
In my (recently finished) thesis, I looked at a lot of different
properties (e.g. topic, centrality, popularity via pageviews) of common
and unique concepts
1 - 100 of 170 matches
Mail list logo