Tom,
how do we know whether anything is the truth? I would argue that for
completeness statements, as discussed by James Heald above, we should use
pretty much the same criteria we use for anything else - i.e. not truth,
but whether the sources support that statement.
I.e. I don't see much
My mindreading skills tell me that you forgot to add this link:
http://www.telegraph.co.uk/news/worldnews/12180516/Geography-of-violence-Map-records-every-battle-ever-fought.html
On Wed, Mar 2, 2016 at 10:58 AM Gerard Meijssen
wrote:
> Hoi,
> A mix of maps Wikidata
Hi all,
thank you for the interest in the primary sources tool!
I wanted to make sure that there are no false expectations. Google has
committed to deliver the initial tool. Thanks to Thomas P’s internship and
support from Thomas S and Sebastian, and with the release of the data, the
code, the
s a pretty amazing world - all you need is at click away. So go ahead and
do what you want to get done.
On Tue, Sep 29, 2015 at 1:07 AM Federico Leva (Nemo) <nemow...@gmail.com>
wrote:
> Denny Vrandečić, 28/09/2015 23:27:
> > Actually, my suggestion would be to
> Hi Denny,
>
> The following R script (
> https://gist.github.com/andrawaag/2b8c831ab4dd70b16cf2) plots wikidata
> content on a worldmap in R.
>
>
> Andra
>
>
> On Tue, Sep 8, 2015 at 10:19 PM, Denny Vrandečić <vrande...@gmail.com>
> wrote:
>
>> Is the
Daniel's answer fits exactly with the proposal (which is unsurprising,
because he reviewed and certainly influenced it).
To make it clear again: the proposal on
https://www.wikidata.org/wiki/Wikidata:Wiktionary/Development/Proposals/2015-05
is a proposal for the tasks that need to be performed.
is clear now. Thanks for your engagement!
Denny
On Sun, May 17, 2015 at 12:20 PM Denny Vrandečić vrande...@gmail.com
wrote:
Daniel's answer fits exactly with the proposal (which is unsurprising,
because he reviewed and certainly influenced it).
To make it clear again: the proposal on
https
this. It is the grand daddy of Wikidata and it does
combine everything you would want as far as lexical data is concerned.
Thanks,
GerardM
On 8 May 2015 at 18:18, Denny Vrandečić vrande...@gmail.com wrote:
I very much agree with Lydia and Nemo that there should not be a separate
Wikibase
on the level of the label, it
does not make sense to have Wiktionary included in Wikidata.
Thanks,
GerardM
On 8 May 2015 at 06:19, Denny Vrandečić vrande...@gmail.com wrote:
I would disagree with requiring the Wiktionary communities to change
their ways. Instead we should adapt our plans
I mean, the lexical data in Wikidata according to the proposal would allow
for statements on Lexemes and Forms. I slipped into the future for a moment
;)
On Thu, May 7, 2015 at 9:32 PM Denny Vrandečić vrande...@gmail.com wrote:
I am not sure I understand what you are saying. The lexical data
I would disagree with requiring the Wiktionary communities to change their
ways. Instead we should adapt our plans to fit into the way they are set up.
Even if the English Wiktionary community would change to have per-language
pages instead of the current system, it would be rather unlikely that
, Denny Vrandečić vrande...@gmail.com wrote:
It is rather clear that everyone wants Wikidata to also support
Wiktionary, and there have been plenty of proposals in the last few years.
I think that the latest proposals are sufficiently similar to go for the
next step: a break down of the tasks needed
Actually I think that having no value for the end date qualifier probably
means that it has not ended yet. There is no other way to express whether
this information is currently merely incomplete (i.e. it has ended, but no
one bothered to fill it in) or not (i.e. it has not ended yet). This is
This is seriously awesome! Thank you!
On Mon, Apr 20, 2015 at 1:18 PM Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
Hi all,
For many years, Denny and I have been giving talks about why we need to
improve the data management in Wikipedia. To explain and motivate this,
we have often
I am happy to let you know about the initial release of the primary sources
tool. More info is available here:
https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool
The release is meant to facilitate your feedback. There are probably plenty
of things that should be fixed before the tool
Any time property or the birth date property, specifically?
On Tue Feb 24 2015 at 10:58:09 AM Maximilian Klein isa...@gmail.com wrote:
Next research question:
{q | instance_of(i, q) and has_time_property(i) and has_geo_property(i)}
In this case we know humans (q5) are things that have time
hurtful and dismissive of each other? We have a
great project, riding an amazing wave, and there's too much for each one of
us to do to afford to hurt each other and make this a place less nice than
it could be.
On Fri Feb 20 2015 at 1:44:53 PM Denny Vrandečić vrande...@google.com
wrote
Regarding Paul's comment:
I first heard about Wikidata at SemTech in San Francisco and I was told
very directly that they were not interested in working with anybody who was
experienced with putting data from generic database in front of users
because they had worked so hard to get academic
Also, the problem most SPARQL backend developers worried about was not
Wikidata's size, but it's dynamicity. Not the number of triples, but the
frequency of edits. And we did talk to many of those people.
On Thu, Feb 19, 2015, 07:05 Markus Krötzsch mar...@semantic-mediawiki.org
wrote:
Hi Paul,
by
Denny Vrandečić and Markus Krötzsch. This is a very helpful article
that in my opinion should be featured on the Wikidata main page.
Glad you liked it. Checking the Wikidata item, I notice that it is
actually Open Access and not all rights reserved. It is available for
free (forever) from
Knowledgebase article [1] written by
Denny Vrandečić and Markus Krötzsch. This is a very helpful article
that in my opinion should be featured on the Wikidata main page.
[1] http://cacm.acm.org/magazines/2014/10/178785-wikidata/fulltext
Regards,
James Weaver
On Wed, Jan 7, 2015, at 05:14 PM
Actually, since Wikidata allows now properties on properties, one might
easily create an item Disambiguating property and then make a claim
instance of - Disambiguating property on the relevant property. there is
no need for any extra implementation work.
On Wed Jan 07 2015 at 9:48:32 AM Thad
I found out the other day that there's an item about myself, and I wanted
to edit it, and got a weird feeling about it. So I raised the question on
the project chat
https://www.wikidata.org/wiki/Wikidata:Project_chat#COI_and_editing
and got told that an RFC would be a good idea. So I tried one.
In OWL this is done through instance of (i.e. rdf:type) pointing at a
Transitive Property class (owl:TransitiveProperty). So the most similar
representation of that in Wikidata would be to have an item for transitive
property, and make an instance of: transitive property statement on the
Hi Gerard,
I very much agree. It would be very good to have a discussion on which kind
of data can be integrated in which way.
One way or the other, one of the most frequent criticisms of Wikidata is a
lack of references, which this tool will tackle on the way as well. And at
the same time it
wohoo. that's pretty awesome! congrats.
Are they going to use the soon-to-be-available property mapping properties?
On Tue Dec 02 2014 at 1:33:38 PM Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
Hey folks :)
The student team working on data quality and trust is hard at work and
just
On Tue Nov 11 2014 at 1:51:08 PM Denny Vrandečić vrande...@google.com
wrote:
+1 for removing the blacklist from the code.
On Tue Nov 11 2014 at 12:28:05 AM John Erling Blad jeb...@gmail.com
wrote:
What did I say, etc, etc, etc... It feels good to be right. I was
right. Me. I and myself
On Tue Nov 11 2014 at 1:51:32 PM Denny Vrandečić vrande...@google.com
wrote:
On Tue Nov 11 2014 at 1:51:08 PM Denny Vrandečić vrande...@google.com
wrote:
+1 for removing the blacklist from the code.
On Tue Nov 11 2014 at 12:28:05 AM John Erling Blad jeb...@gmail.com
wrote:
What did I
Folks,
as you know, many Googlers are huge fans of Wikipedia. So here’s a little
gift for Wikidata’s second birthday.
Some of my smart colleagues at Google have run a few heuristics and
algorithms in order to discover Wikipedia articles in different languages
about the same topic which are
Sure, you can keep all your todos with Google ;)
https://www.gmail.com/mail/help/tasks/
Cheers,
Denny
On Wed Oct 29 2014 at 2:58:03 PM Jeroen De Dauw jeroended...@gmail.com
wrote:
Hey,
Does this mean we can also shoot a TODO list in the direction of Google? :)
Cheers
--
Jeroen De Dauw
That's a great idea!
Just curious, for such a specific use case, why did you go for an App
instead of a Website?
On Tue Oct 28 2014 at 7:29:22 AM Sjoerd de Bruin sjoerddebr...@me.com
wrote:
Not available in the Dutch iTunes Store...
Op 28 okt. 2014 om 15:26 heeft Pierre-Yves Beaudouin
Yay! Congratulations!
On Mon Oct 27 2014 at 4:55:51 PM John Lewis johnflewi...@gmail.com wrote:
Hi everyone,
Some exciting news here. The Open Data Awards' finalists lists were
recently published on their website. Wikidata has been listed as a finalist
in two different categories which are
Wow! That's pretty cool work!
Do you have any plans to keep the data fresh?
On Mon Oct 06 2014 at 1:22:12 PM Benjamin Good ben.mcgee.g...@gmail.com
wrote:
I thought folks might like to know that every human gene (according to the
United States National Center for Biotechnology Information)
That's very cool! To get an idea, how big is your dataset?
On Tue Sep 30 2014 at 12:06:56 PM Daniel Kinzler
daniel.kinz...@wikimedia.de wrote:
What makes it so slow?
Note that you can use wbeditentity to perform complex edits with a single
api
call. It's not as streight forward to use as,
Fully agree with Markus' beautifully written explanation, although I am not
completely convinced of the level theory - but it seems to work in the
given examples, and a few other examples I was thinking through.
Note that Porsche 356 could very much be an instance of car model - but
not of car.
On Sep 13, 2014 3:20 PM, P. Blissenbach pu...@web.de wrote:
Regarding purely factual data comprising a less than significant portion
of a
database - which is certainly true for all ISBNs in Googles databas
Btw. if a statement about an ISBN is sourced, among ohers, with Source:
Google,
that
Hey Marieke,
You can either use the Wikidata toolkit by Markus Krötzsch, if you want to
work on the dump, or the Wikidata web API, if you only need a few such
mappings at a time.
On Jul 17, 2014 9:24 AM, Erp, M.G.J. van marieke.van@vu.nl wrote:
Hi there,
I was wondering how to get the
Hi Markus,
On Wed Apr 09 2014 at 4:18:50 AM, Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
Change to the directory of the example module (wdtk-examples), then run:
mvn exec:java
-Dexec.mainClass=org.wikidata.wdtk.examples.DumpProcessingExample
Thanks, that is exactly what I needed!
I was trying to use this, but my Java is a bit rusty. How do I run the
DumpProcessingExample?
I did the following steps:
git clone https://github.com/Wikidata/Wikidata-Toolkit
cd Wikidata-Toolkit
mvn install
mvn test
Now, how do I start DumpProcessingExample?
Sorry for being a bit dense here.
That's a toughie. Looking forward to see that one resolved :)
On Wed, Apr 2, 2014 at 2:14 AM, Andy Mabbett a...@pigsonthewing.org.ukwrote:
On 1 April 2014 20:01, Denny Vrandečić vrande...@google.com wrote:
a bug on the github project
I've raised another, about the use of adjectives
visitors of a
given page
in a given language would hit the same cache entry. That seems workable.
Anyway, we are not there quite yet, just something to ponder :)
-- daniel
Am 01.04.2014 20:14, schrieb Denny Vrandečić:
I just published qLabel, an Open Source jQuery plugin that allows
I would very strongly recommend to use Semantic MediaWiki for this use
case. It is more powerful, we use SMW in other WMF contexts already, and
supporting the data inside Meta (instead of inside Wikidata and then
transcluding it) allows us also to generate workflows in Meta involving
local
it is worth start presenting the concepts/options
now. Besides, ideas and a common understanding take time to develop, and
the RFC was started, so I thought it was worth giving it some attention.
Cheers,
Micru
On Fri, Mar 7, 2014 at 12:18 AM, Denny Vrandečić vrande...@gmail.comwrote:
Since
Wikidata labels are simple. This is due to the necessities of the project.
We need one single label to display. Having Wikidata labels with ranks,
qualifieres, sources, etc. simply would not work in the UI.
Labels and names in reality are indeed extremely complex. But as already
pointed out, this
Welcome to Wikidata! I am very much looking forward to see the results of
your work. The demo looks very promising and the results are already so
much better than what we currently have. And also the answers are very
fast, which is promising.
Awesome work, and welcome!
On Thu Jan 09 2014 at
The main reason why Queries are not done yet is because in the beginning of
2013 I deprioritized them compared to the original plan. Only a single
developer kept working on them, instead of a major part of the team, as was
originally planned.
I made this decision because it became clear to me
This would require a locally installable Lua library, which currently does
not exist. It would be a great project to do something like this!
The same is true for other languages, but I understand the particular use
case you have in mind.
On Thu, Jan 2, 2014 at 1:43 PM, Voß, Jakob
Thanks for reviving this thread, Luiz. I also wanted to ask whether we
should be updating parts of DNB and similar data. Maybe not create new
entries, but for those that we already have, add some of the available data
and point to the DNB dataset?
On Fri, Dec 6, 2013 at 3:24 PM, Luiz Augusto
It is either obvious that they should be entering only integers or positive
numbers, in which case such feedback isn't helpful, or it might end up
being too restrictive again. Who tells me that a system like this won't get
used in order to force cities to have a population of an integer bigger
Fr 22.11.2013 21:56, schrieb Denny Vrandečić:
It is either obvious that they should be entering only integers or
positive numbers, in which case such feedback isn't helpful, or it might
end up being too restrictive again. Who tells me that a system like this
won't get used in order to force
Hello Antoine,
just to add to what was already said:
a Qualifier in Wikidata is not a statement about a statement. In RDF
semantics, the pattern that we follow is not the reification of the triple
and then to make triples with the reified triple as a subject, as per
This is completely up to the community, whether they want this data and the
necessary structures for it. It really depends on the scope of the dataset.
But here it is the same: there is no way to use this data in short-term for
the metadata in Commons. This will be possible in a few months, if you
I would be surprised if that theory held true. I expect that both very
abstract (fruit) and extremely specific (golden delicious) items would have
a lower sitelink count than the golden layer of most useful terms (apple)
in the hierarchy (I am reminded of the theory of word length and term
Not good, but just to make a suggestion...
DataValues
DataServices
DataValuesExt
2013/9/5 Jeroen De Dauw jeroended...@gmail.com
Hey,
I don't really like this. If I see DataValues and DataValuesInterfaces
side by side, i'd assume that DataValuesInterfaces contains the interfaces
and
/**wiki/Q133https://test.wikidata.org/wiki/Q133
cheers
Finn Årup Nielsen
On 09/06/2013 12:10 PM, Denny Vrandečić wrote:
Hello all,
in preparation of next week's deployment to Wikidata.org,
test.wikidata.org http://test.wikidata.org now has the new datatype
URL deployed.
If you have
be, in non edit mode show the star or whatever icon
represents a specific badge, and in edit mode just provide a drop down
where the badge can be selected from.
Cheers, tobi
On Sep 4, 2013 7:05 PM, Michał Łazowik mlazo...@me.com wrote:
Wiadomość napisana przez Denny Vrandečić denny.vrande
Wiadomość napisana przez Denny Vrandečić denny.vrande...@wikimedia.de w
dniu 5 wrz 2013, o godz. 13:09:
full agreement.
2013/9/4 Tobi Gritschacher tobias.gritschac...@wikimedia.de
Hi,
changing the badge should not be possible by just clicking the star. The
user has to hit the edit button first
OK, based on the discussion so far, we will add the data type to the snak
in the external export, and keep the string data value for the URL data
type. That should satisfy all use cases that have been brought up.
2013/9/2 Daniel Kinzler daniel.kinz...@wikimedia.de
Am 02.09.2013 00:57,
Just following up on some discussion I had with DanielK and Jeroen today on
this, and summarizing it for the mailing list.
I still fail to see what the advantage would be to use the IRI datavalue -
especially when it is basically stripped down to be a string datavalue, as
Jeroen suggests in the
Hi Rob,
sorry for the longer wait on this one.
2013/8/28 Rob Lanphier ro...@wikimedia.org
On Tue, Aug 27, 2013 at 4:27 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
This was already discussed here:
http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg69983.html
Hi Rob,
responses inline.
2013/8/26 Rob Lanphier ro...@wikimedia.org
On Fri, Aug 23, 2013 at 8:03 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
for Semantic MediaWiki we had this already listed for a while here:
https://meta.wikimedia.org/wiki/Wikidata/Notes/SMW_and_Wikidata
Just a few corrections to the historical dates given by Tom.
2013/8/23 Tom Morris tfmor...@gmail.com
In a word, no. Google acquired Metaweb, the company that built Freebase,
which forms the core of the Knowledge Graph in 2010. Metaweb was founded
in 2005 (interesting Google search: Metaweb
Oh, that's a clear and loud I have no idea :)
2013/8/23 Tom Morris tfmor...@gmail.com
On Fri, Aug 23, 2013 at 10:10 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
I understand Michael's question to be much more concrete: does the
progress in Wikidata has anything to do
Hi Maarten,
thanks. That's the best proposal I have seen so far in how to proceed with
Phase 1 on Commons. I usually had pushed Commons support further to the
back, but with this I think we would indeed create some real value with a
small change. I will bounce Commons Phase 1 client support up on
This doesn't really work yet in the UI. Basically, you could only enter sth
like 6th c. BC which in this case would not be correct.
1st Millenium BC would be possible and correct, but it is a bit too wide.
That's the only thing supported right now.
We will be working on improving this situation.
Hi Jan,
we currently assume that we will have a first querying capability available
this fall. The implementation has progressed very well in the last few
months and weeks, including special pages to access it, API modules, etc.
Indeed querying will be available later than originally anticipated
[Sorry for cross-posting]
Yes, I agree that the OmegaWiki community should be involved in the
discussions, and I pointed GerardM to our proposals whenever and
discussions, using him as a liaison. We also looked and keep looking at the
OmegaWiki data model to see what we are missing.
Our latest
Following numerous discussions and after input from many people, we are
happy to present the new version of the proposal that would lead to
Wikidata supporting structured data for the Wiktionaries.
http://www.wikidata.org/wiki/Wikidata:Wiktionary
I am very thankful to all those that provided
Due to Wikimania and general unbearableness of the weather, the next
deployment and branching will be a bit later.
The next branching is planned for August 21st (that will be branch
1.22wmf16, as it is currently called).
This will be deployed on Thus, August 22 to test,
on Mon, August 26 to
That's amazing! And so fast! Any idea how many links there have been? (Just
curious).
Thanks for reporting!
Denny
2013/7/29 Romaine Wiki romaine_w...@yahoo.com
Hello all,
I am happy to announce that all interwikis from all articles, templates,
project pages (except some archive pages) have
Hi Jacobo,
I hope you don't mind that I share the answer with the list. I think the
answer to this question might be of general interest.
the JavaScript creating the visualization in the browser is here:
https://dl.dropboxusercontent.com/u/172199972/map/map.js
As you can see it is just a
Small correction.
2013/7/22 Denny Vrandečić denny.vrande...@wikimedia.de
* Subscriptions: one table on the client. It has two columns, one with the
pageId and one with the siteId, indexed on both columns (and one column
with a pk, I guess, for OSC).
That's entityId - siteId, not pageId
The AAAI awards the Feigenbaum Prize to the Watson team, which decides to
donate the prize money to the Wikimedia Foundation, explicitly listing
Wikidata as a reason.
When asked for a comment, Wikidata said:
Q2013 P3 Q12253 .
Congratulations to the Watson team and their stunning results!
More
In June 2012 I ran an analysis to discover how many language links were on
Wikipedia. Last week, I rerun the analysis again - and the results are
stunning.
Of the 240 Million language links, 239.2 Million have been removed so far.
This is an amazing result by the community. Congratulations.
Last
copied from
http://www.wikidata.org/wiki/Wikidata:Project_chat#Propagation_of_changes_to_the_Wikipedias_currently_lagging
Changes to Wikidata are currently propagated to the Wikipedias with a lag
of several hours, but this should be fixed during the next few hours.
The Dispatcher, who is
to continue to effectively
further our goals towards a world where everyone has access to the sum of
all knowledge.
Sincerely,
Denny Vrandečić
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia
Hi Hady,
use the MediaWiki API, like this:
http://www.wikidata.org/w/api.php?action=querylist=allpagesformat=jsonapnamespace=120aplimit=10
You can list through all the results using
http://www.wikidata.org/w/api.php?action=querylist=allpagesformat=jsonapnamespace=120aplimit=10apcontinue=P110
I just wanted to say thank you! That's truly amazing work.
As far as I can tell, more than 200 Million lines of wikitext have so far
been removed from the Wikipedias. That's 200 Million lines that do not have
to maintained anymore.
(I have not run the actual analysis yet, I have been waiting for
done this change.
2013/6/20 Denny Vrandečić denny.vrande...@wikimedia.de
Thinking about it again, and discussing it internally, maybe we should
replace word with expression and meaning with sense?
Any +1's or differing opinions?
2013/6/20 Denny Vrandečić denny.vrande...@wikimedia.de
It was never intended to create a Wiktionary Database separate from
Wikidata, but have it being a part of Wikidata.
2013/6/21 Gerard Meijssen gerard.meijs...@gmail.com
Hoi,
Denny, when you look at the data currently in Wikidata, you find what is
in essence more than a basis for a
Thank you, Sundar!
2013/6/20 BalaSundaraRaman sundarbe...@yahoo.com
Hi Denny,
I've left a message at the Tamil Wiktionary Village Pump.
for Wordnet, especially how it connects synsets to Freebase topics:
https://www.freebase.com/base/wordnet/synset?schema=
Tom
On Wed, Jun 19, 2013 at 9:57 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Hello,
I would like all interested in the interaction of Wikidata and Wiktionary
languages don't follow the same logic.
On the other hand, do you think it would be possible to accommodate
grammar rules too?
I have added some people from Apertium that might have some insights about
it.
Cheers,
Micru
On Wed, Jun 19, 2013 at 9:57 AM, Denny Vrandečić
denny.vrande
Thinking about it again, and discussing it internally, maybe we should
replace word with expression and meaning with sense?
Any +1's or differing opinions?
2013/6/20 Denny Vrandečić denny.vrande...@wikimedia.de
The current proposal does not cover grammar rules explicitly. If at all, I
would
Tomorrow at 2pm Berlin time, the Wikidata team will host a public hangout
on Travis. This is mostly meant to inform ourselves, but it might be a good
resource for others as well.
We might be late, as this is the first time we are doing such a thing, so
bring a bit patience.
We will try to record
Hello,
I would like all interested in the interaction of Wikidata and Wiktionary
to take a look at the following proposal. It is trying to serve all use
cases mentioned so far, and remain still fairly simple to implement.
http://www.wikidata.org/wiki/Wikidata:Wiktionary
To the best of our
Can I have a statement about how much easier it would have been with
Wikidata? :)
2013/6/13 Brent Hecht bhe...@cs.umn.edu
Hi all,
In my (recently finished) thesis, I looked at a lot of different
properties (e.g. topic, centrality, popularity via pageviews) of common
and unique concepts
with WordNet's license?
Neil
On Wed, Jun 19, 2013 at 9:57 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Hello,
I would like all interested in the interaction of Wikidata and Wiktionary
to take a look at the following proposal. It is trying to serve all use
cases mentioned so far
I am completely amazed by a particularly brilliant way that Wikipedia uses
Wikidata. Instead of simply displaying the data from Wikidata and removing
the local data, a template and workflow is proposed, which...
* grabs the relevant data from Wikidata
* compares it with the data given locally in
2013/4/7 Jianyong Zhang zhjy...@gmail.com
On Tue, Apr 2, 2013 at 9:54 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
2013/4/1 Jianyong Zhang zhjy...@gmail.com
1) It becomes redirect to another article, will Qx be changed in this
scenario?
I expect that if a Wikipedia article
Hey all,
I just got a warning from Ops that our log table is growing extremely fast.
One write up by this is here:
https://bugzilla.wikimedia.org/show_bug.cgi?id=47415
Basically, a vast majority of edits on Wikidata are written to the log
table as they are autopatrolled. And since we have a lot
Hey Dario,
there is on simple fix we want to apply rather sooner than later, which is
to use the number of language links for ranking. This should work rather
well. The thing is that this is kinda hard to implement in MySQL, I
figured, and that we would need to use something Lucene based
This is in my opinion an upstream issue for MediaWiki proper. I do not
think that templates and images from Commons are that different. Take this
image for example:
https://en.wikipedia.org/wiki/File:Treaty_of_Accession_2011_Ratification_Map.svg
It always reflects the current state of
end well.
I am not even sure it is much more complicated. But I am very worried it is
too different.
Cheers,
Denny
Petr Onderka
[[en:User:Svick]]
2013/3/28 Denny Vrandečić denny.vrande...@wikimedia.de:
We have a first write up of how we plan to support queries in Wikidata.
Comments
) can be added to
what links here.
Feel free to include back the distribution list in your reply if you see
merits in this suggestion.
Best Regards,
Alex
On Apr 2, 2013, at 9:54 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Hi Janyong,
as Michael said, Wikidata does
We have a first write up of how we plan to support queries in Wikidata.
Comments on our errors and requests for clarifications are more than
welcome.
https://meta.wikimedia.org/wiki/Wikidata/Development/Queries
Cheers,
Denny
P.S.: unfortunately, no easter eggs inside.
--
Project director
Oh, I would please ask to wait another week or two, for us to have
qualifiers. Maybe they can deal with some of these cases. We just got them
demoed today, and they really look neat, so I am very convinced they will
be there with the next update.
2013/3/27 Michael Hale hale.michael...@live.com
We do have strong types, but only few of time: item, commons media, string,
time, geo, URL. Government leader would not be a supported type.
The exact list and details are here:
http://meta.wikimedia.org/wiki/Wikidata/Data_model#Datatypes_and_their_Values
Cheers,
Denny
2013/3/21 Michael
It really depends on your definitions :)
Items are strongly typed as items. Any item can have any property. And only
items can have properties. Time or geocoordinates, e.g., can not have
properties.
But yes, there is no forcing of properties onto any item, nor any
restriction of usage of every
Hi Dario,
two or three features are still missing to enable that (sorted in order we
are probably going to deploy them):
* qualifiers
* the time datatype
* statement ranks
As soon as they are available, this can be modeled in a way that it can be
useful for projects accessing the data.
So,
101 - 200 of 286 matches
Mail list logo