Hi,
On 08.05.2015 09:40, Stas Malyshev wrote:
Hi!
Other technical solutions can be found for keeping content apart when
needed (e.g., separate dumps by entity types).
It's not only dumps, it's also searches, APIs, special pages, etc. Of
course, everything can be solved with enough time and
On 08.05.2015 08:50, Lydia Pintscher wrote:
On Fri, May 8, 2015 at 7:15 AM, Stas Malyshev smalys...@wikimedia.org wrote:
I am worried that having two different data sets within the same
instance would be a problem for tools working with the data, and for
humans too. And frankly, I don't see too
Spam. The address Dear Colleague sent to this mailing list is giving
it away, I know, but for less obvious cases, a good general guideline is
to avoid research conferences that ask you to pay for each paper you
publish there. Legit research events decouple paper selection from
financial
On 08.05.2015 11:30, Thomas Douillard wrote:
I don't get this, is this really a technical issue or just an interface
one ? It can be pretty clear to users that the semantic entity pages are
very different from lexical entities in the same instance just by
tweaking the UI. Or with separate
mixte de service (UMS) 3554
20 rue Duquesne
CS 93837
29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95
fax : +33 (0)2 98 01 63 93
Le 29/04/2015 21:44, Markus Krötzsch a écrit :
On 29.04.2015 20:56, Luca Martinelli wrote:
Dear all,
I need to know about the possibility of making queries on a Wikibase
are naturally a good way to deal with
definitions.
2015-04-29 19:35 GMT+02:00 Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org:
Hi,
General case first: Many statements depend on time and have an end
date (e.g., population numbers). The general
On 29.04.2015 20:56, Luca Martinelli wrote:
Dear all,
I need to know about the possibility of making queries on a Wikibase
instance. I think it is possible to make queries on data on a
particular instance only with external tools at the moment, right?
Yes, this is correct. The SPARQL query
Hi,
General case first: Many statements depend on time and have an end date
(e.g., population numbers). The general approach there is to (1) have a
qualifier that clarifies the restricted temporal validity and (2) make
the current statement preferred. So your idea with the ranks was a
good
On 26.04.2015 22:16, Gerard Meijssen wrote:
Hoi,
I regularly query for for instance claim[31] ie any instance of
whatever... I would also query for the existence of a date of death in a
similar way. for me a claim with a whatever it is that says that there
is no value would be a positive result
On 26.04.2015 22:28, Gerard Meijssen wrote:
Hoi,
It is a matter of perspective. From my perspective a value exists or
not. Depending on that I may want to process. When you state novalue
there is a value of novalue and that is not the same as there not being
a value in the first place.
Ah, I
Quick reply to Denny and Gerard:
@Denny: I think it makes sense to treat qualifiers under a closed-world
semantics. That is: what is not there can safely be assumed to be false.
In this I agree with Gerard. OTOH, I don't think it hurts very much to
add them anyway.
@Gerard: Please note that
On 23.04.2015 12:25, Thomas Douillard wrote:
This is a question of point of vue and how to solves conflicted
declaration, way larger than this. There could be disputes other who is
really the father of something, this would be the same.
such a statement in Wikidata means:
* This source says
On 22.04.2015 22:10, Stas Malyshev wrote:
Hi!
...
While letters like ч and щ can indeed
generate some long combinations which are not very visually appealing,
Tell me about it! -- M. Kroetzsch
___
Wikidata-l mailing list
,
so the query will have to change accordingly in the future.
Markus
On Mon, Apr 20, 2015 at 3:50 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
On 20.04.2015 23:47, Daniel Kinzler wrote:
Something seems to be wrong with the order
On 21.04.2015 11:27, Daniel Kinzler wrote:
Am 21.04.2015 um 00:50 schrieb Markus Krötzsch:
On 20.04.2015 23:47, Daniel Kinzler wrote:
Something seems to be wrong with the order, though. Munich (pop 1m in all
statements) is listed way after Chemnitz (pop 300k in all statements). Any
idea why
? They should be xsd:decimal...
They are.
Markus
Am 20.04.2015 um 22:18 schrieb Markus Krötzsch:
Hi all,
For many years, Denny and I have been giving talks about why we need to improve
the data management in Wikipedia. To explain and motivate this, we have often
asked the simple question: What
On 20.04.2015 22:29, Nicola Vitucci wrote:
...
I hope this is inspiring to some of you. One could also look for the
world's youngest or oldest current mayors with similar queries, for
example.
Markus, this is really cool! Can I reuse it as an example on WikiSPARQL? :-)
Yes, of course.
On 20.04.2015 22:51, Stas Malyshev wrote:
Hi!
is as follows (with some explaining comments inline):
This is very nice, thanks! Will use this as a test case for the query
engine (btw yes it works on my test machine just fine :).
more than one match per city then, even with DISTINCT).
Hi all,
For many years, Denny and I have been giving talks about why we need to
improve the data management in Wikipedia. To explain and motivate this,
we have often asked the simple question: What are the world's largest
cities with a female mayor? The information to answer this is clearly
rdfs:label ?label .
FILTER ( LANG(?label) = en )
}
} GROUP BY ?country ?label ORDER BY DESC(?count)
There seems to be a great imbalance here, which could indicate some
bias/incompleteness of our data -- or, possibly, of the world.
Cheers,
Markus
On Mon, Apr 20, 2015 at 1:18 PM Markus Krötzsch
Hi Matthew,
You can use our experimental SPARQL endpoint
http://milenio.dcc.uchile.cl/sparql. It has direct relations for all
statements that have no qualifiers, and two-step relations for all
statements (with or without qualifiers), which are a bit more complex
but give you more power over
Hi Alan,
The SitelinksExample shows how to get the basic language-links data. In
Wikidata, sites are encoded by IDs such as enwiki or frwikivoyage.
To find out what they mean in terms of URLs, you need to get the
interlanguage information first. The example shows you how to do this.
The
On 09.04.2015 01:11, Nicola Vitucci wrote:
...
Indeed. I made this temporary change on WikiSPARQL, so that links like
in Jean-Baptiste's examples may work properly. If you try this:
http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/entity/Q18335803%3E
and then click on
testing, but if your main interest is in the UI and not the
backend, this might be a nice cooperation.
Cheers,
Markus
On 09.04.2015 22:16, Markus Krötzsch wrote:
On 09.04.2015 01:11, Nicola Vitucci wrote:
...
Indeed. I made this temporary change on WikiSPARQL, so that links like
in Jean
On 08.04.2015 17:24, Nicola Vitucci wrote:
Il 08/04/2015 16:36, Markus Krötzsch ha scritto:
On 08.04.2015 15:07, Nicola Vitucci wrote:
Hi Markus,
would you recommend to add some sort of patch until the new dumps are
out, either in the data (by adding some triples to a temporary graph) or
just
Hi Jean-Baptiste,
Your observation is correct. This is because a single Wikidata statement
(with one Wikidata property) does not translate into a single triple
(with one RDF property) in RDF. Rather, several RDF triples are used,
they need to use more than one property, and these properties
first,
e.g., ranks in RDF).
Cheers
Markus
Cheers,
Nicola
(wikisparql.org)
Il 08/04/2015 14:49, Markus Krötzsch ha scritto:
Hi Jean-Baptiste,
Your observation is correct. This is because a single Wikidata statement
(with one Wikidata property) does not translate into a single triple
On 08.04.2015 16:02, Jean-Baptiste Pressac wrote:
Thank you, I will have a look to your publication to have a better
understanding of the mecanism of RDFisation.
Are you also going to solve the problem with the links to the wikidata
ontology ? For instance on this page
On 06.04.2015 22:02, Markus Krötzsch wrote:
Dear Sebastian,
Using OWL is surely a nice idea when the semantics is appropriate (i.e.,
where you want Open-World entailment, not constraints) and here the
Possibly misleading typo: I meant where, not here ;-) -- Markus
expressiveness is enough
Dear Sebastian,
Using OWL is surely a nice idea when the semantics is appropriate (i.e.,
where you want Open-World entailment, not constraints) and here the
expressiveness is enough. This is much more difficult, however, than one
might at first think it is. For a simple example, the common
and effective model for the community to build upon.
-Ben
On Mon, Apr 6, 2015 at 1:03 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
On 06.04.2015 22:02, Markus Krötzsch wrote:
Dear Sebastian,
Using OWL is surely a nice idea when
Hi Erik, hi all,
Aren't those properties already distinguished by the classification
statements we now have on property pages? For example:
https://www.wikidata.org/wiki/Property:P214
Defines the VIAF id to be a unique identifier (yes, this is somewhat
questionable modelling, since a
Brilliant, we should set up a page with a list of SPARQL endpoits for
Wikidata! For production usage, it is great to have a variety to chose from.
==WARNING==
The RDF format is currently in flux. The purpose of the Chilean endpoint
http://milenio.dcc.uchile.cl/sparql is to gather feedback
).
Markus
On Wed, Mar 11, 2015 at 2:09 PM Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
Hi Andrew,
This is a great idea! It would help data consumers to know what to
expect and community members to know what to put in (or where help
Hi Serge,
The short answer to this is that the purpose of aliases in Wikidata is
to help searching for items, and nothing more. Aliases may include
nicknames that are in no way official, and abbreviations that are not
valid if used in another context. Therefore, they seem to be a poor
source
On 11.03.2015 05:40, Tom Morris wrote:
On Tue, Mar 10, 2015 at 6:41 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
For example, you can see that Portugal has a lot of lighthouses
while Spain has almost none -- maybe we need to look at our
On 10.03.2015 17:09, Daniel Kinzler wrote:
Am 10.03.2015 um 16:55 schrieb Markus Krötzsch:
Hi Serge,
The short answer to this is that the purpose of aliases in Wikidata is to help
searching for items, and nothing more. Aliases may include nicknames that are in
no way official
Awesome work :-). I love your use of Google Docs as a UI prototyping
tool. We could really use a few more special-purpose querying tools.
Markus
On 09.03.2015 22:03, Navino Evans wrote:
Hi all,
We've been using WDQ queries a lot recently to update timelines in the
Histropedia directory and,
Thanks Maartens for the info.
On 08.03.2015 02:03, Jeroen De Dauw wrote:
Hey,
And to answer your second question: Maximum number of values is 50
(500 for bots) (from
https://www.wikidata.org/w/api.php?action=helpmodules=wbgetentities)
That seems a bit much to me. Considering an entity can
On 07.03.2015 16:39, Federico Leva (Nemo) wrote:
https://www.mediawiki.org/wiki/API:Etiquette is probably the
best/reference document here...
Thanks, this answers my first question (no worries about request rate as
long as requests are serialized).
Markus
Hi Amir,
In spite of all due enthusiasm, please evaluate your results (with
humans!) before making automated edits. In fact, I would contradict
Magnus here and say that such an approach would best be suited to
provide meaningful (pre-filtered) *input* to people who play a Wikidata
game,
On 07.03.2015 18:21, Magnus Manske wrote:
Congratulations for this bold step towards the Singularity :-)
Lol. The word neural in the name of the algorithm is infinitely more
attractive and inspiring than something abstract like Support Vector
Machine, isn't it? -- although we know that both
Hi,
Quick question about the wbgetentities API: are there any general rules
that clients should obey when making requests?
* Maximal hit rate?
* How many entities can you actually get in one request? Is this
documented anywhere? Is it possible for a tool to find this number or is
it just
On 27.02.2015 17:47, Lydia Pintscher wrote:
On Thu, Feb 26, 2015 at 2:52 PM, Markus Kroetzsch
markus.kroetz...@tu-dresden.de wrote:
Hi,
It's that time of the year again when I am sending a reminder that we still
have broken JSON in the dump files ;-). As usual, the problem is that empty
maps
Hi Paul,
Re RDF*/SPARQL*: could you send a link? Someone has really made an
effort to find the least googleable terminology here ;-)
Re relying on standards: I think this argument is missing the point. If
you look at what developers in Wikidata are concerned with, it is +90%
interface and
On 12.02.2015 07:17, Gerard Meijssen wrote:
Hoi,
It is pointless to include automated descriptions when they are then
saved in a fixed form. The point of automated descriptions is exactly
that they change as new statements are made. This is one reason why they
are superior to manual
On 20.01.2015 23:27, Hydriz Scholz wrote:
Hi all,
All the Wikidata JSON dumps are available and archived on Archive.org.
See this search query [1] for a full list of them. For Labs users, the
latest 10 dumps are available at /data/scratch/wikidata.
Ah, interesting. I did not know that the
Also, as this seem to be taking longer than expected, I have now also
re-published the Jan 12 and Jan 5 JSON dumps on labs now for your
convenience:
http://tools.wmflabs.org/wikidata-exports/tmp/
Users of Wikidata Toolkit can manually download the file
20150112.json.gz to a subdirectory
Dear Wikidata JSON export team,
There seems to be a sytnax error in the 20120112 JSON file that (I
think) has already been there in the previous dump. So I guess it makes
sense to report it.
In line 9374899, around column 2648 of the 20120112 JSON dump, we find
snaks:[]
Of course, {} would
On 20.01.2015 19:20, Jeroen De Dauw wrote:
Hey,
I seemed to recall this being reported earlier, it being discussed, and
a fix being created.
Yes. And in spite of your analysis, the problem seems to have almost
disappeared after that. It used to be all over the dataset, now it is
just in one
Hi (esp. WMF people),
The JSON dumps used to be at
http://dumps.wikimedia.org/other/wikidata/
Now this directory is empty. Any hints at what is going on?
Cheers,
Markus
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
The issue was fixed in master now. I also added some more INFO-type
messages that will report about the dump files found online and locally.
Cheers,
Markus
On 18.01.2015 14:26, Markus Krötzsch wrote:
On 18.01.2015 10:58, Egon Willighagen wrote:
On Sat, Jan 17, 2015 at 11:04 PM, Markus
On 18.01.2015 10:58, Egon Willighagen wrote:
On Sat, Jan 17, 2015 at 11:04 PM, Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
It is easy to fix this (though I will not fix it tonight, but tomorrow) by
just adjusting the HTML strings we parse for.
Sure! I have subscribed to the bug
Hi Egon,
WDTK 0.3.0 is rather old and we are about to prepare a new release
(there are other issues with 0.3.0: the JSON format has changed since
its release and it won't read the files anyway). Could you try if the
problem occurs with the current development code at github?
Cheers,
Markus
On 17.01.2015 22:43, Egon Willighagen wrote:
This last test from the cmd line is already with master from GitHub...
Thanks, we will investigate. I created a bug report at
https://github.com/Wikidata/Wikidata-Toolkit/issues/114
Markus
Egon
On 17 Jan 2015 22:40, Markus Krötzsch mar
On 17 Jan 2015 22:50, Markus Krötzsch mar...@semantic-mediawiki.org
mailto:mar...@semantic-mediawiki.org wrote:
On 17.01.2015 22:43, Egon Willighagen wrote:
This last test from the cmd line is already with master from
GitHub...
Thanks, we will investigate. I created a bug
On 17.01.2015 23:04, Markus Krötzsch wrote:
...
Question to the MW folks: Is there any machine-readable API to get the
list of available dump files?
I mean: WMF folks, of course -- Markus
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
+1 to this.
While property re-use is desirable in general, we need to make some
basic distinctions. The realm of geographic relations (a special kind of
part-whole relations) is particularly clear and specific, and it would
be worthwhile to distinguish them from more vague notions of
, apply the same default for invalid values,
and change all of its coordinate data to floating point numbers
(currently using long as fixed precision decimal numbers).
Cheers,
Markus
On 11.01.2015 02:15, Markus Krötzsch wrote:
Hi,
Does anybody know the current documentation of the precision
On 11.01.2015 14:53, Maarten Dammers wrote:
Hi Markus,
Markus Krötzsch schreef op 11-1-2015 om 2:15:
Hi,
Does anybody know the current documentation of the precision of the
globe coordinate datatype? This precision was introduced after the
original datamodel discussions.
No clue, I do know
Hey Lydia,
* We just crossed 14000 active users (over the last month). Thank you all!
Could you clarify?
http://stats.wikimedia.org/wikispecial/EN/TablesWikipediaWIKIDATA.htm
shows around 5k active users (=users with 5 edits) each month, but more
than 18K users with an edit in November
show contributor numbers below 14k for all months up to Nov
2014. Overall, the two counts seem to agree though :-)
Markus
Andrew.
On 11 January 2015 at 22:35, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Sun, Jan 11, 2015 at 11:31 PM, Markus Krötzsch
mar...@semantic-mediawiki.org
Hi,
Does anybody know the current documentation of the precision of the
globe coordinate datatype? This precision was introduced after the
original datamodel discussions.
I used to believe that it was a rough, informal indication of a
precision based on an (easy-to-process but necessarily
On 09.01.2015 19:43, Joe Filceolaire wrote:
There is a proposal to create a 'subproperty of' property but it is on
hold until we can have a property as a datatype
Yes, that's also important. But I was talking aobut a property that
would be used to establish a subpropertyOf relation between a
On 09.01.2015 17:25, Thad Guidry wrote:
https://www.wikidata.org/wiki/Property:P279 aka the superclass ...
seems to have an equivalent property that refers to
http://www.w3.org/2000/01/rdf-schema#subClassOf ???
Basically yes, this was the informal design intention when the community
On 09.01.2015 00:53, Lydia Pintscher wrote:
...
I like the property-centric approach of wikidata, but is there a notion of
subproperties for contextual refinement? I only found this:
https://www.wikidata.org/wiki/Property:P1647
Yes that is all there is. For a usage example see
On 08.01.2015 22:52, Peter F. Patel-Schneider wrote:
What then is P17 supposed to be used for?
Could, I, for example, use P17 on the address of the Swiss embassy in
Germany and have Switzerland as the value?
associated is generally too weak a word to use in describing properties.
We have to
On 08.01.2015 18:38, Denny Vrandečić wrote:
Yes, CC-BY is great.
Good. I have officially released the article text under this license now:
https://korrekt.org/page/Wikidata:_A_Free_Collaborative_Knowledgebase
Cheers,
Markus
On Thu Jan 08 2015 at 7:01:12 AM Markus Krötzsch
mar...@semantic
On 08.01.2015 15:10, ja...@j1w.xyz wrote:
Prior to viewing Markus Krötzsch's Wikidata page, I was unaware of the
Wikidata: A Free Collaborative Knowledgebase article [1] written by
Denny Vrandečić and Markus Krötzsch. This is a very helpful article
that in my opinion should be featured
On 08.01.2015 21:29, Thad Guidry wrote:
Hi Marcus!
Yes, you and I are on the same page.
I do indeed get this impression ;-)
Yes, I know about the
Property-first view of WIkidata. No quibbles. But there is still an
issue with Assumptions for Country P17 being used for an instance of
Dear Thad,
The second part of your email has good points in it, too. As you say,
one must allow for adjustments in the intended meaning of a property in
real life, and adjusting too much could be dangerous. The method you
suggest (creating a new property and deprecating the old one, rather
On 08.01.2015 20:37, Thad Guidry wrote:
...
Right, Freebase would not stick a Property called Country right on an
instance of a Music Band. We would put Country under the Musical Group
type, and give it a better definition like The nation or territory that
this item originated from.
Back to Denny's original question:
Does anybody see a specific danger of abuse if living people get to edit
their own data right now? Entering wrong claims deliberately would maybe
not be the biggest issue here (since it is already in conflict with
other general policies -- we do not want
on should point to Wikidata or to Wikimedia or something else. But
besides this minor point this seems to be a nice way to have COI
declarations in the data (would also be interesting to know which living
people have official Wikimedia accounts).
Cheers,
Markus
On 07.01.2015 15:25, Markus Krötzsch
P.S. I also should declare a COI on this discussion: I am Q18618630. --
Markus
On 07.01.2015 15:25, Markus Krötzsch wrote:
Back to Denny's original question:
Does anybody see a specific danger of abuse if living people get to edit
their own data right now? Entering wrong claims deliberately
On 04.11.2014 18:18, Cristian Consonni wrote:
Hi Markus,
2014-11-01 0:29 GMT+01:00 Markus Krötzsch mar...@semantic-mediawiki.org:
Nice. We are running the RDF generation on a shared cloud environment and I
am not sure we can really use a lot of RAM there. Do you have any guess how
much RAM you
On 31.10.2014 14:51, Cristian Consonni wrote:
2014-10-30 22:40 GMT+01:00 Cristian Consonni kikkocrist...@gmail.com:
Ok, now I have managed to add the Wikidata statements dump too.
And I have added a wikidata.hdt combined dump of all of the above.
Nice. We are running the RDF generation on
curious if there is any formal collaboration
(in-place|proposed|possible) between dbpedia and wikidata?
Phil
This message optimized for indexing by NSA PRISM
On Wed, Oct 29, 2014 at 2:34 PM, Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
Martynas,
Denny is right. You could set up
On 30.10.2014 11:49, Cristian Consonni wrote:
2014-10-29 22:59 GMT+01:00 Lydia Pintscher lydia.pintsc...@wikimedia.de:
Help with this would be awesome and totally welcome. The tracking bug
is at https://bugzilla.wikimedia.org/show_bug.cgi?id=48143
Speaking of totally awesome (aehm :D):
* see:
Hi Christian,
Awesome :-) Small note: I just got a Bad Gateway when trying
http://data.wikidataldf.com/ but it now seems to work.
It also seems that some of your post answers the question from my
previous email. That sounds as if it is pretty hard to create HDT
exports (not much surprise
Martynas,
Denny is right. You could set up a Virtuoso endpoint based on our RDF
exports. This would be quite nice to have. That's one important reason
why we created the exports, and I really hope we will soon see this
happening. We are dealing here with a very large project, and the
Hi Cristian,
As Daniel said, the live export is currently somewhat limited. However,
we provide RDF dumps that contain all the data:
http://tools.wmflabs.org/wikidata-exports/rdf/
This shows how the final live exports should also look (more or less),
and it could be a blueprint for somebody
Dear all:
Those of you active in research may be interested in submitting to a
recently announced special issue of the Journal of Web Semantics that
explicitly refers to Wikidata in its call:
JWS Special Issue on Knowledge Graphs
Hi,
I fully agree with Thomas and the other replies given here. Let me give
some other views on these topics (partly overlapping with what was said
before). It's important to understand these things to get the subclass
of/instance of thing right -- and it would be extremely useful if we
On 13.09.2014 21:25, Jeremy Baron wrote:
On Sat, Sep 13, 2014 at 7:23 PM, Denny Vrandečić vrande...@gmail.com wrote:
I am not a lawyer, but if I remember correctly, copyright covers expression,
not content. Since the Wikidata data model and its representation in JSON is
rather unique, an ISBN
On 09.09.2014 11:33, Daniel Kinzler wrote:
Am 09.09.2014 01:40, schrieb Denny Vrandečić:
Create a third item in Wikidata, and use that for the language links. Any
Wikipedia that has two separate articles can link to the separate items, any
Wikipedia that has only one article can link to the
On 09.09.2014 11:47, Thomas Douillard wrote:
The composite item seems to be a sort of composite geographical/human
system, like an ecosystem (community of living organisms together with
the nonliving components of their environment)
https://www.wikidata.org/wiki/Q37813 a special kind of
Hi,
I just updated the data for the Wikidata classes and properties browser
[1] -- was about time -- and added some improvements on the way:
(1) Classes and properties are now always ordered by usage (most used
first), which was not possible to do before. Examples:
** properties related
On 08.09.2014 14:27, Jeroen De Dauw wrote:
Hey,
\o/
Where are the source code and issue tracker for this? Probably good if
those where linked from the tool.
True, but it's not quite in our master branch yet: the code is part of
the extended WDTK examples module, see
On 08.09.2014 14:53, Markus Krötzsch wrote:
...
http://tools.wmflabs.org/wikidata-exports/miga/#_item=1204
That first shows population. When then clicking on the link, you see
the data type is quantity, not string.
Yes, I think this is a bug in how we use IRIs and labels for datatypes
function in the code
that I tweaked to adjust this until it seemed right, but there is no
deeper principle behind this.
Very cool...
Thanks :-)
Markus
-Ben
On Mon, Sep 8, 2014 at 9:24 AM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote
Hi all,
I'd like to share a little tool with you that has been created at a
recent hackathon. Not my work but a nice idea that might inspire others:
AnnoT is a manual text annotation tool where you can, while typing
text into an HTML form, select Wikidata items for some of the words. You
On 22.08.2014 09:10, Lydia Pintscher wrote:
On Fri, Aug 22, 2014 at 3:35 AM, Legoktm legoktm.wikipe...@gmail.com wrote:
* I feel uncomfortable linking to Facebook/Twitter/etc. on the main page.
Fair enough. How about we make
https://www.wikidata.org/wiki/Wikidata:Social_media prettier and
On 19.08.2014 22:23, David Cuenca wrote:
...
Actually I have one last question :) At the moment Gerard is using is
a list of:value on category item pages which has the effect of being
the inverse of instance of. And then he adds further conditions as
qualifiers, see:
On 19.08.2014 16:13, Lydia Pintscher wrote:
On Tue, Aug 19, 2014 at 11:19 AM, David Cuenca dacu...@gmail.com wrote:
Thanks for the stats, Gerard. Two thoughts:
- With so many items without description I wonder why we don't have the
automatic descriptions gadget enabled by default.
I am a bit
Hi all,
We have a lot of statements saying that something is an instance of a
Wikipedia disambiguation page (Q4167410). Unfortunately, this kind of
information says something about a particular Wikipedia article in a
particular language, and often is not true for other languages.
Moreover,
On 19.08.2014 12:20, Gerard Meijssen wrote:
Hoi,
I cannot parse this ..
What Thomas is saying is that classification (putting things into
categories) and querying (finding things based on certain properties)
can be combined in a natural way. In ontology languages like OWL, you
can make
I guess (many categories could be expressed by
queries to improve results; a gentle, community-led transition will be
possible and preferred; categories won't be switched off just because
Wikidata is switched on).
Cheers,
Markus
Cheers,
Micru
On Tue, Aug 19, 2014 at 12:30 PM, Markus
On 12.08.2014 23:49, Andre Engels wrote:
...
In my opinion, I think manual descriptions should be kept, as opposed to be
deleted completely. Instead, automatically generated descriptions should be
provided when there is no manual description and people should be asked to
override automatically
On 25/07/14 15:28, David Cuenca wrote:
Worried about the harshness that lately has been developing, I have
started a new initiative to counter that by promoting more dialogue,
civility, and friendliness
https://www.wikidata.org/wiki/Wikidata:Shelter
When you are with friends you don't need to
1 - 100 of 173 matches
Mail list logo