be raised significantly (and
also performance would be better) once we get it to production.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
://tools.wmflabs.org/wdq2sparql/w2s.php)
- once the labs outage ends of course - to convert between WDQ syntax
and SPARQL. Also check out other links on the WDQS beta page for short
intros about how things are done with SPARQL and examples of which
queries you can run.
--
Stas Malyshev
smalys
yet
both operationally and data-model-wise, so please be aware of this, also
it has timeout limits that won't allow you for now to run queries that
are too complex. But if you want to check it out and see if that fits
your use case you are most welcome.
[1] http://wdqs-beta.wmflabs.org/
--
Stas
Hi!
The links would point to the standard export URLs:
* https://www.wikidata.org/wiki/Special:EntityData/Q423111.json
* https://www.wikidata.org/wiki/Special:EntityData/Q423111.rdf
Speaking about these, shouldn't we also have link rel=alternate for
export formats in the header?
--
Stas
individual
entity actually is - I'm sure you have one that matches your
application, but some other application may have completely different one.
Generally, this can be solved by better classification I think, but so
far I'm not sure what to base this classification on.
--
Stas Malyshev
smalys
. Some of them are tough to classify or link to
anything, but some are rather obvious.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ble. I don't know yet why, I'll look at it further,
probably next week.
So the problem is not "handling cycles" in general, it is handling some
specific data set, and most probably is a consequence of some bug. I'll
report when I have more data about what exactly triggers the bug.
--
Stas
nts or needs the
> same tooling..
We have export formats - CSV, TSV, JSON, etc. (look for "Download
results" link on the right side). If you would like to see any other
format that is not supported now, please ask (best in the form of
Phabricator ticket but writing on feedback page or m
efinitely an
> error. ;-)
Right. So I think we need to mark properties that should not form cycles
with
https://www.wikidata.org/wiki/Q18647519 (asymmetric property) and have
constraints checking scripts/bots find out such cases and alert about them.
--
Stas Malyshev
smalys...@w
o do:
PREFIX wdt: <http://www.wikidata.org/prop/direct/>
SELECT ?book ?isbn WHERE {
VALUES ?isbn { "2-7071-1620-3" "2-7071-1620-4" "2-7071-1620-5" ... }
?book wdt:P957 ?isbn
}
Unless I misunderstand what you mean here.
--
Stas Malyshev
smalys...@wikimedia.o
whose bot
it is please ask that person to seek advice and guidance (which I would
be glad to provide) on how to make it work properly :)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
his query and browser gets out of memory. Try this query with LIMIT 10
first and see what happens.
As for the bot activities affecting other users, the effect seems to be
negligible, so if this query is slow, it is slow on its own merits :)
--
Stas Malyshev
smalys...@wiki
any
instances. So my question is - what is the use of modeling something as
a class if there won't be ever any instances of the class modeled?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
s: Could you have a look?
Yes, looks like there's a large volume of updates, so the service is
several hours behind, but it seems to be catching up now. What's the bot
is doing?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@l
ote on Succu's talk
page to discuss it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
un queries against quantities with units (and I think we do, don't
we?) then we would need to figure out the common basis at least for
common units. I wonder if it's tracked somewhere?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mai
l be no dependency on the current content of Wikidata's Q199.
We already have such dependencies - e.g. in calendars and globes - so it
won't be anything new. But let's see what the Wikidata team thinks about
it :)
--
Stas Malyshev
smalys...@wikimedia.org
___
ou intended to write query that asks for 10 records but accidentally
wrote one that returns 10 million, it's much nicer to discover it with
suitable limit than waiting for the query to time out and then try to
figure out why it happened.
Yes, I realize all this has to go to some page i
people now to main endpoint as it
is much better at handling the load.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
inging
it to my attention.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
vice to unbound variables. If you drop ?headLabel
then it works. It is a downside of the label service, not sure yet how
to fix it (feel free to submit the Phabricator issue, maybe myself or
somebody else has an idea later).
--
Stas Malyshev
smalys..
bly would be the way to fix it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
it timed out for me (I was using a very common reference
> though ;-). For rarer references, live queries are definitely the better
> approach.
Works for me for Q216047, didn't check others though. For a popular
references, labs one may be too slow, indeed. A faster one is coming
"real
ch should be enough to merge it.
Not sure if those one-off things worth bothering with, just putting it
out there to consider.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
es, in theory it should be fast, so I suspect some kind of bug.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
s
then Preferred ones, otherwise Normal ones but never Deprecated ones.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
er.
Should probably be https://query.wikidata.org/bigdata/namespace/wdq/sparql
That works for me.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
is ommitted from the database for performance
reasons. You could still match statements by URL by converting them to
str() and then using substr() function, but that probably wouldn't help
much since there's a lot of statements so the filtering would not be
very selective.
--
Stas Malyshev
smalys...@w
may still have data and people may have to
use it)
* http://queryr.wmflabs.org/api/items/Q42/data/occupation returns only
one value, shouldn't it return multiple ones?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wiki
in Wikidata in about 2 weeks, which seems to be
OK for most tasks. But it is a limitation and as I said, I'll work to
eventually get rid of it.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https
h
URL shortener to Wikimedia's own one, which is supposed to be coming up
eventually, but before that, we plan to use existing ones. We might
change a provider if it turns out there is a better one, but we do not
plan to remove the functionality.
--
Stas Malyshev
smalys..
alism
> can have far-reaching consequences, it also is much harder to hide if
> the community has the right tools at hand.
> (9) Better data importing infrastructures (some problems mentioned in
> this thread seem to be caused by a multi-stage data import approach that
> only works
e new qualifier that didn't exist before) or remove
one (if we deprecate it, for example). So while the value is not likely
to change, other components of the claim very well might.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@list
Maybe some of the use cases might be better served by TPF server
(https://github.com/blazegraph/BlazegraphBasedTPFServer) - this does not
enable federated queries per se but enables to produce content that can
be queried externally more easily.
--
Stas Malyshev
smalys...@wikimed
n the special case.
>
> Would a patch allowing limited known remote sparql endpoints to
> org.wikidata.query.rdf.blazegraph.WikibaseContextListener
> be possible?
Technically, it is possible and shouldn't be very hard to do. We need to
figure out which endpoints we want to allow.
--
Stas Ma
wait until it's deployed, and
maybe gently prod responsible people from time to time :)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
which can be created via
automatic means in great volumes, which make URL shorter and which are
aimed at storing, at least for a while, each URL as individual data piece.
So, for our purposes w3id would not be very useful.
--
Stas Malyshev
smalys...@wiki
a.se/ontology-1.0.owl
The properties can be found in the general dump (
https://dumps.wikimedia.org/wikidatawiki/entities/ )
described as outlined here:
https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Properties
There's no separate file, RDF, OWL or otherwise, with only proper
guided fear to somehow benefit
somebody "wrong".
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
hing" but these
things are very culture-specific (including changing within the culture
with a passage of time) so I'm not sure it would be easy to represent
this in Wikidata.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing lis
to be truncated in the middle of a statement, e.g.
It may be some kind of timeout because of the quantity of the data being
sent. How long does such request take?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wiki
when it happens.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
nits fully, and we may have to add stuff for
geo-coordinates too - but one can argue it's good enough to be 1.0 and
I'd agree with it. But we need to take decision on this. Please feel
free to also comment on the task.
--
Stas Malyshev
smalys...@wikimedia.org
_
e able to do later, and some things we
probably would not be able to offer with any adequate quality.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
> 5.44s empty result
> 8.60s 2090 triples
> 5.44s empty result
> 22.70s 27352 triples
That looks weirdly random. I'll check out what is going on there.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing l
e type stays string. So depending on
> what they use they might need to be adapted.
RDF export seems to be fine, except that we need to update OWL and docs
for new types, I'll check pywikibot a bit later.
--
Stas Malyshev
smalys...@wikimedia.org
___
W
llions of data items,
which would probably time out anyway. Add something like "LIMIT 10" to it.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ite about that additionally
before we do it, depending on how development is going could be end of
this month or somewhere next month.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wiki
ecause it only checks very basic queries and those are still under
timeout.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
described here:
https://wiki.blazegraph.com/wiki/index.php/RDF_GAS_API and it's a
service implementing basic graph algorithms such as BFS, shortest path,
PageRank, etc. I personally didn't use it too much but it may be very
useful for tasks which are naturally expressed as graph traversals.
--
Stas Malyshev
s
Hi!
> is it me or is the SPARQL service very slow right now?
I've upgraded it yesterday to Blazegraph 2.0 and it looks like there was
some glitch there. I've restarted it and now it seems to be fine. I'll
be watching it and see if it repeats.
--
Stas Malyshev
smalys...@wikimedia.
have two so the other one would just take over).
However, if something bad happens, there might be a brief disruption of
service.
I'll send a message when it's done, and if you notice anything weird
after the upgrade, please ping me or submit issue in Phabricator.
Thanks,
--
Stas Malyshev
smalys
his index. So there's still work to do, see
https://phabricator.wikimedia.org/T123565
There are some interesting challenges due to the fact that our
coordinates include globes which is not a very common thing, but we're
working on supporting it.
--
Stas Malyshev
smalys...@
m Wikidata P840 and
> presently colored according to P136 using the Leaflet marker. Text is
> from the P1683 qualifier under P840:
>
> http://fnielsen.github.io/littar/
Congratulations, very nice project!
--
Stas Malyshev
smalys...@wikimedia.org
__
Hi!
> Wikidata SPARQL aficionados,
>
> This SPARQL query worked for several weeks, but quit working a few days
> ago:
No idea what happened, I'll look into it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing l
or now and will investigate
why it broke later.
Thanks for the report!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
actual content up-to-date, not the cached version.
I also created a poll: https://phabricator.wikimedia.org/V8
so please feel free to vote for your favorite option.
OK, this letter is long enough already so I'll stop here and wait to
hear what everybody's thinking.
Thanks in advance,
--
Stas Malyshev
I want to hear
opinions on this :)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
50.000 to 50.001, it seems less critical
> somehow.
That sounds like a good idea, we'll need to check if Varnish allows us
to do tricks like this...
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
y even there would be a good
idea. Unfortunately, in the internet environment of today there is no
lack of players that would want to abuse such thing for nefarious purposes.
We will keep looking for solution for this, but so far we haven't found one.
Thanks,
--
Stas
en go to
history and restore pre-merge version.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ess
> processes related to authority data often take weeks or month ...)
I think we'll always have some way to run un-cached query. The question
is only how easy would it be - i.e. would you need to add parameter,
click a checkbox, etc.
--
Stas Malyshev
smaly
o robots.txt is needed there.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
> you may want to check out the Linked Data Fragment server in Blazegraph:
> https://github.com/blazegraph/BlazegraphBasedTPFServer
Thanks, I will check it out!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wi
Hi!
> Couldn't you use P460 when there is doubt?
>
> https://www.wikidata.org/wiki/Property:P460
P460's type is Item, which means it is relation between two Wikidata
items. External ID is relation between Wikidata item and something
outside Wikidata.
--
Stas Malyshev
smalys...@wiki
70k
> resources.
Well, SPARQL data store is not supposed to contain any deleted
entries... But looks like there's some bug there. If you give me the
list of the "bad" entries, it's easy to update them. Considerable harder
is to find *why* they weren't updated in the first place. I
Hi!
> Is there a property for WordnetId?
The list of properties is here:
https://www.wikidata.org/wiki/Wikidata:List_of_properties
Don't see there anything for Wordnet.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikid
ly, no. Longer answer in
https://phabricator.wikimedia.org/T128947#2104017
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
FYI to those it may concern - we plan to institute regular WDQS
deployments on Mondays for both code and GUI. Not much going to change
except regular deployments would happen at predictable time instead of
"whenever I feel like it" :) Does not preclude emergency deployments in
case something
could have another instance that is auto-deployed from the deployment
repo daily by scripts, if necessary.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
labels) but you gain a lot of speed.
In any case, I'll raise this issue with Blazegraph and it also may be
worth to submit Phabricator issue about it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
t;good to convert"? Should I run it through some checklist first? Should
I ask somebody?
What are the rules for "disputed" - is some process for review planned?
I think some more definite statement would help, especially to people
willing to contribute.
even seek to participate in the discussion (though I
don't think WMF employment would disqualify me from contributing in
volunteer capacity, given my affiliations - as they are - are clearly
stated) - but only to know the results so I could contribute in editor
capacity, following whatever rules are there.
's pretty clear from the context but if
needed, I will clarify.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
re details in: https://phabricator.wikimedia.org/T130049
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
box
- Support for different globes
What is missing but would be added soon?
- Distances as search output and as separate function
- Documentation
- You tell me
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wiki
why use P31 or P279 in a
reference?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
downloading a dump, then filtering the entities by
> claim, but are there better/less resource-intensive ways?
Probably not currently without some outside tools. When we get LDF
support, then that may be the way :)
--
Stas Malyshev
smalys...@wikimedia.org
___
be patient until then.
I apologize for the inconvenience caused, and will continue to research
the cause of the missing data and then fix it. I'll update the ticket
when we have new info.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing
Hi!
> Very nice! From where is the map-data? Open street map?
Yes, see: https://www.mediawiki.org/wiki/Maps
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listi
are not
currently supported (maybe in the future).
Please tell me if you notice anything wrong or have any
comments/suggestions.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman
rg/T134238 'Query service fails with
> "Too many open files"' should be resolved as of 9 May. A new one
No, that's different one and it is fixed.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
; from
> WDTK dump files site [1], there is <http://wikidata.org/ontology#> used
> everywhere. So... is the WB ontology somehow translated to WD ontology?
Hmm, not sure about that one, Markus should know more about it.
--
Stas Malyshev
smalys...@wikimedia.org
__
E {
?st ?pred wd:Q2354820 .
?p wikibase:qualifier ?pred .
} LIMIT 10
but no result is produced, so I assume Q2354820 is not used as a
qualifier value (unless I'm missing something).
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailin
Hi!
> Hoi,
> Thanks Stas :) This is one example where Q2354820 is used.
>
> http://tools.wmflabs.org/reasonator/?=24013782
OK, this looks like a new one and the query now returns it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wiki
quired in some
situations) - that's why it was not announced yet.
But for many applications geof:distance already works. The rest will be
implemented an announced soon, probably sometime next week or week after
that.
--
Stas Malyshev
smalys...@wikimed
[4] https://phabricator.wikimedia.org/T123565
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
y ... (I recently had to IP-block an RDF crawler
> from one of my sites after it had ignored robots.txt completely).
We don't have any blocks or throttle mechanisms right now. But if we see
somebody making serious negative impact on the service, we may have to
change that.
--
Stas Ma
eries - then by the time the error happens
part of the response has been sent already, so there's no way to set
error http code etc. Thus such responses are not distinguishable from
valid replies, at least not without looking into the content.
--
Stas Malyshev
wiki/Q1787424
https://www.wikidata.org/wiki/Q166542
Some days I have a feeling those should be P460... ;)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
B5%D0%B4%D0%B8%D0%B8:%D0%9E%D1%84%D0%BE%D1%80%D0%BC%D0%BB%D0%B5%D0%BD%D0%B8%D0%B5_%D1%81%D1%82%D0%B0%D1%82%D0%B5%D0%B9#.D0.A1.D0.BA.D0.BB.D0.BE.D0.BD.D0.B5.D0.BD.D0.B8.D0.B5_.D0.B5.D0.B4.D0.B8.D0.BD.D0.B8.D1.86_.D0.B8.D0.B7.D0.BC.D0.B5.D1.80.D0.B5.D0.BD.D0.B8.D1.8F
[3] https://phabricator.wikimedia
re now, I could probably make a
> quite demo patch to show how it can be done.
I don't think we can put grammar rules in labels, that's why I proposed
a special property as an option.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing l
would be
comprehensible :) The gist of it is that Russian, as many other
inflected languages, changes nouns by grammatical case, and uses
different cases for different number of items (i.e. 1, 2, and 5 will
use three different cases). Labels are of course in singular nominative
case, which is wrong for ma
ike me
would know what the community consensus has arrived to.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ipedia administrative data and should be there.
External service can combine data from these sources but I don't think
it falls under WDQS tasks.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimed
re of course other ways to achieve the same, so
I'll look into various options, but so far page props doesn't sound like
that bad an idea, to me.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
t; plurals for free.
That may work, but downside of this is that it is linked to unit ID - so
if we wanted to use it for, say, Commons data, we'd have to somehow link
between "metre" on Wikidata and "metre" on Commons.
--
Stas Malyshev
smalys...@wikimedia.org
___
preference.
What it doesn't currently do is to verify that the preferred one refers
to the latest date. It probably shouldn't fix these cases (because there
may be valid cause why the latest is not the best, e.g. some population
estimates are more precise than others) but it can alert about it.
s in the dump from those
> that can. Consuming tools can then continue to function without problems
> for the former kind of change.
As I said, format versioning. Maybe even semver or some suitable
modification of it. RDF exports BTW alread
1 - 100 of 326 matches
Mail list logo