Hi!
> How about adding the RDF to query.wikidata.org so we can get a current
> list?
We could probably load the rdf we have now into Blazegraph relatively
easily. Updating may be a bit tricky (should we delete historical
items?) but it's possible to figure it out. I'll look into it.
--
suggestions, comments or experience any
problems with it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
gt; could
True, the dumps run weekly. "More or less" situation can arise only if
one of the dumps fail (either due to a bug or some sort of external
force majeure).
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@
some
queries to get stuck or be rejected, see
https://wikitech.wikimedia.org/wiki/Incident_documentation/20171018-wdqs
and https://wikitech.wikimedia.org/wiki/Incident_documentation/20171130-wdqs
These do not take the whole service down, so I am not sure how they
qualify uptime-wise.
--
Stas Ma
ting and helping to make the Query Service
better!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
documented and what is exactly the process and what you mean by
"upload the Wikidata properties as OWL"? More detailed information would
be hugely helpful.
Thanks in advance,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wik
ing said that, I am curious - what exactly you are doing with this
data set? Why you need a list of all humans - how this list is going to
be used? Knowing that may help to devise better specialized strategy of
achieving the same.
--
Stas Malyshev
smalys...@wikimedia.org
__
Hi!
> i want to inform you that Repology has your "repository" of software
> versions included and can list problems or outdated versions that way.
What does this list actually include? Is this the list of software and
versions present in Wikidata as items?
--
Stas
can look into it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
t have more of
this. Thanks for bringing it to my attention!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
RIs, probably more if they contain ton of
text data.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
/ontology-beta#geoPrecision>
> "1.0E-6"^^<http://www.w3.org/2001/XMLSchema#decimal> .
> ```
Could you submit a phabricator task (phabricator.wikimedia.org) about
this? If it's against the standard it certainly should not be encoded
area, not yet sure how
to do it though.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
> The top 1000
> is:
> https://docs.google.com/spreadsheets/d/1E58W_t_o6vTNUAx_TG3ifW6-eZE4KJ2VGEaBX_74YkY/edit?usp=sharing
This one is pretty interesting, how do I extract this data? It may be
useful independently of what we're discussing here.
--
Stas Malyshev
smalys...@wiki
ndexing "cast member" would
get you a step closer, but only a tiny step and there are a number of
other steps to take before that can work.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ad to hear thoughts on the matter.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
> * I would really like dates (mainly, born/died), especially if they work
> for "greater units", that is, I search for a year and get an item back,
> even though the statament is month- or day-precise
What would be the use case for this?
--
Stas Malyshev
smal
tain random
> text (esp. natural language) since they are prone to be unique and
> impossible to search.
Yes, we definitely should not do that. I tried to exclude such
properties but if you notice more of them, let's add them to exclusion
config.
--
Stas Malyshev
this: https://phabricator.wikimedia.org/T163642 ?
This is the task to make strings searchable _without_ haswbstatement
keyword.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mail
etter to
search engines using well-known metadata vocabularies, I think it would
be a very welcome effort.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
der "Wikidata" + "Discovery-Search".
There are multiple tasks for it, but if you want to add any, please feel
welcome to browse and add.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
u think it's a dataset others may want to reuse, tabular data on
Commons may be a venue: https://www.mediawiki.org/wiki/Help:Tabular_Data
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia
rwise (ccing Markus in case he knows more on the topic).
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ery.wikidata.org/sparql.
Thank you! I will take care of it in the next update (next week).
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ed
> can't be managed, at least with Wikimedia current ressources.
It's not Wikimedia that will be shouldering the burden, it's every user
of Wikimedia data sets.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
we accepted any that have anything stricter than that)
Category namespace - since it comes from Wikipedias, it's CC-BY-SA I
assume.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mail
t subject to any license. Specific arrangement (collection) of facts
can be copyrighted and licensed though and this specific one is
Wikidata, which is licensed under CC0.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Hi!
> No. There is no such thing as "category namespace" in Wikidata. There
You are correct. I was talking about category namespace in Wikidata
Query Service. It is documented here:
https://www.mediawiki.org/wiki/Wikidata_query_service/Categories
--
Stas Malyshev
smalys...@
rco_Rubio
Not sure however if it's structured or available in API form. It also
has state level politicians, e.g.: https://ballotpedia.org/Bill_Monning
- but it seems it's even harder to parse there.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidat
class, so we'd have to constantly
update the hierarchy. But this is more of a technical challenge, which
will come after we have some solution for the above.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
h for all of them, so you may never get
a chance to find the basset horn. Also, of course, querying big
downstream hierarchies takes time too, which means performance hit.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
at would be a bad thing. But I don't think anything we are
discussing here would lead to that happening.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
omment in the task) if you have any questions or
concerns.
[1] https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Header
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
essing them would
not be very useful.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
are
experienced in ontology creation and maintenance.
> to be chosen that then need to be applied consistently? Is this
> something the community can do, or is some more active direction going
> to need to be applied?
I think this is very much something that the
o, as I typed this email.>
> and now it does appear.
Yes, this is how it should work. There were no changes lately, AFAIK,
but it is possible that you hit some glitch or maintenance on your
previous search. If that happen again, please tell me when and with
which search string /URL, I'll try
Hi!
> When will stemming be supported in Search ?
In general, I think it already should be, for fields and contexts that
use appropriate analyzers, but I'd like to hear more details:
1. Which search?
2. What you're looking for, i.e. search string?
3. What you expect to find?
--
Stas Malys
ot
really enforce any of the rules with regard to classes, property
domain/ranges, etc. and have frequent and numerous exceptions to those.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wiki
g to the users as reusing the Q.
No, that would be confusing. If OSM wants own data type, because Q item
does not fit - e.g. OSM doesn't want descriptions and sitelinks - then
it should use a separate letter, like MediaInfo uses M. But using L
would not be smart since then this data would not integrate well with
Hi!
> Also given that it uses oresscores, we recently fixed some performance
> issues caused by it. Do you still have issues with it?
Yes, the issues I have listed still happen. My API calls do not use
ORES. E.g. see:
https://logstash.wikimedia.org/goto/63db4ce68fb5da3cdc7828150de10c59
--
rectly there.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
he
same result. Which depends on query. So I'd suggest providing some info
about the queries and specific issues you're having, and then we could
see if it's possible to improve it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wi
but
> reads are unaffected as far as we can tell.
The incident report for this issue is here:
https://wikitech.wikimedia.org/wiki/Incident_documentation/20190110-WDQS
It will be updated if we have any new developments or new information.
As of now, all servers are working normally.
--
Stas
filter(lang(?label)="fr")
> }
Could you describe in a bit more detail what you're trying to do here?
Doing two service calls is not a pattern one would commonly use... It
can be slow if query optimizer misunderstands such query, too. I feel
I'd have a bit more insight if I understoo
it, I see it
every time I run it on Labs (where Kafka stream is not available). So I
think the issues with RC API on wikidata are still alive.
There's also a parallel issue of
https://phabricator.wikimedia.org/T207718 with RDF fetching, which also
still happens.
--
Stas Malyshev
smalys...@wikimed
Hi!
There is a discussion going on in W3C SPARQL 1.2 Community Group about
the improvements in SPARQL language. May be interesting to people that
are using SPARQL and those that may have some ideas of how to improve it.
-- Forwarded message -
From: *Andy Seaborne*
that having a
> tutorial that explains and teaches the Query Service will help expand
> Wikidata to new audiences worldwide.
This sounds great, thank you!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wiki
re not included.
Could you provide specific properties and preferably also some Q-ids for
which you expected to find direct-normalized props but didn't?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
http
. So the request above applies to the search parts of the
WikibaseLexeme code also.
If you have any questions/comments, please feel free to ask me, on the
lists or on the IRC.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Hi!
> Yes, the api is
> at https://www.wikidata.org/w/api.php?action=query=search=Bush
There's also
https://www.wikidata.org/w/api.php?action=wbsearchentities=Bush=en=json
This is what completion search in Wikidata is using.
--
Stas Malyshev
smalys...@wikimed
Hi!
> and if I enable any of the FILTER lines, it returns 0 results.
> What changed / Why ?
Thanks for reporting, I'll check into it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
We use separate data store for search (ElasticSearch) and probably will
have to have separate one for queries, whatever would be the mechanism.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
interested if somebody took on themselves to model Wikidata in terms of
ArangoDB documents, load the whole data and see what the resulting
performance would be, I am not sure it would be wise for us to invest
our team's - very limited currently - resources into that.
Thanks,
--
Stas Malyshev
s
aybe getting
some numbers might be useful.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
According to
https://meta.wikimedia.org/wiki/User-Agent_policy, all clients should
identify with valid user agent. We've started enforcing it recently, so
maybe this tool has this issue. If not, please provide the data above.
--
Stas Malyshev
smalys...@wikimedia.org
_
efer to help than to ban). Otherwise, we'd be forced to put
more limitations on it that will affect everyone.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
e queries to ensure they are fast and produce proper
results on the setup you propose, then it can be done. Good luck!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
kends - KV, document, relational, column store,
whatever you have. The tricky part starts when you need to run millions
of queries on 10B triples database. If your backend is not optimal for
that task, it's not going to perform.
--
Stas Malyshev
smalys...@wikimedia.org
___
st majority of the Wikidata Query Service case.
Would be interesting to see if we can apply anything from the article.
Thanks for the link!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
hat.
Replication could certainly be useful I think it it's faster to update
single server and then replicate than simultaneously update all servers
(that's what is happening now).
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikid
ies/latest-all.nt.gz would
still be pointing to the right files, and if all you care is downloading
the latest dump, using these links is always recommended.
We will send another message once the change has been implemented and
deployed.
Thanks,
--
Stas Malyshev
smalys...@wik
te further. You can also watch
https://phabricator.wikimedia.org/T225996 for final resolution of this.
[1] https://phabricator.wikimedia.org/T225996
[2] https://www.w3.org/TR/rdf11-concepts/#section-Graph-Literal
--
Stas Malyshev
smalys...@wikimedia.org
___
rowse/JENA-1077
I will adjust the code in Blazegraph accordingly, so WDQS will comply
with this practice (i.e. result format will be as it was before). This
will be implemented in coming days.
Sorry again for the disruption.
--
Stas Malyshev
smalys..
database and be hosted on the same hardware. This is
especially important for services like Wikidata Query Service where all
data (at least currently) occupies a shared space and can not be easily
separated.
Any thoughts on this?
--
Stas Malyshev
smalys...@wiki
ut how to do.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
rmation in smaller chunks using LIMIT/OFFSET clauses.
Note that this doesn't speed up query itself.
4. Use LDF server:
https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual#Linked_Data_Fragments_endpoint
Depending on what data do you need, there probably would be the options.
--
St
CCing Ariel to take a look. Probably needs to be
re-run or we can just wait for the next one.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
media.org/wikidatawiki/entities/20190617/)
But looking at it now, I see wikidata-20190617-all.json.gz is
comparable with the last week, so looks like it's fine now?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wiki
Hi!
On 6/25/19 11:17 PM, Ariel Glenn WMF wrote:
> I think the issue is with the 0624 json dumps, which do seem a lot
> smaller than previous weeks' runs.
Ah, true, I didn't realize that. I think this may be because of that
dumpJson.php issue, which is now fixed. Maybe rerun the dump?
--
it (logs suggest there's virtually no usage
now, but that can change of course) please use the endpoint above.
[1]
https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual#DCAT-AP
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
or plan to use it and what for. Please either answer here
or even better in the task[2] on Phabricator.
[1]
https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual#DCAT-AP
[2] https://phabricator.wikimedia.org/T228297
--
Stas Malyshev
smalys...@wikimedia.org
Hello all!
Here is (at last!) an update on what we are doing to protect the
stability of Wikidata Query Service.
For 4 years we have been offering to Wikidata users the Query Service, a
powerful tool that allows anyone to query the content of Wikidata,
without any identification needed. This
it of course are
welcome.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
one of them every time is
unfeasible. Not to mention this JSON is not an accurate representation
of the RDF data model. So I don't think it is worth spending time in
this direction... I just don't see how any query engine could work with
that storage.
--
Stas Malyshev
smalys...@w
Hi!
The best place for this kind of question would be the wikidata-tech mailing
list
wikidata-tech@lists.wikimedia.org. It would probably be a good idea if you
(and whoever else deals with wikidata on the technical level) were subscribed
there. It's pretty low traffic.
Thanks, I've sent
-scale resolution. I'm not sure what coordinates can be even
known with such resolution, let alone be needed for anything practical.
[1]
https://en.wikipedia.org/wiki/Wikipedia:WikiProject_Geographical_coordinates#Precision_guidelines
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
me. Other
comments/thoughts/suggestions also welcome.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
in
SPARQL which may take some time, since these languages are a bit
different (esp. in tree traversal aspects). But I think WDQ language
support should be on the agenda, I'm just not sure it should be the
first item.
--
Stas Malyshev
smalys...@wikimedia.org
soon and see how it
behaves. Some of the WDQ features - such is wide branching with OR
options - may be quite inefficient in SPARQL, but we can generate it
anyway. I'll update when I have something interesting (probably next week).
--
Stas Malyshev
smalys...@wikimedia.org
name may have completely different semantics. In Wikidata,
however, properties are generic, so I wonder if it would be possible to
keep context. dbPedia obviously does have context but not sure where it
would be in Wikidata.
--
Stas Malyshev
smalys...@wikimedia.org
once ranks are used more widely to
mark it.
I think this is solved with preferred ranks and truthy statements
concept pretty nice. So people should start using ranks to separate
current data from historical data.
--
Stas Malyshev
smalys...@wikimedia.org
Hi!
What about people who were born in the 18th-century? We know they are
dead, but their death is not recorded and we only know when they were
last active. How do you set that end date?
That's what somevalue/unknown is for.
--
Stas Malyshev
smalys...@wikimedia.org
there might be cases where it is useful,
still not on references. But I think maybe not forbidding as such but
having the guideline on what is considered the Right Thing and then if
there's an exception than the editors can use their own judgement.
--
Stas Malyshev
smalys...@wikimedia.org
it does make sense.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Hi!
Hi, there is items about Wikibase data model in Wikidata (created by me,
but not only)
If I understand correctly, they could be cited in the semantic web as
https://www.wikidata.org/entity/Q19798647
What would be the purpose of these items? I.e., what is the intended usage?
--
Stas
://www.wikidata.org/wiki/Q1187613's is Уотсон. And I have no idea
what is the correct romanization of
https://www.wikidata.org/wiki/Q4105300's name.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https
.
https://www.wikidata.org/wiki/Q16354757 is not Wikibase Data model :)
Which reminds me of https://www.wikidata.org/wiki/Q1061035 ...
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https
people, neither experienced nor new.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Hi!
use ISO standards. One of the reasons is their impartiality (in the
meaning that they are not related to one specific language).
Wikidata labels, however, *are* related to specific language. Every
label is associated with the language.
--
Stas Malyshev
smalys...@wikimedia.org
some missing parts (see https://phabricator.wikimedia.org/T50143),
though dumpJson should be safe to use if you want JSON. We're actively
working on the RDF part so it will be ready for use soon too.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-l
structure would produce?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
WMF nor WMDE maintains. It's one thing to
make new service (which btw I think is an awesome idea, just wanted to
say it so that it would be clear than I am not criticizing the whole
idea, just this aspect of it) and another add subtle changes to an
existing one.
--
Stas Malyshev
smalys
server with only one database and only one
table - why not use separation that already comes for free with another
instance? We still can reuse any code we like.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-l mailing list
Wikidata-l
https://www.wikidata.org/wiki/Property:P1207
Q42 is a treasure trove for these :) Will see if there is more.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman
of encyclopaedic knowledge.
It's indeed important, so I think the idea is not to diminish their
importance somehow but to give them separate space which may lead to
improved workflows for both data properties and external ID properties.
--
Stas Malyshev
smalys...@wikimedia.org
Hi!
I agree this would be a nice idea. I believe it would be relatively easy to
do,
if only properties could have properties of their own.
AFAIK they can, e.g. https://www.wikidata.org/wiki/Property:P35
--
Stas Malyshev
smalys...@wikimedia.org
https://www.wikidata.org/wiki/Property:P1628 with
Freebase full URIs.
I think using https://www.wikidata.org/wiki/Property:P1628 is a good
idea. It may also be useful to add qualifier of (P642) to
https://www.wikidata.org/wiki/Q1453477 (Freebase).
--
Stas Malyshev
smalys...@wikimedia.org
wikipedia.org/wiki/Getty%20Thesaurus%20of%20Geographic%20Names
Not sure if it's a problem or not.
But the sitelinks should be updated now.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://l
/tree/master/gui may also
be the way to do it ;)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
gYear to be completely
different types and as such some queries between them would not work.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata-tech mailing list
Wikidata-tech@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
201 - 300 of 326 matches
Mail list logo