y take some time due to some other
issues, so for now just please tell me about them.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
.wikidata.org/wiki/Wikidata:List_of_properties/Summary_table
This page is kind of hard to use (and huge, probably that's why Lua is
unhappy) - I'd recommend using a tool like SQUID above
(https://tools.wmflabs.org/sqid) - much nicer way to approach it IMO.
--
Stas Malyshev
smaly
at's the promise from Wikidata (at least
excepting cases of specially announced BC-breaking changes). Maybe
inform the user that some information is not understood and thus may be
not available, but not refuse to function completely.
--
Stas Malyshev
smalys...@wikimedia.org
__
ust for Blazegraph to work well on this data set. Might also
improve results for other engines, so not sure how it influences the
comparison between the engines.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.
l and structured place to refer
> specific objects.
Some specific objects, e.g. works of art, buildings, ships, etc. are
already well represented in Wikidata. I imagine there are notability
rules of course. So your proposed one would have relaxed notability
rules, no notability rules at al
not
possible, due to relational engines not being a good match for
hierachical graph-like structures in Wikidata.
It would be interesting to look at the Postgres implementation of the
data model and queries to see whether your conclusions were different in
this case.
--
Stas Malyshev
smalys...@w
luated on about 50 criteria. Of course, some of
them were hard to formalize, and some number were a bit arbitrary, but
that's what we did and Blazegraph came out with the best score.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mai
I also think it is a right thing to do, and enables us to do many things
much easier, but I wanted to explore how to best address this particular
issue, and solicit ideas.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.w
maybe not). If we make it
talk to Elastic, I imagine anything available to Elastic would be
possible to use, but no promises since I didn't research it yet.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
http
ase:timePrecision to
know if the date is accurate or just a placeholder.
[1] https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Time
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
keytool.
There are a number of guides on how to do it available on stack overflow
and otherwise, though I don't have the link right now (did it a week or
so ago). If you don't find it, ping me I'll try to dig it up.
--
Stas Malyshev
smalys...@wikimedia.org
_
t doesn't look like it should take more
than a year (famous last words, I know :) so 2017 looks like a sane
estimate.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman
e service for that, the
service is for when you have specific set of languages you're interested
in.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
soon,
please watch the announcements.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
in the specified language
be returned?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
> I bet wikibase:label has to be reimplemented in some other way to prove
> efficient...
Yes, label service may be inefficient sometimes. I'll look into how it
can be improved.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailin
Where from now on I will use the
> SPARQL2 template by default. Is it possible to change the background
> color to, for example, green?
I've removed the background, looks like nobody is really happy about it
being there :)
--
Stas Malyshev
sm
work. It needs some code to be able to properly build and consume
queries, but not impossible.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
"subclass of" and "category's main topic" but not
sure that'd capture all. E.g.: http://tinyurl.com/h7qpcdn but that only
captures one subcategory, since other items don't have the same
hierarchy in Wikidata.
--
Stas Malyshev
smalys...@wikimedia.org
___
to 2K would change anything.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
wanted to make a public announcement /
request for feedback before doing anything big. So please comment if you
see any problem with the proposed plan or any objections. If no
objections are raised, I currently plan to do it sometime about Wed-Thu
this week.
Thanks,
--
Stas Malyshev
smalys
erence with same properties - i.e. one URL to the same
address.
These nodes do not have their own documents - since in Wikibase and
Wikidata it's not possible to address individual values/references - but
they are not linked to a single entity.
--
Stas Malyshev
smalys...@wikimed
Hi!
Looks like the feedback to the idea has been positive (thanks to
everybody that participated!) so I've made a task to track it:
https://phabricator.wikimedia.org/T144103
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata
rse, if
you merge all these documents in a dump, the triple would appear only
once (we have special deduplication code to take care of that) but it's
impossible to track it back to a specific document then. So I understand
the idea, and see how it may be useful, but I don't see a real way to
implem
ract all
data regarding one entity from the dump. You can do it via export, e.g.:
http://www.wikidata.org/entity/Q42?flavor=full - but that doesn't
extract it, it just generates it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
and surprises.
So, what do you think - would having ntriples RDF dump for wikidata help
things?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
you create a bot which does large number of
processing it is a good practice to send distinct User-Agent header so
people on the other side would know what's going on.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikid
r than extracting comments from query
field e.g. when doing Hive data processing.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
nd out who sent a particular query. 3b may
be superior to 3a, but I admit I don't know enough about it :)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ate baseline
> demand, in terms of what complexity is required.
Can I get the list too? There could be some interesting info there for
me too :)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://list
in creating a
phabricator ticket, then I or somebody else will make a patch that I
approve, etc.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
> Dario, I would love for the WDQS to support federated queries. Since we
It is possible, technically, but we need a whitelist of the servers that
are allowed. Any ideas about how to produce such list?
--
Stas Malyshev
smalys...@wikimedia.
are what the data in SPARQL service is
loaded from - and very inefficient, we don't really need SPARQL server
for that, it can be done with much simpler code.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.
may be some weird structures, including even outright errors
such as cycles in subclass graph, etc. And, of course, it changes all
the time :)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wi
Hi!
> https://twitter.com/jamesinealing/status/813830785472012288
This is great - both this and buzzfeed one :)
Happy new year!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
ht
is to shift the workload to the client, if we
shift it back to the server, we're back to the regular SPARQL endpoint
with all the limitations we must put on it. However, I'm pretty sure
something like Client.js or Client.java mentioned above can do SPARQL
queries - it's how the demo wor
.Java/issues/41
Thank you, ans thank you for responding so quickly! I'll keep an eye on
it and will report any oddities.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mail
on how big DBs behave under sharding vs. just distributing requests
across servers.
> 2. Fast multi-threaded instances (no sharding but via replication
> topologies) behind proxies (functioning as cops, so to speak).
That's basically what we're doing now.
--
Stas
and federation. The
first use case of course would benefit from good SPARQL engine running
on a machine with decent connectivity. So server-side JS or Java should
probably fare better.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailin
d we can see if it improves things.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ence is
SPARQL parser, which doesn't take much in overall scheme of things. With
TPF, all queries are simple, with SPARQL, decidedly not so :)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.
o gets @context and which gets prefix.
The algorithm for parsing it would be checking if the key has :, if yes,
resolve against prefix, if not - resolve against @context.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@list
ery brief
interruption. Will update once it's done.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ses web
workers but I haven't found how the parallelism is controlled.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
dt:P30 wd:Q51; wdt:P31/wdt:P279* wd:Q355304 .
OPTIONAL { ?river wdt:P625 ?location. }
}
It takes it about 1000(!) s to find the first river, and it only found
two that are directly P31, so I suspect path queries are not going to
work that well. I wonder whether it does breadth-first search someh
r development team, but that will take its time.
Interesting, glad to hear this.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
abs.org/ which
should be a little more stable.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ackend stuff, frontend caching is handled by
varnish cluster, which is kinda complex, docs are here:
https://wikitech.wikimedia.org/wiki/Caching_overview
If you see anything that can be improved, please tell.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wiki
e like CloudFlare would be when
we're talking about billions of triples in various combinations. TPF
requests are pretty elementary AFAIK and wouldn't that mean that CF
would have to pretty much load every view on the graph engine index and
store it in every format (assuming content negotiation issue
is some network/delivery problem, but don't know
yet what to think about 3 more results.
I get 128 and 131 results intermittently, so I suspect some kind of bug,
not sure where yet.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wik
ar the max.
So these are the parameters so far (remember that's for one server, so 3
servers ideally are supposed to do 3x of that).
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
to do anything additional
with it - it's for the users now to see if it's useful. We'll be
watching to see whether it is not overtaxing resources and not dragging
SPARQL part down, but otherwise for now that's all the investment we're
doing for now. If we come up with some use case helpful for
his quick test was done on 150 parallel threads.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
://labs.europeana.eu/api/linked-open-data-sparql-endpoint
[2] http://nomisma.org/
[3] http://data.cervantesvirtual.com/about
[4] http://datos.bne.es/sparql?help=intro
[5]
https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual#Federation
--
Stas Malyshev
smalys...@wikimedia.org
UniProt docs
that says this case is OK (within some kind of FAQ probably? Maybe
http://www.uniprot.org/help/license ) or somehow a way to make an
official permission (some kind of letter to le...@wikimedia.org? I don't
know, I don't have much experience with this) that would make it much
easier.
--
ut. After
we're done with CC-By ones, I'll probably clean up that page a bit and
update it for more long-term structure but it seems something like that
would work.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wiki
tend it. I plan to see how it fares and if there's not anything
wrong that we've missed (doesn't look like it so far, which is good) and
add next batch next week - which means it would be deployed in April
10th deployment window (we deploy WDQS stuff on Mondays usually).
--
Stas Malyshev
smalys..
losest to it.
If you need to see which of the full statements have the same data as
direct claims, look for statements with type wikibase:BestRank. This way
you can have the same data but with qualifiers, references and full values.
--
Stas Malyshev
smalys...@wikimedia.org
__
orresponding subtag. This subtag SHOULD NOT be used.
Because the addition of other codes in the future can render
its application invalid, it is inherently unstable and hence
incompatible with the stability goals of BCP 47. It is always
preferable to use other subtags: either 'und' or (with prior
ag
icient to
> distinguish the different variants we will need.
I think it can be, using BCP 47 extensions, but Wikidata team should not
be taking care of it - instead, Wikidata editors should do it by
assigning language tag properties to specific Wikidata items.
-
a between Wikidata and SPARQL database/
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
://query.wikidata.org/copyright.html
[3] https://www.wikidata.org/wiki/Wikidata:SPARQL_federation_input
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
be OK too. Please add the descriptions to
https://www.wikidata.org/wiki/Wikidata:SPARQL_federation_input
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
> I wanted to test the use of the queries below to list named graphs (if
> any) in wikidata service [a]. I've tried them without success:
We do not use any subgraphs in WDQS, so I don't think that query would
produce much of a useful result.
--
Stas Malyshev
smalys...@wikimed
[2] https://www.mediawiki.org/wiki/Wikidata_Toolkit
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
correctly.
Aren't we limiting it right now this way in Wikidata?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
as I read it is for "this is in specific language, but we
don't have a code for it".
See https://en.wikipedia.org/wiki/ISO_639
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.o
is meant to reduce
the effect of such workloads and enforce availability to all clients. It
is not our purpose to block any legit workload, and if that happens,
please tell us and we'll look into adjusting the limits.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
ame thing in fact, that's the point of
federation, as I understand). I think P4 should be
http://federated-wikidata.wmflabs.org.wmflabs.org/prop/direct/P4 in this
case (or, in case of real Wikidata federation, it would use actual
wikidata URL of course).
--
Stas Malyshev
smalys...@wiki
:) Suggestion on improving it are welcome though.
More details available in: https://phabricator.wikimedia.org/T125500
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman
it fits your use case or not, but offering it as one
more option.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
ill make an announcement (hopefully the final one on the topic :) when
it is all done.
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Service URI
> http://wikiba.se/ontology#mwapi is not allowed
Your local installation seems to be old version. Please check out the
newest one from github. MWAPI service should be whitelisted
automatically in the new code.
--
Stas Malyshev
sm
re P31 and P279 currently represent hierarchy as such -
t.e. loops have been known to exist in those (maybe already fixed, but
not 100% sure). So one needs to be aware of that too.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@list
me. Thanks so much for developing it!
Now, is there a place to submit bug reports? :) I have noticed this one:
https://www.wikidata.org/wiki/Q5066005#P569
shows 2 violations, though the date appears to be completely fine.
--
Stas Malyshev
smalys...@wikimedia.org
__
?
Other comments and ideas on the matter are of course always welcome.
Please comment on the talk page[2] or reply to this message.
[1] https://phabricator.wikimedia.org/T148245
[2]
https://www.wikidata.org/w/index.php?title=Wikidata_talk:WDQS_and_Mediawiki_API=edit
Thanks,
--
Stas Malyshev
smalys
ipedia and other wikis right now,
which is in much better shape, but then we need to do some work to get
back to Wikidata IDs. Doable, but we need to think which option is
better. Maybe there would be several of them. But yes, this is
definitely something that is on the agenda.
--
Stas Malyshev
f he
> still needs it – and if it can’t be removed from the P569 talk page for
> some reason, we’ll probably filter it out in the user script).
The checks seem to be against P184/P185, of which neither is on
Q5066005. So I suspect there's some bug still somewhere.
--
Stas Malyshev
smaly
take extremely naive approach,
in which case performance probably will be horrible.
SERVICE otoh is pretty easy to implement and it's flexible enough to
allow talking to many external APIs. Current proposal seems to be OK,
though may be changed if more user-friendly ideas come up.
--
Stas
_page
[2] https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual/MWAPI
[3] https://www.wikidata.org/wiki/Wikidata:Contact_the_development_team
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
[1] https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
munity preferences.
I think right now it should be string. Can be changed later if we want
specific datatype for queries, right now we don't have one.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.w
o "latest-all.json.bz2")?
The dumps are indeed in
https://dumps.wikimedia.org/wikidatawiki/entities/, e.g.
https://dumps.wikimedia.org/wikidatawiki/entities/20170501/wikidata-20170501-all-BETA.ttl.gz
No links for latest for TTL dumps, for some reason. This probably needs
to be fixed.
--
Stas
[3] https://phabricator.wikimedia.org/T165982
Thanks,
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
t exists now. Is anybody actually using this value, and if so, how? If
not, I'll probably change how it works.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
has changed?
This should be fine. The data should not be different. The difference
might be because some servers are out of sync. I'll check it.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.w
/User_Manual/MWAPI
[2] https://wikitech.wikimedia.org/wiki/Deployments
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
it? I don't use it but it seems to be part of
standard for dataset descriptions, so I wonder if the issues can be
fixed. I don't know too much about it but from the description is seems
to be very automatable.
--
Stas Malyshev
smalys...@wikimedia.org
___
W
double, but there are ways around it. So technically we can keep 9
digits or however many we need, if we wanted to. I just wanted to see if
we should.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
not sure 0.3
is that useful. What would one do with it, especially in SPARQL?
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
k millimeter accuracy would ever be
relevant for Wikidata.
> Having it all queryable in Wikidata ? hmmm... not for me, other data
> catalogs and GIS systems handle that job.
Exactly.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mai
but from what I understood (my Czech is kinda weak ;) the grids
there are also because of some data representation artifact.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
here might actually make
things a bit harder to work with (such as - "are these two things
actually the same thing?" or "are they located in the same place?") Of
course, all those problems are solvable, but why not make it easier?
--
Stas Malyshev
smalys...@wikimedia.org
__
e/Indexing/RDF_Dump_Format#Globe_coordinate
[4]
https://gis.stackexchange.com/questions/8650/measuring-accuracy-of-latitude-and-longitude
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.o
t inefficient for further data sharing without URIs.
The question of querying data like "GDB by state.tab" is an interesting
one. I'm not sure whether triple store would be a good medium, but maybe
it could be... Needs some research on the idea.
--
St
clarify
> things, I suppose.
Current (as in, latest/best available for now) population of London
would be found as "truthy" value (wdt), all other population figures -
e.g. historical figures - will be under "all" (p/ps/psv).
--
Stas Malyshev
smalys...@wikimedia.org
ith references to your comments.
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
r/repo/config/ElasticSearchRescoreFunctions.php;4c6aa54e56c68ebd3543b23c88f52ae6f176a079$25
Basically it's a combination of match score (how well the string matches
the query), incoming link count, sitelink count and special boosts like
demoting the disambiguation pages.
--
Stas Malyshev
t; Didn't see
> https://www.wikidata.org/wiki/Property:P4653
yes, too recent :)
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
.org/wiki/User_talk:Smalyshev_(WMF)
- Talk to us on IRC: #wikimedia-discovery
Thanks!
--
Stas Malyshev
smalys...@wikimedia.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
101 - 200 of 326 matches
Mail list logo