Well done! Absolutely love it! I'm already using it to build SPARQL
queries for the wikidata visualizations [1].
[1]: http://en.wikipedia.beta.wmflabs.org/wiki/Sparql
On Sun, Feb 14, 2016 at 2:44 PM, Hay (Husky) wrote:
> Hey everyone,
> it seems we're getting new properties
gt;> Markus
>>
>> On 14.02.2016 15:11, Jane Darnell wrote:
>>
>>> Wow Hay, this is super useful
>>>
>>> On Sun, Feb 14, 2016 at 8:50 AM, Hay (Husky) <hus...@gmail.com
>>> <mailto:hus...@gmail.com>> wrote:
>>>
>>>
https://commons.wikimedia.org/w/api.php?action=query=File:Python-Foot.png=imageinfo&=url=100
On Wed, Apr 6, 2016 at 1:29 AM, wrote:
> From the following image URL returned from a SPARQL query, what would be
> the best way to generate a thumbnail 100 pixels wide?
>
Any person in wikidata is "famous" - otherwise they wouldn't be notable and
therefore wouldn't be there))
If you prefer the stricter notability requirement(as used by Wikipedia),
search only for those that have a wikipedia page
On Aug 2, 2016 1:44 PM, "Ghislain ATEMEZING"
Jane, now we are really going into the field of elastic search's relevancy
calculation. When searching, things like popularity (pageviews), incoming
links, number of different language wiki articles, article size, article
quality (good/selected), and many other aspects could be used to better the
Erika, would building a better wikidata UI help alleviate your concern?
For example, it used to be that to add a link to the same article in
another language, one had to edit raw wiki markup and add a weird language
link. Now with wikidata it is by far more intuitive, with an edit button
right
You might also use page views for the fame estimates. E.g. us election
candidate pageviews:
https://meta.wikimedia.org/wiki/User:Yurik/US_Politics_Real_Time
On Wed, Aug 17, 2016, 11:42 Felipe Hoffa wrote:
> I've been playing with Wikipedia (to extract list of links),
I guess I qualify for #2 several times:
* The & support access to the geoshapes service, which
in turn can make requests to WDQS. For example, see
https://en.wikipedia.org/wiki/User:Yurik/maplink (click on "governor's
link")
* The wiki tag supports the same geoshapes service, as well as
I would highly recommend using X-Analytics header for this, and
establishing a "well known" key name(s). X-Analytics gets parsed into
key-value pairs (object field) by our varnish/hadoop infrastructure,
whereas the user agent is basically a semi-free form text string. Also,
user agent cannot be
I would like to propose that we add how popular is each sitelink to WDQS.
This would allow queries that order results by wiki article popularity. For
example, this query lists Wikidata items without French labels but with
French articles, ordered by how often they gets viewed in frwiki.
> when you say "wikidata is not well suited for lists data", you refer
> to wikibase or WDQS here?
>
Wikibase, per Daniel K.
>
> the data:Bea.gov/GDP by state.tab above is certainly a good
> representation for efficient delivery (via json) and display of data.
> but inefficient for further
There is a better alternative to storing lists -
https://www.mediawiki.org/wiki/Help:Tabular_Data -- it allows you to store
a CSV-like table of data on Commons, with localized columns, and access it
from all other wikis from the and Lua scripts.
A good example of it -- "per state GDP" page --
ier of a traditional property?
>
> I find the feature very promising, but for now it is still in its
> infancy. I don't see how I could use it for edits like this one:
>
> https://www.wikidata.org/w/index.php?title=Q37461404=578074181=578071885
>
> Antonin
>
> On 18/10/2017
Something I wish was available is the voting record, at least at a
country/state level. Knowing the politician's time in office is a great
start, but how that person voted is what really makes democracy work.
On Sun, Mar 11, 2018 at 5:16 AM, Gerard Meijssen
wrote:
>
Seems like they simply store it as wiki markup -
https://ballotpedia.org/wiki/index.php?title=Marco+Rubio=edit,
unless they generate it from some other internal database.
On Mon, Mar 12, 2018 at 8:10 PM, Stas Malyshev
wrote:
> Hi!
>
> > Something I wish was available is
Thanks Stas. How does this affect non-WMF clones of Wikidata QS?
On Tue, Mar 6, 2018 at 2:51 PM, Stas Malyshev
wrote:
> Hi!
>
> This morning we have switched the polling mechanism for Wikidata Query
> Service from using Recent Changes API to using Kafka events
>
Awesome news, congratulations!
See live demo at https://wikidata-lexeme.wmflabs.org/
On Wed, Mar 7, 2018 at 11:49 AM, Léa Lacroix
wrote:
> Hello all,
>
> First version of Lexicographical Data will be released in April. You can
> read the detailed announcement here:
P.S. is there a list of values we want to introduce with the well known
numbers?
e.g.peace - L1
On Wed, Mar 7, 2018 at 12:04 PM, Yuri Astrakhan <yuriastrak...@gmail.com>
wrote:
> Awesome news, congratulations!
>
> See live demo at https://wikidata-lexeme.wmflabs.org/
&g
Amir, importing data from Wikidata to OSM has been discussed a number of
times. There is a number of active OSM community members who are strongly
opposing it because they feel Wikidata is not sufficiently safe from the
legal perspective. E.g. Wikipedia allows users to look up things in Google
There is a difference between the name "translation" and "transliteration".
Place translation should always take precedence, e.g. (Köln vs Cologne).
This mostly applies to cities/countries, but not street-level naming.
Transliterations are trickier. Should we simply transliterate everything
into
Daniel,
> P and Q indicate the *type* of the entity ("P" = "Property", "Q" = "Item"
> for
> arcane reasons), "L" = Lexeme, "F" = Form, "S" = Sense, "M" = MediaInfo).
> As you
> can tell, we'd quickly run out of letters and cause confusion if this
> became
> configurable.
>
I don't think this
On Thu, Nov 29, 2018 at 1:03 PM Daniel Kinzler
wrote:
> This doesn't fix the hard-coded prefix in the RDF output generated by
> Wikibase.
>
> See my previous email - my patch fixes that too. Here's an example query
http://tinyurl.com/yav76uof in Sophox -- it calls out to Wikidata to get a
list
Olaf, Andra, Lydia,
On Thu, Nov 29, 2018 at 4:01 AM Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:
> Are we talking about https://phabricator.wikimedia.org/T194180? I'm
> happy to push that into one of the next sprints if so.
>
> I think my yesterday's patch fixes this issue on the
Daniel, it is not so clear cut. Most users will not be exposed to a
"zoo". Case in point - Open Street Map. In OSM, the entire user base of
tens of thousands of people know the meaning of Q123. The "Q" prefix has a
strong identity in itself. Anyone will instantly say - yes, it's a
Wikidata
On Thu, Nov 29, 2018 at 12:51 AM Federico Leva (Nemo)
wrote:
> Yuri Astrakhan, 29/11/18 04:14:
> > The "Q" prefix has a strong identity in itself. Anyone will instantly
> > say - yes, it's a Wikidata identifier
>
> But that's because most people only
I would add another very important aspect - query prefixes - to build some
cohesion within Wikibase community.
Currently, WDQS hardcodes prefixes like "wd:" and "wdt:" to be based on the
"conceptUri" parameter. Which means that any Wikibase installation that
has its own data would still use
currently I think
> this is not possible.
>
>-- James
>
> On 28/11/2018 16:32, Yuri Astrakhan wrote:
> > I would add another very important aspect - query prefixes - to build
> some
> > cohesion within Wikibase community.
> >
> > Currently, WDQS h
I actually already implemented support in SPARQL for that, but it needs a
bit more work to get it properly merged with the Blazegraph code. I had it
working for a while as part of Sophox (OSM Sparql).
* docs: https://wiki.openstreetmap.org/wiki/Sophox#External_Data_Sources
* code:
:38 PM Kingsley Idehen
wrote:
> On 5/31/19 11:28 AM, Yuri Astrakhan wrote:
>
> I actually already implemented support in SPARQL for that, but it needs a
> bit more work to get it properly merged with the Blazegraph code. I had it
> working for a while as part of Sophox (OSM Spar
Sounds interesting, is there a github repo?
On Fri, May 3, 2019 at 8:19 PM Amirouche Boubekki <
amirouche.boube...@gmail.com> wrote:
> GerardM post triggered my interest to post to the mailing list. As you
> might know I am working on functional quadstore that is quadstore that
> keeps around
There has been a number of discussions about translations. At the moment,
the whole situation is very similar to the original interwiki (sitelink)
issue -- a lexeme in each language has to point to corresponding lexemes in
all other languages. This issue was actually what started Wikidata in the
Hi, I would like to implement a new property type for my project. Are there
any examples of extensions that add new prop types to wikibase?
I already implemented most of what I need by changing wikibase code, but I
doubt a property to store multiline code snippets will be accepted into
wikibase
iki/extensions/Kartographer>, I
> believe.
>
> I’m not aware of any extensions that add new datatypes and are
> specifically intended to be used as examples or building blocks for your
> own extensions.
>
> Cheers,
> Lucas
> On 30.05.20 17:52, Yuri Astrakhan wrote:
>
> Hi,
We really ought to change it to dynamic:
https://en.wikipedia.org/w/api.php?action=querymeta=wikibase
(thanks duh)
If there is no error, wikibase is present, disable. This way no update to
the blacklist is needed. The query should be re-issued every 30 min to make
sure it hasn't changed.
The
Best
On Thu, Feb 14, 2013 at 9:44 AM, Yuri Astrakhan
yuriastrak...@gmail.comwrote:
We really ought to change it to dynamic:
https://en.wikipedia.org/w/api.php?action=querymeta=wikibase
(thanks duh)
If there is no error, wikibase is present, disable. This way no update to
the blacklist
but for the future it's the only option so I'll work
on it but not now
Best
On Thu, Feb 14, 2013 at 9:44 AM, Yuri Astrakhan yuriastrak...@gmail.com
mailto:yuriastrakhan@gmail.**com yuriastrak...@gmail.com wrote:
We really ought to change it to dynamic:
https://en.wikipedia.org/w
their logic is substantially changed.
On Thu, Feb 14, 2013 at 6:10 PM, Marco Fleckinger
marco.fleckin...@wikipedia.at wrote:
What about a bot to observe bot changes periodically and checking those
for unwanted interwiki-operations?
On 02/14/2013 11:57 PM, Yuri Astrakhan wrote:
We really
In another maillist, Yuri pointed me Wikidata API
roadmaphttps://www.mediawiki.org/wiki/Requests_for_comment/Wikidata_API.
I don't know with those changes, if my current use
(action=queryprop=revisions and action=querylist=recentchanges) will
be supported?
We don't plan to break stuff just
On Wed, Mar 6, 2013 at 3:35 PM, Yuri Astrakhan yuriastrak...@gmail.comwrote:
During an IRC discussion, I was told that a page in namespace 0 like
Q219937 http://www.wikidata.org/wiki/Q219937 does not necessarily have
a one-to-one relationship with an entity like Bonnie and Clyde
I also plan to be in Amsterdam, would love to work closely with the
wikidata team.
--yurik
On Thu, Mar 21, 2013 at 11:53 AM, Andy Mabbett a...@pigsonthewing.org.ukwrote:
On 21 March 2013 14:26, Lydia Pintscher lydia.pintsc...@wikimedia.de
wrote:
Some of the Wikidata team will be at the
Petr, I have began specing out a migration to action=query in
http://www.mediawiki.org/wiki/Requests_for_comment/Wikidata_API , but that
RFC is not yet complete. Will probably work on it when I am in Amsterdam to
get that in sync with the Wikidata team... (and convince Denny that the
path is
This can already be done by changing JsonConfig configuration. I propose we
add a Data namespace to the *commons https://commons.wikimedia.org/*.
Moreover, with the recent work on Graph
https://www.mediawiki.org/wiki/Extension:Graph extension, I was thinking
of storing graphing related data there
at the moment, but hopefully it will get done soon.
On May 9, 2015 21:05, Jan Ainali jan.ain...@wikimedia.se wrote:
2015-05-09 19:54 GMT+02:00 Yuri Astrakhan yastrak...@wikimedia.org:
Lydia, can visualization challenge use freshly launched graphs? I haven't
enabled them on Wikidata just to be safe
Lydia, can visualization challenge use freshly launched graphs? I haven't
enabled them on Wikidata just to be safe, but it can be done very quickly.
See https://www.mediawiki.org/wiki/Extension:Graph
Hey folks :)
Here's your summary of what happened around Wikidata over the past week.
Enjoy!
44 matches
Mail list logo