Re: [Wikidata] Wikimedia Sverige receives $65k+ in funding for our Library Data project

2018-08-28 Thread Osma Suominen

This is excellent news! Congratulations, and best of luck to your project!

I think this will really help establish the role of library data within 
Wikidata, already well grounded by the work of WikiCite, Inventaire.io, 
various vocabulary and authority file mappings on Mix'n'Match and many 
other projects.


I look forward to following your progress and I would very much like to 
do something similar with Finnish bibliographic data :)


-Osma

Alicia Fagerving kirjoitti 27.08.2018 klo 17:17:

Wikimedia Sverige is proud to be
the recipient of $65,500 in support from the Swedish National Library for
our project Library Data.

We will work in collaboration with the Swedish
National Library to include a number of datasets onto Wikidata, such as
data about authors, libraries and different special databases of
bibliographies[1]. This is a pilot project where we aim to discuss with the
community what to include and what to exclude. Based on the discussions and
the requests from the community we will design a continuation of this
project (if this first part is deemed successful continuous funding is
possible for 3-4 more years).

We started investigating a possible long term
partnership with the National Library in 2017 when Wikimedia Sverige
delivered inputs to the new National Strategy for the Library Sector on how
Sweden's libraries can work with Wikimedia for mutual benefits.[2] The
National Library has just made history as the world's first national
library to fully transition to Linked Open Data (BIBFRAME 2.0),[3] so the
timing could not have been better; we are now in position to examine how
this move can benefit Wikidata and other Wikimedia projects.

Please contact
the project manager André Costa (andre.co...@wikimedia.se 
) or the developer Alicia Fagerving
(alicia.fagerv...@wikimedia.se ) 
if you have

any questions.

As always, you can find the full application on our wiki (in
Swedish):

* 
https://se.wikimedia.org/wiki/Projekt:Strategisk_inkludering_av_biblioteksdata_p%C3%A5_Wikidata_2018/Ans%C3%B6kan


[1] 
[2] 


(in Swedish)
[3] 




Kind regards,

~*~
Alicia Fagerving
Developer
Wikimedia Sverige (WMSE)

e-mail: alicia.fagerv...@wikimedia.se 
phone: +46 73 950 09 56


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata




--
Osma Suominen
D.Sc. (Tech), Information Systems Specialist
National Library of Finland
P.O. Box 26 (Kaikukatu 4)
00014 HELSINGIN YLIOPISTO
Tel. +358 50 3199529
osma.suomi...@helsinki.fi
http://www.nationallibrary.fi

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Better suggestions for constraint values on properties

2018-08-28 Thread James Heald
Interesting, but why not simply suggest the most frequently-used 
qualifiers for the given property?


The list of "allowed qualifiers" is often wildly incomplete; or 
alternatively no such constraint is specified at all.


  -- James.


On 28/08/2018 14:05, Léa Lacroix wrote:

Hello all,

Currently, when adding a new value in the constraints section of a
property, there is no suggestion to fill the value or the qualifier. We’ve
been improving this by a few changes that are going to be deployed this
week.

- When adding a new value in the property constraint statement, a list
of suggestions will be displayed and, all the relevant constraint items
will be showed first. They will be selected among the list of qualifiers
present in the statement “property constraint -> one of constraint”
of property
constraint (P2302) .
- Of course, you can still type anything you want in the field to find a
value. The full text search has been improved (when typing “none constr”
you will also see “none of constraint” in the suggester)

In a very near future, we will also make the following happen:

- Same behavior for qualifiers inside the constraint statements. The
suggester will pick up the values from “allowed qualifiers constraint”
- When clicking on “add value” in the property constraint statement, a
suggester menu will directly appear (without having to click on the value
field)

The first two changes will appear on wikidata.org on August 30th, the
following ones in the next weeks. Feel free to make some tests, and let us
know if you find a bug or something that doesn’t behave as expected.

Related tickets: phab:T199672 ,
phab:T201288 .

Thanks,



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata




---
This email has been checked for viruses by AVG.
https://www.avg.com


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] image re-use information available via a SPARQL query?

2018-08-28 Thread Jan Macura
Hello Bob,

As far as I know, I am afraid that this is not possible, and won't be
possible until the Structured data on Commons
 will come to
life.

Regards,
 Jan
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] several job openings in or around the Wikidata dev team

2018-08-28 Thread Lydia Pintscher
Hey folks :)

We have a number of open positions at the moment at WMDE that are
relevant for Wikidata including a program manager for Wikidata who
will work closely with me. I'd love to see many applications from
people who already are a part of the community and understand and love
what makes Wikidata special.
Find the full list including job descriptions here:
https://wikimedia-deutschland.softgarden.io/de/vacancies
If you have any questions please let me know.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/029/42207.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] image re-use information available via a SPARQL query?

2018-08-28 Thread Bob DuCharme
I know how to run SPARQL queries to find out various properties of the 
Empire State Building as shown on https://www.wikidata.org/wiki/Q9188, 
including the URL of the image shown there.


When I look at the 
https://commons.wikimedia.org/wiki/File:Empire_State_Building_from_the_Top_of_the_Rock.jpg 
web page I see that the image has a CC-BY-SA-2.0 license, but I can't 
work out how to get to that in a query.


Can anyone show me a SPARQL query about this image where the query 
result would show me the license type?


Thanks,

Bob


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Better suggestions for constraint values on properties

2018-08-28 Thread Léa Lacroix
Hello all,

Currently, when adding a new value in the constraints section of a
property, there is no suggestion to fill the value or the qualifier. We’ve
been improving this by a few changes that are going to be deployed this
week.

   - When adding a new value in the property constraint statement, a list
   of suggestions will be displayed and, all the relevant constraint items
   will be showed first. They will be selected among the list of qualifiers
   present in the statement “property constraint -> one of constraint”
of property
   constraint (P2302) .
   - Of course, you can still type anything you want in the field to find a
   value. The full text search has been improved (when typing “none constr”
   you will also see “none of constraint” in the suggester)

In a very near future, we will also make the following happen:

   - Same behavior for qualifiers inside the constraint statements. The
   suggester will pick up the values from “allowed qualifiers constraint”
   - When clicking on “add value” in the property constraint statement, a
   suggester menu will directly appear (without having to click on the value
   field)

The first two changes will appear on wikidata.org on August 30th, the
following ones in the next weeks. Feel free to make some tests, and let us
know if you find a bug or something that doesn’t behave as expected.

Related tickets: phab:T199672 ,
phab:T201288 .

Thanks,
-- 
Léa Lacroix
Project Manager Community Communication for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] VIAF date scraping errors

2018-08-28 Thread Jane Darnell
Yes I reached the same conclusion. In fact it's even worse than 1950, and
many people have been given the incorrect birthdate of "1900" for the same
reason. I have started going through women items with 1900-1-1 birthdates,
correcting the birthdate where possible, and where impossible (e.g. simply
not enough information), I have been moving the incorrect date to "1960s"
or "1970s", just to offload the 1900 errors and carve out any chance of
PD-work (died before 1948). A big mess indeed :(

On Tue, Aug 28, 2018 at 9:35 AM Renée Bagslint 
wrote:

> If you Google "Emily Riehl", you will find that Google tells you that she
> was born in 1950: this is certainly complete nonsense. Her date of birth
> isn't in her Wikipedia article, which is where Google gets its text from,
> but turns out to be in her Wikidata entry, having added by
> Reinheitsgebot, operated by Magnus Manske. It seems that in May when this
> and many other dates were being added by the bot, it was scraping VIAF
> files and incorrectly parsing the XML for the Marc21 entry 997, which gives
> everyone born in the 20th century an arbtrary "floruit" date of 1950 if not
> otherwise available. I have to say that I'm not completely sure about every
> detail of this diagnosis -- 997 is not even a standard Marc field, it's
> reserved for local use: significant dates such as birth and death are
> encoded in field 046.  But it is clear that the dates being inserted by the
> bot can be completely fictitious. This was reported in May but it seems
> that it has not been fixed. As a result Wikidata and hence Google are
> delivering an unknown number of incorrect dates of birth.
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata SPARQL query logs available

2018-08-28 Thread Finn Aarup Nielsen

I was wondering why our research section was number 8!?

Then I recalled our dashboard running from 
"http://people.compute.dtu.dk/faan/cognitivesystemswikidata1.html";. It 
updates around each 3 minute all day long :)


/Finn

On 08/23/2018 09:57 PM, Daniel Mietchen wrote:

I just ran Max' one-liner over one of the dump files, and it worked
smoothly. Not sure where the best place would be to store such things,
so I simply put it in my sandbox for now:
https://www.wikidata.org/w/index.php?title=User:Daniel_Mietchen/sandbox&oldid=732396160
.
d.
On Tue, Aug 7, 2018 at 6:06 PM David Cuenca Tudela  wrote:


If someone could post the 10 (or 50!) more popular items, I would really 
appreciate it :-)

Cheers,
Micru

On Tue, Aug 7, 2018 at 5:59 PM Maximilian Marx  
wrote:



Hi,

On Tue, 7 Aug 2018 17:37:34 +0200, Markus Kroetzsch 
 said:

If you want a sorted list of "most popular" items, this is a bit more
work and would require at least some Python script, or some less
obvious combination of sed (extracting all URLs of entities), and
sort.


   zgrep -Eoe '%3Chttp%3A%2F%2Fwww.wikidata.org%2Fentity%2FQ[1-9][0-9]+%3E' 
dump.gz | cut -d 'Q' -f 2 | cut -d '%' -f 1 | sort | uniq -c | sort -nr

should do the trick.

Best,

Maximilian
--
Dipl.-Math. Maximilian Marx
Knowledge-Based Systems Group
Faculty of Computer Science
TU Dresden
+49 351 463 43510
https://kbs.inf.tu-dresden.de/max

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata




--
Etiamsi omnes, ego non
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata



___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] VIAF date scraping errors

2018-08-28 Thread Renée Bagslint
If you Google "Emily Riehl", you will find that Google tells you that she
was born in 1950: this is certainly complete nonsense. Her date of birth
isn't in her Wikipedia article, which is where Google gets its text from,
but turns out to be in her Wikidata entry, having added by Reinheitsgebot,
operated by Magnus Manske. It seems that in May when this and many other
dates were being added by the bot, it was scraping VIAF files and
incorrectly parsing the XML for the Marc21 entry 997, which gives everyone
born in the 20th century an arbtrary "floruit" date of 1950 if not
otherwise available. I have to say that I'm not completely sure about every
detail of this diagnosis -- 997 is not even a standard Marc field, it's
reserved for local use: significant dates such as birth and death are
encoded in field 046.  But it is clear that the dates being inserted by the
bot can be completely fictitious. This was reported in May but it seems
that it has not been fixed. As a result Wikidata and hence Google are
delivering an unknown number of incorrect dates of birth.
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Wikimedian-in-Residence position open at Stanford University / LD4P project

2018-08-28 Thread Michelle Futornick
When you hear the word metadata, do you think of cell phone records, or of a 
whole world of knowledge waiting to be linked and discovered? Stanford 
University Libraries and the Linked Data for Production (LD4P) 
project are seeking someone who is passionate about 
libraries and about bringing library data to the world, to serve as our 
Wikimedian-in-Residence.


As a widely used source of structured data, Wikidata has the potential to be a 
platform for publishing, linking, and enriching library metadata. The 
Wikimedian-in-Residence will play a key role in enabling the Wikimedia 
community and the library community to enrich each other’s pool of knowledge.


Alex Stinson, GLAM-Wiki Strategist at the Wikimedia Foundation, explains the 
importance of this role: "We are seeing a growing body of interest among 
different library communities, both in North America, Europe, and other parts 
of the world in figuring out how libraries should be involved in using 
Wikidata. The LD4P project and this role provide a well-supported environment 
for creating experiments within the actual central work of research libraries, 
to figure out what works and doesn't and how technology can be improved to 
support such collaboration."


Read more and apply for the position here: 
https://careersearch.stanford.edu/jobs/wikimedian-in-residence-3711




Michelle Futornick

Linked Data for Production (LD4P) Program Manager

Stanford University

Lathrop Library

Stanford, CA 94305
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata