Re: [Wikidata-l] Classes of things in Wikidata (Was: OpenStreetMap + Wikidata for light houses)
On 11.03.2015 15:31, Magnus Manske wrote: I can offer https://tools.wmflabs.org/mix-n-match/ This (or a future successor) could serve as the list keeper for, say, list of lighthouses, or graded buildings in the UK. One thing that IMHO would be required for this to work would be a (semi-)automatic sync of the list keeper with authoritative external lists (unless we want to sync those to Wikidata directly). Yes, that's a beautiful tool. This would definitely be a place to link to from wherever the Things on Wikidata are documented (I guess this would then be somewhere on-wiki). Markus On Wed, Mar 11, 2015 at 2:09 PM Markus Krötzsch mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org wrote: Hi Andrew, This is a great idea! It would help data consumers to know what to expect and community members to know what to put in (or where help with imports would be appreciated). Moreover, the discussion about this list would be a great way to structure our work in general (have documented discussions about our goals for certain types of data). I feel that the bot right approval process is not the best place to decide if we strive to have all streets or all lighthouses in. For things that are not complete in Wikidata (yet or ever), it would further help to provide pointers to other, more complete data sources (and the properties we might have to link to them). The question is how to best organise this list. Your initial example setup already shows that this tends to become very diverse (not to say: chaotic). One could link this from the related class items (e.g., lighthouses or paintings), but having this as another extra load on the talk page would maybe not so ideal either. After all, this could be one of the first things that newbies to Wikidata want to get an idea about. Cheers, Markus On 11.03.2015 14:07, Andrew Gray wrote: ... I wonder if it would be useful to have a centralised list of classes of things in Wikidata. For example: Things entirely in Wikidata * MEPs * County-level administrative divisions of all countries * All artworks by the following people (list) * Cultural heritage sites in the following countries (list) * All people listed in the following biographical databases (list) * (etc) Things not yet entirely in Wikidata (but probably will be eventually) * All national-level elected representatives * All species * Lighthouses * All artworks by the following people (list) * Cultural heritage sites in the following countries (list) * All people listed in the following biographical databases (list) Things which will never be complete in Wikidata * All local politicians * Streets worldwide * All businesses This would be a very useful adjunct to the notability page, as it would give concrete examples to work from for the sort of things we feel are appropriate. _ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org mailto:Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/__mailman/listinfo/wikidata-l https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] OpenStreetMap + Wikidata for light houses
On 11 Mar 2015, at 09:37, Magnus Manske magnusman...@googlemail.com wrote: On Wed, Mar 11, 2015 at 9:24 AM Markus Krötzsch mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org wrote: No, you are right: this is of course an issue in the completeness of our data. If you zoom in to Europe, you can see that some countries have costs full of lighthouses, while others seem to lack them almost completely. I think it clearly shows that a lot of our data comes from Wikipedias (in some specific language). In this instance, the issue appears to be that the existing lists on Wikipedia have not been touched, e.g.: https://en.wikipedia.org/wiki/List_of_lighthouses_in_Spain https://en.wikipedia.org/wiki/List_of_lighthouses_in_Spain These, including redlinks, could be imported into Wikidata rather easily. Some already have images. Ideally, we'd want some official (e.g. national, UN) source to cross-check. This is probably the best source: NGA List of Lights The List of Lights, Radio Aids and Fog Signals is published in seven volumes, as Publication numbers 110 through 116. Each volume contains lights and other aids to navigation that are maintained by or under the authority of foreign governments. http://msi.nga.mil/NGAPortal/MSI.portal?_nfpb=true_st=_pageLabel=msi_portal_page_62pubCode=0007 http://msi.nga.mil/NGAPortal/MSI.portal?_nfpb=true_st=_pageLabel=msi_portal_page_62pubCode=0007 The US also have another lights list for US http://msi.nga.mil/NGAPortal/MSI.portal?_nfpb=true_pageLabel=msi_portal_page_62pubCode=0014 http://msi.nga.mil/NGAPortal/MSI.portal?_nfpb=true_pageLabel=msi_portal_page_62pubCode=0014 For the US the NOAA have publicly accessible ENC marine charts which show 'lights': http://www.nauticalcharts.noaa.gov/ENCOnline/enconline.html http://www.nauticalcharts.noaa.gov/ENCOnline/enconline.html The US also have sailing guides: the regions http://msi.nga.mil/MSISiteContent/StaticFiles/Images/SDLIMITS.jpg http://msi.nga.mil/MSISiteContent/StaticFiles/Images/SDLIMITS.jpg where to get the pdf from http://msi.nga.mil/NGAPortal/MSI.portal?_nfpb=true_st=_pageLabel=msi_portal_page_62pubCode=0011 http://msi.nga.mil/NGAPortal/MSI.portal?_nfpb=true_st=_pageLabel=msi_portal_page_62pubCode=0011 Also Sailing Directions (Enroute) http://msi.nga.mil/NGAPortal/MSI.portal?_nfpb=true_st=_pageLabel=msi_portal_page_62pubCode=0010 http://msi.nga.mil/NGAPortal/MSI.portal?_nfpb=true_st=_pageLabel=msi_portal_page_62pubCode=0010 Here is a 10MB example http://msi.nga.mil/MSISiteContent/StaticFiles/NAV_PUBS/SD/Pub132/Pub132bk.pdf http://msi.nga.mil/MSISiteContent/StaticFiles/NAV_PUBS/SD/Pub132/Pub132bk.pdf There is also a crowdsource project here: http://wikimapia.org/#lang=enlat=38.315801lon=-4.954834z=7m=btag=782 http://wikimapia.org/#lang=enlat=38.315801lon=-4.954834z=7m=btag=782 Also the The Lighthouse Directory (University of North Carolina at Chapel Hill) http://www.unc.edu/~rowlett/lighthouse/ http://www.unc.edu/~rowlett/lighthouse/ ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] OpenStreetMap + Wikidata
Am 10.03.2015 um 16:32 schrieb Yaroslav M. Blanter: Hi Amir, anything which can be remotely considered as a tourist attraction, as well as shops, hotels, reataurants and such are withing the scope of Wikivoyage and thus of Wikidata. For streets, we have now an approved bot task adding all Dutch streets on Wikidata, and I do not see why any other country could be different - provided we have good sources. I fear doing this is going to kill Wikidata. Neither the software nor the community scales to managing entries for every street in the world. -- Daniel Kinzler Senior Software Developer Wikimedia Deutschland Gesellschaft zur Förderung Freien Wissens e.V. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
[Wikidata-l] input for improving integration of Wikidata in watchlists needed
Hey folks :) Data quality and trust is what we're currently concentrating on in the development around Wikidata. A big part of that is improving the integration of Wikidata in the watchlist of Wikipedia and other sister projects. I just opened a page to collect input on how we can improve it. Please help me with spreading this to the Wikipedias and other sister projects. https://www.wikidata.org/wiki/Wikidata:Watchlist_integration_improvement_input Cheers Lydia -- Lydia Pintscher - http://about.me/lydia.pintscher Product Manager for Wikidata Wikimedia Deutschland e.V. Tempelhofer Ufer 23-24 10963 Berlin www.wikimedia.de Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Names, Aliases, Copyright (and a little OpenStreetMap)
Mechanically generated names by transliteration bots or by direct word for word translation (depending on the custom generally used in the target language) may well be appropriate in many cases Remember that I'm talking about place names, rather than other types of names. Within that scope, I don't understand how a made up name can be appropriate, moreover I don't understand how a made up name can be given equal footing as the correct name. Imagine a situation where someone transliterates a name in, say French, but the French name for the place is different. How are we to distinguish between the two? - Serge ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Classes of things in Wikidata (Was: OpenStreetMap + Wikidata for light houses)
Thanks both. I'll try hacking out an aggregated list tonight and send the link around, but for now if there's anything you know for sure we cover comprehensively, fire away with your suggestions :-) Mix-and-match was in the back of my head, as it has a nice example of all three groups: * Already completely included - MEPs (at least until the next election), ODNB * Going to be included but not there yet - Dictionary of Welsh Biography * Will never be completely included - ACAD Andrew. On 11 March 2015 at 14:31, Magnus Manske magnusman...@googlemail.com wrote: I can offer https://tools.wmflabs.org/mix-n-match/ This (or a future successor) could serve as the list keeper for, say, list of lighthouses, or graded buildings in the UK. One thing that IMHO would be required for this to work would be a (semi-)automatic sync of the list keeper with authoritative external lists (unless we want to sync those to Wikidata directly). On Wed, Mar 11, 2015 at 2:09 PM Markus Krötzsch mar...@semantic-mediawiki.org wrote: Hi Andrew, This is a great idea! It would help data consumers to know what to expect and community members to know what to put in (or where help with imports would be appreciated). Moreover, the discussion about this list would be a great way to structure our work in general (have documented discussions about our goals for certain types of data). I feel that the bot right approval process is not the best place to decide if we strive to have all streets or all lighthouses in. For things that are not complete in Wikidata (yet or ever), it would further help to provide pointers to other, more complete data sources (and the properties we might have to link to them). The question is how to best organise this list. Your initial example setup already shows that this tends to become very diverse (not to say: chaotic). One could link this from the related class items (e.g., lighthouses or paintings), but having this as another extra load on the talk page would maybe not so ideal either. After all, this could be one of the first things that newbies to Wikidata want to get an idea about. Cheers, Markus On 11.03.2015 14:07, Andrew Gray wrote: ... I wonder if it would be useful to have a centralised list of classes of things in Wikidata. For example: Things entirely in Wikidata * MEPs * County-level administrative divisions of all countries * All artworks by the following people (list) * Cultural heritage sites in the following countries (list) * All people listed in the following biographical databases (list) * (etc) Things not yet entirely in Wikidata (but probably will be eventually) * All national-level elected representatives * All species * Lighthouses * All artworks by the following people (list) * Cultural heritage sites in the following countries (list) * All people listed in the following biographical databases (list) Things which will never be complete in Wikidata * All local politicians * Streets worldwide * All businesses This would be a very useful adjunct to the notability page, as it would give concrete examples to work from for the sort of things we feel are appropriate. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l -- - Andrew Gray andrew.g...@dunelm.org.uk ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Classes of things in Wikidata (Was: OpenStreetMap + Wikidata for light houses)
Here's a very rough first attempt: https://www.wikidata.org/wiki/Wikidata:Comprehensive_groups_of_items The main problem is that other than the areas I've been working in, I really don't know what's out there :-). Does the Wiki Loves Monuments work mean that we have complete monument coverage in certain countries, for example? Andrew. On 11 March 2015 at 15:01, Andrew Gray andrew.g...@dunelm.org.uk wrote: Thanks both. I'll try hacking out an aggregated list tonight and send the link around, but for now if there's anything you know for sure we cover comprehensively, fire away with your suggestions :-) Mix-and-match was in the back of my head, as it has a nice example of all three groups: * Already completely included - MEPs (at least until the next election), ODNB * Going to be included but not there yet - Dictionary of Welsh Biography * Will never be completely included - ACAD Andrew. On 11 March 2015 at 14:31, Magnus Manske magnusman...@googlemail.com wrote: I can offer https://tools.wmflabs.org/mix-n-match/ This (or a future successor) could serve as the list keeper for, say, list of lighthouses, or graded buildings in the UK. One thing that IMHO would be required for this to work would be a (semi-)automatic sync of the list keeper with authoritative external lists (unless we want to sync those to Wikidata directly). On Wed, Mar 11, 2015 at 2:09 PM Markus Krötzsch mar...@semantic-mediawiki.org wrote: Hi Andrew, This is a great idea! It would help data consumers to know what to expect and community members to know what to put in (or where help with imports would be appreciated). Moreover, the discussion about this list would be a great way to structure our work in general (have documented discussions about our goals for certain types of data). I feel that the bot right approval process is not the best place to decide if we strive to have all streets or all lighthouses in. For things that are not complete in Wikidata (yet or ever), it would further help to provide pointers to other, more complete data sources (and the properties we might have to link to them). The question is how to best organise this list. Your initial example setup already shows that this tends to become very diverse (not to say: chaotic). One could link this from the related class items (e.g., lighthouses or paintings), but having this as another extra load on the talk page would maybe not so ideal either. After all, this could be one of the first things that newbies to Wikidata want to get an idea about. Cheers, Markus On 11.03.2015 14:07, Andrew Gray wrote: ... I wonder if it would be useful to have a centralised list of classes of things in Wikidata. For example: Things entirely in Wikidata * MEPs * County-level administrative divisions of all countries * All artworks by the following people (list) * Cultural heritage sites in the following countries (list) * All people listed in the following biographical databases (list) * (etc) Things not yet entirely in Wikidata (but probably will be eventually) * All national-level elected representatives * All species * Lighthouses * All artworks by the following people (list) * Cultural heritage sites in the following countries (list) * All people listed in the following biographical databases (list) Things which will never be complete in Wikidata * All local politicians * Streets worldwide * All businesses This would be a very useful adjunct to the notability page, as it would give concrete examples to work from for the sort of things we feel are appropriate. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l -- - Andrew Gray andrew.g...@dunelm.org.uk -- - Andrew Gray andrew.g...@dunelm.org.uk ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Names, Aliases, Copyright (and a little OpenStreetMap)
Am 12.03.2015 um 10:03 schrieb Gerard Meijssen: Hoi, What would you do with the many, many Chinese place names in Wikidata where we have nothing but Chinese ? It is completely useless to me in this way. A good transliterations works for me. Like most people beyond that I do not care much about it being official or sourced. Decent automatic translitteration is fine I think. Automatic word-for-word *translation* however seems rather problematic. -- Daniel Kinzler Senior Software Developer Wikimedia Deutschland Gesellschaft zur Förderung Freien Wissens e.V. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Kian: The first neural network to serve Wikidata
Sure, tonight it will be done. Best On Thu, Mar 12, 2015 at 2:08 AM, Sjoerd de Bruin sjoerddebr...@me.com wrote: I'm ready for it! All existing humans on nlwiki have a gender now, so it's easy to review this batch. Bring it on. Op 11 mrt. 2015, om 22:14 heeft Maarten Dammers maar...@mdammers.nl het volgende geschreven: Hi Amir, Amir Ladsgroup schreef op 9-3-2015 om 22:40: Result for English Wikipedia (6366 articles classified as human) https://tools.wmflabs.org/dexbot/kian_res_en.txt Sounds like fun! Can you run it on the Dutch Wikipedia too? On https://tools.wmflabs.org/multichill/queries/wikidata/noclaims_nlwiki.txt I have a list of items without claims (linking them to other items). Maarten ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l -- Amir ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] [Dbpedia-discussion] [Dbpedia-developers] DBpedia-based RDF dumps for Wikidata
Your description sounds quite close to what we had in mind. The high level group is manifesting quite well, the domain groups are planned as pilots for selected domains (e.g. Law or Mobility). I lost a bit the overview on the data classification. We might auto-link or crowdsource. I would need to ask others, however. We are aiming to create a structure that allows stability and innovation in an economic way - - I see this as the real challenge... Jolly good show, Sebastian On 11 March 2015 20:53:55 CET, John Flynn jflyn...@verizon.net wrote: This is a very ambitious, but commendable, goal. To map all data on the web to the DBpedia ontology is a huge undertaking that will take many years of effort. However, if it can be accomplished the potential payoff is also huge and could result in the realization of a true Semantic Web. Just as with any very large and complex software development effort, there needs to be a structured approach to achieving the desired results. That structured approach probably involves a clear requirements analysis and resulting requirements documentation. It also requires a design document and an implementation document, as well as risk assessment and risk mitigation. While there is no bigger believer in the build a little, test a little rapid prototyping approach to development, I don't think that is appropriate for a project of this size and complexity. Also, the size and complexity also suggest the final product will likely be beyond the scope of any individual to fully comprehend the overall ontological structure. Therefore, a reasonable approach might be to break the effort into smaller, comprehensible segments. Since this is a large ontology development effort, segmenting the ontology into domains of interest and creating working groups to focus on each domain might be a workable approach. There would also need to be a working group that focus on the top levels of the ontology and monitors the domain working groups to ensure overall compatibility and reduce the likelihood of duplicate or overlapping concepts in the upper levels of the ontology and treats universal concepts such as space and time consistently. There also needs to be a clear, and hopefully simple, approach to mapping data on the web to the DBpedia ontology that will accommodate both large data developers and web site developers. It would be wonderful to see the worldwide web community get behind such an initiative and make rapid progress in realizing this commendable goal. However, just as special interests defeated the goal of having a universal software development approach (Ada), I fear the same sorts of special interests will likely result in a continuation of the current myriad development efforts. I understand the one size doesn't fit all arguments, but I also think one size could fit a whole lot could be the case here. Respectfully, John Flynn http://semanticsimulations.com From: Sebastian Hellmann [mailto:hellm...@informatik.uni-leipzig.de] Sent: Wednesday, March 11, 2015 3:12 AM To: Tom Morris; Dimitris Kontokostas Cc: Wikidata Discussion List; dbpedia-ontology; dbpedia-discuss...@lists.sourceforge.net; DBpedia-Developers Subject: Re: [Dbpedia-discussion] [Dbpedia-developers] DBpedia-based RDF dumps for Wikidata Dear Tom, let me try to answer this question in a more general way. In the future, we honestly consider to map all data on the web to the DBpedia ontology (extending it where it makes sense). We hope that this will enable you to query many data sets on the Web using the same queries. As a convenience measure, we will get a huge download server that provides all data from a single point in consistent formats and consistent metadata, classified by the DBpedia Ontology. Wikidata is just one example, there is also commons, Wiktionary (hopefully via DBnary), data from companies, DBpedia members and EU projects. all the best, Sebastian On 11.03.2015 06:11, Tom Morris wrote: Dimitris, Soren, and DBpedia team, That sounds like an interesting project, but I got lost between the statement of intent, below, and the practical consequences: On Tue, Mar 10, 2015 at 5:05 PM, Dimitris Kontokostas kontokos...@informatik.uni-leipzig.de wrote: we made some different design choices and map wikidata data directly into the DBpedia ontology. What, from your point of view, is the practical consequence of these different design choices? How do the end results manifest themselves to the consumers? Tom -- Dive into the World of Parallel Programming The Go Parallel Website, sponsored by Intel and developed in partnership with Slashdot Media, is your hub for all things parallel software development, from weekly thought leadership blogs to news, videos, case studies, tutorials and more. Take a look and join the conversation now. http://goparallel.sourceforge.net/
Re: [Wikidata-l] [Dbpedia-discussion] [Dbpedia-developers] DBpedia-based RDF dumps for Wikidata
This is a very ambitious, but commendable, goal. To map all data on the web to the DBpedia ontology is a huge undertaking that will take many years of effort. However, if it can be accomplished the potential payoff is also huge and could result in the realization of a true Semantic Web. Just as with any very large and complex software development effort, there needs to be a structured approach to achieving the desired results. That structured approach probably involves a clear requirements analysis and resulting requirements documentation. It also requires a design document and an implementation document, as well as risk assessment and risk mitigation. While there is no bigger believer in the build a little, test a little rapid prototyping approach to development, I don't think that is appropriate for a project of this size and complexity. Also, the size and complexity also suggest the final product will likely be beyond the scope of any individual to fully comprehend the overall ontological structure. Therefore, a reasonable approach might be to break the effort into smaller, comprehensible segments. Since this is a large ontology development effort, segmenting the ontology into domains of interest and creating working groups to focus on each domain might be a workable approach. There would also need to be a working group that focus on the top levels of the ontology and monitors the domain working groups to ensure overall compatibility and reduce the likelihood of duplicate or overlapping concepts in the upper levels of the ontology and treats universal concepts such as space and time consistently. There also needs to be a clear, and hopefully simple, approach to mapping data on the web to the DBpedia ontology that will accommodate both large data developers and web site developers. It would be wonderful to see the worldwide web community get behind such an initiative and make rapid progress in realizing this commendable goal. However, just as special interests defeated the goal of having a universal software development approach (Ada), I fear the same sorts of special interests will likely result in a continuation of the current myriad development efforts. I understand the one size doesn't fit all arguments, but I also think one size could fit a whole lot could be the case here. Respectfully, John Flynn http://semanticsimulations.com From: Sebastian Hellmann [mailto:hellm...@informatik.uni-leipzig.de] Sent: Wednesday, March 11, 2015 3:12 AM To: Tom Morris; Dimitris Kontokostas Cc: Wikidata Discussion List; dbpedia-ontology; dbpedia-discuss...@lists.sourceforge.net; DBpedia-Developers Subject: Re: [Dbpedia-discussion] [Dbpedia-developers] DBpedia-based RDF dumps for Wikidata Dear Tom, let me try to answer this question in a more general way. In the future, we honestly consider to map all data on the web to the DBpedia ontology (extending it where it makes sense). We hope that this will enable you to query many data sets on the Web using the same queries. As a convenience measure, we will get a huge download server that provides all data from a single point in consistent formats and consistent metadata, classified by the DBpedia Ontology. Wikidata is just one example, there is also commons, Wiktionary (hopefully via DBnary), data from companies, DBpedia members and EU projects. all the best, Sebastian On 11.03.2015 06:11, Tom Morris wrote: Dimitris, Soren, and DBpedia team, That sounds like an interesting project, but I got lost between the statement of intent, below, and the practical consequences: On Tue, Mar 10, 2015 at 5:05 PM, Dimitris Kontokostas kontokos...@informatik.uni-leipzig.de wrote: we made some different design choices and map wikidata data directly into the DBpedia ontology. What, from your point of view, is the practical consequence of these different design choices? How do the end results manifest themselves to the consumers? Tom -- Dive into the World of Parallel Programming The Go Parallel Website, sponsored by Intel and developed in partnership with Slashdot Media, is your hub for all things parallel software development, from weekly thought leadership blogs to news, videos, case studies, tutorials and more. Take a look and join the conversation now. http://goparallel.sourceforge.net/ ___ Dbpedia-developers mailing list dbpedia-develop...@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/dbpedia-developers -- Sebastian Hellmann AKSW/NLP2RDF research group Insitute for Applied Informatics (InfAI) and DBpedia Association Events: * Feb 9th, 2015 3rd DBpedia Community Meeting in Dublin http://wiki.dbpedia.org/meetings/Dublin2015 * May 29th, 2015 Submission deadline SEMANTiCS 2015 * Sept 15th-17th, 2015 SEMANTiCS 2015 (formerly i-SEMANTICS), Vienna
Re: [Wikidata-l] Names, Aliases, Copyright (and a little OpenStreetMap)
I agree that word for word translations are not appropriate for English. If there are languages which traditionally do use word for word then that might be appropriate for those languages On 12 Mar 2015 10:24, Daniel Kinzler daniel.kinz...@wikimedia.de wrote: Am 12.03.2015 um 10:03 schrieb Gerard Meijssen: Hoi, What would you do with the many, many Chinese place names in Wikidata where we have nothing but Chinese ? It is completely useless to me in this way. A good transliterations works for me. Like most people beyond that I do not care much about it being official or sourced. Decent automatic translitteration is fine I think. Automatic word-for-word *translation* however seems rather problematic. -- Daniel Kinzler Senior Software Developer Wikimedia Deutschland Gesellschaft zur Förderung Freien Wissens e.V. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Names, Aliases, Copyright (and a little OpenStreetMap)
I completely forgot we already had the excellent transliteration gadget https://www.wikidata.org/wiki/MediaWiki:Gadget-SimpleTransliterate.js by Ebraminio https://www.wikidata.org/wiki/User:Ebraminio. Just made a rough patch https://www.wikidata.org/w/index.php?title=MediaWiki:Gadget-SimpleTransliterate.jsdiff=20370oldid=155995810 to make it work with the new UI. Enjoy! Il 12/03/2015 11:24, Daniel Kinzler ha scritto: Am 12.03.2015 um 10:03 schrieb Gerard Meijssen: Hoi, What would you do with the many, many Chinese place names in Wikidata where we have nothing but Chinese ? It is completely useless to me in this way. A good transliterations works for me. Like most people beyond that I do not care much about it being official or sourced. Decent automatic translitteration is fine I think. Automatic word-for-word *translation* however seems rather problematic. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Names, Aliases, Copyright (and a little OpenStreetMap)
Hi Serge, The short answer to this is that the purpose of aliases in Wikidata is to help searching for items, and nothing more. Aliases may include nicknames that are in no way official, and abbreviations that are not valid if used in another context. Therefore, they seem to be a poor source of data to import into other projects. Wikidata has properties such as birth name (https://www.wikidata.org/wiki/Property:P1477) that are used to provide properly sourced multi-lingual text data for items. Cheers, Markus On 10.03.2015 16:09, Serge Wroclawski wrote: Hi all, I'm a relative newcommer to Wikidata but long time OpenStreetMap contributor. Recently OpenStreetMap has a situation where large numbers of translated names have been added to OSM objects. When asked about the origin of these names, I've been told a number of places, one of which is Wikidata. What appears to be happening, from what I've seen, is that there's a small number of users copying data from other (not so reliable) sources, and then putting that data in Wikidata as aliases for place names, then when asked where the place names are from, they say Wikidata. The problem here is that many of these translations are simply made up. They're either transliterations or word for word translations, rather than being genuine names in another language. Unfortunately, it's my understanding that Wikidata aliases can't be sourced (ie they can't be validated or invalidated like other facts). If this is the case, it's a problem for both our projects. I'd planned on my own implementation of using Wikidata names for places in OSM to create custom renderings, but we need to be able to know that the place names are something we can trace back and source properly. - Serge ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] OpenStreetMap + Wikidata
Am 10.03.2015 um 16:54 schrieb Luca Martinelli: 2015-03-10 16:28 GMT+01:00 Janko Mihelić jan...@gmail.com: What would this new Wikibase have that OpenStreetMap doesn't already have? The possibility of talking with WMF projects, as Wikidata talks with all the other projects... Only if it's also hosted on the WMF cluster. Or we implement http based federation (planned, but a lot of work, and waaay down there on the prio list). -- Daniel Kinzler Senior Software Developer Wikimedia Deutschland Gesellschaft zur Förderung Freien Wissens e.V. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l