[Wikidata] Re: Can no longer login (with or without VPN)
Hi Thad, Ah, I checked with and without a VPN. I see this without a VPN: https://i.imgur.com/GbspDOE.png (includes the missing line) Which links to: Special:PasswordReset but it is *not usable* from a blocked VPN to prevent abuse. Sidenote: Your 2nd screenshot has your email-address in the "username" field, which would definitely Not work. *For requesting an exception*, see the 3 options at https://meta.wikimedia.org/wiki/No_open_proxies#Global_exceptions_and_appeals Hope that helps, Quiddity ___ Wikidata mailing list -- wikidata@lists.wikimedia.org Public archives at https://lists.wikimedia.org/hyperkitty/list/wikidata@lists.wikimedia.org/message/H3WDZMGFHNJU45KO4MBDZWRFEXS3V3ES/ To unsubscribe send an email to wikidata-le...@lists.wikimedia.org
Re: [Wikidata] Community resources needed for focus languages for lexicographical data and Abstract Wikipedia campaign
Hi KuboF Re: "how much involvement" - As you note, this would have to vary/scale depending on the community. Essentially, this means a few editors are willing to spend some time each week with tasks such as: testing out new features, discussing potential-improvements or bugs with those features, discussing/explaining/reading any related topics with the rest of their local language community, helping to improve the central documentation, and of course contributing content to Lexemes and Wikifunctions. The intensity of these tasks would vary over time, depending on where the projects are in their development cycle. A main task would be the communication with the wider community in the given language - and it is hard to predict how much time that would take. CX was started way back in 2014, so that is complicated to compare it to, but based on my fuzzy memory that is probably a good comparison. Also, thank you for creating an application for Esperanto already![1] Best, Quiddity/Nick [1] https://www.wikidata.org/wiki/Wikidata:Lexicographical_data/Focus_languages/Form/Esperanto On Sun, Mar 7, 2021 at 9:12 AM Michal Matúšov wrote: > Hi there. > > About the new development of the lexicographical data and Abstract > Wikipedia [1]: Can you estimate how much involvement (time / effort / > energy ...) is supposed from the language community that will be selected > for focus? The description says about "particularly active feedback > channels" but different lang communities can interpret it differently :) > E.g. can you compare it to situation for first languages that have > received ContentTranslation? > > In the Esperanto community we discuss our possible application, but the > topic of needed resources is important for us. > > Thanks! > > [1] > https://www.wikidata.org/wiki/Wikidata:Lexicographical_data/Focus_languages > > KuboF Hromoslav > ___ > Wikidata mailing list > Wikidata@lists.wikimedia.org > https://lists.wikimedia.org/mailman/listinfo/wikidata > -- Nick "Quiddity" Wilson (he/him) Community Relations Specialist Wikimedia Foundation ___ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Re: [Wikidata] Coordinate precision in Wikidata, RDF & query service
On Tue, Aug 29, 2017 at 2:13 PM, Stas Malyshev wrote: > [...] Would four decimals > after the dot be enough? According to [4] this is what commercial GPS > device can provide. If not, why and which accuracy would be appropriate? > I think that should be 5 decimals for commercial GPS, per that link? It also suggests that "The sixth decimal place is worth up to 0.11 m: you can use this for laying out structures in detail, for designing landscapes, building roads. It should be more than good enough for tracking movements of glaciers and rivers. This can be achieved by taking painstaking measures with GPS, such as differentially corrected GPS." Do we hope to store datasets around glacier movement? It seems possible. (We don't seem to currently https://www.wikidata.org/wiki/Q770424 ) I skimmed a few search results, and found 7 (or 15) decimals given in one standard, but the details are beyond my understanding: http://resources.esri.com/help/9.3/arcgisengine/java/gp_toolref/geoprocessing_environments/about_coverage_precision.htm https://stackoverflow.com/questions/1947481/how-many-significant-digits-should-i-store-in-my-database-for-a-gps-coordinate https://stackoverflow.com/questions/7167604/how-accurately-should-i-store-latitude-and-longitude > [4] > https://gis.stackexchange.com/questions/8650/measuring-accuracy-of-latitude-and-longitude -- Nick Wilson (Quiddity) volunteer hat ___ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Re: [Wikidata] Tool for consuming left-over data from import
On Fri, Aug 4, 2017 at 10:57 AM, André Costa wrote: > Hi all! > > As part of the Connected Open Heritage project Wikimedia Sverige have been > migrating Wiki Loves Monuments datasets from Wikipedias to Wikidata. > > In the course of doing this we keep a note of the data which we fail to > migrate. For each of these left-over bits we know which item and which > property it belongs to as well as the source field and language from the > Wikipedia list. An example would e.g. be a "type of building" field where > we could not match the text to an item on Wikidata but know that the target > property is P31. > > We have created dumps of these (such as > https://tools.wmflabs.org/coh/_total_se-ship_new.json, don't worry this one > is tiny) but are now looking for an easy way for users to consume them. > > Does anyone know of a tool which could do this today? The Wikidata game only > allows (AFAIK) for yes/no/skip whereas you would here want something like > /invalid/skip. And if not are there any tools which with a bit > of forking could be made to do it? > (IANADeveloper, but) I believe Wikidata Game might handle this? E.g. The "Date" game has fields for dates http://storage8.static.itmages.com/i/17/0807/h_1502126752_6195952_63d5e0e3da.png http://storage5.static.itmages.com/i/17/0807/h_1502126720_7252323_c4174b3da6.png https://tools.wmflabs.org/wikidata-game/#mode=no_date I forget if any of the Distributed Games have similar functionality (and no time to check now). Hope that helps! > We have only published a few dumps but there are more to come. I would also > imagine that this, or a similar, format could be useful for other > imports/template harvests where some fields are more easily handled by > humans. > > Any thoughts and suggestions are welcome. > Cheers, > André > André Costa | Senior Developer, Wikimedia Sverige | andre.co...@wikimedia.se > | +46 (0)733-964574 > > Stöd fri kunskap, bli medlem i Wikimedia Sverige. > Läs mer på blimedlem.wikimedia.se > > > ___ > Wikidata mailing list > Wikidata@lists.wikimedia.org > https://lists.wikimedia.org/mailman/listinfo/wikidata > -- Nick Wilson (Quiddity) Community Liaison, Wikimedia Foundation ___ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Re: [Wikidata] Label gaps on Wikidata
On Mon, Feb 20, 2017 at 9:59 PM, Smolenski Nikola wrote: > Citiranje "Nick Wilson (Quiddity)" : >> 2) Translation >> I also agree that a machine-translation /suggestion/ or /hint/ would be a >> nice option. The main concern is users who don't understand the limitations >> of machine-translation and whom must resist the urge to just copy&paste. > > It should be possible, perhaps even preferred, to show translation of the most > common descriptions, done on translatewiki. Thus all the descriptions like > "Wikipedia disambiguation page", "Wikimedia category" etc could be visible in > all languages. > I think this (good) example is for a slightly different feature, which means that there are 2 distinct feature-requests: - 1) For unique item descriptions (the main focus of this mailing list thread), we want to find a way to "suggest" descriptions to editors, based on machine-translations of existing descriptions in other languages. 1a) This could be a new task in phabricator? (per discussion in this thread) 1b) (Probably a very-long-term goal?) This could also perhaps be https://phabricator.wikimedia.org/T64695 "Draft a computer-assisted translation system for Wikidata labels/descriptions" which discusses the scaling problems, and suggests that we might EVENTUALLY want semi-automated description updates, at least in some items, similar to how Reasonator works. I suspect it would be best to keep those 2 ideas separate, hence I suggest filing a new task for (1a). -- 2) A way for generic description translations, to be automatically added to some items. 2a) For very common & wikimedia-focused descriptions, this seems to be /periodically/ handled by bots. E.g. for Disambiguation items, it looks like User:MilanBot currently handles this task, for example: * https://www.wikidata.org/w/index.php?title=Q260478&action=history * https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/MilanBot E.g. for Category items, it looks like ValterVBot currently handles this task, for example: * https://www.wikidata.org/w/index.php?title=Q6939670&diff=198113824&oldid=197219107 * https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/ValterVBot This task, https://phabricator.wikimedia.org/T139912 seems to track the idea of properly automating it all, and it links to an onwiki discussion that has many more details. I don't understand the technical discussions, or current state of development, enough to even attempt to summarize. 2b) For other common descriptions, these translations all seem to be manually added? E.g. for items with the description "scientific journal article" or "scientific article". * https://www.wikidata.org/wiki/Q28510879 and https://www.wikidata.org/wiki/Q28579322 and https://www.wikidata.org/wiki/Q28298612 and I think thousands more? However, these are probably not a best practice that we want to encourage, per https://www.wikidata.org/wiki/Help:Description and per some of the descriptions in other languages being more precise (e.g. "vedecký článok (publikovaný 2009-01)" ). Therefore, this (2b) cluster probably belongs more with the (1a/1b) set of feature-requests, and should not be mass-replicated across Wikidata. I hope that's mostly accurate... Quiddity ___ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Re: [Wikidata] Label gaps on Wikidata
1) Gap: I do agree it would be good to promote these backlogs, as two of the easiest ones for newcomers to work on. (Although there are guidelines and best-practices, and any backlog promotion should clearly point to those documentation pages, so that newcomers can have a ready-reference). 2) Translation I also agree that a machine-translation /suggestion/ or /hint/ would be a nice option. The main concern is users who don't understand the limitations of machine-translation and whom must resist the urge to just copy&paste. (This goes for both language-fluency, but also for technical-vocabulary fluency, e.g. I could not give a confident description of most chemistry or physics articles, even with numerous machine-translation-based suggestions or the article itself!) I can't see anything specifically about this in Phabricator, so it's probably worth filing a feature request, unless someone else points out a task I missed, or raises an overwhelming concern. [Note: a semi-related task to link in the SeeAlso of the new one: T71345] 3) Tools: Is it currently possible to get a list of items without a label/description in language X? I tried a few weeks ago, and the onwiki Special pages were broken. I filed https://phabricator.wikimedia.org/T157884 "Nothing loads on Special:EntitiesWithoutDescription or Special:EntitiesWithoutLabel results" to cover this problem. Ah, I now see https://tools.wmflabs.org/wikidata-terminator/? which works for missing descriptions. However the "with missing labels" set of links seems to be broken for most languages. Sjoerd filed https://bitbucket.org/magnusmanske/wikidata-todo/issues/45/terminator-top-1000-linked-items-with and I've added some example links. The other set of links that are listed, are all outdated ( https://www.wikidata.org/wiki/Wikidata:WikiProject_Labels_and_descriptions#List_of_items_without_labels_and.2For_descriptions and below) I wonder if we should add a link to https://tools.wmflabs.org/wikidata-game/distributed/#game=23 ("Kaspar's Persondata game: Descriptions") in that list? AFAIK it only contains English suggestions though. Are there any other tools which help with listing or processing these particular backlogs? Quiddity (Volunteer hat. This is just the address I use to subscribe to this list) ___ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata