Hoi,
The short answer is no, this is not the way we go. When you want specific
languages added that have an ISO-639-3 code you can request it. In
principle the language committee is in favour however, it is done
automatically. Wikidata is not a stamp collection.
Thanks,
      GerardM

On 6 October 2014 01:20, P. Blissenbach <pu...@web.de> wrote:

> Hoi Gerard,
>
> I believe, there is a misconception there. "It only makes sense to add a
> language when there is a use case" comes with the presuppositions that a
> language needs to be added, and that it might be added without a use case.
>
> My suggestion is that a language would not be used without a use case. So
> unless someone enters a label in a language, that language is virtualy not
> there (in WikiData) even though it might exist in the IANA registry of
> language subtags and thus in ISO 639 and thus was acceptable to WikiData.
>
> My other suggestion is that a language is "added" automatically when it
> exists in the IANA registry and when someone enters label data for it. I
> doubt it makes sense to bother anyone including the Language Committee with
> a request to "add" a language that undoubtedly exists anyways and thus
> shall be added anyways. After all, we have some 8888+
> language/script/varieties left in the IANA registry that we do neither
> support nor allow to be used in WikiData at the moment.
>
> Of course, you can view requests and permission as a matter to exercise
> some control about what is happening, but whould it scale?
>
> Background info:
> Since I am capable to add (some) labels in half a dozen more languages
> than I currently routinely do, and WikiData would not let me (producing
> some sort of errors when I tried), I left almost half a dozen hints, error
> reports etc. already over time at various places. There was no remedy
> visible to me. My current suggestion is an outcome of my corresponding
> question to the developers of Wikidata present at the WikiCom 2014 in
> Cologne, and the hints and suggestions I got.
>
> In addition to the above, I have a growing collection of dictionary data
> files of various languages waiting to be uploaded semiautomatically if I
> could.
>
> Purodha
>
>
> "Gerard Meijssen" <gerard.meijs...@gmail.com> writes:
>
> Hoi,
> Any ISO 639-3 language is admissible. HOWEVER, it only makes sense to add
> languages when there is a use case. When someone is interested in adding
> content in a particular language, the language committee is happy to allow
> for this. There is one proviso; when it becomes clear that content in a
> specific language is not representative of that language all the content
> will be removed..
> Thanks,
>      GerardM
>
>
>
>
> On 5 October 2014 15:03, P. Blissenbach <pu...@web.de> wrote:Hi everyone,
>
> When entering labels in WikiData, any world language should be allowed.
>
> Technical language/script/variety marking for internet ressources is
> currently defined in the IANA language subtag registry.
>
> Thus the above suggestion boils down to mark language selections for
> labels by a valid code as per the IANA language subtag registry, and allow
> each tag to be used (referred to) by editors entering labels.
>
> I created bug 71664 so as to overcome the current limitation.
>
> Purodha
>
> References:
> *
> http://www.iana.org/assignments/language-subtag-registry/language-subtag-registry[http://www.iana.org/assignments/language-subtag-registry/language-subtag-registry]
> - IANA language subtag registry
> ** http://rishida.net/utils/subtags/[http://rishida.net/utils/subtags/] -
> Interactive query of the IANA language subtag registry
> *
> http://www.rfc-editor.org/rfc/bcp/bcp47.txt[http://www.rfc-editor.org/rfc/bcp/bcp47.txt]
> - BCP 47 = Best Current Practice - Tags for Identifying Language
> *
> http://www.w3.org/International/articles/language-tags/[http://www.w3.org/International/articles/language-tags/]
> - Language tags in HTML and XML (by the W3C)
> *
> https://bugzilla.wikimedia.org/show_bug.cgi?id=71664[https://bugzilla.wikimedia.org/show_bug.cgi?id=71664]
>
> --
> (this e-mail probably scanned by NSA and GCHQ and others)
>
> _______________________________________________
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org[Wikidata-l@lists.wikimedia.org]
>
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l_______________________________________________
> Wikidata-l mailing list Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l[https://lists.wikimedia.org/mailman/listinfo/wikidata-l]
>
> _______________________________________________
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l

Reply via email to