re: Languages "urn:lang:xxx: or urn:lang:xx:"

I'm not sure what the official UN Languages are.  The US Library of Congress is 
the maintainer of ISO 639 (1- 2alpha,2- 3 alpha,5- 3 alpha groups)).  They have 
an ID server which responds to URI's made from the Standard and the code.  The 
ISO 639-5 groups have the "local" components identified by LCSH (subject 
headings).  For your purposes, it would be best to assign these local languages 
to the large block (520) of user defined codes on an ad hoc basis.  This will 
enable you to create small, custom triple-stores (for validating forms in the 
field) and merge the records afterwards.  In this way you can always be sure 
you are using the most current version of the Standards and navigation issues 
for field work are minimized.


re: Nations, Subdivisions and Currency "urn:lex:xx;sub.div;sub.div:"

Brazil uses "urn:lex:" syntax identifiers [1,2] for Legislation and 
Jurisdiction.  This is a way to generate partial identifiers for geography, 
again without having to memorize any code system.  As with the language codes, 
an ad hoc system of urn:lex: identifiers works much better than reliance on 
existing codes.  Kosovo, for example does not have an ISO 3166 Country code, 
but nonetheless, a user defined Country Code once chosen, can define the 7 
regions of UNMIK.

I'll prepare two "how to" examples: 1) Kosovo, for reasons mentioned above and 
the Holy See, because websites with the Queen's English are a dime a dozen, but 
websites with Emperor Constantine's Latin are a bit more rare[3].



[1] http://www.lexml.gov.br/

[2] http://tools.ietf.org/html/draft-spinosa-urn-lex-04

[3] Don't try this at home, kids, I'm Jesuit Educated.  s/Jesuit Educated/going 
to hell, anyway/g;


________________________________
 From: Carsten Keßler <carsten.kess...@uni-muenster.de>
To: public-lod@w3.org 
Cc: Chad Hendrix <hend...@un.org> 
Sent: Wednesday, February 22, 2012 11:10 AM
Subject: Re: Metadata about single triples
 
Hi Gannon,

> I agree with the comments below.  I think what Bob is suggesting is that you
> include a non-ontology Core (DCMI for example).

I am not sure whether I get that correctly, but it is pretty clear to
use that we need something like this. We are already using some of the
DCMI properties and we'll need at least those codes for the official
UN languages and the local languages of the regions that we are
reporting on. However, my question was more about the 'how' – how can
we technically attach these different metadata in a way that works in
practice?

I absolutely agree that governance and establishment of processes (or
bureaucracy, as you call it) are a major aspect to make HXL a success.

Carsten

Reply via email to