I think a key thing is to determine to what extent any definition of 
'completeness' is actually a representation of 'quality'.  As Peter says, 
making sure not just that metadata is present but then checking it conforms 
with rules is a big step towards this. I would also extend this to assessing at 
what level of accuracy things have been set, for example dates (a rough range 
vs a precise day) and geotags (coordinates presenting the centre of Paris vs 
the exact position that a photograph was taken from). These sorts of things can 
make a big difference to both the discoverability and practical reusability of 
records by end users.

Best, James



________________________________________
From: Code for Libraries [CODE4LIB@LISTSERV.ND.EDU] on behalf of Esmé Cowles 
[escow...@ticklefish.org]
Sent: 06 May 2015 13:51
To: CODE4LIB@LISTSERV.ND.EDU
Subject: Re: [CODE4LIB] How to measure quality of a record

Sergio-

Mark Phillips has a related blog post that I think is an excellent place to 
start, which outlines a system for scoring how complete a record is:

http://vphill.com/journal/post/4075

There was some discussion on twitter recently about this, which you can look up 
on the #metadataquality hashtag: https://twitter.com/hashtag/metadataquality

I think there was a move to setup a mailing list for this topic or something 
like that, but I'm not sure where that stands now.

-Esme

> On 05/06/15, at 7:21 AM, Sergio Letuche <code4libus...@gmail.com> wrote:
>
> Hello community,
>
> is there a way, any statistical approach, that you are aware of that let's
> say, allows one to have an idea of how "complete" a record is, or what are
> the actions you take in order to have an idea of the quality of a record,
> and eventually a database?
>
> Thank you in advance

Reply via email to