2009/6/17 Francis Davey <[email protected]>:
> 2009/6/17 Tony Bowden <[email protected]>:
>> On Wed, Jun 17, 2009 at 5:23 PM,
>> CountCulture<[email protected]> wrote:
>>> The problem with a wikipedia model (apart from the usual ones -- bad
>>> edits etc) is that it doesn't give you structured data (hence dbpedia
>>> etc), and one of the goals of theyworkforyoulocal is to extract
>>> structured data and provide it in a unified interface, rather than the
>>> mishmash that's around at the moment.
>>
>> When I was doing this for Belfast City Council I used Semantic
>> MediaWiki to get around that problem. It's not especially
>> lay-friendly[1], but it's a pretty good way to get all the wiki
>> goodness whilst still also having structured data.
>
> Its very very easy to parse anything structured in a mediawiki (just a
> few regular expressions and a bit of code), so anything in a table or
> a template can be parsed even if there's nothing cleverer there.

In fact, isn't dbpedia simply scraped & parsed straight off wikipedia,
and presented in a standardised format...?

It's an impressive project.  People are doing some nice things with
it.  I like this map-of-where-footballers-come-from example:
http://www.cems.uwe.ac.uk/xmlwiki/RDF/clubIndex.xq

It just goes to show what can be achieved by good curating of data,
even in a context where data integrity is not enforced at all within
the application.

Seb

_______________________________________________
Mailing list [email protected]
Archive, settings, or unsubscribe:
https://secure.mysociety.org/admin/lists/mailman/listinfo/developers-public

Reply via email to