Paul Houle wrote:
> Kingsley Idehen wrote:
>>
>> Note, DBpedia-Live has been in hot staging for a few months now. 
>> Thus, apropos Peter, if you fix Wikipedia the Linked Data View in 
>> DBpedia will be fixed too esp. with Live Edition in play.
>>
>> We are syncing Live with Wikipedia now that 3.5.1 cut is out.
>>
>    I'll look into what it takes to fix wikipedia.
>
>    Methodologically,  however,  the last kind of query that I want to 
> run over DBpedia live (or freebase) is a query like "give me a list of 
> all cities;"  that kind of query almost certainly overruns the 
> returned limit of either the SPARQL or MQL implementation.

Build a Data Window using OFFSET and LIMIT.
>
>    If I end up rebuilding 'Isidore',  my framework for combing through 
> lots of records in generic databases,  it's going to be around a 
> 'taxonomic core' that keeps track of identifying identification for 
> objects and of the major classes that organize them.  That's something 
> that I like building from a general dump,  because I feel more 
> confident that I didn't run into 'data holes' hidden by an API.
Of course :-)
>
>    Practically,  a complete taxonomic core for dbpedia and Freebase is 
> reasonably small and easy to handle.  Once I've got it,  I can (in 
> principle) identify classes of objects that I'm interested in and then 
> use SPARQL or MQL APIs to fill in details about particular ones.
>
>    For instance,  for ny-pictures,  I found about 10,000 'things' 
> related to New York City and loaded those into a system that's pretty 
> heavyweight and expensive.  I found both dbpedia live and live access 
> to freebase to be very useful for filling out information that was 
> missing from the dbpedia dump that I started ny-pictures from;  in 
> particular,  a lot of geographical coordinates have been added in the 
> last few month.
>
>    Along these lines,  I really like the topic dumps that metaweb 
> publishes;  these are good 'map' of Freebase which makes it possible 
> to selectively pick out stuff to look at with MQL,  without having to 
> figure out how to interpret those big FB dump files.

Yes. 

Would be nice if the end product made its way back into the LOD cloud.

Hopefully, when we are set with RDF delta shipping (which basically 
enables DDE or yore but for Linked Data via pubsubhubbub) issues like 
this will be easier to handle.



-- 

Regards,

Kingsley Idehen       
President & CEO 
OpenLink Software     
Web: http://www.openlinksw.com
Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca: kidehen 






------------------------------------------------------------------------------
_______________________________________________
Dbpedia-discussion mailing list
Dbpedia-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion

Reply via email to