Hi Kasun,

Synchronizing the available DBpedia dumpsand Wikipedia page_edit_history 
datasets shouldn't be a problem at all. In Wikipedia revision dumps for 
each page there is a sequence of revisions, each with a timestamp. You 
just extract all you need from revisions with timestamps <= (earlier 
than) DBpedia_last_release_date, and then your data are syncronized with 
DBpedia dumps.

Best,
Volha


On 7/5/2013 2:17 PM, kasun perera wrote:
>
> Reason 2
> We need to concern about data freshness dealing with tasks related to 
> knowledge representation. Debpedia latest dumps (1.8) are nealy one 
> year older. This work also need to deal with other datasets such as 
> Wikipedia page_edit_history, interlaguage links ect. So there is the 
> need that all the datasets  are in sync with each other, i.e. they 
> have the same dates. If I use dbpedia dumps there is a problem of 
> finding synchronized datasets.
>



------------------------------------------------------------------------------
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
_______________________________________________
Dbpedia-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dbpedia-developers

Reply via email to