Hi everybody,

I just came across DBPedia and it looks like a really awesome project.
I noticed that the latest dump from the English Wikipedia is quite old
(January 2008, Freebase does a new release every 3 months), so I am
interested in creating a more up-to-date dbpedia from a recent
download of the Wikipedia pages and articles (which I have). Is this
possible?

I am most interested in knowing which Wikipedia pages are people,
companies, and possible disambiguations. Perhaps if there are URLs of
the person (or company logo) on Wikipedia, that would be good too.

I downloaded the SVN from sourceforge, but I'm afraid I am a little
lost as I was unable to find any documentation. I have a suspicion
that the entry point is the extraction/extract.php file... but I have
no idea where to put the bz2 wikipedia dump file.

Additionally, I noticed that the PersonData preview link is broken.
This is disappointing as it is the one I am most interested in (so I
had to download the full dataset). Is there any reason why this is
only created from the German data?

Shug

-------------------------------------------------------------------------
Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
Studies have shown that voting for your favorite open source project,
along with a healthy diet, reduces your potential for chronic lameness
and boredom. Vote Now at http://www.sourceforge.net/community/cca08
_______________________________________________
Dbpedia-discussion mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion

Reply via email to