On 10/4/07, Olivier Grisel <[EMAIL PROTECTED]> wrote:
>
> I guess you will need one a proper RDBMS-based backend and then find a way
> to
> load the data by chunks doing a database commit every N added triplets.


Any suggestions on a RDBMS backend that would be appropriate?  I usually use
Postgres, but I don't see it listed in the rdflib api so am assuming it's
not supported.  Would something like SQLite do the job?  I don't anticipate
heavy usage; just one user running one query at a time against the dbpedia
dataset.

This can be a really long process so having a ready to load low
> level-formatted
> store might help a lot.


Please elaborate - what's a "low level-formatted store"?  :-)

I did not know about the DBpedia.org project: this is a really cool project.
> What would be a really nice killer app would be a python-based natural
> language
> client to dbpedia (along with wordnet maybe). Using nltk-lite might help a
> lot
> there:
>
> http://nltk.sourceforge.net/index.php/Main_Page


Yes, something like that would really make a difference, as a non-programmer
would choke on SPARQL.

Thanks for the help and feedback!
_______________________________________________
Dev mailing list
Dev@rdflib.net
http://rdflib.net/mailman/listinfo/dev

Reply via email to