On 5/25/11 3:43 PM, Fabrizio Orlandi wrote:
> 7 billion triples in the resulting dataset, after running the provenance
> extraction process on the full Wikipedia dump, can be a problem indeed.
> However, there are some working implementations of triplestores serving
> more than 10 billion triples [2]. In addion, we should consider advances
> in RDF storage with new triple stores that may handle such amount of
> data, and clustering architectures to do so.

LOD Cloud cache at: <http://lod.openlinksw.com>  has 23 Billion Triples 
and counting. It exists to showcase what's real today re. state of the 
art in bigdata realm.

Links:

1. http://www.delicious.com/kidehen/linked_data_demo -- various query 
demos that leverage the LOD cloud cache instance.

-- 

Regards,

Kingsley Idehen 
President&  CEO
OpenLink Software
Web: http://www.openlinksw.com
Weblog: http://www.openlinksw.com/blog/~kidehen
Twitter/Identi.ca: kidehen






------------------------------------------------------------------------------
vRanger cuts backup time in half-while increasing security.
With the market-leading solution for virtual backup and recovery, 
you get blazing-fast, flexible, and affordable data protection.
Download your free trial now. 
http://p.sf.net/sfu/quest-d2dcopy1
_______________________________________________
Dbpedia-discussion mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion

Reply via email to