On Tue, Jun 4, 2019 at 12:18 PM Adam Sanchez <[email protected]> wrote:
>
> Hello,
>
> Does somebody know the minimal hardware requirements (disk size and
> RAM) for loading wikidata dump in Blazegraph?

The actual hardware requirements will depend on your use case. But for
comparison, our production servers are:

* 16 cores (hyper threaded, 32 threads)
* 128G RAM
* 1.5T of SSD storage

> The downloaded dump file wikidata-20190513-all-BETA.ttl is 379G.
> The bigdata.jnl file which stores all the triples data in Blazegraph
> is 478G but still growing.
> I had 1T disk but is almost full now.

The current size of our jnl file in production is ~670G.

Hope that helps!

    Guillaume

> Thanks,
>
> Adam
>
> _______________________________________________
> Wikidata mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikidata



-- 
Guillaume Lederrey
Engineering Manager, Search Platform
Wikimedia Foundation
UTC+2 / CEST

_______________________________________________
Wikidata mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikidata

Reply via email to