Did you see this?

https://addshore.com/2019/10/your-own-wikidata-query-service-with-no-limits-part-1/

On Wed, Jun 10, 2020, 12:51 Leandro Tabares Martín <
[email protected]> wrote:

> Dear all,
>
> I'm loading the whole wikidata dataset into Blazegraph using a High
> Performance Computer. I gave 120 GB RAM and 3 processing cores to the job.
> After almost 24 hours of load the "wikidata.jnl" file has only 28 GB as
> size. Initially the process was fast, but as the file increased its size
> the loading speed has decreased. I realize that only 14 GB of RAM are being
> used. I already implemented the recomendations given in
> https://github.com/blazegraph/database/wiki/IOOptimization Do you have
> some other recommendations to increase the loading speed?
>
> Leandro
> _______________________________________________
> Wikidata-tech mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikidata-tech
>
_______________________________________________
Wikidata-tech mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikidata-tech

Reply via email to