Le jeu. 11 juin 2020 à 11:13, David Causse a écrit :
>
> Hi,
>
> did you "munge"[0] the dumps prior to loading them?
> As a comparison, loading the munged dump on a WMF production machine (128G,
> 32cores, SSD drives) takes around 8days.
>
> 0:
Hi,
Please, find attached a picture of Blazegraph's performance during the load
of Wikidata's dataset. This is after increasing the resources assigned to
the job to 24 cores and 240 GB RAM. Do you think it is normal behaviour?
Thanks,
Leandro
On Thu, Jun 11, 2020 at 2:33 PM Leandro Tabares
It's generally advised to reply to the replies to your original mailing
list post, rather than creating a very similar post few days later...
On Thu, 11 Jun 2020 at 13:33, Leandro Tabares Martín <
leandro.tabaresmar...@uhasselt.be> wrote:
> Hi,
>
> I have downloaded Blazegraph already compiled
Hi,
I have downloaded Blazegraph already compiled from [1]. I also made the
optimizations indicated at [2].
For the loading process I'm following the instructions given in the
"getting-started.md" file that comes in the "docs" folder of the compiled
distribution [1]. That means:
1- Munge the
Hi,
did you "munge"[0] the dumps prior to loading them?
As a comparison, loading the munged dump on a WMF production machine (128G,
32cores, SSD drives) takes around 8days.
0:
https://wikitech.wikimedia.org/wiki/Wikidata_query_service#Data_preparation
On Thu, Jun 11, 2020 at 12:37 AM Denny
Did you see this?
https://addshore.com/2019/10/your-own-wikidata-query-service-with-no-limits-part-1/
On Wed, Jun 10, 2020, 12:51 Leandro Tabares Martín <
leandro.tabaresmar...@uhasselt.be> wrote:
> Dear all,
>
> I'm loading the whole wikidata dataset into Blazegraph using a High
> Performance
Dear all,
I'm loading the whole wikidata dataset into Blazegraph using a High
Performance Computer. I gave 120 GB RAM and 3 processing cores to the job.
After almost 24 hours of load the "wikidata.jnl" file has only 28 GB as
size. Initially the process was fast, but as the file increased its size