Hi Calle
Depending on the configuration of the events (how many org units, data
elements etc), yes, event import can still feel a bit slow. We did some
updates in 225/226, but as you are already using 226 this didn't help your
case.
Yes. when we did the metadata importer we managed to speedup it
David,
I'm importing into 2.26, postgresql is tuned as far as possible (the test
was on my laptop), and 12GB RAM. I was using CSV, put there should not be
any significant difference between JSON and CSV.
I will import the 2.5 mill events directly and see what time that takes,
comparatively.
Hi Calle
When some conditions are met, the event import can be really speedy
(imports of ~100 000 in <30 minutes).
- Recent version (2.25+)
- PostgreSQL properly tuned (this is important)
- Enough RAM (8 GB+)
I usually use JSON files and post them against /api/events.
David
On Fri, Jun 30,
Ime,
I was using the UI - have not looked at the API endpoint.
For now I will use sql scripts as I've done before - I'm just trying to
find out if anybody is working on improving what is (now) an import
function that in reality only caters for small data sets. There is no
question about the core
Hi Calle,
Thanks for your question.
Just curious (I don't have answer). As we are planning to do this soon.
Are you doing this through UI or API?
If API, which endpoint enables this?
Thanks
Ime
On Jun 28, 2017 19:15, "Calle Hedberg" wrote:
> Hi
>
> I started
Hi
I started importing around 700,000 events 2 days ago (about 5 mill
individual values) - and the import is still running, 48 hours later.
Import is slowing - it seemed to be importing around 7-8 values per second,
now down to 2-3 per second. It looks like that import might take around 200
hours
6 matches
Mail list logo