Hi,

I am using postgresql 12.7 on Ubunut as a datawarehouse, that is then queried by QlikSense to produce business analytics.

One of my dataloaders, that runs multiple queries, sometimes takes about 3 hours to feed Qlik with the relevant records (about 10M records), but sometimes goes crazy and times out (as Qlik stops it when it takes more than 480 minutes).

The point is that Qlik is using a CURSOR to retrive the data. I'm not familiar with CURSOR and postgresql documentation mainly cites functions as use case. I don't really know how Qlik creates these cursors when executing my queries...

I tried load_min_duration to pinpoint the problem, but only shows things like that:

...
LOG: duration : 294774.600 ms, instruction : fetch 100000 in "SQL_CUR4"
LOG: duration : 282867.279 ms, instruction : fetch 100000 in "SQL_CUR4"
...

So I don't know exactly which of my queries is hiding behind "SQL_CUR4"...

Is there a way to log the actual query ?
Is using a CURSOR a best practice to retrieve big datasets ? (it seems Qlik is using it for every connection on Postgresql) Does each FETCH re-run the query, or is the result somehow cached (on disk ?) ?

Thanks for any insight on CURSOR and/or Qlik queries on Postgresql !

Best regards,
Franck


Reply via email to