Working with a rather large dataset (16M timebased points), I have trouble working with this in QGIS, using both the Postgis-provider OR the WFS-provider.

As data is properly indexed, retrieving a selected area is fine.

BUT as soon as you try to use the attribute table, I-tool or for example the TimeManager, QGIS stalls.

I think(!) because apparently both providers are requesting all data again from their datasource, sometimes even ignoring the current extent (WFS).

While, it seems more appropriate to 'just use what you already have': the dataprovider/QGIS already has all features (both geom + attributes in case of WFS for example), so WHY should it request all data again (data-freshness?)?

Are others having this problems too?

Could adding a 'Caching-option' for the QgsDataProvider be a reasonable/viable option: - if 'data caching' is enabled, the data will be retrieved only once (per extent probably). So showing the Attribute table, I-tool info, or using TimeManager/Query should be relatively easy/fast as the features are locally (or in memory) cached. - as soon as you want 'fresh' data you disable the caching temporarily, and new data will be requested (as it is doing now apparently).

Or am I missing something (I see there is an option for attributetable caching...)?

Not sure about other countries, but here in NL governmental institutes start serving huge dataset more and more, either as WMS/WCS/WFS services, or more and more as REST/GeoJSON services.

Anybody has some insights ideas about this?

Regards,

Richard Duivenvoorde
_______________________________________________
QGIS-Developer mailing list
[email protected]
List info: https://lists.osgeo.org/mailman/listinfo/qgis-developer
Unsubscribe: https://lists.osgeo.org/mailman/listinfo/qgis-developer

Reply via email to