On Thu, 6 Dec 2018 at 19:05, Bo Victor Thomsen <[email protected]> wrote: > > Hi list - > > > > I've done some experiments with a dataset consisting of 440000 rows and > uploaded this to two database servers: Postgres and SQLServer. Both tables > has indexes on Primary key and the spatial column. > > > > And then connected to both tables in QGIS. The SQL server is 3 times slower > in retrieving the dataset than Postgres in QGIS! >
It's probably the extra validity checks which were added. SQL Server itself is broken by design when it comes to spatial data handling and if it encounters an invalid geometry it will silently abort the request and you'll be missing features from the layer. But there's *no way* for QGIS to detect when this occurs! Accordingly QGIS takes the "safer is better" approach and forces a validity check and make valid step as part of the queries sent to SQL Server. This avoids the potentially missing features, but comes at a large cost. If you're 100% sure that your tables have no invalid geometries (and never will have any!), you *can* switch this check off. But be warned... if you ever introduce invalid geometries into your tables, you'll get data loss. The setting is under the SQL Server connection's properties -- "skip invalid geometry handling". Let me know if this helps at all Nyall _______________________________________________ QGIS-Developer mailing list [email protected] List info: https://lists.osgeo.org/mailman/listinfo/qgis-developer Unsubscribe: https://lists.osgeo.org/mailman/listinfo/qgis-developer
