|     Documentation    |     Premium Support

Load Huge Features From SQL

I have 8M geometry in SQL and I’m sure spatial geometry exists for the table.
We know the issue is when you in primary levels and ThinkGeo find the intersection of the features with the CurrentExtent, so it should load all of the features, I mean 8M…
The one thing that came to my mind to load the features, is to load some part of them using WhereClause in SQLServerFeatureSource. So I change the WhereCaluse based on ZoomLevel.

But at this point, I cannot get to the performance that I expect, Do you have any idea to improve my implementation?

I measure the query time, it takes on average 8 seconds for each tile, and we have 4 tiles in CurrentExtent that contain features.

I think the GetFeatures and Drawing both are executed in the same thread.

Hey @Mahdi_Rastegari,

If the actual SQL queries are taking 8 seconds to complete, you will probably want to improve your query performance by using filtered indexes that match up with the WhereClause that you are using in your SQLServerFeatureSource.

We actually did this when building our ThinkGeo Cloud maps with 255 million rows in our polygon table and we were able to speed up the queries from 30 seconds to sub-second. The main difference is that we are using PostgreSQL instead of SQL Server, but it should apply all the same.

You may also want to make sure that you are spatially clustering your indexes, as that improved query performance for us as well, but not nearly as good as filtered indexes.



Thanks for sharing your information, if I execute the raw query ( I mean without any WhereClause) that includes only the STIntersections, it takes around 8 second, I think the bottleneck is getting spatial data from a specific bounding box. Don’t you think PostgreSQL has better performance than SQL Server over analysing Spatial data?

Hey @Mahdi_Rastegari,

If it’s taking 8 seconds on a normal STIntersects query, then I see 3 potential issues in your table:

  1. Your table is not spatially clustered
  2. The Intersects bounding box is large enough for SQL Server to think that using the spatial index is not worth it (i.e. the bounding box covers like 70% of all 8M rows). Take a look at SQL Server’s performance tools to see how the query is being performed to make sure that it is using it.
  3. The database server needs to be configured for processing large amounts of data and that is wildly dependent on your server and dataset.

I would also generally agree that PostgreSQL is probably going to handle spatial data a little bit better than SQL Server due to the maturity of the PostGIS plugin. And it has been proven to be performant for large datasets like OpenStreetMap.