The prototype I’m working on must be able to deal with a moderately large number of layers (30 to 100 or so), mostly loaded from shape files. Our external data provides for the concept of display thresholds making the related layers visible only within the designated scale range. I won’t know the final full extent until after all shape files (and others, but mainly the shape files) are loaded. After everything is loaded, I take the full extent and reset the 20 default zoom levels so that they are appropriate for the desired experience. As I’m loading them, I create the various styles, and as demonstrated in the samples, they are bound to zoom level 01 and then ApplyUntilZoomLevel through 20. As scale changes, I set IsVisible on the layers based on our thresholds.
Unfortunately, particularly with the PointStyle layers having lots of points, performance quickly becomes unacceptable when zooming/panning (I do have a memory cache attached). I’ve found that if I limit the point styles to avoid the most zoomed out levels, those levels perform MUCH better. So I’m thinking I need to rethink how I load the styles. Perhaps a better way would be waiting until after all layers are loaded to determine the full extent, setting the zoom level scales as appropriate, and only THEN go back to set the styles only on the levels where the threshold allows them to be visible. That would eliminate the code looping through to show/hide layers on each scale change (just eliminating this alone didn’t improve things noticeably), and also provide the benefits shown by hard coding the avoidance of setting point styles on the lower (numeric) zoom levels.
So, is that the right path? Or is there something else I should be looking at to improve zoom/pan performance? The cache makes it really fast once you have already “been there”. It’s the initial pan/zoom that I need to improve in a very substantial way.
Performance tuning with large number of shape layers
Hi Russ,
I am sorry I have a little confused about your section two, about how you think to handle the slow point styles.
In fact we have some experience about how to improve render speed, I want to share that to you and wish that’s helpful.
1. If the cache is work for you, I think you can try to use our cache generator to pregenerate cache. thinkgeo.com/forums/MapSuite/tabid/143/aft/10446/Default.aspx
2. Generally if one layer shows too many shapes in one zoomlevel and make it very slow, we need optimize that. If it’s a point shape you can do something like this:
a. Use ClusterPointStyle in high level and when zoom in, use original point style.
b. Research the data, build index for parts data by type, render them in different zoomlevel. For example, if we have a streets data, we can group them by highway and local road, then build different index files for one shape, when we are in high zoomlevel, we just render highway, after we zoom in we render all the roads, this can make sure when we are in one specified zoomlevel, current extent won’t contains too many data and the speed is acceptable.
Wish that’s helpful for your scenario.
Regards,
Don
Thank you.
The way our shape data is organized, your suggestions under #2 do not seem to apply. Basically, each shapefile/layer represents a different type of “thing”. If the user configures our metadata to indicate that it is visible for a scale range, then it is always the same for that range, and not visible when scale is outside the indicated range. So ClusterPointStyle and multiple index doesn’t really help.
I’ve looked into #1 a bit further. Our shape files are different for each customer, and may be updated over time, so we would have to be generate the cache dynamically as need was determined. I downloaded the cache generator sample, looked at how it works, and tried to implement a dynamic generation in our prototype. Basically, if the cache folder is empty (or out of date), run the generate for our layers. But with 20 zoom levels, that would be a LOT of tiles to generate. Still, I tried using only the first 10 level scales. This did not produce what I expected. Using scales from 219938 to 12385 (10, evenly spaced) and the “full extent” from the map (adjusted to ideal fit after loading the shape files), I got only one folder for 164954.239237129, which is our initial view at zoom level 2. I suspect that was simply created by the FileBitmapTileCache that was attached to the shape layer overlay (instead of InMemoryBitmapTileCache), and that the cache pre-gen did not actually work. I’ve run out of time for the day, and may fool with this some more tomorrow, but just by running the prototype through zoom/pan, the tremendous number of files generated makes this less than ideal I think.
My other idea also did not pan out. My initial approach was setting styles (based on our product metadata) for each layer across all zoom levels and controlling threshold visibility for each layer explicitly by changing Layer.IsVisible in response to Scale Changed. In another test, I thought I had noticed that when not setting styles for the lower zoom levels (for instance setting style only for ZoomLevel15-20) that the performance improved substantially. So I changed my code that initially loaded to shape files into layers and set styles from 01-20 as appropriately as I went. Now it is loading all shape files into layers, determining full extent, setting all map zoom levels beginning with full extent and zooming 25% per-level. Only after that is done, I look at each layer’s thresholds, map those to the adjusted zoom levels, and set the styles only on zoom levels where needed. Unfortunately this also did not provide any noticeable improvement, though I’ll need to look at it a bit closer to see if it’s working the way I think it is.
Another question. If I manipulate the map zoom level sets, how does that relate to layers and their zoom level sets. For instance, I’m adjusting the scale and setting names AFTER the layers are loaded. It’s done that so I can find the total extent to set the zoom levels as desired. I then go back and load each layer’s required styles assigned to a zoom level range indicated by our meta visibility threshold (scale range). But I noticed that the layer’s zoom levels remained the same as original when they were created, even though they were not modified (which I thought meant they would use the common from map?). So now I have to modify each layer’s zoom level set to match the maps zoom level scale while also setting the styles on the zoom levels. Is this how it’s supposed to work, or am I missing something?
Hi Russ,
It looks in most cases your data need to be rendered run-time. For shape files, we won’t load unnecessary data when extent changed because index file, so this part maybe it’s hard to find a way go on improve. Now for improve enhancement, if the we cannot generate cache before this progress, what we can do is make sure there aren’t too many shapes in one extent at the same time. For example, if we have data for all the roads of USA but we render it in the world level, the program will very slow or looks stop. So we need to tell the user don’t render too big data in high level, any solution need based on this point.
BTW, do you think that’s possible to split the big data to multiply small ones?
Regards,
Don
Thanks again, that’s pretty much the same conclusion I was coming to. And splitting the data differently isn’t likely to be practical given the ecosystem that produces and uses it.
Also, any comment on the last question about the zoom levels?
Hi Russ,
I don’t know how you detail implement that, but I think you should add many zoomLevel to the customZoomLevels of map. Then you should set the style for the customZoomLevels of your layer.
I did a test and found when we modify the customeZoomLevel and modify the style of layer, it still works.
Please see my test code as below and let me know if I misunderstand your question.
private void SelectFeatures_Load(object sender, EventArgs e)
{
winformsMap1.MapUnit = GeographyUnit.DecimalDegree;
winformsMap1.BackgroundOverlay.BackgroundBrush = new GeoSolidBrush(GeoColor.GeographicColors.ShallowOcean);
winformsMap1.ZoomLevelSet.CustomZoomLevels.Add(winformsMap1.ZoomLevelSet.ZoomLevel01);
winformsMap1.ZoomLevelSet.CustomZoomLevels.Add(winformsMap1.ZoomLevelSet.ZoomLevel02);
ShapeFileFeatureLayer worldLayer = new ShapeFileFeatureLayer(@"…\SampleData\Data\Countries02.shp");
worldLayer.ZoomLevelSet = winformsMap1.ZoomLevelSet;
worldLayer.ZoomLevelSet.CustomZoomLevels[0].DefaultAreaStyle = AreaStyles.CreateSimpleAreaStyle(GeoColor.SimpleColors.Transparent, GeoColor.FromArgb(100, GeoColor.SimpleColors.Green));
worldLayer.ZoomLevelSet.CustomZoomLevels[1].DefaultAreaStyle = AreaStyles.CreateSimpleAreaStyle(GeoColor.SimpleColors.Transparent, GeoColor.FromArgb(100, GeoColor.SimpleColors.Blue));
LayerOverlay staticOverlay = new LayerOverlay();
staticOverlay.Layers.Add(“WorldLayer”, worldLayer);
winformsMap1.Overlays.Add(staticOverlay);
winformsMap1.MapClick += new EventHandler<MapClickWinformsMapEventArgs>(winformsMap1_MapClick);
winformsMap1.CurrentExtent = new RectangleShape(-139.2, 92.4, 120.9, -93.2);
winformsMap1.Refresh();
}
void winformsMap1_MapClick(object sender, MapClickWinformsMapEventArgs e)
{
winformsMap1.ZoomLevelSet.CustomZoomLevels.Add(winformsMap1.ZoomLevelSet.ZoomLevel03);
winformsMap1.ZoomLevelSet.CustomZoomLevels.Add(winformsMap1.ZoomLevelSet.ZoomLevel04);
((winformsMap1.Overlays[0] as LayerOverlay).Layers[0] as ShapeFileFeatureLayer).ZoomLevelSet.CustomZoomLevels[2].DefaultAreaStyle = AreaStyles.CreateSimpleAreaStyle(GeoColor.SimpleColors.Transparent, GeoColor.FromArgb(100, GeoColor.SimpleColors.Yellow));
((winformsMap1.Overlays[0] as LayerOverlay).Layers[0] as ShapeFileFeatureLayer).ZoomLevelSet.CustomZoomLevels[3].DefaultAreaStyle = AreaStyles.CreateSimpleAreaStyle(GeoColor.SimpleColors.Transparent, GeoColor.FromArgb(100, GeoColor.SimpleColors.Red));
}
Regards,
Don