ThinkGeo.com    |     Documentation    |     Premium Support

Upgrading serialized maps

One of the difficult tasks we face in the process of upgrading our application from TG 10 to TG 14 is to allow our users to open previously saved maps in the new application without having to rebuild them from scratch.

We use ThinkGeo’s GeoSerializer to save maps that users have created, some of which can be quite complex and may take hours to rebuild. When we upgraded from TG 9 to TG 10 we were able to ‘upgrade’ the old serialization files by doing a find-and-replace on class names for namespaces that may have changed. The upgrade from TG 10 to TG 14 is much more difficult since many class names and property names have changed. In addition, although the xml is pretty easy to read, the names of the properties of TG 14 classes do not necessarily reflect the actual property name, it appears they have been obfuscated. To make matters worse, even though I had managed to figure out some of the obfuscated property names, some of them changed with the release of 14.5.0.

Do you have any suggestions for how I might solve this problem?

Thanks!

Steve

Hi Steve,

At this point, I don’t think there is a reliable silver bullet for upgrading serialized maps from v10 to v14.

One reason is simply the size of the change between those versions. Compared with the v9-to-v10 upgrade, where the API changes were relatively limited, v10 to v14 is a much bigger jump. We introduced major changes such as async support, embedded WPF, and .NET Core/.NET support, so there is no simple one-to-one mapping between the old and new APIs.

Another key issue is the serialization model itself. The current serializer is still based on the traditional [Serializable] approach, which depends on internal private fields. That makes it inherently fragile across versions, because those fields are implementation details and may change over time. So even if you were able to convert a v10 XML file into something that works in v14, it would still be difficult to make that solution reliable long term. More broadly, this style of serialization has been falling out of favor for exactly this reason.

We plan to move away from this traditional model and toward an explicitly defined contract-based approach. That will no longer depend on private fields, and it should make future versioning and compatibility much easier to manage.

For your current case, I would not recommend spending too much time trying to map old private fields to the current ones. It would likely be fragile and expensive to maintain. I think the more practical options are:

  1. Define your own serialization contract and build your own save/load logic around it. This requires more work up front, but it gives you full control and a more stable path going forward.

  2. Wait for the new contract-based serialization approach to mature further. This would reduce effort, but it depends on our timeline. At the moment, we are targeting v15.1, around November this year.

We already have some early work in this direction. If you’re interested, let me know, and I think we can share it with you to help you build your own serialization contract. By the way, are you on .net framework or .net 8+? I’m asking because the new serialization we are working is for .net 8+ only, at least for now.

Even after the new serialization approach is in place, there would still be some work needed to convert from v10 to that new format. However, that should be more of a one-time migration. Once the serialization contract is clearly defined and well maintained, this kind of issue should be much less likely to happen again in the future.

Best regards,
Ben

Hi Steve,

It went smoother than I thought. Here’s a quick follow-up:

We went ahead and built the contract-based serialization layer I mentioned, and it now lives as a separate set of packages under https://gitlab.com/thinkgeo/public/thinkgeo-desktop-maps/-/tree/develop/tools/ThinkGeo.Serialization.

  1. With this you can do serialization / deserialization like this:
  using ThinkGeo.UI.WinForms.Serialization;   // or .Wpf

  string json = MapSerializer.Serialize(mapView);
  MapSerializer.Deserialize(mapView, json);
  1. It uses plain DTO records with System.Text.Json — no [Serializable] attribute, no private fields involved. The contracts are all public, and you can change them as you want.
  2. We’ve tested it against all 148 WPF and 137 WinForms HowDoI samples — fixed-point round-trip passes on every one. Those test projects are in the same repo.
  3. For your v10 → v14 migration: write a one-time tool that exports your v10 project to the new JSON. From that point onward you’re free of the old format and v14, v15, v16… all read the same JSON without fanfare, since you control the contracts.
  4. Notes: some known caveats such as cloud ClientSecret / API keys (don’t persist to untrusted storage), but you can always change that in the source if you want.

We were thinking of integrating this into the product, but keeping it as a separate project might actually be better — it gives users more power to change the serialization contracts. Anyway, give it a try and let me know if you have any questions.

Thanks,
Ben

Ben,

Wow, that was quick!

I’m not sure I’m following all the details, so let me ask a question or two.

We have users with hundreds of ‘working sets’ serialized in the TG 10 format. The way we have accomplished this is to serialize each layer in its own file and then zip all the files up into a single file. We don’t serialize the whole WinformsMap, only the properties we are interested in.

So it looks like we would need to use our existing TG 10 application to deserialize existing ‘working sets’, then use the new serialization method to re-serialize all the layers and the properties of the map. Then we would be able to seamlessly de-serialize in the TG-14 and later project using the new serialization method.

Am I following this correctly?

EDIT: Ok so now, looking at the code it looks like the only class you serialize using the new method is the MapView. So now I’m a little more confused about how this would help with converting the old format to the new.

Thanks!

Steve

Hi Steve,

Yes, that is essentially the migration path. There are 3 steps:

  1. Deserialize the existing working sets using your v10 application.
  2. Serialize those restored v10 objects into the new JSON contract format.
  3. In your v14 application, deserialize that JSON using the new serialization project.

The serialization project only demonstrates the v14 side, so for the migration itself you would
still need to implement the equivalent export step on the v10 side. The advantage is that the
new serialization source is open, so you can control exactly what gets serialized and keep that contract stable for future versions.

We show MapView serialization in the sample project only because it is the simplest end-to-end example. It is not the only way to use the new serializer. You can use the lower-level contracts/mappers instead of serializing the whole MapView. For example, a single ShapeFileFeatureLayer can be serialized/deserialized like this:

var dto = LayerMapper.ToDescriptor(new ShapeFileFeatureLayer(@"roads.shp"))!;
var json = JsonSerializer.Serialize(dto, jsonOptions);

var descriptor = JsonSerializer.Deserialize<LayerDescriptor>(json, jsonOptions)!;
var layer = (ShapeFileFeatureLayer)LayerMapper.FromDescriptor(descriptor);

So if your current format is “one file per layer”, you can keep that same pattern and migrate each layer into the new JSON format one at a time.

Thanks,
Ben

Ben,

Thanks for the clarification. Since our current application is based on .NET Framework we would definitely have to make this a standalone project, which is OK. May have some more questions for you in the future.

Thanks!

Steve

Steve,

Just FYI, this project currently uses some System.Text.Json features that are not available in .NET Framework. It should still be possible to use the same general approach in a .NET Framework project if you switch the JSON handling to Newtonsoft.Json and update the related code accordingly.

Thanks,
Ben