ThinkGeo.com    |     Documentation    |     Premium Support

Creating a ShapeFile does not produce a valid file

We are trying to export data from the map to a ShapeFile, we have it creating the shapefile, but it is missing data from columns, and if we try to import it into other applications (ArcMap for example) it complains that the files are corrupt.

When I try to look at the .dbf file in Access (opened as a dBase III file) it too complains. If I output one or two fields, I can open the .dbf file, but there seems to be no rhyme or reason to it.

Here is an example of the code I pulled out to a simple test.

        // Create a new shapefile feature layer
        ShapeFileType shapeFileType = ShapeFileType.Point;

        // Define the columns for the shapefile's attribute table.
        IList<DbfColumn> columns = new List<DbfColumn>();
        columns.Add(new DbfColumn("Name", DbfColumnType.Character, 50, 0));
        columns.Add(new DbfColumn("Description", DbfColumnType.Character, 100, 0));
        columns.Add(new DbfColumn("SomeNumber", DbfColumnType.Numeric, 10, 2));
        columns.Add(new DbfColumn("SpudDate", DbfColumnType.Date, 8, 0));

        // Create the shapefile
        string shapeFilePath = $@".\Output\My{shapeFileType}ShapeFile.shp";
        ShapeFileFeatureSource.CreateShapeFile(shapeFileType, shapeFilePath, columns);
        ShapeFileFeatureSource shapeFileFeatureLayer = new ShapeFileFeatureSource(shapeFilePath, FileAccess.ReadWrite);

        // Going to write some features to the shapefile, prepare to do so.
        shapeFileFeatureLayer.Open();
        shapeFileFeatureLayer.BeginTransaction();

        // Add a sample feature to the shapefile
        var feature = new Feature(new PointShape(-10000000, 4000000));
        feature.ColumnValues.Add("Name", "Sample Point");
        feature.ColumnValues.Add("Description", "This is a sample point feature.");
        feature.ColumnValues.Add("SomeNumber", "123.45");
        feature.ColumnValues.Add("SpudDate", "20240101");
        shapeFileFeatureLayer.AddFeature(feature);

        // Commit the transaction and close the shapefile.
        shapeFileFeatureLayer.CommitTransaction();
        shapeFileFeatureLayer.Close();

It does create the file, but it can’t be opened in other apps, and the .dbf file is claimed to be corrupt when I try to open it in Access. If I just output the name, it works. I have also tried outputting everything as character fields, no joy.

I did go over the other posts that talk about this, and used them as a start, but they are old, and, well, didn’t help.

Hi Chris,

Standard Shapefiles use the dBASE III for the attributes. This standard has a strict limit of 10 bytes for column headers. “Description” is 11 bytes long, which causes strict readers to reject the file as corrupt. If you rename it to " Descript " (or anything under 10 characters), it will work in ArcMap and Access.

ThinkGeo supports the standard format, while we also exceed these limits for flexibility. A quick trick: if you see a *.dbc file along with the *.dbf, that’s a sign this file has greater than 10 bytes column name and not consistent with the strict readers.

This is a great point that many users might encounter. In the future versions, we will

  1. Add more comments to the corresponding APIs to clarify these standard limitations.
  2. Add a warning to the output window when a user creates a Shapefile that is incompatible with third-party standards.

Let me know if you have any suggestions:

Thanks,
Ben

That sounds like the one “Clue” that was missing. I will give that a try, but it sound reasonable to me as the fields that did seem to work were all under 10 characters long.

Thanks again Ben.

No problem, Chris!

Here’s another trick: the DBF (dBase III) file inside a shapefile limits a single text field value to 255 bytes. ThinkGeo exceed this limit for flexibility, and similarly a shapefile created by ThinkGeo with a column with longer than 255 bytes will not be compatible with strict readers like ArcMap.