Hi All,
I am wondering if there is anyway to add a feature column to a geodatabase field that is of type Blob using the FileGeoDatabaseFeatureLayer class. The idea being that I would like to be able to load the blob (for example a pdf file) when the user clicks a feature on the map.
According to ArcGIS (below link), this is a supported type for the geodatabase table; however, I can’t seem to add anything other than string type columns to FeatureSourceColumns or to the geodatabase table itself.
webhelp.esri.com/arcgisdeskt…ta%20types
Thanks,
Damian
Adding Blob field to GeoDatabase
Hi Damian,
Thanks for your post, and I agree with you this is a good new feature, I have added it to our internal issue system, but I am sorry to say that right now we don’t have enough resource for it, we will do it when we have enough resource. Or if you need this function immediately, your account rep can contact you for a professional services.
Best Regards
Summer
Thanks Summer, I will stay tuned for an update.
While you are at it, you should probably go ahead and cater for all allowable field types as I’ve already come across some need to aggregate numeric fields which would be easier if I could directly pass the data to Linq.
Regards,
Damian
Hi Damian,
Thanks for your advice. We will take it into consideration. And anything new will be uploaded here immediately.
if you have any more question , please feel free to let us know.
Best Regards
Summer
Hi Damian,
Now blob is available in 7.0.64.0( this is a new feature so it is only in 7.0.64.0, still in build will be available in hours), please install the “unmanaged dependencies msi” in 7.0.64.0 dll packages. attached is a sample code to test it, would you please try it?
Hope it helps
Summer
001_blobsample.txt (1.53 KB)
Hi Damian,
There is a change in build process, the blob will be availabe in 7.0.65.0, would you please wait for one more day?
Best Regards
Summer
Hi Summer,
I downloaded 7.0.0.65 and I have the problem that there is no overload for CreateTable that allows for a signature of string, string, IEnumerable<FeatureSourceColumn>. It’s still expecting all string values. Maybe the update didn’t make it to that version number.
Also, I notice that the version of unmanaged assemblies installer was just 7.0.0.59 in this package. Did it miss update?
Does it mean other column types are now supported because it takes FeatureSourceColumn? If so, I would love if typeName of the FeatureSourceColumn could be changed into a Enum so I know what types I can put in the field instead of guessing what the name of a type may be.
Looking forward to trying this. Should be great!
Thanks,
Damian
Just re-read what you wrote and got 7.0.65.0 dev version instead of production.
This time there is an overload that accepts FeatureSourceColumn collection, but when I execute the CreateTable it has the following error.
Object of type ‘System.Collections.ObjectModel.Collection`1[ThinkGeo.MapSuite.Core.FeatureSourceColumn]’ cannot be converted to type ‘System.Collections.Generic.IEnumerable`1[System.String]’.
I am guessing the build of unmanaged assemblies needs to be updated.
Regards,
Damian
Hi Damian,
Sorry for the mistake in telling the build version, actually the change will take into effect in 7.0.66.0(still in build will be available in hours) rather 7.0.65.0. Would you please install the "Map Suite Unmanaged Dependencies 7.0.66.0" in the dll packages?
Hope it helps
Summer
Hi Summer,
Seems there isn’t going to be an update today on the site for dev version 7.0.66.0. It hasn’t updated since yesterday. Also noting that the production build is 7.0.0.66, but does not contain your updates.
Regards,
Damian
Hi Damian,
Sorry for the inconvenience, the msi build had an bug, now we have fixed it, would you please get the "Map Suite Unmanaged Dependencies 7.0.67.0" from 7.0.67.0 dll packages.
Sorry again
Summer
Hi Summer,
Looks like blob type is working in the development version of 7.0.69.0. This is really cool. Thanks! When will you make it available in production?
I have also done a bit of experiment with the typeName property of FeatureSourceColumn. It looks like I can specify “int” or “integer” now and get a long integer field in the database (verified through ArcCatalog). And, I can specify “double” and output values with decimal. But there seems to be more going on behind the scenes here than is obvious as I can also type in some nonsense typeNames and it will create the database and make the field into Text.
There seems to also be some kind of strange issues with the maxLength parameter as well. If I am writing out an integer or a double, it doesn’t seem to matter what I put in this field.
Also, I tried using a field called “datetime” and a column value of “02-OCT-1969” and of “10/02/1969”. It said it created the record in the database and no errors were given, but investigation of the database showed that no records had actually been created. Only after I set the maxLength parameter to 0 would it write the record and it ends up as text in the database.
It would be great if I could get some rules for what fields/values I am able to write/read to the database accurately.
Finally, I tried a test and created 100 records in the db with the blob set to a 5 Mb image. Unfortunately, trying to read back the db gave me an Out of Memory exception. The database of course was 500Mb which isn’t good, so I will look at various downsampling and compression options of the file stream before committing to the blob field. It would be good if you could validate the out of memory on your side and see what can be done about it. It’s not completely obvious to me why I could write, but not read when I get to a certain size limit.
Thanks,
Damian
Hi Damian,
Great to hear that “blob type is working”, About “When will you make it available in production?”, because there are some API change for this functionality and to make production statle, so it will not be added into production build until next release in May of 2014.
About “nonsense typeNames and it will create the database and make the field into Text”, now the nonsense typeNames will be thrown exceptions in “Map Suite Unmanaged Dependencies 7.0.71.0.msi”.
About “strange issues with the maxLength”, I we tested with Esri.FileGDB.Table.AddField(…), if we add integer field of double field, the Esri.FileGDB.Table.AddField(…) will ignore the length, and the integer field maxlength is still 4 bytes, so wether a maxlength will take effect should be decided inside Esri.FileGDB.Table.AddFiedl().
About using a field called “datetime”, this functionality is added in “Map Suite Unmanaged Dependencies 7.0.71.0.msi”. attached “AddDateTime.txt” is the sample code for it.
About created 100 records in the db with the blob set to a 5 Mb image, we tried to recreate this problem in a loop with a 40M+ data, the problem still didn’t show up, attached “insert200bigblob.txt” is the test code on our end, would you please check it and tell us what we missed?
“Map Suite Unmanaged Dependencies 7.0.71.0.msi” is still in build will be availabe in hours
Thanks,
Summer
001_AddDateTime.txt (1009 Bytes)
insert200bigblob.txt (1.26 KB)
Hi Damian,
Would you please get "Map Suite Unmanaged Dependencies 7.0.72.0.msi", the change will take effect in 7.0.72.0.
Best Regards
Summer
Hi Summer,
Regarding the blob size, I could be mistaken, but I think you must use .ColumnValues.Add in order to get the blob string column to be included with the feature. If you try to set using ColumnValues[] and that column doesn’t exist, I believe it is just ignored.
Try using the following and see if your database size is bigger.
Feature feature = new Feature(0, 0);
feature.ColumnValues.Add(“myblob”, newString);
Regards,
Damian
Hi Damian,
Thanks for your further information, we tried both "feature.ColumnValues.Add("myblob", newString)" and "feature.ColumnValues["myblob"] = newString", but the created database is still the same size. Would you please tell me is there are any difference between using "feature.ColumnValues.Add("myblob", newString)" and "feature.ColumnValues["myblob"] = newString"
Waiting for your further information
Summer
Hi Summer,
Attached is the project I have which produces this behavior. Get yourself a 5 Mb image or other file and replace the paths in the creatDb button event. After, zip the test.gdb directory and verify the size.
The picture I have is 5,261 Kb and I add 50 records to the db. The final db size after compression is 262,681 Kb. With the db this size, the readback event fails with out of memory error.
Regards,
Damian
GeoDbMetaDataLimit.zip (51.5 KB)
Hi Damian,
The OutOfMemory exception is casued by call GetAllFeatures function twice.
Your image size is 5Mb, it saved to database 50 times. When GetAllFeatures called, we read the image from Database and saved it in feature 50 times.
We saved it as Base64String, which will takes about 14mb memory when the data is 5mb before convert. (Please see the testMemory function shows Base64String takes about 14mb memory) So the memory usage should be about 14x50 = 700mb, and the second time memory usage will equal 1.4GB which beyond the default limit 1.2GB of .net framework, so the .net throw an “out of memory” exception.
private
static
void
testMemory()
{
FileStream sr =
new
FileStream(@
“D:\test.JPG”
, FileMode.Open);
BinaryReader br =
new
BinaryReader(sr);
byte
[] bs = br.ReadBytes((
int
)sr.Length);
Collection<Feature> allFeatures =
new
Collection<Feature>();
for
(
int
i = 0; i < 50; i++)
{
string
fieldValue = Convert.ToBase64String(bs);
Feature feature =
new
Feature();
feature.ColumnValues.Add(
“test”
, fieldValue);
allFeatures.Add(feature);
}
Console.ReadLine();
}
But this only appear for X86, if you are using X64, this limit should be higher than x86, so test project won’t throw exception.
For avoid this issue we should choose X64 mode or make the data memory usage under 1.2 GB when x86.
BTW, when research this issue, we find the other problem and fix it which also will reduce memory usage when getAllFeatures, you can get latest package with version higher than 7.0.81.0 for that.
Regards,
Don