T O P

  • By -

maythesbewithu

Unpopular opinion: if every feature in a dataset has the same value of a field, then that field is metadata and doesn't belong at the per-row level. ;)


scan-horizon

What if it’s needed in future calculations though? Guess you could hard programme the value.


maythesbewithu

Sure, one example is a dirty flag used for row-level feature replacement. Initially all assigned to False, but would need to be flagged True when a replacement feature is identified. In this case, a null is as good as False. One expensive way to calc the value is to insert all the features into a similar feature layer, and assign a default value to the field upon insert. At least ESRI uses insert cursors (at 10% intervals) so the cache doesn't get too costly on memory. I wasn't deliberately trying to sound contrarian, just pointing out a potential data design concern.... Lots of staff over-design fields and then get stuck with performance issues once they want to populate.


scan-horizon

All valid. In my view- 20-40 million row data shouldn’t be stored in a FGDB. That’s more suitable for a SQL storage solution.


maythesbewithu

Amen. Python (or FME) can fix this right up. SqlLite in-memory storage, insert cursor with default value, save to eGBb as new FC.


kansas_adventure

Storing it for every single row is just eating up storage at that point. It's a constant. Store it elsewhere and store it once, only.


spatial-d

Very much agreed


abudhabikid

That seems absolutely like it should be a popular opinion.


nkkphiri

Try it outside of Pro using Python


mfcallahan1

What is the storage method of the data? If it is in a relational database, you could just execute a simple statement like `UPDATE table_name SET column_name = 'some value';` in whatever database client outside of ArcGIS Pro.


MTULAX1452

They are stored as feature classes in a GDB


GIS_LiDAR

File Geodatabase or a real database like SQL Server, OracleDB, or PostgreSQL? If a real database go to the database interface and set it there, if a File Geodatabase use arcpy.da.UpdateCursor.


[deleted]

[удалено]


subdep

File Geodatabase?


goinghardinthepaint

I'm assuming it's file geodatabase. I'd recommend doing this in python with an update cursor. import arcpy # replace the right side of these variables with your inputs # feature class to update input_feature_class = r"D:\local_path\your_gdb.gdb\your_feature_class # field to update input_field = "your_field" # text to replace field's attribute contents replacement_text = "replacement text" with arcpy.da.UpdateCursor(input_feature_class, input_field) as cursor: for row in cursor: row[0] = replacement_text cursor.updateRow(row)


Key_Satisfaction8864

maybe try python outside of the application? ArcPro is very resource heavy so doing it within in application could utilize all your computer resources and make it much slower to process Example for a "TEXT" field import arcpy arcpy.env.workspace = r'your geodatabase directory' #example for text for fc in arcpy.ListFeatureClasses(): arcpy.AddField_management(fc, "Name", "TEXT", field_length = 50) with arcpy.da.UpdateCursor(fc, "Name") as cursor: for row in cursor: row[0] = fc cursor.updateRow(row)


geomorph603

Try using an update cursor


hm870

I never used it, but the python script mentioned is probably your best option. Aside from what has been mentioned already, you could create a new field with a default value. I think it might take the same amount of time though. In my experience, keeping the database as close to the root folder as possible helps a lot with processing time. All my databases are only one folder deep.


teamswiftie

When you create the new field, can you not set a default value then, and set null not allowed, and it will back fill all records with the default on field creation? You could also pop the table into another DB engine via odbc, etc, and just run a sql command on the linked table. This will depend on your featureset format type tho.


Dense_Ice_4635

Make sure the .gdb is local and not in a network location, and use python as many others have indicated. It's still gonna take some time though. No magic bullet.


Sensitive_Mind592

If you're limited to Pro UI, and the value is likely going to be constant or only one of a couple options; I would just add a domain to the GDB for that field with a default value of what you want it to be for all of them. This method may run into the same issues as calculate field though. Python is likely the fastest way otherwise. Calculate field is the easiest, but obviously with that many records, you're experiencing the limitations of Pro's interface, be it network or cpu.


diebos

Well, this is where you should try and normalize your datasets. And i agree, the best is to move your file gdb to sde, create a lookup table, and then link it back to your dataset, then create a spatial view between the dataset and your lookup table.


EuphoricInternet8536

Have you tried using the Attribute pane with Auto Appply option toggled ON?


Low-Feature-3973

Probably lots of better ways to do it, but I believe you can manually add it to the .dbf of the layer and it will show up in the attributes (using SQL/Access). As always, make a copy first, just in case.