9.1.13

Funky Arcpy Cursors - functional programming examples

Python has great functional programming features with lambda expressions and first-class functions. There are many ways to cook a turkey, but here are some examples which rely heavily on a functional approach.

Map Functions to Feature Class Rows 

This example simply wraps a search cursor and executes functions. Not super helpful, but it shows the functional mindset and the use of functions as input parameters

def mapFunctionsToFeatureClassRows(featureClass, functionList):
    rows = arcpy.SearchCursor(inputFeatureClass, where)
    for r in rows:
        for f in functionList:
            f(r)
    del rows

Group Rows

def groupRows(inputFeatureClass, where, keyFunction, valueFunction):
    from collections import defaultdict

    groupings = defaultdict(list)
    rows = arcpy.SearchCursor(inputFeatureClass, where)
    for r in rows:
        groupings[keyFunction(r)].append(valueFunction(r))
    del rows
    return groupings
With 10.1+, you can utilize the arcpy.da library for significant performance increase:

def groupRows2(inputFeatureClass, fields, where, keyFunction, valueFunction):
    from collections import defaultdict
    groupings = defaultdict(list)
    with arcpy.da.SearchCursor(inputFeautreClass, fields, where) as cursor:
        for r in cursor:
            groupings[keyFunction(r)].append(valueFunction(r))
    return groupings

By convention, groupRows will call keyFunction(row) and valueFunction(row) supplying the current arcpy row object. The result will be a dictionary mapping keyFunction outputs to lists of valueFunction outputs:
if __name__ == '__main__':
    featureClass = arcpy.GetParameterAsText(0) or r'C:\python_working_directory\schools.gdb\school_points'
    schoolIdField = arcpy.GetParameterAsText(1) or r'schoolId'
    whereClause = None
    keyFunction = lambda r: r.getValue(stateSchoolId[:2])
    valueFunction = lambda r: r.getValue(stateSchoolId)

    groupsByDistrict = groupRows(features, None, keyFunction, valueFunction) 

>>> {'TX': ['TX201', 'TX202', 'TX203']}

 

Translate Append

Do you need to append to an existing feature class?  Basically the same data, but conflicting attribute tables?  Here's a functional solution:
def translateAppend(targetFeatureClass, appendFeatureClass, where, fieldTranslations, outputFeatureClass):
    import types
    
    arcpy.CopyFeatures_management(targetFeatureClass, outputFeatureClass)
    insertCursor = arcpy.InsertCursor(outputFeatureClass)
    appendRows = arcpy.SearchCursor(appendFeatureClass, where)
    
    for r in appendRows:
        newRow = insertCursor.newRow() 
        for k,v in fieldTranslations.items():
            value = None
            if isinstance(v, types.FunctionType):
                value = v(r)
            else:
                value = r.getValue(v)
            if value:
                newRow.setValue(k, value)
        insertCursor.insertRow(newRow)
    
    del newRow
    del appendRows
    del insertCursor
You supply the function with a dictionary whose keys are the fields your existing feature class. The values of the dict are functions which output the value for that field based on your new append data. The values can be lambda expressions, standard function (def statement). By convention, if you use a string value then that field from the append feature class is used:
def createGradeList(row):
    low_grade = row.getValue('low_grade')
    high_grade = row.getValue('high_grade')
    return range(low_grade, high_grade, 1)
   
#Map fields in input feature class to strings or functions
translations = {}

#Straight field-to-field name copy
translations['Shape'] = 'Shape'

#generate value using lambda expression
translations['state'] = lambda row: row.getValue(stateSchoolId)[:2] 

#lookup value in dictionary
translations['StateAbbrev'] = lambda row: state_abbrevs_index[row.getValue('FIPS')]

# generate value using function
translations['GradesCommaDelimited'] = createGradeList

translateAppend(feature_class, new_features, "1=1", translations, output_feature_class)

No comments:

Post a Comment