pyspark.sql.functions.transform_keys#
- pyspark.sql.functions.transform_keys(col, f)[source]#
- Applies a function to every key-value pair in a map and returns a map with the results of those applications as the new keys for the pairs. - New in version 3.1.0. - Changed in version 3.4.0: Supports Spark Connect. - Parameters
- colColumnor str
- name of column or expression 
- ffunction
- a binary function - (k: Column, v: Column) -> Column...Can use methods of- Column, functions defined in- pyspark.sql.functionsand Scala- UserDefinedFunctions. Python- UserDefinedFunctionsare not supported (SPARK-27052).
 
- col
- Returns
- Column
- a new map of entries where new keys were calculated by applying given function to each key value argument. 
 
 - Examples - >>> df = spark.createDataFrame([(1, {"foo": -2.0, "bar": 2.0})], ("id", "data")) >>> row = df.select(transform_keys( ... "data", lambda k, _: upper(k)).alias("data_upper") ... ).head() >>> sorted(row["data_upper"].items()) [('BAR', 2.0), ('FOO', -2.0)]