pyspark.sql.functions.transform_keys

pyspark.sql.functions.transform_keys(col, f)[source]

Applies a function to every key-value pair in a map and returns a map with the results of those applications as the new keys for the pairs.

Parameters
  • col – name of column or expression

  • f – a binary function (k: Column, v: Column) -> Column... Can use methods of pyspark.sql.Column, functions defined in pyspark.sql.functions and Scala UserDefinedFunctions. Python UserDefinedFunctions are not supported (SPARK-27052).

Returns

a pyspark.sql.Column

>>> df = spark.createDataFrame([(1, {"foo": -2.0, "bar": 2.0})], ("id", "data"))
>>> df.select(transform_keys(
...     "data", lambda k, _: upper(k)).alias("data_upper")
... ).show(truncate=False)
+-------------------------+
|data_upper               |
+-------------------------+
|[BAR -> 2.0, FOO -> -2.0]|
+-------------------------+

New in version 3.1.