pyspark.sql.functions.map_concat

pyspark.sql.functions.map_concat(*cols)[source]

Returns the union of all the given maps.

Parameters

cols – list of column names (string) or list of Column expressions

>>> from pyspark.sql.functions import map_concat
>>> df = spark.sql("SELECT map(1, 'a', 2, 'b') as map1, map(3, 'c') as map2")
>>> df.select(map_concat("map1", "map2").alias("map3")).show(truncate=False)
+------------------------+
|map3                    |
+------------------------+
|[1 -> a, 2 -> b, 3 -> c]|
+------------------------+

New in version 2.4.