pyspark.sql.types.
StructType
Struct type, consisting of a list of StructField.
StructField
This is the data type representing a Row.
Row
Iterating a StructType will iterate over its StructFields. A contained StructField can be accessed by its name or position.
>>> struct1 = StructType([StructField("f1", StringType(), True)]) >>> struct1["f1"] StructField(f1,StringType,true) >>> struct1[0] StructField(f1,StringType,true)
__init__
>>> struct1 = StructType([StructField("f1", StringType(), True)]) >>> struct2 = StructType([StructField("f1", StringType(), True)]) >>> struct1 == struct2 True >>> struct1 = StructType([StructField("f1", StringType(), True)]) >>> struct2 = StructType([StructField("f1", StringType(), True), ... StructField("f2", IntegerType(), False)]) >>> struct1 == struct2 False
Methods
__init__([fields])
>>> struct1 = StructType([StructField("f1", StringType(), True)])
add(field[, data_type, nullable, metadata])
add
Construct a StructType by adding new elements to it, to define the schema.
fieldNames()
fieldNames
Returns all field names in a list.
fromInternal(obj)
fromInternal
Converts an internal SQL object into a native Python object.
fromJson(json)
fromJson
json()
json
jsonValue()
jsonValue
needConversion()
needConversion
Does this type needs conversion between Python object and internal SQL object.
simpleString()
simpleString
toInternal(obj)
toInternal
Converts a Python object into an internal SQL object.
typeName()
typeName