pyspark.sql.types.DecimalType

class pyspark.sql.types.DecimalType(precision=10, scale=0)[source]

Decimal (decimal.Decimal) data type.

The DecimalType must have fixed precision (the maximum total number of digits) and scale (the number of digits on the right of dot). For example, (5, 2) can support the value from [-999.99 to 999.99].

The precision can be up to 38, the scale must be less or equal to precision.

When creating a DecimalType, the default precision and scale is (10, 0). When inferring schema from decimal.Decimal objects, it will be DecimalType(38, 18).

Parameters
  • precision – the maximum (i.e. total) number of digits (default: 10)

  • scale – the number of digits on right side of dot. (default: 0)

__init__(precision=10, scale=0)[source]

Initialize self. See help(type(self)) for accurate signature.

Methods

__init__([precision, scale])

Initialize self.

fromInternal(obj)

Converts an internal SQL object into a native Python object.

json()

jsonValue()

needConversion()

Does this type needs conversion between Python object and internal SQL object.

simpleString()

toInternal(obj)

Converts a Python object into an internal SQL object.

typeName()