The data type for collections of multiple values.
An internal type used to represent everything that is not null, UDTs, arrays, structs, and maps.
The data type representing Array[Byte]
values.
The data type representing Boolean
values.
The data type representing Byte
values.
The data type representing calendar time intervals.
The base type of all Spark SQL data types.
A date type, supporting "0001-01-01" through "9999-12-31".
A mutable implementation of BigDecimal that can hold a Long if values are small enough.
The data type representing java.math.BigDecimal
values.
The data type representing Double
values.
The data type representing Float
values.
The data type representing Int
values.
The data type representing Long
values.
The data type for Maps.
Metadata is a wrapper over Map[String, Any] that limits the value type to simple ones: Boolean, Long, Double, String, Metadata, Array[Boolean], Array[Long], Array[Double], Array[String], and Array[Metadata].
Builder for Metadata.
The data type representing NULL
values.
Numeric data types.
Represents a JVM object that is passing through Spark SQL expression evaluation.
The data type representing Short
values.
The data type representing String
values.
A field inside a StructType.
A StructType object can be constructed by
The data type representing java.sql.Timestamp
values.
An AbstractDataType that matches any concrete data types.
Companion object for ArrayType.
Extra factory methods and pattern matchers for Decimals.
Contains a type system for attributes produced by relations, including complex types like structs, arrays and maps.