Spark does not return always the same type on abs as the input was
When averaging Spark doesn't change these types: - BigDecimal -> BigDecimal - Double -> Double But it changes these types : - Int -> Double - Short -> Double - Long -> Double
When averaging Spark doesn't change these types: - BigDecimal -> BigDecimal - Double -> Double But it changes these types : - Int -> Double - Short -> Double - Long -> Double
Types that can be bitwise ORed, ANDed, or XORed by Catalyst.
Types that can be bitwise ORed, ANDed, or XORed by Catalyst. Note that Catalyst requires that when performing bitwise operations between columns the two types must be the same so in some cases casting is necessary.
Spark divides everything as Double, expect BigDecimals are divided into another BigDecimal, benefiting from some added precision.
Spark divides everything as Double, expect BigDecimals are divided into another BigDecimal, benefiting from some added precision.
Types for which we can check if is in
Types for which we can check if is in
Types that can be added, subtracted and multiplied by Catalyst.
Types that can be added, subtracted and multiplied by Catalyst.
Types that can be ordered/compared by Catalyst.
Types that can be ordered/compared by Catalyst.
When summing Spark doesn't change these types: - Long -> Long - BigDecimal -> BigDecimal - Double -> Double
When summing Spark doesn't change these types: - Long -> Long - BigDecimal -> BigDecimal - Double -> Double
For other types there are conversions: - Int -> Long - Short -> Long
Spark's variance and stddev functions always return Double
Spark's variance and stddev functions always return Double
An Injection[A, B] is a reversible function from A to B.
An Injection[A, B] is a reversible function from A to B.
Must obey forAll { a: A => invert(apply(a)) == a }
.
Type for the internal Spark representation of SQL date.
Type for the internal Spark representation of SQL date. If the spark.sql.functions
where typed,
[date_add][1] would for instance be defined as def date_add(d: SQLDate, i: Int); SQLDate
.
[1]: https://spark.apache.org/docs/2.0.2/api/java/org/apache/spark/sql/functions.html#add_months(org.apache.spark.sql.Column,%20int)
Type for the Spark internal representation of a timestamp.
Type for the Spark internal representation of a timestamp. If the spark.sql.functions
where typed,
[current_timestamp][1] would for instance be defined as def current_timestamp(): SQLTimestamp
.
[1]: https://spark.apache.org/docs/1.6.2/api/java/org/apache/spark/sql/functions.html#current_timestamp()
Spark does not return always the same type on abs as the input was