sealed class TypedAggregate[T, U] extends AbstractTypedColumn[T, U]
Expression used in agg
-like constructions.
- Alphabetic
- By Inheritance
- TypedAggregate
- AbstractTypedColumn
- UntypedExpression
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new TypedAggregate(column: Column)(implicit uencoder: TypedEncoder[U])
- new TypedAggregate(expr: Expression)(implicit uenc: TypedEncoder[U])
Type Members
-
trait
Mapper[X] extends AnyRef
A helper class to make to simplify working with Optional fields.
A helper class to make to simplify working with Optional fields.
val x: TypedColumn[Option[Int]] = _ x.opt.map(_*2) // This only compiles if the type of x is Option[X] (in this example X is of type Int)
- Definition Classes
- AbstractTypedColumn
- Note
Known issue: map() will NOT work when the applied function is a udf(). It will compile and then throw a runtime error.
-
type
ThisType[A, B] = TypedAggregate[A, B]
- Definition Classes
- TypedAggregate → AbstractTypedColumn
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
def
%(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, U]
Modulo (a.k.a.
Modulo (a.k.a. remainder) expression.
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
%[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Modulo (a.k.a.
Modulo (a.k.a. remainder) expression.
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
&[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Bitwise AND this expression and another expression.
Bitwise AND this expression and another expression.
df.select(df.col('colA) & (df.col('colB)))
- other
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
&(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]
Bitwise AND this expression and another expression (of same type).
Bitwise AND this expression and another expression (of same type).
df.select(df.col('colA).cast[Int] & -1)
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
&&[TT, W](other: ThisType[TT, Boolean])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Boolean AND.
Boolean AND.
df.filter ( df.col('a) === 1 && df.col('b) > 5)
- Definition Classes
- AbstractTypedColumn
-
def
*(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, U]
Multiplication of this expression a constant.
Multiplication of this expression a constant.
// The following multiplies a person's height by their weight. people.select( people.col('height) * people.col('weight) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
*[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W], t: ClassTag[U]): ThisType[W, U]
Multiplication of this expression and another expression.
Multiplication of this expression and another expression.
// The following multiplies a person's height by their weight. people.select( people.col('height) * people.col('weight) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
+(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, U]
Sum of this expression (column) with a constant.
Sum of this expression (column) with a constant.
// The following selects the sum of a person's height and weight. people.select( people('height) + 2 )
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
+[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Sum of this expression and another expression.
Sum of this expression and another expression.
// The following selects the sum of a person's height and weight. people.select( people.col('height) + people.col('weight) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
-(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, U]
Subtraction.
Subtraction. Subtract the other expression from this expression.
// The following selects the difference between people's height and their weight. people.select( people('height) - 1 )
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
-[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Subtraction.
Subtraction. Subtract the other expression from this expression.
// The following selects the difference between people's height and their weight. people.select( people.col('height) - people.col('weight) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
/(u: U)(implicit n: CatalystNumeric[U]): ThisType[T, Double]
Division this expression by another expression.
Division this expression by another expression.
// The following divides a person's height by their weight. people.select( people('height) / 2 )
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
/[Out, TT, W](other: ThisType[TT, U])(implicit n: CatalystDivisible[U, Out], e: TypedEncoder[Out], w: With.Aux[T, TT, W]): ThisType[W, Out]
Division this expression by another expression.
Division this expression by another expression.
// The following divides a person's height by their weight. people.select( people('height) / people('weight) )
- other
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
<(u: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]
Less than.
Less than.
// The following selects people younger than 21. df.select( df('age) < 21 )
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
<[TT, W](other: ThisType[TT, U])(implicit i0: CatalystOrdered[U], w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Less than.
Less than.
// The following selects people younger than the maxAge column. df.select(df('age) < df('maxAge) )
- other
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
<=(u: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]
Less than or equal to.
Less than or equal to.
// The following selects people younger than 22. df.select( df('age) <= 2 )
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
<=[TT, W](other: ThisType[TT, U])(implicit i0: CatalystOrdered[U], w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Less than or equal to.
Less than or equal to.
// The following selects people younger or equal than the maxAge column. df.select(df('age) <= df('maxAge)
- other
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
=!=(u: U): ThisType[T, Boolean]
Inequality test.
Inequality test.
df.filter(df.col('a) =!= "a")
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
=!=[TT, W](other: ThisType[TT, U])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Inequality test.
Inequality test.
df.filter(df.col('a) =!= df.col('b))
apache/spark
- Definition Classes
- AbstractTypedColumn
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
===[TT, W](other: ThisType[TT, U])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Equality test.
Equality test.
df.filter( df.col('a) === df.col('b) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
===(u: U): ThisType[T, Boolean]
Equality test.
Equality test.
df.filter( df.col('a) === 1 )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
>(u: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]
Greater than.
Greater than.
// The following selects people older than 21. df.select( df('age) > 21 )
- u
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
>[TT, W](other: ThisType[TT, U])(implicit i0: CatalystOrdered[U], w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Greater than.
Greater than.
// The following selects people older than the maxAge column. df.select( df('age) > df('maxAge) )
- other
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
>=(u: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]
Greater than or equal.
Greater than or equal.
// The following selects people older than 20. df.select( df('age) >= 21 )
- u
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
>=[TT, W](other: ThisType[TT, U])(implicit i0: CatalystOrdered[U], w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Greater than or equal.
Greater than or equal.
// The following selects people older or equal than the maxAge column. df.select( df('age) >= df('maxAge) )
- other
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
^[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Bitwise XOR this expression and another expression.
Bitwise XOR this expression and another expression.
df.select(df.col('colA) ^ (df.col('colB)))
- other
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
^(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]
Bitwise XOR this expression and another expression (of same type).
Bitwise XOR this expression and another expression (of same type).
df.select(df.col('colA).cast[Long] ^ 1L)
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
and[TT, W](other: ThisType[TT, Boolean])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Boolean AND.
Boolean AND.
df.filter ( (df.col('a) === 1).and(df.col('b) > 5) )
- Definition Classes
- AbstractTypedColumn
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
asc(implicit catalystOrdered: CatalystOrdered[U]): SortedTypedColumn[T, U]
Returns an ascending ordering used in sorting
Returns an ascending ordering used in sorting
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
between[TT1, TT2, W1, W2](lowerBound: ThisType[TT1, U], upperBound: ThisType[TT2, U])(implicit i0: CatalystOrdered[U], w0: With.Aux[T, TT1, W1], w1: With.Aux[TT2, W1, W2]): ThisType[W2, Boolean]
True if the current column is between the lower bound and upper bound, inclusive.
True if the current column is between the lower bound and upper bound, inclusive.
- lowerBound
another column of the same type
- upperBound
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
between(lowerBound: U, upperBound: U)(implicit i0: CatalystOrdered[U]): ThisType[T, Boolean]
True if the current column is between the lower bound and upper bound, inclusive.
True if the current column is between the lower bound and upper bound, inclusive.
- lowerBound
a constant of the same type
- upperBound
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
bitwiseAND[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Bitwise AND this expression and another expression.
Bitwise AND this expression and another expression.
df.select(df.col('colA) bitwiseAND (df.col('colB)))
- Definition Classes
- AbstractTypedColumn
-
def
bitwiseAND(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]
Bitwise AND this expression and another expression.
Bitwise AND this expression and another expression.
df.select(df.col('colA) bitwiseAND (df.col('colB)))
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
bitwiseOR[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Bitwise OR this expression and another expression.
Bitwise OR this expression and another expression.
df.select(df.col('colA) bitwiseOR (df.col('colB)))
- other
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
bitwiseOR(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]
Bitwise OR this expression and another expression.
Bitwise OR this expression and another expression.
df.select(df.col('colA) bitwiseOR (df.col('colB)))
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
bitwiseXOR[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Bitwise XOR this expression and another expression.
Bitwise XOR this expression and another expression.
df.select(df.col('colA) bitwiseXOR (df.col('colB)))
- other
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
bitwiseXOR(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]
Bitwise XOR this expression and another expression.
Bitwise XOR this expression and another expression.
df.select(df.col('colA) bitwiseXOR (df.col('colB)))
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
cast[A](implicit arg0: TypedEncoder[A], c: CatalystCast[U, A]): ThisType[T, A]
Casts the column to a different type.
Casts the column to a different type.
df.select(df('a).cast[Int])
- Definition Classes
- AbstractTypedColumn
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
contains[TT, W](other: ThisType[TT, U])(implicit ev: =:=[U, String], w: With.Aux[T, TT, W]): ThisType[W, Boolean]
String contains.
String contains.
df.filter ( df.col('a).contains(df.col('b) )
- other
a column which values is used as a string that is being tested against. apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
contains(other: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]
String contains another string literal.
String contains another string literal.
df.filter ( df.col('a).contains("foo") )
- other
a string that is being tested against. apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
desc(implicit catalystOrdered: CatalystOrdered[U]): SortedTypedColumn[T, U]
Returns a descending ordering used in sorting
Returns a descending ordering used in sorting
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
divide[Out, TT, W](other: ThisType[TT, U])(implicit arg0: TypedEncoder[Out], n: CatalystDivisible[U, Out], w: With.Aux[T, TT, W]): ThisType[W, Out]
Division this expression by another expression.
Division this expression by another expression.
// The following divides a person's height by their weight. people.select( people('height) / people('weight) )
- other
another column of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
endsWith[TT, W](other: ThisType[TT, U])(implicit ev: =:=[U, String], w: With.Aux[T, TT, W]): ThisType[W, Boolean]
String ends with.
String ends with.
df.filter ( df.col('a).endsWith(df.col('b))
- other
a column which values is used as a suffix that is being tested against. apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
endsWith(other: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]
String ends with another string literal.
String ends with another string literal.
df.filter ( df.col('a).endsWith("foo")
- other
a suffix that is being tested against. apache/spark
- Definition Classes
- AbstractTypedColumn
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
val
expr: Expression
- Definition Classes
- AbstractTypedColumn → UntypedExpression
-
def
field[V](symbol: Lt[Symbol])(implicit i0: Exists[U, (symbol)#T, V], i1: TypedEncoder[V]): ThisType[T, V]
Returns a nested column matching the field
symbol
.Returns a nested column matching the field
symbol
.- V
the type of the nested field
- symbol
the field symbol
- Definition Classes
- AbstractTypedColumn
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
getOrElse[Out](default: Out)(implicit arg0: TypedEncoder[Out], i0: =:=[U, Option[Out]]): ThisType[T, Out]
Convert an Optional column by providing a default value.
Convert an Optional column by providing a default value.
df( df('opt).getOrElse(defaultConstant) )
- Definition Classes
- AbstractTypedColumn
-
def
getOrElse[TT, W, Out](default: ThisType[TT, Out])(implicit i0: =:=[U, Option[Out]], i1: With.Aux[T, TT, W]): ThisType[W, Out]
Convert an Optional column by providing a default value.
Convert an Optional column by providing a default value.
df(df('opt).getOrElse(df('defaultValue)))
- Definition Classes
- AbstractTypedColumn
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
isNaN(implicit n: CatalystNaN[U]): ThisType[T, Boolean]
True if the current expression is a fractional number and is not NaN.
True if the current expression is a fractional number and is not NaN.
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
isNone(implicit i0: <:<[U, Option[_]]): ThisType[T, Boolean]
True if the current expression is an Option and it's None.
True if the current expression is an Option and it's None.
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
isNotNone(implicit i0: <:<[U, Option[_]]): ThisType[T, Boolean]
True if the current expression is an Option and it's not None.
True if the current expression is an Option and it's not None.
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
isSome[V](exists: (ThisType[T, V]) ⇒ ThisType[T, Boolean])(implicit i0: <:<[U, Option[V]]): ThisType[T, Boolean]
True if the value for this optional column
exists
as expected (seeOption.exists
).True if the value for this optional column
exists
as expected (seeOption.exists
).df.col('opt).isSome(_ === someOtherCol)
- Definition Classes
- AbstractTypedColumn
-
def
isSomeOrNone[V](exists: (ThisType[T, V]) ⇒ ThisType[T, Boolean])(implicit i0: <:<[U, Option[V]]): ThisType[T, Boolean]
True if the value for this optional column
exists
as expected, or isNone
.True if the value for this optional column
exists
as expected, or isNone
. (seeOption.forall
).df.col('opt).isSomeOrNone(_ === someOtherCol)
- Definition Classes
- AbstractTypedColumn
-
def
isin(values: U*)(implicit e: CatalystIsin[U]): ThisType[T, Boolean]
Returns true if the value of this column is contained in of the arguments.
Returns true if the value of this column is contained in of the arguments.
// The following selects people with age 15, 20, or 30. df.select( df('age).isin(15, 20, 30) )
- values
are constants of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
like(literal: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]
SQL like expression.
SQL like expression. Returns a boolean column based on a SQL LIKE match.
val ds = TypedDataset.create(X2("foo", "bar") :: Nil) // true ds.select(ds('a).like("foo")) // Selected column has value "bar" ds.select(when(ds('a).like("f"), ds('a)).otherwise(ds('b))
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
lit[U1](c: U1)(implicit arg0: TypedEncoder[U1]): TypedAggregate[T, U1]
Creates a typed column of either TypedColumn or TypedAggregate.
Creates a typed column of either TypedColumn or TypedAggregate.
- Definition Classes
- TypedAggregate → AbstractTypedColumn
-
def
minus[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Subtraction.
Subtraction. Subtract the other expression from this expression.
// The following selects the difference between people's height and their weight. people.select( people.col('height) minus people.col('weight) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
mod[Out, TT, W](other: ThisType[TT, U])(implicit arg0: TypedEncoder[Out], n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, Out]
Modulo (a.k.a.
Modulo (a.k.a. remainder) expression.
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
multiply[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W], t: ClassTag[U]): ThisType[W, U]
Multiplication of this expression and another expression.
Multiplication of this expression and another expression.
// The following multiplies a person's height by their weight. people.select( people.col('height) multiply people.col('weight) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
opt[X](implicit x: <:<[U, Option[X]]): Mapper[X]
Makes it easier to work with Optional columns.
Makes it easier to work with Optional columns. It returns an instance of
Mapper[X]
whereX
is type of the unwrapped Optional. E.g., in the case ofOption[Long]
,X
is of type Long.val x: TypedColumn[Option[Int]] = _ x.opt.map(_*2)
- Definition Classes
- AbstractTypedColumn
-
def
or[TT, W](other: ThisType[TT, Boolean])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Boolean OR.
Boolean OR.
df.filter ( (df.col('a) === 1).or(df.col('b) > 5) )
- Definition Classes
- AbstractTypedColumn
-
def
plus[TT, W](other: ThisType[TT, U])(implicit n: CatalystNumeric[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Sum of this expression and another expression.
Sum of this expression and another expression.
// The following selects the sum of a person's height and weight. people.select( people.col('height) plus people.col('weight) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
rlike(literal: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]
SQL RLIKE expression (LIKE with Regex).
SQL RLIKE expression (LIKE with Regex). Returns a boolean column based on a regex match.
val ds = TypedDataset.create(X1("foo") :: Nil) // true ds.select(ds('a).rlike("foo")) // true ds.select(ds('a).rlike(".*))
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
startsWith[TT, W](other: ThisType[TT, U])(implicit ev: =:=[U, String], w: With.Aux[T, TT, W]): ThisType[W, Boolean]
String starts with.
String starts with.
df.filter ( df.col('a).startsWith(df.col('b))
- other
a column which values is used as a prefix that is being tested against. apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
startsWith(other: String)(implicit ev: =:=[U, String]): ThisType[T, Boolean]
String starts with another string literal.
String starts with another string literal.
df.filter ( df.col('a).startsWith("foo")
- other
a prefix that is being tested against. apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
substr[TT1, TT2, W1, W2](startPos: ThisType[TT1, Int], len: ThisType[TT2, Int])(implicit ev: =:=[U, String], w1: With.Aux[T, TT1, W1], w2: With.Aux[W1, TT2, W2]): ThisType[W2, String]
An expression that returns a substring
An expression that returns a substring
df.select(df('a).substr(df('b), df('c)))
- startPos
expression for the starting position
- len
expression for the length of the substring
- Definition Classes
- AbstractTypedColumn
-
def
substr(startPos: Int, len: Int)(implicit ev: =:=[U, String]): ThisType[T, String]
An expression that returns a substring
An expression that returns a substring
df.select(df('a).substr(0, 5))
- startPos
starting position
- len
length of the substring
- Definition Classes
- AbstractTypedColumn
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- UntypedExpression → AnyRef → Any
-
def
typed[W, U1](c: Column)(implicit arg0: TypedEncoder[U1]): TypedAggregate[W, U1]
Creates a typed column of either TypedColumn or TypedAggregate.
Creates a typed column of either TypedColumn or TypedAggregate.
- Definition Classes
- TypedAggregate → AbstractTypedColumn
-
def
typed[W, U1](e: Expression)(implicit arg0: TypedEncoder[U1]): ThisType[W, U1]
Creates a typed column of either TypedColumn or TypedAggregate from an expression.
Creates a typed column of either TypedColumn or TypedAggregate from an expression.
- Attributes
- protected
- Definition Classes
- AbstractTypedColumn
- implicit val uenc: TypedEncoder[U]
-
implicit
val
uencoder: TypedEncoder[U]
- Definition Classes
- AbstractTypedColumn → UntypedExpression
-
def
unary_!(implicit i0: <:<[U, Boolean]): ThisType[T, Boolean]
Inversion of boolean expression, i.e.
Inversion of boolean expression, i.e. NOT.
// Select rows that are not active (isActive === false) df.filter( !df('isActive) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
unary_-(implicit n: CatalystNumeric[U]): ThisType[T, U]
Unary minus, i.e.
Unary minus, i.e. negate the expression.
// Select the amount column and negates all values. df.select( -df('amount) )
apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
untyped: Column
Fall back to an untyped Column
Fall back to an untyped Column
- Definition Classes
- AbstractTypedColumn
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
|[TT, W](other: ThisType[TT, U])(implicit n: CatalystBitwise[U], w: With.Aux[T, TT, W]): ThisType[W, U]
Bitwise OR this expression and another expression.
Bitwise OR this expression and another expression.
df.select(df.col('colA) | (df.col('colB)))
- other
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
|(u: U)(implicit n: CatalystBitwise[U]): ThisType[T, U]
Bitwise OR this expression and another expression (of same type).
Bitwise OR this expression and another expression (of same type).
df.select(df.col('colA).cast[Long] | 1L)
- u
a constant of the same type apache/spark
- Definition Classes
- AbstractTypedColumn
-
def
||[TT, W](other: ThisType[TT, Boolean])(implicit w: With.Aux[T, TT, W]): ThisType[W, Boolean]
Boolean OR.
Boolean OR.
df.filter ( df.col('a) === 1 || df.col('b) > 5)
- Definition Classes
- AbstractTypedColumn