org.apache.spark.sql.crossdata

XDDataFrame

class XDDataFrame extends DataFrame with SparkLoggerComponent

Extends a DataFrame to provide native access to datasources when performing Spark actions.

Linear Supertypes
SparkLoggerComponent, Logging, LoggerComponent, DataFrame, Serializable, Serializable, Queryable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. XDDataFrame
  2. SparkLoggerComponent
  3. Logging
  4. LoggerComponent
  5. DataFrame
  6. Serializable
  7. Serializable
  8. Queryable
  9. AnyRef
  10. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new XDDataFrame(sqlContext: SQLContext, logicalPlan: LogicalPlan)

Type Members

  1. trait Logger extends AnyRef

    Definition Classes
    LoggerComponent
  2. class SparkLogger extends com.stratio.common.utils.components.logger.impl.SparkLoggerComponent.Logger

    Definition Classes
    SparkLoggerComponent

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def agg(expr: Column, exprs: Column*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  7. def agg(exprs: Map[String, String]): DataFrame

    Definition Classes
    DataFrame
  8. def agg(exprs: Map[String, String]): DataFrame

    Definition Classes
    DataFrame
  9. def agg(aggExpr: (String, String), aggExprs: (String, String)*): DataFrame

    Definition Classes
    DataFrame
  10. def alias(alias: Symbol): DataFrame

    Definition Classes
    DataFrame
  11. def alias(alias: String): DataFrame

    Definition Classes
    DataFrame
  12. def apply(colName: String): Column

    Definition Classes
    DataFrame
  13. def as(alias: Symbol): DataFrame

    Definition Classes
    DataFrame
  14. def as(alias: String): DataFrame

    Definition Classes
    DataFrame
  15. def as[U](implicit arg0: Encoder[U]): Dataset[U]

    Definition Classes
    DataFrame
    Annotations
    @Experimental()
  16. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  17. def cache(): XDDataFrame.this.type

    Definition Classes
    DataFrame
  18. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. def coalesce(numPartitions: Int): DataFrame

    Definition Classes
    DataFrame
  20. def col(colName: String): Column

    Definition Classes
    DataFrame
  21. def collect(executionType: ExecutionType): Array[Row]

    Collect using an specific ExecutionType.

    Collect using an specific ExecutionType. Only for testing purpose so far. When using the Security Manager, this method has to be invoked with the parameter ExecutionType.Default in order to ensure that the workflow of the execution reaches the point where the authorization is called.

    executionType

    one of the ExecutionType

    returns

    the query result

    Annotations
    @DeveloperApi()
  22. def collect(): Array[Row]

    <invalid inheritdoc annotation>

    <invalid inheritdoc annotation>

    Definition Classes
    XDDataFrame → DataFrame
  23. def collectAsList(): List[Row]

    <invalid inheritdoc annotation>

    <invalid inheritdoc annotation>

    Definition Classes
    XDDataFrame → DataFrame
  24. def collectToPython(): Int

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    DataFrame
  25. def columns: Array[String]

    Definition Classes
    DataFrame
  26. def count(): Long

    <invalid inheritdoc annotation>

    <invalid inheritdoc annotation>

    Definition Classes
    XDDataFrame → DataFrame
  27. def cube(col1: String, cols: String*): GroupedData

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  28. def cube(cols: Column*): GroupedData

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  29. def describe(cols: String*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  30. def distinct(): DataFrame

    Definition Classes
    DataFrame
  31. def drop(col: Column): DataFrame

    Definition Classes
    DataFrame
  32. def drop(colName: String): DataFrame

    Definition Classes
    DataFrame
  33. def dropDuplicates(colNames: Array[String]): DataFrame

    Definition Classes
    DataFrame
  34. def dropDuplicates(colNames: Seq[String]): DataFrame

    Definition Classes
    DataFrame
  35. def dropDuplicates(): DataFrame

    Definition Classes
    DataFrame
  36. def dtypes: Array[(String, String)]

    Definition Classes
    DataFrame
  37. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  38. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  39. def except(other: DataFrame): DataFrame

    Definition Classes
    DataFrame
  40. def explain(): Unit

    Definition Classes
    DataFrame → Queryable
  41. def explain(extended: Boolean): Unit

    Definition Classes
    DataFrame → Queryable
  42. def explode[A, B](inputColumn: String, outputColumn: String)(f: (A) ⇒ TraversableOnce[B])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[B]): DataFrame

    Definition Classes
    DataFrame
  43. def explode[A <: Product](input: Column*)(f: (Row) ⇒ TraversableOnce[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    DataFrame
  44. def filter(conditionExpr: String): DataFrame

    Definition Classes
    DataFrame
  45. def filter(condition: Column): DataFrame

    Definition Classes
    DataFrame
  46. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  47. def first(): Row

    Definition Classes
    DataFrame
  48. def flatMap[R](f: (Row) ⇒ TraversableOnce[R])(implicit arg0: ClassTag[R]): RDD[R]

    Definition Classes
    DataFrame
  49. def flattenedCollect(): Array[Row]

  50. def foreach(f: (Row) ⇒ Unit): Unit

    Definition Classes
    DataFrame
  51. def foreachPartition(f: (Iterator[Row]) ⇒ Unit): Unit

    Definition Classes
    DataFrame
  52. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  53. def groupBy(col1: String, cols: String*): GroupedData

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  54. def groupBy(cols: Column*): GroupedData

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  55. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  56. def head(): Row

    Definition Classes
    DataFrame
  57. def head(n: Int): Array[Row]

    Definition Classes
    DataFrame
  58. def inputFiles: Array[String]

    Definition Classes
    DataFrame
  59. def intersect(other: DataFrame): DataFrame

    Definition Classes
    DataFrame
  60. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  61. def isLocal: Boolean

    Definition Classes
    DataFrame
  62. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  63. def javaRDD: JavaRDD[Row]

    Definition Classes
    DataFrame
  64. def javaToPython: JavaRDD[Array[Byte]]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    DataFrame
  65. def join(right: DataFrame, joinExprs: Column, joinType: String): DataFrame

    Definition Classes
    DataFrame
  66. def join(right: DataFrame, joinExprs: Column): DataFrame

    Definition Classes
    DataFrame
  67. def join(right: DataFrame, usingColumns: Seq[String], joinType: String): DataFrame

    Definition Classes
    DataFrame
  68. def join(right: DataFrame, usingColumns: Seq[String]): DataFrame

    Definition Classes
    DataFrame
  69. def join(right: DataFrame, usingColumn: String): DataFrame

    Definition Classes
    DataFrame
  70. def join(right: DataFrame): DataFrame

    Definition Classes
    DataFrame
  71. def limit(n: Int): DataFrame

    <invalid inheritdoc annotation>

    <invalid inheritdoc annotation>

    Definition Classes
    XDDataFrame → DataFrame
  72. def log: slf4j.Logger

    Attributes
    protected
    Definition Classes
    Logging
  73. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  74. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  75. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  76. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  77. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  78. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  79. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  80. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  81. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  82. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  83. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  84. val logger: Logger

    Definition Classes
    SparkLoggerComponent → LoggerComponent
  85. val logicalPlan: LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    DataFrame
  86. def map[R](f: (Row) ⇒ R)(implicit arg0: ClassTag[R]): RDD[R]

    Definition Classes
    DataFrame
  87. def mapPartitions[R](f: (Iterator[Row]) ⇒ Iterator[R])(implicit arg0: ClassTag[R]): RDD[R]

    Definition Classes
    DataFrame
  88. def na: DataFrameNaFunctions

    Definition Classes
    DataFrame
  89. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  90. final def notify(): Unit

    Definition Classes
    AnyRef
  91. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  92. def numericColumns: Seq[Expression]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    DataFrame
  93. def orderBy(sortExprs: Column*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  94. def orderBy(sortCol: String, sortCols: String*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  95. def persist(newLevel: StorageLevel): XDDataFrame.this.type

    Definition Classes
    DataFrame
  96. def persist(): XDDataFrame.this.type

    Definition Classes
    DataFrame
  97. def printSchema(): Unit

    Definition Classes
    DataFrame → Queryable
  98. val queryExecution: QueryExecution

    Definition Classes
    XDDataFrame → DataFrame → Queryable
  99. def randomSplit(weights: Array[Double]): Array[DataFrame]

    Definition Classes
    DataFrame
  100. def randomSplit(weights: Array[Double], seed: Long): Array[DataFrame]

    Definition Classes
    DataFrame
  101. lazy val rdd: RDD[Row]

    Definition Classes
    DataFrame
  102. def registerTempTable(tableName: String): Unit

    Definition Classes
    DataFrame
  103. def repartition(partitionExprs: Column*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  104. def repartition(numPartitions: Int, partitionExprs: Column*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  105. def repartition(numPartitions: Int): DataFrame

    Definition Classes
    DataFrame
  106. def resolve(colName: String): NamedExpression

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    DataFrame
  107. def rollup(col1: String, cols: String*): GroupedData

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  108. def rollup(cols: Column*): GroupedData

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  109. def sample(withReplacement: Boolean, fraction: Double): DataFrame

    Definition Classes
    DataFrame
  110. def sample(withReplacement: Boolean, fraction: Double, seed: Long): DataFrame

    Definition Classes
    DataFrame
  111. def schema: StructType

    Definition Classes
    DataFrame → Queryable
  112. def select(col: String, cols: String*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  113. def select(cols: Column*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  114. def selectExpr(exprs: String*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  115. def show(numRows: Int, truncate: Boolean): Unit

    Definition Classes
    DataFrame
  116. def show(truncate: Boolean): Unit

    Definition Classes
    DataFrame
  117. def show(): Unit

    Definition Classes
    DataFrame
  118. def show(numRows: Int): Unit

    Definition Classes
    DataFrame
  119. def sort(sortExprs: Column*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  120. def sort(sortCol: String, sortCols: String*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  121. def sortWithinPartitions(sortExprs: Column*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  122. def sortWithinPartitions(sortCol: String, sortCols: String*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  123. val sqlContext: SQLContext

    Definition Classes
    XDDataFrame → DataFrame → Queryable
  124. def stat: DataFrameStatFunctions

    Definition Classes
    DataFrame
  125. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  126. def take(n: Int): Array[Row]

    Definition Classes
    DataFrame
  127. def takeAsList(n: Int): List[Row]

    Definition Classes
    DataFrame
  128. def toDF(colNames: String*): DataFrame

    Definition Classes
    DataFrame
    Annotations
    @varargs()
  129. def toDF(): DataFrame

    Definition Classes
    DataFrame
  130. def toJSON: RDD[String]

    Definition Classes
    DataFrame
  131. def toJavaRDD: JavaRDD[Row]

    Definition Classes
    DataFrame
  132. def toString(): String

    Definition Classes
    Queryable → AnyRef → Any
  133. def transform[U](t: (DataFrame) ⇒ DataFrame): DataFrame

    Definition Classes
    DataFrame
  134. def unionAll(other: DataFrame): DataFrame

    Definition Classes
    DataFrame
  135. def unpersist(): XDDataFrame.this.type

    Definition Classes
    DataFrame
  136. def unpersist(blocking: Boolean): XDDataFrame.this.type

    Definition Classes
    DataFrame
  137. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  138. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  139. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  140. def where(conditionExpr: String): DataFrame

    Definition Classes
    DataFrame
  141. def where(condition: Column): DataFrame

    Definition Classes
    DataFrame
  142. def withColumn(colName: String, col: Column): DataFrame

    Definition Classes
    DataFrame
  143. def withColumnRenamed(existingName: String, newName: String): DataFrame

    Definition Classes
    DataFrame
  144. def write: DataFrameWriter

    Definition Classes
    DataFrame
    Annotations
    @Experimental()

Deprecated Value Members

  1. def createJDBCTable(url: String, table: String, allowExisting: Boolean): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.jdbc(). This will be removed in Spark 2.0.

  2. def insertInto(tableName: String): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.mode(SaveMode.Append).saveAsTable(tableName). This will be removed in Spark 2.0.

  3. def insertInto(tableName: String, overwrite: Boolean): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.mode(SaveMode.Append|SaveMode.Overwrite).saveAsTable(tableName). This will be removed in Spark 2.0.

  4. def insertIntoJDBC(url: String, table: String, overwrite: Boolean): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.jdbc(). This will be removed in Spark 2.0.

  5. def save(source: String, mode: SaveMode, options: Map[String, String]): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.format(source).mode(mode).options(options).save(). This will be removed in Spark 2.0.

  6. def save(source: String, mode: SaveMode, options: Map[String, String]): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.format(source).mode(mode).options(options).save(). This will be removed in Spark 2.0.

  7. def save(path: String, source: String, mode: SaveMode): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.format(source).mode(mode).save(path). This will be removed in Spark 2.0.

  8. def save(path: String, source: String): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.format(source).save(path). This will be removed in Spark 2.0.

  9. def save(path: String, mode: SaveMode): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.mode(mode).save(path). This will be removed in Spark 2.0.

  10. def save(path: String): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.save(path). This will be removed in Spark 2.0.

  11. def saveAsParquetFile(path: String): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.parquet(path). This will be removed in Spark 2.0.

  12. def saveAsTable(tableName: String, source: String, mode: SaveMode, options: Map[String, String]): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.format(source).mode(mode).options(options).saveAsTable(tableName). This will be removed in Spark 2.0.

  13. def saveAsTable(tableName: String, source: String, mode: SaveMode, options: Map[String, String]): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.format(source).mode(mode).options(options).saveAsTable(tableName). This will be removed in Spark 2.0.

  14. def saveAsTable(tableName: String, source: String, mode: SaveMode): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.format(source).mode(mode).saveAsTable(tableName). This will be removed in Spark 2.0.

  15. def saveAsTable(tableName: String, source: String): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.format(source).saveAsTable(tableName). This will be removed in Spark 2.0.

  16. def saveAsTable(tableName: String, mode: SaveMode): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.mode(mode).saveAsTable(tableName). This will be removed in Spark 2.0.

  17. def saveAsTable(tableName: String): Unit

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use write.saveAsTable(tableName). This will be removed in Spark 2.0.

  18. def toSchemaRDD: DataFrame

    Definition Classes
    DataFrame
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use toDF. This will be removed in Spark 2.0.

Inherited from SparkLoggerComponent

Inherited from Logging

Inherited from LoggerComponent

Inherited from DataFrame

Inherited from Serializable

Inherited from Serializable

Inherited from Queryable

Inherited from AnyRef

Inherited from Any

Ungrouped