com.holdenkarau.spark.testing

JavaDataFrameSuiteBase

class JavaDataFrameSuiteBase extends SharedJavaSparkContext with DataFrameSuiteBaseLike with JavaTestSuite

Linear Supertypes
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. JavaDataFrameSuiteBase
  2. JavaTestSuite
  3. DataFrameSuiteBaseLike
  4. Serializable
  5. Serializable
  6. TestSuiteLike
  7. SharedJavaSparkContext
  8. SparkContextProvider
  9. AnyRef
  10. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new JavaDataFrameSuiteBase()

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def appID(): String

  7. def approxEquals(r1: Row, r2: Row, tol: Double): Boolean

    Definition Classes
    DataFrameSuiteBaseLike
  8. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  9. def assert[U](message: String, expected: U, actual: U)(implicit CT: ClassTag[U]): Unit

    Definition Classes
    JavaTestSuiteTestSuiteLike
  10. def assert[U](expected: U, actual: U)(implicit CT: ClassTag[U]): Unit

    Definition Classes
    JavaTestSuiteTestSuiteLike
  11. def assertDataFrameApproximateEquals(expected: DataFrame, result: DataFrame, tol: Double): Unit

    Compares if two DataFrames are equal, checks that the schemas are the same.

    Compares if two DataFrames are equal, checks that the schemas are the same. When comparing inexact fields uses tol.

    tol

    max acceptable tolerance, should be less than 1.

    Definition Classes
    DataFrameSuiteBaseLike
  12. def assertDataFrameEquals(expected: DataFrame, result: DataFrame): Unit

    Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.

    Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.

    Definition Classes
    DataFrameSuiteBaseLike
  13. def assertEmpty[U](arr: Array[U])(implicit CT: ClassTag[U]): Unit

    Definition Classes
    JavaTestSuiteTestSuiteLike
  14. def assertTrue(expected: Boolean): Unit

    Definition Classes
    JavaTestSuiteTestSuiteLike
  15. def beforeAllTestCasesHook(): Unit

  16. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  17. def conf(): SparkConf

  18. implicit def enableHiveSupport: Boolean

    Attributes
    protected
    Definition Classes
    DataFrameSuiteBaseLike
  19. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  20. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  21. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  22. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  23. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  24. implicit def impSqlContext: SQLContext

    Attributes
    protected
    Definition Classes
    DataFrameSuiteBaseLike
  25. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  26. def jsc(): JavaSparkContext

    Definition Classes
    SharedJavaSparkContext
  27. val maxUnequalRowsToShow: Int

    Definition Classes
    DataFrameSuiteBaseLike
  28. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  29. final def notify(): Unit

    Definition Classes
    AnyRef
  30. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  31. def runBefore(): Unit

    Definition Classes
    SharedJavaSparkContext
  32. def sc(): SparkContext

  33. def setup(sc: SparkContext): Unit

    Setup work to be called when creating a new SparkContext.

    Setup work to be called when creating a new SparkContext. Default implementation currently sets a checkpoint directory.

    This _should_ be called by the context provider automatically.

    Definition Classes
    SharedJavaSparkContextSparkContextProvider
  34. lazy val spark: SparkSession

    Definition Classes
    DataFrameSuiteBaseLike
  35. def sqlBeforeAllTestCases(): Unit

    Definition Classes
    DataFrameSuiteBaseLike
  36. lazy val sqlContext: SQLContext

    Definition Classes
    DataFrameSuiteBaseLike
  37. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  38. def toString(): String

    Definition Classes
    AnyRef → Any
  39. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from JavaTestSuite

Inherited from DataFrameSuiteBaseLike

Inherited from Serializable

Inherited from Serializable

Inherited from TestSuiteLike

Inherited from SharedJavaSparkContext

Inherited from SparkContextProvider

Inherited from AnyRef

Inherited from Any

Ungrouped