Class

com.holdenkarau.spark.testing

JavaDataFrameSuiteBase

Related Doc: package testing

Permalink

class JavaDataFrameSuiteBase extends SharedJavaSparkContext with DataFrameSuiteBaseLike with JavaTestSuite

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. JavaDataFrameSuiteBase
  2. JavaTestSuite
  3. DataFrameSuiteBaseLike
  4. Serializable
  5. Serializable
  6. TestSuiteLike
  7. SharedJavaSparkContext
  8. SparkContextProvider
  9. AnyRef
  10. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new JavaDataFrameSuiteBase()

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. def appID(): String

    Permalink
  5. def approxEquals(r1: Row, r2: Row, tol: Double): Boolean

    Permalink
    Definition Classes
    DataFrameSuiteBaseLike
  6. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  7. def assert[U](message: String, expected: U, actual: U)(implicit CT: ClassTag[U]): Unit

    Permalink
    Definition Classes
    JavaTestSuiteTestSuiteLike
  8. def assert[U](expected: U, actual: U)(implicit CT: ClassTag[U]): Unit

    Permalink
    Definition Classes
    JavaTestSuiteTestSuiteLike
  9. def assertDataFrameApproximateEquals(expected: DataFrame, result: DataFrame, tol: Double): Unit

    Permalink

    Compares if two DataFrames are equal, checks that the schemas are the same.

    Compares if two DataFrames are equal, checks that the schemas are the same. When comparing inexact fields uses tol.

    tol

    max acceptable tolerance, should be less than 1.

    Definition Classes
    DataFrameSuiteBaseLike
  10. def assertDataFrameEquals(expected: DataFrame, result: DataFrame): Unit

    Permalink

    Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.

    Compares if two DataFrames are equal, checks the schema and then if that matches checks if the rows are equal.

    Definition Classes
    DataFrameSuiteBaseLike
  11. def assertEmpty[U](arr: Array[U])(implicit CT: ClassTag[U]): Unit

    Permalink
    Definition Classes
    JavaTestSuiteTestSuiteLike
  12. def assertTrue(expected: Boolean): Unit

    Permalink
    Definition Classes
    JavaTestSuiteTestSuiteLike
  13. def beforeAllTestCasesHook(): Unit

    Permalink
  14. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  15. def conf(): SparkConf

    Permalink
  16. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  18. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  19. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  20. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  21. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  22. def jsc(): JavaSparkContext

    Permalink
    Definition Classes
    SharedJavaSparkContext
  23. val maxUnequalRowsToShow: Int

    Permalink
    Definition Classes
    DataFrameSuiteBaseLike
  24. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  25. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  26. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  27. def runBefore(): Unit

    Permalink
    Definition Classes
    SharedJavaSparkContext
  28. def sc(): SparkContext

    Permalink
  29. def setup(sc: SparkContext): Unit

    Permalink

    Setup work to be called when creating a new SparkContext.

    Setup work to be called when creating a new SparkContext. Default implementation currently sets a checkpoint directory.

    This _should_ be called by the context provider automatically.

    Definition Classes
    SharedJavaSparkContextSparkContextProvider
  30. def sqlBeforeAllTestCases(): Unit

    Permalink
    Definition Classes
    DataFrameSuiteBaseLike
  31. lazy val sqlContext: HiveContext

    Permalink
    Definition Classes
    DataFrameSuiteBaseLike
  32. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  33. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  34. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from JavaTestSuite

Inherited from DataFrameSuiteBaseLike

Inherited from Serializable

Inherited from Serializable

Inherited from TestSuiteLike

Inherited from SharedJavaSparkContext

Inherited from SparkContextProvider

Inherited from AnyRef

Inherited from Any

Ungrouped