Object

spark.jobserver

HiveLoaderJob

Related Doc: package jobserver

Permalink

object HiveLoaderJob extends SparkHiveJob

A test job that accepts a HiveContext, as opposed to the regular SparkContext. Initializes some dummy data into a table, reads it back out, and returns a count (Will create Hive metastore at job-server/metastore_db if Hive isn't configured)

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. HiveLoaderJob
  2. SparkHiveJob
  3. SparkJobBase
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. type C = HiveContext

    Permalink
    Definition Classes
    SparkHiveJobSparkJobBase

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  12. val loadPath: String

    Permalink
  13. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  15. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. def runJob(hive: HiveContext, config: Config): Any

    Permalink

    This is the entry point for a Spark Job Server to execute Spark jobs.

    This is the entry point for a Spark Job Server to execute Spark jobs. This function should create or reuse RDDs and return the result at the end, which the Job Server will cache or display.

    returns

    the job result

    Definition Classes
    HiveLoaderJobSparkJobBase
  17. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  18. val tableArgs: String

    Permalink
  19. val tableAs: String

    Permalink
  20. val tableColFormat: String

    Permalink
  21. val tableCreate: String

    Permalink
  22. val tableMapFormat: String

    Permalink
  23. val tableRowFormat: String

    Permalink
  24. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  25. def validate(hive: HiveContext, config: Config): SparkJobValidation

    Permalink

    This method is called by the job server to allow jobs to validate their input and reject invalid job requests.

    This method is called by the job server to allow jobs to validate their input and reject invalid job requests. If SparkJobInvalid is returned, then the job server returns 400 to the user. NOTE: this method should return very quickly. If it responds slowly then the job server may time out trying to start this job.

    returns

    either SparkJobValid or SparkJobInvalid

    Definition Classes
    HiveLoaderJobSparkJobBase
  26. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from SparkHiveJob

Inherited from SparkJobBase

Inherited from AnyRef

Inherited from Any

Ungrouped