Class

spark.jobserver.python

PythonJob

Related Doc: package python

Permalink

case class PythonJob[X <: PythonContextLike](eggPath: String, modulePath: String, py4JImports: Seq[String]) extends api.SparkJobBase with Product with Serializable

Linear Supertypes
Serializable, Serializable, Product, Equals, api.SparkJobBase, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. PythonJob
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. SparkJobBase
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new PythonJob(eggPath: String, modulePath: String, py4JImports: Seq[String])

    Permalink

Type Members

  1. type C = X

    Permalink
    Definition Classes
    PythonJob → SparkJobBase
  2. type JobData = Config

    Permalink
    Definition Classes
    PythonJob → SparkJobBase
  3. type JobOutput = Any

    Permalink
    Definition Classes
    PythonJob → SparkJobBase

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. val eggPath: String

    Permalink
  7. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  10. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  11. val logger: Logger

    Permalink
  12. val modulePath: String

    Permalink
  13. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  15. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. val py4JImports: Seq[String]

    Permalink
  17. def runJob(sc: X, runtime: JobEnvironment, data: Config): Any

    Permalink

    This is the entry point for a Spark Job Server to execute Python jobs.

    This is the entry point for a Spark Job Server to execute Python jobs. It calls a Python subprocess to execute the relevant Python Job class.

    sc

    a SparkContext or similar for the job. May be reused across jobs.

    runtime

    the JobEnvironment containing run time information pertaining to the job and context.

    data

    not used for Python jobs

    returns

    the job result

    Definition Classes
    PythonJob → SparkJobBase
  18. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  19. def validate(sc: X, runtime: JobEnvironment, config: Config): Or[Config, Every[ValidationProblem]]

    Permalink

    To support a useful validate method here for Python jobs we would have call two python processes, one for validate and one for runJob.

    To support a useful validate method here for Python jobs we would have call two python processes, one for validate and one for runJob. However this is inefficient and it would mean having to convert JobData into a Java Object and then back out to a Python Object for runJob.

    So for Python Jobs this simply returns indicating the job is valid. Validation by the underlying Python class will be performed within the subprocess called during runJob.

    sc

    a SparkContext or similar for the job. May be reused across jobs.

    runtime

    the JobEnvironment containing run time information pertaining to the job and context.

    config

    the Typesafe Config object passed into the job request

    returns

    Always returns the jobConfig, so it will be passed on to runJob as the job data.

    Definition Classes
    PythonJob → SparkJobBase
  20. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  21. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from api.SparkJobBase

Inherited from AnyRef

Inherited from Any

Ungrouped