ammonite

Spark

class Spark extends Serializable

The spark entry point from an Ammonite session

Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. Spark
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Spark(ttl: Duration = Spark.defaultTtl, yarnVersion: String = "2.6.0")(implicit interpreter: Interpreter, classpath: Classpath)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def cancelTtl(): Unit

  8. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  9. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  13. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  14. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  15. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  16. final def notify(): Unit

    Definition Classes
    AnyRef
  17. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  18. def sc: SparkContext

    The SparkContext associated to this handle.

    The SparkContext associated to this handle.

    Lazily initialized on first call.

    Its config can be customized prior to its initialization through sparkConf or with the withConf method.

    Its launch triggers the launch of a web server that serves the REPL build products.

    Gets automatically stopped upon host interpreter stopping. Can also be stopped manually with the stop method.

    If stopped through the stop method, calling sc again will trigger the creation of a new SparkContext.

  19. def setConfDefaults(conf: SparkConf): Unit

    Called before creation of the SparkContext to setup the SparkConf.

  20. def sparkConf: SparkConf

    The SparkConf associated to this handle

  21. def sparkContext: SparkContext

    Alias for sc

  22. lazy val sqlContext: SQLContext

    SparkSQL context associated to this handle.

    SparkSQL context associated to this handle. Lazily initialized on first call.

  23. def start(): Unit

    Triggers the initialization of the SparkContext, if not already started.

  24. def stop(): Unit

    Stops the SparkContext associated to this handle.

    Stops the SparkContext associated to this handle. The context previously returned should not be considered valid after that. The web server launched along with the context will be stopped too.

    Calling sc again will trigger the creation of a new SparkContext

  25. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  26. def toString(): String

    Definition Classes
    Spark → AnyRef → Any
  27. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. def withConf(f: (SparkConf) ⇒ SparkConf): Unit

    Helper function to add custom settings to the SparkConf associated to this handle

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped