ammonite.spark

Spark

class Spark extends Serializable

The spark entry point from an Ammonite session

Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. Spark
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Spark()(implicit interpreter: Interpreter, classpath: Classpath)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  15. final def notify(): Unit

    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  17. def sc: SparkContext

    The SparkContext associated to this handle.

    The SparkContext associated to this handle.

    Lazily initialized on first call.

    Its config can be customized prior to its initialization through sparkConf or with the withConf method.

    Its launch triggers the launch of a web server that serves the REPL build products.

    Gets automatically stopped upon host interpreter stopping. Can also be stopped manually with the stop method.

    If stopped through the stop method, calling sc again will trigger the creation of a new SparkContext.

  18. def setConfDefaults(conf: SparkConf): Unit

    Called before creation of the SparkContext to setup the SparkConf.

  19. def sparkConf: SparkConf

    The SparkConf associated to this handle

  20. def sparkContext: SparkContext

    Alias for sc

  21. lazy val sqlContext: SQLContext

    SparkSQL context associated to this handle.

    SparkSQL context associated to this handle. Lazily initialized on first call.

  22. def start(): Unit

    Triggers the initialization of the SparkContext, if not already started.

  23. def stop(): Unit

    Stops the SparkContext associated to this handle.

    Stops the SparkContext associated to this handle. The context previously returned should not be considered valid after that. The web server launched along with the context will be stopped too.

    Calling sc again will trigger the creation of a new SparkContext

  24. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  25. def toString(): String

    Definition Classes
    Spark → AnyRef → Any
  26. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. def withConf(f: (SparkConf) ⇒ SparkConf): Unit

    Helper function to add custom settings to the SparkConf associated to this handle

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped