Class/Object

polynote.kernel.interpreter.python

PySparkInterpreter

Related Docs: object PySparkInterpreter | package python

Permalink

class PySparkInterpreter extends PythonInterpreter

Linear Supertypes
PythonInterpreter, Interpreter, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. PySparkInterpreter
  2. PythonInterpreter
  3. Interpreter
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new PySparkInterpreter(_compiler: ScalaCompiler, jepInstance: Jep, jepExecutor: Executor, jepThread: AtomicReference[Thread], jepBlockingService: Blocking, runtime: Runtime[Any], pyApi: PythonAPI, venvPath: Option[Path])

    Permalink

Type Members

  1. case class PythonState extends State with Product with Serializable

    Permalink
    Definition Classes
    PythonInterpreter

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def compile(parsed: PyObject, cell: String): Task[PyObject]

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  7. val compiler: ScalaCompiler

    Permalink
    Definition Classes
    PythonInterpreter
  8. def completionsAt(code: String, pos: Int, state: State): Task[List[Completion]]

    Permalink
    Definition Classes
    PythonInterpreter → Interpreter
  9. def convertFromPython(jep: Jep): PartialFunction[(String, PyObject), (scala.tools.nsc.interactive.Global.Type, Any)]

    Permalink
    Attributes
    protected
    Definition Classes
    PySparkInterpreter → PythonInterpreter
  10. def convertToPython(jep: Jep): PartialFunction[(String, Any), AnyRef]

    Permalink
    Attributes
    protected
    Definition Classes
    PySparkInterpreter → PythonInterpreter
  11. def defaultConvertToPython(nv: (String, Any)): AnyRef

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  12. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  14. def errorCause(get: PyCallable): Option[Throwable]

    Permalink
    Attributes
    protected
    Definition Classes
    PySparkInterpreter → PythonInterpreter
  15. def eval[T](code: String)(implicit arg0: ClassTag[T]): Task[T]

    Permalink
    Attributes
    protected[polynote.kernel.interpreter.python]
    Definition Classes
    PythonInterpreter
  16. def exec(code: String): Task[Unit]

    Permalink
    Attributes
    protected[polynote.kernel.interpreter.python]
    Definition Classes
    PythonInterpreter
  17. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  18. val gatewayRef: AtomicReference[GatewayServer]

    Permalink
  19. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  20. def getValue(name: String): Task[PyObject]

    Permalink
    Attributes
    protected[polynote.kernel.interpreter.python]
    Definition Classes
    PythonInterpreter
  21. def handlePyError(get: PyCallable, trace: ArrayList[AnyRef]): Throwable

    Permalink
    Definition Classes
    PythonInterpreter
  22. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  23. def importFutureAnnotations: String

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  24. def init(state: State): RIO[InterpreterEnv, State]

    Permalink
    Definition Classes
    PySparkInterpreter → PythonInterpreter → Interpreter
  25. def injectGlobals(globals: PyObject): RIO[CurrentRuntime, Unit]

    Permalink
    Attributes
    protected
    Definition Classes
    PySparkInterpreter → PythonInterpreter
  26. def isFutureAnnotationsSupported: Task[Boolean]

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  27. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  28. def jep[T](fn: (Jep) ⇒ T): Task[T]

    Permalink
    Attributes
    protected[polynote.kernel.interpreter.python]
    Definition Classes
    PythonInterpreter
  29. def matplotlib: String

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  30. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  31. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  32. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  33. def parametersAt(code: String, pos: Int, state: State): Task[Option[Signatures]]

    Permalink
    Definition Classes
    PythonInterpreter → Interpreter
  34. def parse(code: String, cell: String): Task[PyObject]

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  35. def populateGlobals(state: State): Task[PyObject]

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  36. def pysparkImports: String

    Permalink

    Handle setting up PySpark.

    Handle setting up PySpark.

    First, we need to pick the python interpreter. Unfortunately this means we need to re-implement Spark's interpreter configuration logic, because that's only implemented inside SparkSubmit (and only when you use pyspark-shell actually).

    Here's the order we follow for the driver python executable (from org.apache.spark.launcher.SparkSubmitCommandBuilder):

    1. conf spark.pyspark.driver.python 2. conf spark.pyspark.python 3. environment variable PYSPARK_DRIVER_PYTHON 4. environment variable PYSPARK_PYTHON

    For the executors we just omit the driver python - so it's just:

    1. conf spark.pyspark.python 2. environment variable PYSPARK_PYTHON

    Additionally, to load pyspark itself we try to grab the its location from the Spark distribution. This ensures that all the versions match up.

    WARNING: Using pyspark from pip install pyspark, could break things - don't use it!

    Attributes
    protected
  37. def run(compiled: PyObject, globals: PyObject, state: State): RIO[CurrentRuntime, State]

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  38. def run(code: String, state: State): RIO[InterpreterEnv, State]

    Permalink
    Definition Classes
    PythonInterpreter → Interpreter
  39. def setValue(name: String, value: AnyRef): Task[Unit]

    Permalink
    Attributes
    protected[polynote.kernel.interpreter.python]
    Definition Classes
    PythonInterpreter
  40. def setup(isFutureAnnotationsSupported: Boolean): String

    Permalink
    Attributes
    protected
    Definition Classes
    PythonInterpreter
  41. def shutdown(): Task[Unit]

    Permalink
    Definition Classes
    PythonInterpreter → Interpreter
  42. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  43. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  44. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  45. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  46. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from PythonInterpreter

Inherited from Interpreter

Inherited from AnyRef

Inherited from Any

Ungrouped