org.apache.spark.sql.execution.streaming

MemoryStream

case class MemoryStream[A](id: Int, sqlContext: SQLContext)(implicit evidence$2: Encoder[A]) extends Source with Logging with Product with Serializable

A Source that produces value stored in memory as they are added by the user. This Source is primarily intended for use in unit tests as it can only replay data when the object is still available.

Linear Supertypes
Serializable, Serializable, Product, Equals, Logging, Source, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. MemoryStream
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. Logging
  7. Source
  8. AnyRef
  9. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new MemoryStream(id: Int, sqlContext: SQLContext)(implicit arg0: Encoder[A])

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def addData(data: TraversableOnce[A]): Offset

  7. def addData(data: A*): Offset

  8. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  9. val batches: ListBuffer[Dataset[A]]

    All batches from lastCommittedOffset + 1 to currentOffset, inclusive.

    All batches from lastCommittedOffset + 1 to currentOffset, inclusive. Stored in a ListBuffer to facilitate removing committed batches.

    Attributes
    protected
  10. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  11. def commit(end: Offset): Unit

    Informs the source that Spark has completed processing all data for offsets less than or equal to end and will only request offsets greater than end in the future.

    Informs the source that Spark has completed processing all data for offsets less than or equal to end and will only request offsets greater than end in the future.

    Definition Classes
    MemoryStreamSource
  12. var currentOffset: LongOffset

    Attributes
    protected
  13. val encoder: ExpressionEncoder[A]

    Attributes
    protected
  14. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  15. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. def getBatch(start: Option[Offset], end: Offset): DataFrame

    Returns the data that is between the offsets (start, end].

    Returns the data that is between the offsets (start, end]. When start is None, then the batch should begin with the first record. This method must always return the same data for a particular start and end pair; even after the Source has been restarted on a different node.

    Higher layers will always call this method with a value of start greater than or equal to the last value passed to commit and a value of end less than or equal to the last value returned by getOffset

    It is possible for the Offset type to be a SerializedOffset when it was obtained from the log. Moreover, StreamExecution only compares the Offset JSON representation to determine if the two objects are equal. This could have ramifications when upgrading Offset JSON formats i.e., two equivalent Offset objects could differ between version. Consequently, StreamExecution may call this method with two such equivalent Offset objects. In which case, the Source should return an empty DataFrame

    Definition Classes
    MemoryStreamSource
  17. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  18. def getOffset: Option[Offset]

    Returns the maximum available offset for this source.

    Returns the maximum available offset for this source. Returns None if this source has never received any data.

    Definition Classes
    MemoryStreamSource
  19. val id: Int

  20. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Attributes
    protected
    Definition Classes
    Logging
  21. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  22. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  23. var lastOffsetCommitted: LongOffset

    Last offset that was discarded, or -1 if no commits have occurred.

    Last offset that was discarded, or -1 if no commits have occurred. Note that the value -1 is used in calculations below and isn't just an arbitrary constant.

    Attributes
    protected
  24. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  25. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  26. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  27. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  28. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  29. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  30. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  31. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  32. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  33. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  34. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  35. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  36. val logicalPlan: StreamingExecutionRelation

    Attributes
    protected
  37. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  38. final def notify(): Unit

    Definition Classes
    AnyRef
  39. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  40. val output: Seq[Attribute]

    Attributes
    protected
  41. def reset(): Unit

  42. def schema: StructType

    Returns the schema of the data from this source

    Returns the schema of the data from this source

    Definition Classes
    MemoryStreamSource
  43. val sqlContext: SQLContext

  44. def stop(): Unit

    Stop this source and free any resources it has allocated.

    Stop this source and free any resources it has allocated.

    Definition Classes
    MemoryStreamSource
  45. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  46. def toDF(): DataFrame

  47. def toDS(): Dataset[A]

  48. def toString(): String

    Definition Classes
    MemoryStream → AnyRef → Any
  49. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  50. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  51. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from Logging

Inherited from Source

Inherited from AnyRef

Inherited from Any

Ungrouped