c

io.smartdatalake.definitions

SparkStreamingMode

case class SparkStreamingMode(checkpointLocation: String, triggerType: String = "Once", triggerTime: Option[String] = None, inputOptions: Map[String, String] = Map(), outputOptions: Map[String, String] = Map(), outputMode: OutputMode = OutputMode.Append) extends ExecutionMode with Product with Serializable

Spark streaming execution mode uses Spark Structured Streaming to incrementally execute data loads and keep track of processed data. This mode needs a DataObject implementing CanCreateStreamingDataFrame and works only with SparkSubFeeds. This mode can be executed synchronously in the DAG by using triggerType=Once, or asynchronously as Streaming Query with triggerType = ProcessingTime or Continuous.

checkpointLocation

location for checkpoints of streaming query to keep state

triggerType

define execution interval of Spark streaming query. Possible values are Once (default), ProcessingTime & Continuous. See Trigger for details. Note that this is only applied if SDL is executed in streaming mode. If SDL is executed in normal mode, TriggerType=Once is used always. If triggerType=Once, the action is repeated with Trigger.Once in SDL streaming mode.

triggerTime

Time as String in triggerType = ProcessingTime or Continuous. See Trigger for details.

inputOptions

additional option to apply when reading streaming source. This overwrites options set by the DataObjects.

outputOptions

additional option to apply when writing to streaming sink. This overwrites options set by the DataObjects.

Linear Supertypes
Serializable, Serializable, Product, Equals, ExecutionMode, SmartDataLakeLogger, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkStreamingMode
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. ExecutionMode
  7. SmartDataLakeLogger
  8. AnyRef
  9. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkStreamingMode(checkpointLocation: String, triggerType: String = "Once", triggerTime: Option[String] = None, inputOptions: Map[String, String] = Map(), outputOptions: Map[String, String] = Map(), outputMode: OutputMode = OutputMode.Append)

    checkpointLocation

    location for checkpoints of streaming query to keep state

    triggerType

    define execution interval of Spark streaming query. Possible values are Once (default), ProcessingTime & Continuous. See Trigger for details. Note that this is only applied if SDL is executed in streaming mode. If SDL is executed in normal mode, TriggerType=Once is used always. If triggerType=Once, the action is repeated with Trigger.Once in SDL streaming mode.

    triggerTime

    Time as String in triggerType = ProcessingTime or Continuous. See Trigger for details.

    inputOptions

    additional option to apply when reading streaming source. This overwrites options set by the DataObjects.

    outputOptions

    additional option to apply when writing to streaming sink. This overwrites options set by the DataObjects.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. val checkpointLocation: String
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  7. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  8. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  10. val inputOptions: Map[String, String]
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. lazy val logger: Logger
    Attributes
    protected
    Definition Classes
    SmartDataLakeLogger
    Annotations
    @transient()
  13. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  14. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  15. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  16. val outputMode: OutputMode
  17. val outputOptions: Map[String, String]
  18. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  19. val triggerTime: Option[String]
  20. val triggerType: String
  21. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from ExecutionMode

Inherited from SmartDataLakeLogger

Inherited from AnyRef

Inherited from Any

Ungrouped