case class SparkStreamingMode(checkpointLocation: String, triggerType: String = "Once", triggerTime: Option[String] = None, inputOptions: Map[String, String] = Map(), outputOptions: Map[String, String] = Map(), outputMode: OutputMode = OutputMode.Append) extends ExecutionMode with Product with Serializable
Spark streaming execution mode uses Spark Structured Streaming to incrementally execute data loads and keep track of processed data. This mode needs a DataObject implementing CanCreateStreamingDataFrame and works only with SparkSubFeeds. This mode can be executed synchronously in the DAG by using triggerType=Once, or asynchronously as Streaming Query with triggerType = ProcessingTime or Continuous.
- checkpointLocation
location for checkpoints of streaming query to keep state
- triggerType
define execution interval of Spark streaming query. Possible values are Once (default), ProcessingTime & Continuous. See Trigger for details. Note that this is only applied if SDL is executed in streaming mode. If SDL is executed in normal mode, TriggerType=Once is used always. If triggerType=Once, the action is repeated with Trigger.Once in SDL streaming mode.
- triggerTime
Time as String in triggerType = ProcessingTime or Continuous. See Trigger for details.
- inputOptions
additional option to apply when reading streaming source. This overwrites options set by the DataObjects.
- outputOptions
additional option to apply when writing to streaming sink. This overwrites options set by the DataObjects.
- Alphabetic
- By Inheritance
- SparkStreamingMode
- Serializable
- Serializable
- Product
- Equals
- ExecutionMode
- SmartDataLakeLogger
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
SparkStreamingMode(checkpointLocation: String, triggerType: String = "Once", triggerTime: Option[String] = None, inputOptions: Map[String, String] = Map(), outputOptions: Map[String, String] = Map(), outputMode: OutputMode = OutputMode.Append)
- checkpointLocation
location for checkpoints of streaming query to keep state
- triggerType
define execution interval of Spark streaming query. Possible values are Once (default), ProcessingTime & Continuous. See Trigger for details. Note that this is only applied if SDL is executed in streaming mode. If SDL is executed in normal mode, TriggerType=Once is used always. If triggerType=Once, the action is repeated with Trigger.Once in SDL streaming mode.
- triggerTime
Time as String in triggerType = ProcessingTime or Continuous. See Trigger for details.
- inputOptions
additional option to apply when reading streaming source. This overwrites options set by the DataObjects.
- outputOptions
additional option to apply when writing to streaming sink. This overwrites options set by the DataObjects.
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
- val checkpointLocation: String
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- val inputOptions: Map[String, String]
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
lazy val
logger: Logger
- Attributes
- protected
- Definition Classes
- SmartDataLakeLogger
- Annotations
- @transient()
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- val outputMode: OutputMode
- val outputOptions: Map[String, String]
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
- val triggerTime: Option[String]
- val triggerType: String
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()