Class/Object

io.smartdatalake.workflow.action.sparktransformer

SQLDfsTransformer

Related Docs: object SQLDfsTransformer | package sparktransformer

Permalink

case class SQLDfsTransformer(name: String = "sqlTransform", description: Option[String] = None, code: Map[DataObjectId, String], options: Map[String, String] = Map(), runtimeOptions: Map[String, String] = Map()) extends OptionsDfsTransformer with Product with Serializable

Configuration of a custom Spark-DataFrame transformation between many inputs and many outputs (n:m) as SQL code. The input data is available as temporary views in SQL. As name for the temporary views the input DataObjectId is used (special characters are replaces by underscores).

name

name of the transformer

description

Optional description of the transformer

code

SQL code for transformation. Use tokens %{<key>} to replace with runtimeOptions in SQL code. Example: "select * from test where run = %{runId}" A special token %{inputViewName} can be used to insert the temporary view name.

options

Options to pass to the transformation

runtimeOptions

optional tuples of [key, spark sql expression] to be added as additional options when executing transformation. The spark sql expressions are evaluated against an instance of DefaultExpressionData.

Linear Supertypes
Serializable, Serializable, Product, Equals, OptionsDfsTransformer, ParsableDfsTransformer, ParsableFromConfig[ParsableDfsTransformer], DfsTransformer, PartitionValueTransformer, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SQLDfsTransformer
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. OptionsDfsTransformer
  7. ParsableDfsTransformer
  8. ParsableFromConfig
  9. DfsTransformer
  10. PartitionValueTransformer
  11. AnyRef
  12. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SQLDfsTransformer(name: String = "sqlTransform", description: Option[String] = None, code: Map[DataObjectId, String], options: Map[String, String] = Map(), runtimeOptions: Map[String, String] = Map())

    Permalink

    name

    name of the transformer

    description

    Optional description of the transformer

    code

    SQL code for transformation. Use tokens %{<key>} to replace with runtimeOptions in SQL code. Example: "select * from test where run = %{runId}" A special token %{inputViewName} can be used to insert the temporary view name.

    options

    Options to pass to the transformation

    runtimeOptions

    optional tuples of [key, spark sql expression] to be added as additional options when executing transformation. The spark sql expressions are evaluated against an instance of DefaultExpressionData.

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. val code: Map[DataObjectId, String]

    Permalink

    SQL code for transformation.

    SQL code for transformation. Use tokens %{<key>} to replace with runtimeOptions in SQL code. Example: "select * from test where run = %{runId}" A special token %{inputViewName} can be used to insert the temporary view name.

  7. val description: Option[String]

    Permalink

    Optional description of the transformer

    Optional description of the transformer

    Definition Classes
    SQLDfsTransformerDfsTransformer
  8. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  9. def factory: FromConfigFactory[ParsableDfsTransformer]

    Permalink

    Returns the factory that can parse this type (that is, type CO).

    Returns the factory that can parse this type (that is, type CO).

    Typically, implementations of this method should return the companion object of the implementing class. The companion object in turn should implement FromConfigFactory.

    returns

    the factory (object) for this class.

    Definition Classes
    SQLDfsTransformer → ParsableFromConfig
  10. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  12. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  13. val name: String

    Permalink

    name of the transformer

    name of the transformer

    Definition Classes
    SQLDfsTransformerDfsTransformer
  14. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  15. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  17. val options: Map[String, String]

    Permalink

    Options to pass to the transformation

    Options to pass to the transformation

    Definition Classes
    SQLDfsTransformerOptionsDfsTransformer
  18. def prepare(actionId: ActionId)(implicit session: SparkSession, context: ActionPipelineContext): Unit

    Permalink

    Optional function to implement validations in prepare phase.

    Optional function to implement validations in prepare phase.

    Definition Classes
    DfsTransformer
  19. val runtimeOptions: Map[String, String]

    Permalink

    optional tuples of [key, spark sql expression] to be added as additional options when executing transformation.

    optional tuples of [key, spark sql expression] to be added as additional options when executing transformation. The spark sql expressions are evaluated against an instance of DefaultExpressionData.

    Definition Classes
    SQLDfsTransformerOptionsDfsTransformer
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  21. def transform(actionId: ActionId, partitionValues: Seq[PartitionValues], dfs: Map[String, DataFrame])(implicit session: SparkSession, context: ActionPipelineContext): Map[String, DataFrame]

    Permalink

    Function to be implemented to define the transformation between many inputs and many outputs (n:m)

    Function to be implemented to define the transformation between many inputs and many outputs (n:m)

    actionId

    id of the action which executes this transformation. This is mainly used to prefix error messages.

    partitionValues

    partition values to transform

    dfs

    Map of (dataObjectId, DataFrame) tuples available as input

    returns

    Map of transformed (dataObjectId, DataFrame) tuples

    Definition Classes
    OptionsDfsTransformerDfsTransformer
  22. def transformPartitionValues(actionId: ActionId, partitionValues: Seq[PartitionValues])(implicit session: SparkSession, context: ActionPipelineContext): Option[Map[PartitionValues, PartitionValues]]

    Permalink

    Optional function to define the transformation of input to output partition values.

    Optional function to define the transformation of input to output partition values. For example this enables to implement aggregations where multiple input partitions are combined into one output partition. Note that the default value is input = output partition values, which should be correct for most use cases.

    actionId

    id of the action which executes this transformation. This is mainly used to prefix error messages.

    partitionValues

    partition values to transform

    returns

    Map of input to output partition values. This allows to map partition values forward and backward, which is needed in execution modes. Return None if mapping is 1:1.

    Definition Classes
    OptionsDfsTransformerPartitionValueTransformer
  23. def transformPartitionValuesWithOptions(actionId: ActionId, partitionValues: Seq[PartitionValues], options: Map[String, String])(implicit session: SparkSession): Option[Map[PartitionValues, PartitionValues]]

    Permalink

    Optional function to define the transformation of input to output partition values.

    Optional function to define the transformation of input to output partition values. For example this enables to implement aggregations where multiple input partitions are combined into one output partition. Note that the default value is input = output partition values, which should be correct for most use cases. see also DfsTransformer.transformPartitionValues()

    options

    Options specified in the configuration for this transformation, including evaluated runtimeOptions

    Definition Classes
    OptionsDfsTransformer
  24. def transformWithOptions(actionId: ActionId, partitionValues: Seq[PartitionValues], dfs: Map[String, DataFrame], options: Map[String, String])(implicit session: SparkSession): Map[String, DataFrame]

    Permalink

    Function to be implemented to define the transformation between many inputs and many outputs (n:m) see also DfsTransformer.transform()

    Function to be implemented to define the transformation between many inputs and many outputs (n:m) see also DfsTransformer.transform()

    options

    Options specified in the configuration for this transformation, including evaluated runtimeOptions

    Definition Classes
    SQLDfsTransformerOptionsDfsTransformer
  25. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from OptionsDfsTransformer

Inherited from ParsableDfsTransformer

Inherited from ParsableFromConfig[ParsableDfsTransformer]

Inherited from DfsTransformer

Inherited from PartitionValueTransformer

Inherited from AnyRef

Inherited from Any

Ungrouped