io.smartdatalake.workflow.action.customlogic
Optional class name implementing trait CustomDfsTransformer
Optional file where scala code for transformation is loaded from. The scala code in the file needs to be a function of type fnTransformType.
Optional scala code for transformation. The scala code needs to be a function of type fnTransformType.
Optional map of DataObjectId and corresponding SQL Code. Use tokens %{<key>} to replace with runtimeOptions in SQL code. Example: "select * from test where run = %{runId}"
Options to pass to the transformation
optional tuples of [key, spark sql expression] to be added as additional options when executing transformation. The spark sql expressions are evaluated against an instance of DefaultExpressionData.
Optional class name implementing trait CustomDfsTransformer
Options to pass to the transformation
optional tuples of [key, spark sql expression] to be added as additional options when executing transformation.
optional tuples of [key, spark sql expression] to be added as additional options when executing transformation. The spark sql expressions are evaluated against an instance of DefaultExpressionData.
Optional scala code for transformation.
Optional scala code for transformation. The scala code needs to be a function of type fnTransformType.
Optional file where scala code for transformation is loaded from.
Optional file where scala code for transformation is loaded from. The scala code in the file needs to be a function of type fnTransformType.
Optional map of DataObjectId and corresponding SQL Code.
Optional map of DataObjectId and corresponding SQL Code. Use tokens %{<key>} to replace with runtimeOptions in SQL code. Example: "select * from test where run = %{runId}"
Configuration of a custom Spark-DataFrame transformation between many inputs and many outputs (n:m). Define a transform function which receives a map of input DataObjectIds with DataFrames and a map of options and has to return a map of output DataObjectIds with DataFrames, see also trait CustomDfsTransformer.
Optional class name implementing trait CustomDfsTransformer
Optional file where scala code for transformation is loaded from. The scala code in the file needs to be a function of type fnTransformType.
Optional scala code for transformation. The scala code needs to be a function of type fnTransformType.
Optional map of DataObjectId and corresponding SQL Code. Use tokens %{<key>} to replace with runtimeOptions in SQL code. Example: "select * from test where run = %{runId}"
Options to pass to the transformation
optional tuples of [key, spark sql expression] to be added as additional options when executing transformation. The spark sql expressions are evaluated against an instance of DefaultExpressionData.