Convert Spark Type with Decimal precision,scale to Exasol type.
Convert Spark Type with Decimal precision,scale to Exasol type.
For example:
Spark.DecimalType(5,2) -> "DECIMAL(5,2)"
Exasol has a max scale, precision of 36. Spark precision/scale greater than 36 will be truncated.
A Spark DecimalType with precision and scale
The equivalent Exasol type
Given a java.sql.ResultSetMetaData returns a Spark org.apache.spark.sql.types.StructType schema.
Given a java.sql.ResultSetMetaData returns a Spark org.apache.spark.sql.types.StructType schema.
A result set metadata
A StructType matching result set types
Maps a JDBC type java.sql.Types$ to a Spark SQL org.apache.spark.sql.types.DataType.
Maps a JDBC type java.sql.Types$ to a Spark SQL org.apache.spark.sql.types.DataType.
A JDBC type from java.sql.ResultSetMetaData column type
A precision value obtained from ResultSetMetaData,
rsmd.getPrecision(index)
A scale value obtained from ResultSetMetaData,
rsmd.getScale(index)
A isSigned value obtained from ResultSetMetaData,
rsmd.isSigned(index)
A Spark SQL DataType corresponding to JDBC SQL type
Returns comma separated column name and column types for Exasol table from Spark schema.
Returns comma separated column name and column types for Exasol table from Spark schema.
It skips the NOT NULL
constraint if the Spark dataframe schema
type is a org.apache.spark.sql.types.StringType$ type.
A Spark org.apache.spark.sql.types.StructType schema
A comma separated column names and their types
Returns corresponding Exasol type as a string for a given Spark org.apache.spark.sql.types.DataType type.
Returns corresponding Exasol type as a string for a given Spark org.apache.spark.sql.types.DataType type.
A Spark DataType (e.g. org.apache.spark.sql.types.StringType$)
A default Exasol type as string for this DataType
Returns corresponding Jdbc java.sql.Types$ type given Spark org.apache.spark.sql.types.DataType type.
Returns corresponding Jdbc java.sql.Types$ type given Spark org.apache.spark.sql.types.DataType type.
A Spark DataType (e.g. org.apache.spark.sql.types.StringType$)
A default JdbcType for this DataType
Select only required columns from Spark SQL schema.
Select only required columns from Spark SQL schema.
Adapted from Spark JDBCRDD private function pruneSchema
.
A list of required columns
A Spark SQL schema
A new Spark SQL schema with only columns in the order of column names
A helper class with mapping functions between Exasol JDBC types and Spark SQL types.