Class MySqlValueConverters

java.lang.Object
io.debezium.jdbc.JdbcValueConverters
io.debezium.connector.mysql.MySqlValueConverters
All Implemented Interfaces:
ValueConverterProvider

@Immutable public class MySqlValueConverters extends JdbcValueConverters
MySQL-specific customization of the conversions from JDBC values obtained from the MySQL binlog client library.

This class always uses UTC for the default time zone when converting values without timezone information to values that require timezones. This is because MySQL TIMESTAMP values are always stored in UTC (unlike DATETIME values) and are replicated in this form. Meanwhile, the MySQL Binlog Client library will deserialize these as Timestamp values that have no timezone and, therefore, are presumed to be in UTC. When the column is properly marked with a Types.TIMESTAMP_WITH_TIMEZONE type, the converters will need to convert that Timestamp value into an OffsetDateTime using the default time zone, which always is UTC.

Author:
Randall Hauch
See Also:
  • AbstractRowsEventDataDeserializer
  • Field Details

    • LOGGER

      private static final org.slf4j.Logger LOGGER
    • TIME_FIELD_PATTERN

      private static final Pattern TIME_FIELD_PATTERN
      Used to parse values of TIME columns. Format: 000:00:00.000000.
    • DATE_FIELD_PATTERN

      private static final Pattern DATE_FIELD_PATTERN
      Used to parse values of DATE columns. Format: 000-00-00.
    • TIMESTAMP_FIELD_PATTERN

      private static final Pattern TIMESTAMP_FIELD_PATTERN
      Used to parse values of TIMESTAMP columns. Format: 000-00-00 00:00:00.000.
    • parsingErrorHandler

      private final MySqlValueConverters.ParsingErrorHandler parsingErrorHandler
  • Constructor Details

  • Method Details

    • adjustTemporal

      public static Temporal adjustTemporal(Temporal temporal)
      A utility method that adjusts ambiguous 2-digit year values of DATETIME, DATE, and TIMESTAMP types using these MySQL-specific rules:
      • Year values in the range 00-69 are converted to 2000-2069.
      • Year values in the range 70-99 are converted to 1970-1999.
      Parameters:
      temporal - the temporal instance to adjust; may not be null
      Returns:
      the possibly adjusted temporal instance; never null
    • byteOrderOfBitType

      protected ByteOrder byteOrderOfBitType()
      Overrides:
      byteOrderOfBitType in class JdbcValueConverters
    • schemaBuilder

      public org.apache.kafka.connect.data.SchemaBuilder schemaBuilder(Column column)
      Specified by:
      schemaBuilder in interface ValueConverterProvider
      Overrides:
      schemaBuilder in class JdbcValueConverters
    • converter

      public ValueConverter converter(Column column, org.apache.kafka.connect.data.Field fieldDefn)
      Specified by:
      converter in interface ValueConverterProvider
      Overrides:
      converter in class JdbcValueConverters
    • charsetFor

      protected Charset charsetFor(Column column)
      Return the Charset instance with the MySQL-specific character set name used by the given column.
      Parameters:
      column - the column in which the character set is used; never null
      Returns:
      the Java Charset, or null if there is no mapping
    • convertJson

      protected Object convertJson(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Convert the String byte[] value to a string value used in a SourceRecord.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertSmallInt

      protected Object convertSmallInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Overrides:
      convertSmallInt in class JdbcValueConverters
    • convertInteger

      protected Object convertInteger(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Overrides:
      convertInteger in class JdbcValueConverters
    • convertBigInt

      protected Object convertBigInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Overrides:
      convertBigInt in class JdbcValueConverters
    • convertFloat

      protected Object convertFloat(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      MySql reports FLOAT(p) values as FLOAT and DOUBLE. A precision from 0 to 23 results in a 4-byte single-precision FLOAT column. A precision from 24 to 53 results in an 8-byte double-precision DOUBLE column. As of MySQL 8.0.17, the nonstandard FLOAT(M,D) and DOUBLE(M,D) syntax is deprecated, and should expect support for it be removed in a future version of MySQL. Based on this future, we didn't handle the case.
      Overrides:
      convertFloat in class JdbcValueConverters
    • convertString

      protected Object convertString(Column column, org.apache.kafka.connect.data.Field fieldDefn, Charset columnCharset, Object data)
      Convert the String or byte[] value to a string value used in a SourceRecord.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      columnCharset - the Java character set in which column byte[] values are encoded; may not be null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertYearToInt

      protected Object convertYearToInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Converts a value object for a MySQL YEAR, which appear in the binlog as an integer though returns from the MySQL JDBC driver as either a short or a Date.
      Parameters:
      column - the column definition describing the data value; never null
      fieldDefn - the field definition; never null
      data - the data object to be converted into a year literal integer value; never null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertEnumToString

      protected Object convertEnumToString(List<String> options, Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Converts a value object for a MySQL ENUM, which is represented in the binlog events as an integer value containing the index of the enum option. The MySQL JDBC driver returns a string containing the option, so this method calculates the same.
      Parameters:
      options - the characters that appear in the same order as defined in the column; may not be null
      column - the column definition describing the data value; never null
      fieldDefn - the field definition; never null
      data - the data object to be converted into an ENUM literal String value
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertSetToString

      protected Object convertSetToString(List<String> options, Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Converts a value object for a MySQL SET, which is represented in the binlog events contain a long number in which every bit corresponds to a different option. The MySQL JDBC driver returns a string containing the comma-separated options, so this method calculates the same.
      Parameters:
      options - the characters that appear in the same order as defined in the column; may not be null
      column - the column definition describing the data value; never null
      fieldDefn - the field definition; never null
      data - the data object to be converted into an SET literal String value; never null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • matches

      protected boolean matches(String upperCaseTypeName, String upperCaseMatch)
      Determine if the uppercase form of a column's type exactly matches or begins with the specified prefix. Note that this logic works when the column's type contains the type name followed by parentheses.
      Parameters:
      upperCaseTypeName - the upper case form of the column's type name
      upperCaseMatch - the upper case form of the expected type or prefix of the type; may not be null
      Returns:
      true if the type matches the specified type, or false otherwise
    • isGeometryCollection

      protected boolean isGeometryCollection(String upperCaseTypeName)
      Determine if the uppercase form of a column's type is geometry collection independent of JDBC driver or server version.
      Parameters:
      upperCaseTypeName - the upper case form of the column's type name
      Returns:
      true if the type is geometry collection
    • extractEnumAndSetOptions

      protected List<String> extractEnumAndSetOptions(Column column)
    • extractEnumAndSetOptionsAsString

      protected String extractEnumAndSetOptionsAsString(Column column)
    • convertSetValue

      protected String convertSetValue(Column column, long indexes, List<String> options)
    • convertPoint

      protected Object convertPoint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Convert the a value representing a POINT byte[] value to a Point value used in a SourceRecord.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertGeometry

      protected Object convertGeometry(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Convert the a value representing a GEOMETRY byte[] value to a Geometry value used in a SourceRecord.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • normalizeBinaryData

      protected byte[] normalizeBinaryData(Column column, byte[] data)
      Overrides:
      normalizeBinaryData in class JdbcValueConverters
    • convertUnsignedTinyint

      protected Object convertUnsignedTinyint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Convert the a value representing a Unsigned TINYINT value to the correct Unsigned TINYINT representation.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertUnsignedSmallint

      protected Object convertUnsignedSmallint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Convert the a value representing a Unsigned SMALLINT value to the correct Unsigned SMALLINT representation.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertUnsignedMediumint

      protected Object convertUnsignedMediumint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Convert the a value representing a Unsigned MEDIUMINT value to the correct Unsigned SMALLINT representation.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertUnsignedInt

      protected Object convertUnsignedInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Convert the a value representing a Unsigned INT value to the correct Unsigned INT representation.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertUnsignedBigint

      protected Object convertUnsignedBigint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Convert the a value representing a Unsigned BIGINT value to the correct Unsigned INT representation.
      Parameters:
      column - the column in which the value appears
      fieldDefn - the field definition for the SourceRecord's Schema; never null
      data - the data; may be null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertDurationToMicroseconds

      protected Object convertDurationToMicroseconds(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
      Converts a value object for an expected type of Duration to Long values that represents the time in microseconds.

      Per the JDBC specification, databases should return Time instances, but that's not working because it can only handle Daytime 00:00:00-23:59:59. We use Duration instead that can handle the range of -838:59:59.000000 to 838:59:59.000000 of a MySQL TIME type and transfer data as signed INT64 which reflects the DB value converted to microseconds.

      Parameters:
      column - the column definition describing the data value; never null
      fieldDefn - the field definition; never null
      data - the data object to be converted into a Duration type; never null
      Returns:
      the converted value, or null if the conversion could not be made and the column allows nulls
      Throws:
      IllegalArgumentException - if the value could not be converted but the column does not allow nulls
    • convertTimestampToLocalDateTime

      protected Object convertTimestampToLocalDateTime(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
    • stringToDuration

      public static Duration stringToDuration(String timeString)
    • stringToLocalDate

      public static LocalDate stringToLocalDate(String dateString, Column column, Table table)
    • containsZeroValuesInDatePart

      public static boolean containsZeroValuesInDatePart(String timestampString, Column column, Table table)
    • defaultParsingErrorHandler

      public static void defaultParsingErrorHandler(String message, Exception exception)