Package io.debezium.relational
Class TableSchemaBuilder
java.lang.Object
io.debezium.relational.TableSchemaBuilder
Builder that constructs
TableSchema
instances for Table
definitions.
This builder is responsible for mapping table columns
to fields
in Kafka Connect Schema
s,
and this is necessarily dependent upon the database's supported types. Although mappings are defined for standard types,
this class may need to be subclassed for each DBMS to add support for DBMS-specific types by overriding any of the
"add*Field
" methods.
See the Java SE Mapping SQL
and Java Types for details about how JDBC types
map to Java value types.
- Author:
- Randall Hauch
-
Field Summary
Modifier and TypeFieldDescriptionprivate final CustomConverterRegistry
private final DefaultValueConverter
private final FieldNameSelector.FieldNamer<Column>
private static final org.slf4j.Logger
private final boolean
private final SchemaNameAdjuster
private final org.apache.kafka.connect.data.Schema
private final ValueConverterProvider
-
Constructor Summary
ConstructorDescriptionTableSchemaBuilder
(ValueConverterProvider valueConverterProvider, DefaultValueConverter defaultValueConverter, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder.TableSchemaBuilder
(ValueConverterProvider valueConverterProvider, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder. -
Method Summary
Modifier and TypeMethodDescriptionprotected void
addField
(org.apache.kafka.connect.data.SchemaBuilder builder, Table table, Column column, ColumnMapper mapper) Add to the suppliedSchemaBuilder
a field for the column with the given information.protected ValueConverter[]
convertersForColumns
(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, ColumnMappers mappers) Obtain the array of converters for each column in a row.create
(TopicNamingStrategy topicNamingStrategy, Table table, Tables.ColumnNameFilter filter, ColumnMappers mappers, Key.KeyMapper keysMapper) Create aTableSchema
from the giventable definition
.protected StructGenerator
createKeyGenerator
(org.apache.kafka.connect.data.Schema schema, TableId columnSetName, List<Column> columns, TopicNamingStrategy topicNamingStrategy) Creates the function that produces a Kafka Connect key object for a row of data.protected ValueConverter
createValueConverterFor
(TableId tableId, Column column, org.apache.kafka.connect.data.Field fieldDefn) Create aValueConverter
that can be used to convert row values for the given column into the Kafka Connect value object described by thefield definition
.protected StructGenerator
createValueGenerator
(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, Tables.ColumnNameFilter filter, ColumnMappers mappers) Creates the function that produces a Kafka Connect value object for a row of data.protected org.apache.kafka.connect.data.Field[]
fieldsForColumns
(org.apache.kafka.connect.data.Schema schema, List<Column> columns) protected int[]
indexesForColumns
(List<Column> columns) boolean
private void
validateIncomingRowToInternalMetadata
(int[] recordIndexes, org.apache.kafka.connect.data.Field[] fields, ValueConverter[] converters, Object[] row, int position) private ValueConverter
wrapInMappingConverterIfNeeded
(ColumnMappers mappers, TableId tableId, Column column, ValueConverter converter)
-
Field Details
-
LOGGER
private static final org.slf4j.Logger LOGGER -
schemaNameAdjuster
-
valueConverterProvider
-
defaultValueConverter
-
sourceInfoSchema
private final org.apache.kafka.connect.data.Schema sourceInfoSchema -
fieldNamer
-
customConverterRegistry
-
multiPartitionMode
private final boolean multiPartitionMode
-
-
Constructor Details
-
TableSchemaBuilder
public TableSchemaBuilder(ValueConverterProvider valueConverterProvider, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder.- Parameters:
valueConverterProvider
- the provider for obtainingValueConverter
s andSchemaBuilder
s; may not be nullschemaNameAdjuster
- the adjuster for schema names; may not be null
-
TableSchemaBuilder
public TableSchemaBuilder(ValueConverterProvider valueConverterProvider, DefaultValueConverter defaultValueConverter, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder.- Parameters:
valueConverterProvider
- the provider for obtainingValueConverter
s andSchemaBuilder
s; may not be nulldefaultValueConverter
- is used to convert the default value literal to a Java type recognized by value converters for a subset of types. may be null.schemaNameAdjuster
- the adjuster for schema names; may not be null
-
-
Method Details
-
create
public TableSchema create(TopicNamingStrategy topicNamingStrategy, Table table, Tables.ColumnNameFilter filter, ColumnMappers mappers, Key.KeyMapper keysMapper) Create aTableSchema
from the giventable definition
. The resulting TableSchema will have akey schema
that contains all of the columns that make up the table's primary key, and avalue schema
that contains only those columns that are not in the table's primary key.This is equivalent to calling
create(table,false)
.- Parameters:
topicNamingStrategy
- the topic naming strategytable
- the table definition; may not be nullfilter
- the filter that specifies whether columns in the table should be included; may be null if all columns are to be includedmappers
- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the table schema that can be used for sending rows of data for this table to Kafka Connect; never null
-
isMultiPartitionMode
public boolean isMultiPartitionMode() -
createKeyGenerator
protected StructGenerator createKeyGenerator(org.apache.kafka.connect.data.Schema schema, TableId columnSetName, List<Column> columns, TopicNamingStrategy topicNamingStrategy) Creates the function that produces a Kafka Connect key object for a row of data.- Parameters:
schema
- the Kafka Connect schema for the key; may be null if there is no known schema, in which case the generator will be nullcolumnSetName
- the name for the set of columns, used in error messages; may not be nullcolumns
- the column definitions for the table that defines the row; may not be nulltopicNamingStrategy
- the topic naming strategy- Returns:
- the key-generating function, or null if there is no key schema
-
validateIncomingRowToInternalMetadata
private void validateIncomingRowToInternalMetadata(int[] recordIndexes, org.apache.kafka.connect.data.Field[] fields, ValueConverter[] converters, Object[] row, int position) -
createValueGenerator
protected StructGenerator createValueGenerator(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, Tables.ColumnNameFilter filter, ColumnMappers mappers) Creates the function that produces a Kafka Connect value object for a row of data.- Parameters:
schema
- the Kafka Connect schema for the value; may be null if there is no known schema, in which case the generator will be nulltableId
- the table identifier; may not be nullcolumns
- the column definitions for the table that defines the row; may not be nullfilter
- the filter that specifies whether columns in the table should be included; may be null if all columns are to be includedmappers
- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the value-generating function, or null if there is no value schema
-
indexesForColumns
-
fieldsForColumns
-
convertersForColumns
protected ValueConverter[] convertersForColumns(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, ColumnMappers mappers) Obtain the array of converters for each column in a row. A converter might be null if the column is not be included in the records.- Parameters:
schema
- the schema; may not be nulltableId
- the identifier of the table that contains the columnscolumns
- the columns in the row; may not be nullmappers
- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the converters for each column in the rows; never null
-
wrapInMappingConverterIfNeeded
private ValueConverter wrapInMappingConverterIfNeeded(ColumnMappers mappers, TableId tableId, Column column, ValueConverter converter) -
addField
protected void addField(org.apache.kafka.connect.data.SchemaBuilder builder, Table table, Column column, ColumnMapper mapper) Add to the suppliedSchemaBuilder
a field for the column with the given information.- Parameters:
builder
- the schema builder; never nulltable
- the table definition; never nullcolumn
- the column definitionmapper
- the mapping function for the column; may be null if the columns is not to be mapped to different values
-
createValueConverterFor
protected ValueConverter createValueConverterFor(TableId tableId, Column column, org.apache.kafka.connect.data.Field fieldDefn) Create aValueConverter
that can be used to convert row values for the given column into the Kafka Connect value object described by thefield definition
. This uses the suppliedValueConverterProvider
object.- Parameters:
tableId
- the id of the table containing the column; never nullcolumn
- the column describing the input values; never nullfieldDefn
- the definition for the field in a Kafka ConnectSchema
describing the output of the function; never null- Returns:
- the value conversion function; may not be null
-