Package io.debezium.relational
Class TableSchemaBuilder
- java.lang.Object
-
- io.debezium.relational.TableSchemaBuilder
-
@ThreadSafe @Immutable public class TableSchemaBuilder extends Object
Builder that constructsTableSchema
instances forTable
definitions.This builder is responsible for mapping
table columns
tofields
in Kafka ConnectSchema
s, and this is necessarily dependent upon the database's supported types. Although mappings are defined for standard types, this class may need to be subclassed for each DBMS to add support for DBMS-specific types by overriding any of the "add*Field
" methods.See the Java SE Mapping SQL and Java Types for details about how JDBC
types
map to Java value types.- Author:
- Randall Hauch
-
-
Field Summary
Fields Modifier and Type Field Description private CustomConverterRegistry
customConverterRegistry
private FieldNameSelector.FieldNamer<Column>
fieldNamer
private static org.slf4j.Logger
LOGGER
private SchemaNameAdjuster
schemaNameAdjuster
private org.apache.kafka.connect.data.Schema
sourceInfoSchema
private ValueConverterProvider
valueConverterProvider
-
Constructor Summary
Constructors Constructor Description TableSchemaBuilder(ValueConverterProvider valueConverterProvider, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, boolean sanitizeFieldNames)
Create a new instance of the builder.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description protected void
addField(org.apache.kafka.connect.data.SchemaBuilder builder, Table table, Column column, ColumnMapper mapper)
Add to the suppliedSchemaBuilder
a field for the column with the given information.protected ValueConverter[]
convertersForColumns(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, ColumnMappers mappers)
Obtain the array of converters for each column in a row.TableSchema
create(String schemaPrefix, String envelopSchemaName, Table table, Tables.ColumnNameFilter filter, ColumnMappers mappers, Key.KeyMapper keysMapper)
Create aTableSchema
from the giventable definition
.protected StructGenerator
createKeyGenerator(org.apache.kafka.connect.data.Schema schema, TableId columnSetName, List<Column> columns)
Creates the function that produces a Kafka Connect key object for a row of data.protected ValueConverter
createValueConverterFor(TableId tableId, Column column, org.apache.kafka.connect.data.Field fieldDefn)
Create aValueConverter
that can be used to convert row values for the given column into the Kafka Connect value object described by thefield definition
.protected StructGenerator
createValueGenerator(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, Tables.ColumnNameFilter filter, ColumnMappers mappers)
Creates the function that produces a Kafka Connect value object for a row of data.protected org.apache.kafka.connect.data.Field[]
fieldsForColumns(org.apache.kafka.connect.data.Schema schema, List<Column> columns)
protected int[]
indexesForColumns(List<Column> columns)
private String
tableSchemaName(TableId tableId)
Returns the type schema name for the given table.private void
validateIncomingRowToInternalMetadata(int[] recordIndexes, org.apache.kafka.connect.data.Field[] fields, ValueConverter[] converters, Object[] row, int position)
private ValueConverter
wrapInMappingConverterIfNeeded(ColumnMappers mappers, TableId tableId, Column column, ValueConverter converter)
-
-
-
Field Detail
-
LOGGER
private static final org.slf4j.Logger LOGGER
-
schemaNameAdjuster
private final SchemaNameAdjuster schemaNameAdjuster
-
valueConverterProvider
private final ValueConverterProvider valueConverterProvider
-
sourceInfoSchema
private final org.apache.kafka.connect.data.Schema sourceInfoSchema
-
fieldNamer
private final FieldNameSelector.FieldNamer<Column> fieldNamer
-
customConverterRegistry
private final CustomConverterRegistry customConverterRegistry
-
-
Constructor Detail
-
TableSchemaBuilder
public TableSchemaBuilder(ValueConverterProvider valueConverterProvider, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, boolean sanitizeFieldNames)
Create a new instance of the builder.- Parameters:
valueConverterProvider
- the provider for obtainingValueConverter
s andSchemaBuilder
s; may not be nullschemaNameAdjuster
- the adjuster for schema names; may not be null
-
-
Method Detail
-
create
public TableSchema create(String schemaPrefix, String envelopSchemaName, Table table, Tables.ColumnNameFilter filter, ColumnMappers mappers, Key.KeyMapper keysMapper)
Create aTableSchema
from the giventable definition
. The resulting TableSchema will have akey schema
that contains all of the columns that make up the table's primary key, and avalue schema
that contains only those columns that are not in the table's primary key.This is equivalent to calling
create(table,false)
.- Parameters:
schemaPrefix
- the prefix added to the table identifier to construct the schema names; may be null if there is no prefixenvelopSchemaName
- the name of the schema of the built table's envelopetable
- the table definition; may not be nullfilter
- the filter that specifies whether columns in the table should be included; may be null if all columns are to be includedmappers
- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the table schema that can be used for sending rows of data for this table to Kafka Connect; never null
-
tableSchemaName
private String tableSchemaName(TableId tableId)
Returns the type schema name for the given table.
-
createKeyGenerator
protected StructGenerator createKeyGenerator(org.apache.kafka.connect.data.Schema schema, TableId columnSetName, List<Column> columns)
Creates the function that produces a Kafka Connect key object for a row of data.- Parameters:
schema
- the Kafka Connect schema for the key; may be null if there is no known schema, in which case the generator will be nullcolumnSetName
- the name for the set of columns, used in error messages; may not be nullcolumns
- the column definitions for the table that defines the row; may not be null- Returns:
- the key-generating function, or null if there is no key schema
-
validateIncomingRowToInternalMetadata
private void validateIncomingRowToInternalMetadata(int[] recordIndexes, org.apache.kafka.connect.data.Field[] fields, ValueConverter[] converters, Object[] row, int position)
-
createValueGenerator
protected StructGenerator createValueGenerator(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, Tables.ColumnNameFilter filter, ColumnMappers mappers)
Creates the function that produces a Kafka Connect value object for a row of data.- Parameters:
schema
- the Kafka Connect schema for the value; may be null if there is no known schema, in which case the generator will be nulltableId
- the table identifier; may not be nullcolumns
- the column definitions for the table that defines the row; may not be nullfilter
- the filter that specifies whether columns in the table should be included; may be null if all columns are to be includedmappers
- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the value-generating function, or null if there is no value schema
-
fieldsForColumns
protected org.apache.kafka.connect.data.Field[] fieldsForColumns(org.apache.kafka.connect.data.Schema schema, List<Column> columns)
-
convertersForColumns
protected ValueConverter[] convertersForColumns(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, ColumnMappers mappers)
Obtain the array of converters for each column in a row. A converter might be null if the column is not be included in the records.- Parameters:
schema
- the schema; may not be nulltableId
- the identifier of the table that contains the columnscolumns
- the columns in the row; may not be nullmappers
- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the converters for each column in the rows; never null
-
wrapInMappingConverterIfNeeded
private ValueConverter wrapInMappingConverterIfNeeded(ColumnMappers mappers, TableId tableId, Column column, ValueConverter converter)
-
addField
protected void addField(org.apache.kafka.connect.data.SchemaBuilder builder, Table table, Column column, ColumnMapper mapper)
Add to the suppliedSchemaBuilder
a field for the column with the given information.- Parameters:
builder
- the schema builder; never nulltable
- the table definition; never nullcolumn
- the column definitionmapper
- the mapping function for the column; may be null if the columns is not to be mapped to different values
-
createValueConverterFor
protected ValueConverter createValueConverterFor(TableId tableId, Column column, org.apache.kafka.connect.data.Field fieldDefn)
Create aValueConverter
that can be used to convert row values for the given column into the Kafka Connect value object described by thefield definition
. This uses the suppliedValueConverterProvider
object.- Parameters:
tableId
- the id of the table containing the column; never nullcolumn
- the column describing the input values; never nullfieldDefn
- the definition for the field in a Kafka ConnectSchema
describing the output of the function; never null- Returns:
- the value conversion function; may not be null
-
-