-
Interfaces Interface Description org.apache.flink.legacy.table.connector.source.SourceFunctionProvider This interface is based on theSourceFunction
API, which is due to be removed. UseSourceProvider
instead.org.apache.flink.legacy.table.factories.StreamTableSinkFactory This interface has been replaced byDynamicTableSinkFactory
. The new interface creates instances ofDynamicTableSink
. See FLIP-95 for more information.org.apache.flink.legacy.table.factories.StreamTableSourceFactory This interface has been replaced byDynamicTableSourceFactory
. The new interface creates instances ofDynamicTableSource
. See FLIP-95 for more information.org.apache.flink.legacy.table.sinks.AppendStreamTableSink This interface has been replaced byDynamicTableSink
. The new interface consumes internal data structures. See FLIP-95 for more information.org.apache.flink.legacy.table.sinks.RetractStreamTableSink This interface has been replaced byDynamicTableSink
. The new interface consumes internal data structures. See FLIP-95 for more information.org.apache.flink.legacy.table.sinks.StreamTableSink This interface has been replaced byDynamicTableSink
. The new interface consumes internal data structures. See FLIP-95 for more information.org.apache.flink.legacy.table.sinks.UpsertStreamTableSink This interface has been replaced byDynamicTableSink
. The new interface consumes internal data structures. See FLIP-95 for more information.org.apache.flink.legacy.table.sources.StreamTableSource This interface has been replaced byDynamicTableSource
. The new interface produces internal data structures. See FLIP-95 for more information.org.apache.flink.table.connector.sink.legacy.SinkFunctionProvider This interface is based on theSinkFunction
API, which is due to be removed. UseSinkV2Provider
instead.
-
Classes Class Description org.apache.flink.legacy.table.descriptors.RowtimeValidator SeeRowtime
for details.org.apache.flink.legacy.table.descriptors.SchemaValidator SeeSchema
for details.org.apache.flink.legacy.table.sinks.OutputFormatTableSink This interface has been replaced byDynamicTableSink
. The new interface consumes internal data structures. See FLIP-95 for more information.org.apache.flink.legacy.table.sources.InputFormatTableSource This interface has been replaced byDynamicTableSource
. The new interface produces internal data structures. See FLIP-95 for more information.org.apache.flink.table.descriptors.OldCsvValidator Use the RFC-compliantCsv
format in the dedicated flink-formats/flink-csv module instead.org.apache.flink.table.sinks.CsvAppendTableSinkFactory The legacy CSV connector has been replaced byFileSink
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sinks.CsvBatchTableSinkFactory The legacy CSV connector has been replaced byFileSink
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sinks.CsvTableSink The legacy CSV connector has been replaced byFileSink
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sinks.CsvTableSinkFactoryBase The legacy CSV connector has been replaced byFileSink
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sinks.LegacyCsvDynamicTableSinkFactory The legacy CSV connector has been replaced byFileSink
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sinks.LegacyCsvDynamicTableSinkFactory.LegacyCsvDynamicTableSink The legacy CSV connector has been replaced byFileSink
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sinks.LegacyCsvDynamicTableSinkOptions The legacy CSV connector has been replaced byFileSink
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sources.CsvAppendTableSourceFactory The legacy CSV connector has been replaced byFileSource
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sources.CsvBatchTableSourceFactory The legacy CSV connector has been replaced byFileSource
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sources.CsvTableSource The legacy CSV connector has been replaced byFileSource
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sources.CsvTableSourceFactoryBase The legacy CSV connector has been replaced byFileSource
. It is kept only to support tests for the legacy connector stack.org.apache.flink.table.sources.format.CsvInputFormat org.apache.flink.table.sources.format.RowCsvInputFormat All Flink DataSet APIs are deprecated since Flink 1.18 and will be removed in a future Flink major version. You can still build your application in DataSet, but you should move to either the DataStream and/or Table API.
-
Methods Method Description org.apache.flink.legacy.table.descriptors.SchemaValidator.deriveTableSinkSchema(DescriptorProperties) This method combines two separate concepts of table schema and field mapping. This should be split into two methods once we have support for the corresponding interfaces (see FLINK-9870).org.apache.flink.legacy.table.factories.StreamTableSinkFactory.createStreamTableSink(Map<String, String>) TableSinkFactory.Context
contains more information, and already contains table schema too. Please useTableSinkFactory.createTableSink(Context)
instead.org.apache.flink.legacy.table.factories.StreamTableSourceFactory.createStreamTableSource(Map<String, String>) TableSourceFactory.Context
contains more information, and already contains table schema too. Please useTableSourceFactory.createTableSource(Context)
instead.org.apache.flink.table.api.bridge.java.StreamTableEnvironment.createTemporaryView(String, DataStream<T>, Expression...) UseStreamTableEnvironment.createTemporaryView(String, DataStream, Schema)
instead. In most cases,StreamTableEnvironment.createTemporaryView(String, DataStream)
should already be sufficient. It integrates with the new type system and supports all kinds ofDataTypes
that the table runtime can consume. The semantics might be slightly different for raw and structured types.org.apache.flink.table.api.bridge.java.StreamTableEnvironment.fromDataStream(DataStream<T>, Expression...) UseStreamTableEnvironment.fromDataStream(DataStream, Schema)
instead. In most cases,StreamTableEnvironment.fromDataStream(DataStream)
should already be sufficient. It integrates with the new type system and supports all kinds ofDataTypes
that the table runtime can consume. The semantics might be slightly different for raw and structured types.org.apache.flink.table.api.bridge.java.StreamTableEnvironment.toAppendStream(Table, Class<T>) UseStreamTableEnvironment.toDataStream(Table, Class)
instead. It integrates with the new type system and supports all kinds ofDataTypes
that the table runtime can produce. The semantics might be slightly different for raw and structured types. UsetoDataStream(DataTypes.of(TypeInformation.of(Class)))
ifTypeInformation
should be used as source of truth.org.apache.flink.table.api.bridge.java.StreamTableEnvironment.toRetractStream(Table, Class<T>) UseStreamTableEnvironment.toChangelogStream(Table, Schema)
instead. It integrates with the new type system and supports all kinds ofDataTypes
and everyChangelogMode
that the table runtime can produce.org.apache.flink.table.sources.CsvTableSource.Builder.field(String, TypeInformation<?>) This method will be removed in future versions as it uses the old type system. It is recommended to useCsvTableSource.Builder.field(String, DataType)
instead which uses the new type system based onDataTypes
. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.