-
Deprecated InterfacesInterfaceDescriptionThis interface has been replaced by
DynamicTableSinkFactory. The new interface creates instances ofDynamicTableSink. See FLIP-95 for more information.This interface has been replaced byDynamicTableSourceFactory. The new interface creates instances ofDynamicTableSource. See FLIP-95 for more information.This interface has been replaced byDynamicTableSink. The new interface consumes internal data structures. See FLIP-95 for more information.This interface has been replaced byDynamicTableSink. The new interface consumes internal data structures. See FLIP-95 for more information.This interface has been replaced byDynamicTableSink. The new interface consumes internal data structures. See FLIP-95 for more information.This interface has been replaced byDynamicTableSink. The new interface consumes internal data structures. See FLIP-95 for more information.This interface has been replaced byDynamicTableSource. The new interface produces internal data structures. See FLIP-95 for more information.This interface is based on theSinkFunctionAPI, which is due to be removed. UseSinkV2Providerinstead.
-
Deprecated ClassesClassDescriptionSee
Rowtimefor details.SeeSchemafor details.This interface has been replaced byDynamicTableSink. The new interface consumes internal data structures. See FLIP-95 for more information.This interface has been replaced byDynamicTableSource. The new interface produces internal data structures. See FLIP-95 for more information.Use the RFC-compliantCsvformat in the dedicated flink-formats/flink-csv module instead.The legacy CSV connector has been replaced byFileSink. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSink. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSink. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSink. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSink. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSink. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSink. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSource. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSource. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSource. It is kept only to support tests for the legacy connector stack.The legacy CSV connector has been replaced byFileSource. It is kept only to support tests for the legacy connector stack.All Flink DataSet APIs are deprecated since Flink 1.18 and will be removed in a future Flink major version. You can still build your application in DataSet, but you should move to either the DataStream and/or Table API.
-
Deprecated MethodsMethodDescriptionThis method combines two separate concepts of table schema and field mapping. This should be split into two methods once we have support for the corresponding interfaces (see FLINK-9870).
TableSinkFactory.Contextcontains more information, and already contains table schema too. Please useTableSinkFactory.createTableSink(Context)instead.TableSourceFactory.Contextcontains more information, and already contains table schema too. Please useTableSourceFactory.createTableSource(Context)instead.UseStreamTableEnvironment.createTemporaryView(String, DataStream, Schema)instead. In most cases,StreamTableEnvironment.createTemporaryView(String, DataStream)should already be sufficient. It integrates with the new type system and supports all kinds ofDataTypesthat the table runtime can consume. The semantics might be slightly different for raw and structured types.UseStreamTableEnvironment.fromDataStream(DataStream, Schema)instead. In most cases,StreamTableEnvironment.fromDataStream(DataStream)should already be sufficient. It integrates with the new type system and supports all kinds ofDataTypesthat the table runtime can consume. The semantics might be slightly different for raw and structured types.UseStreamTableEnvironment.toDataStream(Table, Class)instead. It integrates with the new type system and supports all kinds ofDataTypesthat the table runtime can produce. The semantics might be slightly different for raw and structured types. UsetoDataStream(DataTypes.of(TypeInformation.of(Class)))ifTypeInformationshould be used as source of truth.UseStreamTableEnvironment.toDataStream(Table, Class)instead. It integrates with the new type system and supports all kinds ofDataTypesthat the table runtime can produce. The semantics might be slightly different for raw and structured types. UsetoDataStream(DataTypes.of(TypeInformation.of(Class)))ifTypeInformationshould be used as source of truth.UseStreamTableEnvironment.toChangelogStream(Table, Schema)instead. It integrates with the new type system and supports all kinds ofDataTypesand everyChangelogModethat the table runtime can produce.UseStreamTableEnvironment.toChangelogStream(Table, Schema)instead. It integrates with the new type system and supports all kinds ofDataTypesand everyChangelogModethat the table runtime can produce.This method will be removed in future versions as it uses the old type system. It is recommended to useCsvTableSource.Builder.field(String, DataType)instead which uses the new type system based onDataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
SourceFunctionAPI, which is due to be removed. UseSourceProviderinstead.