Skip navigation links
  • Overview
  • Package
  • Class
  • Use
  • Tree
  • Deprecated
  • Index
  • Help

Deprecated API

Contents

  • Interfaces
  • Classes
  • Exceptions
  • Fields
  • Methods
  • Constructors
  • Annotation Type Elements
  • Deprecated Interfaces
    Interface
    Description
    org.apache.flink.table.descriptors.DescriptorValidator
    See Descriptor for details.
    org.apache.flink.table.legacy.api.constraints.Constraint
    See ResolvedSchema and Constraint.
    org.apache.flink.table.legacy.connector.source.AsyncTableFunctionProvider
    Please use AsyncLookupFunctionProvider to implement asynchronous lookup table.
    org.apache.flink.table.legacy.connector.source.TableFunctionProvider
    Please use LookupFunctionProvider to implement synchronous lookup table.
    org.apache.flink.table.legacy.descriptors.Descriptor
    Descriptor was primarily used for the legacy connector stack and have been deprecated. Use TableDescriptor for creating sources and sinks from the Table API.
    org.apache.flink.table.legacy.factories.TableFactory
    This interface has been replaced by Factory.
    org.apache.flink.table.legacy.factories.TableSinkFactory
    This interface has been replaced by DynamicTableSinkFactory. The new interface consumes internal data structures. See FLIP-95 for more information.
    org.apache.flink.table.legacy.factories.TableSourceFactory
    This interface has been replaced by DynamicTableSourceFactory. The new interface produces internal data structures. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sinks.OverwritableTableSink
    This interface will not be supported in the new sink design around DynamicTableSink. Use SupportsOverwrite instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sinks.PartitionableTableSink
    This interface will not be supported in the new sink design around DynamicTableSink. Use SupportsPartitioning instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sinks.TableSink
    This interface has been replaced by DynamicTableSink. The new interface consumes internal data structures. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.DefinedFieldMapping
    This interface will not be supported in the new source design around DynamicTableSource. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.DefinedProctimeAttribute
    This interface will not be supported in the new source design around DynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.DefinedRowtimeAttributes
    This interface will not be supported in the new source design around DynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.FieldComputer
    This interface will not be supported in the new source design around DynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.FilterableTableSource
    This interface will not be supported in the new source design around DynamicTableSource. Use SupportsFilterPushDown instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.LimitableTableSource
    This interface will not be supported in the new source design around DynamicTableSource. Use SupportsLimitPushDown instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.LookupableTableSource
    This interface will not be supported in the new source design around DynamicTableSource. Use LookupTableSource instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.NestedFieldsProjectableTableSource
    This interface will not be supported in the new source design around DynamicTableSource. Use SupportsProjectionPushDown instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.PartitionableTableSource
    This interface will not be supported in the new source design around DynamicTableSource. Use SupportsPartitionPushDown instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.ProjectableTableSource
    This interface will not be supported in the new source design around DynamicTableSource. Use SupportsProjectionPushDown instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.TableSource
    This interface has been replaced by DynamicTableSource. The new interface produces internal data structures. See FLIP-95 for more information.
  • Deprecated Classes
    Class
    Description
    org.apache.flink.table.dataview.ListViewSerializer
    org.apache.flink.table.dataview.ListViewSerializerSnapshot
    org.apache.flink.table.dataview.ListViewTypeInfo
    org.apache.flink.table.dataview.MapViewSerializer
    org.apache.flink.table.dataview.MapViewSerializerSnapshot
    org.apache.flink.table.dataview.MapViewTypeInfo
    org.apache.flink.table.dataview.NullAwareMapSerializer
    org.apache.flink.table.dataview.NullAwareMapSerializerSnapshot
    org.apache.flink.table.descriptors.ConnectorDescriptorValidator
    org.apache.flink.table.descriptors.DescriptorProperties
    This utility will be dropped soon. DynamicTableFactory is based on ConfigOption and catalogs use CatalogPropertiesUtil.
    org.apache.flink.table.descriptors.FileSystemValidator
    The legacy CSV connector has been replaced by FileSource / FileSink. It is kept only to support tests for the legacy connector stack.
    org.apache.flink.table.factories.TableFactoryService
    org.apache.flink.table.factories.TableSinkFactoryContextImpl
    org.apache.flink.table.factories.TableSourceFactoryContextImpl
    org.apache.flink.table.functions.AggregateFunctionDefinition
    Non-legacy functions can simply omit this wrapper for declarations.
    org.apache.flink.table.functions.LegacyUserDefinedFunctionInference
    org.apache.flink.table.functions.ScalarFunctionDefinition
    Non-legacy functions can simply omit this wrapper for declarations.
    org.apache.flink.table.functions.TableAggregateFunctionDefinition
    Non-legacy functions can simply omit this wrapper for declarations.
    org.apache.flink.table.functions.TableFunctionDefinition
    Non-legacy functions can simply omit this wrapper for declarations.
    org.apache.flink.table.legacy.api.constraints.UniqueConstraint
    See ResolvedSchema and UniqueConstraint.
    org.apache.flink.table.legacy.api.TableColumn
    See ResolvedSchema and Column.
    org.apache.flink.table.legacy.api.TableSchema
    This class has been deprecated as part of FLIP-164. It has been replaced by two more dedicated classes Schema and ResolvedSchema. Use Schema for declaration in APIs. ResolvedSchema is offered by the framework after resolution and validation.
    org.apache.flink.table.legacy.api.Types
    This class will be removed in future versions as it uses the old type system. It is recommended to use DataTypes instead which uses the new type system based on instances of DataType. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
    org.apache.flink.table.legacy.api.WatermarkSpec
    See ResolvedSchema and WatermarkSpec.
    org.apache.flink.table.legacy.descriptors.Rowtime
    This class was used for legacy connectors using Descriptor.
    org.apache.flink.table.legacy.descriptors.Schema
    This class was used for legacy connectors using Descriptor.
    org.apache.flink.table.legacy.sources.RowtimeAttributeDescriptor
    This interface will not be supported in the new source design around DynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.sources.tsextractors.TimestampExtractor
    This interface will not be supported in the new source design around DynamicTableSource. Use the concept of computed columns instead. See FLIP-95 for more information.
    org.apache.flink.table.legacy.types.logical.TypeInformationRawType
    Use RawType instead.
    org.apache.flink.table.legacy.utils.TypeStringUtils
    This utility is based on TypeInformation. However, the Table & SQL API is currently updated to use DataTypes based on LogicalTypes. Use LogicalTypeParser instead.
    org.apache.flink.table.sinks.TableSinkBase
    This class is implementing the deprecated TableSink interface. Implement DynamicTableSink directly instead.
    org.apache.flink.table.types.utils.LegacyTypeInfoDataTypeConverter
    Use DataTypeFactory.createDataType(TypeInformation) instead. Note that this method will not create legacy types anymore. It fully uses the new type system available only in the planner.
    org.apache.flink.table.typeutils.TimeIndicatorTypeInfo
    This class will be removed in future versions as it is used for the old type system. It is recommended to use DataTypes instead. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
    org.apache.flink.table.typeutils.TimeIntervalTypeInfo
    This class will be removed in future versions as it is used for the old type system. It is recommended to use DataTypes instead. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
  • Deprecated Exceptions
    Exceptions
    Description
    org.apache.flink.table.api.AmbiguousTableFactoryException
    This exception is considered internal and has been erroneously placed in the *.api package. It is replaced by AmbiguousTableFactoryException and should not be used directly anymore.
    org.apache.flink.table.api.ExpressionParserException
    This exception is considered internal and has been erroneously placed in the *.api package. It is replaced by ExpressionParserException and should not be used directly anymore.
    org.apache.flink.table.api.NoMatchingTableFactoryException
    This exception is considered internal and has been erroneously placed in the *.api package. It is replaced by NoMatchingTableFactoryException and should not be used directly anymore.
  • Deprecated Fields
    Field
    Description
    org.apache.flink.table.legacy.descriptors.Schema.SCHEMA_TYPE
    Schema uses the legacy type key (e.g. schema.0.type = LONG) to store type information in prior v1.9. Since v1.10, Schema uses data type key (e.g. schema.0.data-type = BIGINT) to store types.
    org.apache.flink.table.module.CommonModuleOptions.MODULE_TYPE
    This is only required for the legacy factory stack
  • Deprecated Methods
    Method
    Description
    org.apache.flink.table.annotation.FunctionHint.argument()
    Use FunctionHint.arguments() instead.
    org.apache.flink.table.annotation.FunctionHint.argumentNames()
    Use FunctionHint.arguments() instead.
    org.apache.flink.table.annotation.ProcedureHint.argument()
    Use ProcedureHint.arguments() instead.
    org.apache.flink.table.annotation.ProcedureHint.argumentNames()
    Use ProcedureHint.arguments() instead.
    org.apache.flink.table.catalog.CatalogBaseTable.getSchema()
    This method returns the deprecated TableSchema class. The old class was a hybrid of resolved and unresolved schema information. It has been replaced by the new Schema which is always unresolved and will be resolved by the framework later.
    org.apache.flink.table.catalog.ResolvedCatalogBaseTable.getSchema()
    This method returns the deprecated TableSchema class. The old class was a hybrid of resolved and unresolved schema information. It has been replaced by the new ResolvedSchema which is resolved by the framework and accessible via ResolvedCatalogBaseTable.getResolvedSchema().
    org.apache.flink.table.functions.BuiltInFunctionDefinition.Builder.namedArguments(String...)
    Use BuiltInFunctionDefinition.Builder.staticArguments(StaticArgument...) instead.
    org.apache.flink.table.functions.BuiltInFunctionDefinition.Builder.typedArguments(DataType...)
    Use BuiltInFunctionDefinition.Builder.staticArguments(StaticArgument...) instead.
    org.apache.flink.table.functions.ImperativeAggregateFunction.getAccumulatorType()
    This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecated TableEnvironment.registerFunction(...) method. The new reflective extraction logic (possibly enriched with DataTypeHint and FunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to override UserDefinedFunction.getTypeInference(DataTypeFactory).
    org.apache.flink.table.functions.ImperativeAggregateFunction.getResultType()
    This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecated TableEnvironment.registerFunction(...) method. The new reflective extraction logic (possibly enriched with DataTypeHint and FunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to override UserDefinedFunction.getTypeInference(DataTypeFactory).
    org.apache.flink.table.functions.ScalarFunction.getParameterTypes(Class<?>[])
    This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecated TableEnvironment.registerFunction(...) method. The new reflective extraction logic (possibly enriched with DataTypeHint and FunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to override UserDefinedFunction.getTypeInference(DataTypeFactory).
    org.apache.flink.table.functions.ScalarFunction.getResultType(Class<?>[])
    This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecated TableEnvironment.registerFunction(...) method. The new reflective extraction logic (possibly enriched with DataTypeHint and FunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to override UserDefinedFunction.getTypeInference(DataTypeFactory).
    org.apache.flink.table.functions.TableFunction.getParameterTypes(Class<?>[])
    This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecated TableEnvironment.registerFunction(...) method. The new reflective extraction logic (possibly enriched with DataTypeHint and FunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to override UserDefinedFunction.getTypeInference(DataTypeFactory).
    org.apache.flink.table.functions.TableFunction.getResultType()
    This method uses the old type system and is based on the old reflective extraction logic. The method will be removed in future versions and is only called when using the deprecated TableEnvironment.registerFunction(...) method. The new reflective extraction logic (possibly enriched with DataTypeHint and FunctionHint) should be powerful enough to cover most use cases. For advanced users, it is possible to override UserDefinedFunction.getTypeInference(DataTypeFactory).
    org.apache.flink.table.legacy.api.TableColumn.of(String, DataType)
    Use TableColumn.physical(String, DataType) instead.
    org.apache.flink.table.legacy.api.TableColumn.of(String, DataType, String)
    Use TableColumn.computed(String, DataType, String) instead.
    org.apache.flink.table.legacy.api.TableSchema.Builder.field(String, TypeInformation<?>)
    This method will be removed in future versions as it uses the old type system. It is recommended to use TableSchema.Builder.field(String, DataType) instead which uses the new type system based on DataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
    org.apache.flink.table.legacy.api.TableSchema.fromTypeInfo(TypeInformation<?>)
    This method will be removed soon. Use DataTypes to declare types.
    org.apache.flink.table.legacy.api.TableSchema.getFieldType(int)
    This method will be removed in future versions as it uses the old type system. It is recommended to use TableSchema.getFieldDataType(int) instead which uses the new type system based on DataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
    org.apache.flink.table.legacy.api.TableSchema.getFieldType(String)
    This method will be removed in future versions as it uses the old type system. It is recommended to use TableSchema.getFieldDataType(String) instead which uses the new type system based on DataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
    org.apache.flink.table.legacy.api.TableSchema.getFieldTypes()
    This method will be removed in future versions as it uses the old type system. It is recommended to use TableSchema.getFieldDataTypes() instead which uses the new type system based on DataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
    org.apache.flink.table.legacy.api.TableSchema.toRowType()
    Use TableSchema.toRowDataType() instead.
    org.apache.flink.table.legacy.descriptors.Schema.field(String, TypeInformation<?>)
    This method will be removed in future versions as it uses the old type system. Please use Schema.field(String, DataType) instead.
    org.apache.flink.table.legacy.factories.TableSinkFactory.createTableSink(Map<String, String>)
    TableSinkFactory.Context contains more information, and already contains table schema too. Please use TableSinkFactory.createTableSink(Context) instead.
    org.apache.flink.table.legacy.factories.TableSinkFactory.createTableSink(ObjectPath, CatalogTable)
    TableSinkFactory.Context contains more information, and already contains table schema too. Please use TableSinkFactory.createTableSink(Context) instead.
    org.apache.flink.table.legacy.factories.TableSourceFactory.createTableSource(Map<String, String>)
    TableSourceFactory.Context contains more information, and already contains table schema too. Please use TableSourceFactory.createTableSource(Context) instead.
    org.apache.flink.table.legacy.factories.TableSourceFactory.createTableSource(ObjectPath, CatalogTable)
    TableSourceFactory.Context contains more information, and already contains table schema too. Please use TableSourceFactory.createTableSource(Context) instead.
    org.apache.flink.table.legacy.sinks.TableSink.configure(String[], TypeInformation<?>[])
    This method will be dropped in future versions. It is recommended to pass a static schema when instantiating the sink instead.
    org.apache.flink.table.legacy.sinks.TableSink.getFieldNames()
    Use the field names of TableSink.getTableSchema() instead.
    org.apache.flink.table.legacy.sinks.TableSink.getFieldTypes()
    Use the field types of TableSink.getTableSchema() instead.
    org.apache.flink.table.legacy.sinks.TableSink.getOutputType()
    This method will be removed in future versions as it uses the old type system. It is recommended to use TableSink.getConsumedDataType() instead which uses the new type system based on DataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
    org.apache.flink.table.legacy.sources.TableSource.getReturnType()
    This method will be removed in future versions as it uses the old type system. It is recommended to use TableSource.getProducedDataType() instead which uses the new type system based on DataTypes. Please make sure to use either the old or the new type system consistently to avoid unintended behavior. See the website documentation for more information.
    org.apache.flink.table.legacy.sources.TableSource.getTableSchema()
    Table schema is a logical description of a table and should not be part of the physical TableSource. Define schema when registering a Table either in DDL or in TableEnvironment#connect(...).
    org.apache.flink.table.types.inference.TypeInference.Builder.namedArguments(String...)
    Use TypeInference.Builder.staticArguments(StaticArgument...) instead.
    org.apache.flink.table.types.inference.TypeInference.Builder.namedArguments(List<String>)
    Use TypeInference.Builder.staticArguments(List) instead.
    org.apache.flink.table.types.inference.TypeInference.Builder.optionalArguments(List<Boolean>)
    Use TypeInference.Builder.staticArguments(List) instead.
    org.apache.flink.table.types.inference.TypeInference.Builder.typedArguments(List<DataType>)
    Use TypeInference.Builder.staticArguments(List) instead.
    org.apache.flink.table.types.inference.TypeInference.Builder.typedArguments(DataType...)
    Use TypeInference.Builder.staticArguments(StaticArgument...) instead.
    org.apache.flink.table.types.inference.TypeInference.getAccumulatorTypeStrategy()
    Use TypeInference.getStateTypeStrategies() instead.
    org.apache.flink.table.types.inference.TypeInference.getNamedArguments()
    Use TypeInference.getStaticArguments() instead.
    org.apache.flink.table.types.inference.TypeInference.getOptionalArguments()
    Use TypeInference.getStaticArguments() instead.
    org.apache.flink.table.types.inference.TypeInference.getTypedArguments()
    Use TypeInference.getStaticArguments() instead.
    org.apache.flink.table.types.utils.DataTypeUtils.projectRow(DataType, int[])
    Use the Projection type
    org.apache.flink.table.types.utils.DataTypeUtils.projectRow(DataType, int[][])
    Use the Projection type
    org.apache.flink.table.types.utils.TypeConversions.fromDataTypeToLegacyInfo(DataType)
    Please don't use this method anymore. It will be removed soon and we should not make the removal more painful. Sources and sinks should use the method available in context to convert, within the planner you should use either InternalTypeInfo or ExternalTypeInfo depending on the use case.
    org.apache.flink.table.types.utils.TypeConversions.fromDataTypeToLegacyInfo(DataType[])
    Please don't use this method anymore. It will be removed soon and we should not make the removal more painful. Sources and sinks should use the method available in context to convert, within the planner you should use either InternalTypeInfo or ExternalTypeInfo depending on the use case.
    org.apache.flink.table.types.utils.TypeConversions.fromLegacyInfoToDataType(TypeInformation<?>)
    Please don't use this method anymore. It will be removed soon and we should not make the removal more painful. Sources and sinks should use the method available in context to convert, within the planner you should use either InternalTypeInfo or ExternalTypeInfo depending on the use case.
    org.apache.flink.table.types.utils.TypeConversions.fromLegacyInfoToDataType(TypeInformation<?>[])
    Please don't use this method anymore. It will be removed soon and we should not make the removal more painful. Sources and sinks should use the method available in context to convert, within the planner you should use either InternalTypeInfo or ExternalTypeInfo depending on the use case.
    org.apache.flink.table.utils.EncodingUtils.loadClass(String)
    Use EncodingUtils.loadClass(String, ClassLoader) instead, in order to explicitly provide the correct classloader.
  • Deprecated Constructors
    Constructor
    Description
    org.apache.flink.table.legacy.api.TableSchema(String[], TypeInformation<?>[])
    Use the TableSchema.Builder instead.
  • Deprecated Annotation Type Elements
    Annotation Type Element
    Description
    org.apache.flink.table.annotation.FunctionHint.argument()
    Use FunctionHint.arguments() instead.
    org.apache.flink.table.annotation.FunctionHint.argumentNames()
    Use FunctionHint.arguments() instead.
    org.apache.flink.table.annotation.ProcedureHint.argument()
    Use ProcedureHint.arguments() instead.
    org.apache.flink.table.annotation.ProcedureHint.argumentNames()
    Use ProcedureHint.arguments() instead.

Copyright © 2014–2025 The Apache Software Foundation. All rights reserved.