Interface TableEnvironment

All Known Subinterfaces:
TableEnvironmentInternal
All Known Implementing Classes:
TableEnvironmentImpl

@PublicEvolving public interface TableEnvironment
A table environment is the base class, entry point, and central context for creating Table and SQL API programs.

It is unified both on a language level for all JVM-based languages (i.e. there is no distinction between Scala and Java API) and for bounded and unbounded data processing.

A table environment is responsible for:

  • Connecting to external systems.
  • Registering and retrieving Tables and other meta objects from a catalog.
  • Executing SQL statements.
  • Offering further configuration options.

The syntax for path in methods such as createTemporaryView(String, Table) is following [[catalog-name.]database-name.]object-name, where the catalog name and database are optional. For path resolution see useCatalog(String) and useDatabase(String).

Example: `cat.1`.`db`.`Table` resolves to an object named 'Table' in a catalog named 'cat.1' and database named 'db'.

Note: This environment is meant for pure table programs. If you would like to convert from or to other Flink APIs, it might be necessary to use one of the available language-specific table environments in the corresponding bridging modules.

  • Method Details

    • create

      static TableEnvironment create(EnvironmentSettings settings)
      Creates a table environment that is the entry point and central context for creating Table and SQL API programs.

      It is unified both on a language level for all JVM-based languages (i.e. there is no distinction between Scala and Java API) and for bounded and unbounded data processing.

      A table environment is responsible for:

      • Connecting to external systems.
      • Registering and retrieving Tables and other meta objects from a catalog.
      • Executing SQL statements.
      • Offering further configuration options.

      Note: This environment is meant for pure table programs. If you would like to convert from or to other Flink APIs, it might be necessary to use one of the available language-specific table environments in the corresponding bridging modules.

      Parameters:
      settings - The environment settings used to instantiate the TableEnvironment.
    • create

      static TableEnvironment create(org.apache.flink.configuration.Configuration configuration)
      Creates a table environment that is the entry point and central context for creating Table and SQL API programs.

      It is unified both on a language level for all JVM-based languages (i.e. there is no distinction between Scala and Java API) and for bounded and unbounded data processing.

      A table environment is responsible for:

      • Connecting to external systems.
      • Registering and retrieving Tables and other meta objects from a catalog.
      • Executing SQL statements.
      • Offering further configuration options.

      Note: This environment is meant for pure table programs. If you would like to convert from or to other Flink APIs, it might be necessary to use one of the available language-specific table environments in the corresponding bridging modules.

      Parameters:
      configuration - The specified options are used to instantiate the TableEnvironment.
    • fromValues

      default Table fromValues(Object... values)
      Creates a Table from given values.

      Examples:

      You can use a row(...) expression to create a composite rows:

      
       tEnv.fromValues(
           row(1, "ABC"),
           row(2L, "ABCDE")
       )
       

      will produce a Table with a schema as follows:

      
       root
       |-- f0: BIGINT NOT NULL     // original types INT and BIGINT are generalized to BIGINT
       |-- f1: VARCHAR(5) NOT NULL // original types CHAR(3) and CHAR(5) are generalized to VARCHAR(5)
                                   // it uses VARCHAR instead of CHAR so that no padding is applied
       

      The method will derive the types automatically from the input expressions. If types at a certain position differ, the method will try to find a common super type for all types. If a common super type does not exist, an exception will be thrown. If you want to specify the requested type explicitly see fromValues(AbstractDataType, Object...).

      It is also possible to use Row object instead of row expressions.

      ROWs that are a result of e.g. a function call are not flattened

      
       public class RowFunction extends ScalarFunction {
           {@literal @}DataTypeHint("ROW<f0 BIGINT, f1 VARCHAR(5)>")
           Row eval();
       }
      
       tEnv.fromValues(
           call(new RowFunction()),
           call(new RowFunction())
       )
       

      will produce a Table with a schema as follows:

      
       root
       |-- f0: ROW<`f0` BIGINT, `f1` VARCHAR(5)>
       

      The row constructor can be dropped to create a table with a single column:

      ROWs that are a result of e.g. a function call are not flattened

      
       tEnv.fromValues(
           1,
           2L,
           3
       )
       

      will produce a Table with a schema as follows:

      
       root
       |-- f0: BIGINT NOT NULL
       
      Parameters:
      values - Expressions for constructing rows of the VALUES table.
    • fromValues

      default Table fromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, Object... values)
      Creates a Table from given collection of objects with a given row type.

      The difference between this method and fromValues(Object...) is that the schema can be manually adjusted. It might be helpful for assigning more generic types like e.g. DECIMAL or naming the columns.

      Examples:

      
       tEnv.fromValues(
           DataTypes.ROW(
               DataTypes.FIELD("id", DataTypes.DECIMAL(10, 2)),
               DataTypes.FIELD("name", DataTypes.STRING())
           ),
           row(1, "ABC"),
           row(2L, "ABCDE")
       )
       

      will produce a Table with a schema as follows:

      
       root
       |-- id: DECIMAL(10, 2)
       |-- name: STRING
       

      For more examples see fromValues(Object...).

      Parameters:
      rowType - Expected row type for the values.
      values - Expressions for constructing rows of the VALUES table.
      See Also:
    • fromValues

      Table fromValues(org.apache.flink.table.expressions.Expression... values)
      Creates a Table from given values.

      Examples:

      You can use a row(...) expression to create a composite rows:

      
       tEnv.fromValues(
           row(1, "ABC"),
           row(2L, "ABCDE")
       )
       

      will produce a Table with a schema as follows:

      
        root
        |-- f0: BIGINT NOT NULL     // original types INT and BIGINT are generalized to BIGINT
        |-- f1: VARCHAR(5) NOT NULL // original types CHAR(3) and CHAR(5) are generalized to VARCHAR(5)
       	 *                          // it uses VARCHAR instead of CHAR so that no padding is applied
       

      The method will derive the types automatically from the input expressions. If types at a certain position differ, the method will try to find a common super type for all types. If a common super type does not exist, an exception will be thrown. If you want to specify the requested type explicitly see fromValues(AbstractDataType, Expression...).

      It is also possible to use Row object instead of row expressions.

      ROWs that are a result of e.g. a function call are not flattened

      
       public class RowFunction extends ScalarFunction {
           {@literal @}DataTypeHint("ROW<f0 BIGINT, f1 VARCHAR(5)>")
           Row eval();
       }
      
       tEnv.fromValues(
           call(new RowFunction()),
           call(new RowFunction())
       )
       

      will produce a Table with a schema as follows:

      
       root
       |-- f0: ROW<`f0` BIGINT, `f1` VARCHAR(5)>
       

      The row constructor can be dropped to create a table with a single column:

      ROWs that are a result of e.g. a function call are not flattened

      
       tEnv.fromValues(
           lit(1).plus(2),
           lit(2L),
           lit(3)
       )
       

      will produce a Table with a schema as follows:

      
       root
       |-- f0: BIGINT NOT NULL
       
      Parameters:
      values - Expressions for constructing rows of the VALUES table.
    • fromValues

      Table fromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, org.apache.flink.table.expressions.Expression... values)
      Creates a Table from given collection of objects with a given row type.

      The difference between this method and fromValues(Expression...) is that the schema can be manually adjusted. It might be helpful for assigning more generic types like e.g. DECIMAL or naming the columns.

      Examples:

      
       tEnv.fromValues(
           DataTypes.ROW(
               DataTypes.FIELD("id", DataTypes.DECIMAL(10, 2)),
               DataTypes.FIELD("name", DataTypes.STRING())
           ),
           row(1, "ABC"),
           row(2L, "ABCDE")
       )
       

      will produce a Table with a schema as follows:

      
       root
       |-- id: DECIMAL(10, 2)
       |-- name: STRING
       

      For more examples see fromValues(Expression...).

      Parameters:
      rowType - Expected row type for the values.
      values - Expressions for constructing rows of the VALUES table.
      See Also:
    • fromValues

      Table fromValues(Iterable<?> values)
      Creates a Table from given collection of objects.

      See fromValues(Object...) for more explanation.

      Parameters:
      values - Expressions for constructing rows of the VALUES table.
      See Also:
    • fromValues

      Table fromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, Iterable<?> values)
      Creates a Table from given collection of objects with a given row type.

      See fromValues(AbstractDataType, Object...) for more explanation.

      Parameters:
      rowType - Expected row type for the values.
      values - Expressions for constructing rows of the VALUES table.
      See Also:
    • registerCatalog

      @Deprecated void registerCatalog(String catalogName, org.apache.flink.table.catalog.Catalog catalog)
      Deprecated.
      Use createCatalog(String, CatalogDescriptor) instead. The new method uses a CatalogDescriptor to initialize the catalog instance and store the CatalogDescriptor to the CatalogStore.
      Registers a Catalog under a unique name. All tables registered in the Catalog can be accessed.
      Parameters:
      catalogName - The name under which the catalog will be registered.
      catalog - The catalog to register.
    • createCatalog

      void createCatalog(String catalogName, org.apache.flink.table.catalog.CatalogDescriptor catalogDescriptor)
      Creates a Catalog using the provided CatalogDescriptor. All table registered in the Catalog can be accessed. The CatalogDescriptor will be persisted into the CatalogStore.
      Parameters:
      catalogName - The name under which the catalog will be created
      catalogDescriptor - The catalog descriptor for creating catalog
    • getCatalog

      Optional<org.apache.flink.table.catalog.Catalog> getCatalog(String catalogName)
      Gets a registered Catalog by name.
      Parameters:
      catalogName - The name to look up the Catalog.
      Returns:
      The requested catalog, empty if there is no registered catalog with given name.
    • loadModule

      void loadModule(String moduleName, org.apache.flink.table.module.Module module)
      Loads a Module under a unique name. Modules will be kept in the loaded order. ValidationException is thrown when there is already a module with the same name.
      Parameters:
      moduleName - name of the Module
      module - the module instance
    • useModules

      void useModules(String... moduleNames)
      Enable modules in use with declared name order. Modules that have been loaded but not exist in names varargs will become unused.
      Parameters:
      moduleNames - module names to be used
    • unloadModule

      void unloadModule(String moduleName)
      Unloads a Module with given name. ValidationException is thrown when there is no module with the given name.
      Parameters:
      moduleName - name of the Module
    • registerFunction

      @Deprecated void registerFunction(String name, org.apache.flink.table.functions.ScalarFunction function)
      Deprecated.
      Use createTemporarySystemFunction(String, UserDefinedFunction) instead. Please note that the new method also uses the new type system and reflective extraction logic. It might be necessary to update the function implementation as well. See the documentation of ScalarFunction for more information on the new function design.
      Registers a ScalarFunction under a unique name. Replaces already existing user-defined functions under this name.
    • createTemporarySystemFunction

      void createTemporarySystemFunction(String name, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)
      Registers a UserDefinedFunction class as a temporary system function.

      Compared to createTemporaryFunction(String, Class), system functions are identified by a global name that is independent of the current catalog and current database. Thus, this method allows to extend the set of built-in system functions like TRIM, ABS, etc.

      Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary system function.

      Parameters:
      name - The name under which the function will be registered globally.
      functionClass - The function class containing the implementation.
    • createTemporarySystemFunction

      void createTemporarySystemFunction(String name, org.apache.flink.table.functions.UserDefinedFunction functionInstance)
      Registers a UserDefinedFunction instance as a temporary system function.

      Compared to createTemporarySystemFunction(String, Class), this method takes a function instance that might have been parameterized before (e.g. through its constructor). This might be useful for more interactive sessions. Make sure that the instance is Serializable.

      Compared to createTemporaryFunction(String, UserDefinedFunction), system functions are identified by a global name that is independent of the current catalog and current database. Thus, this method allows to extend the set of built-in system functions like TRIM, ABS, etc.

      Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary system function.

      Parameters:
      name - The name under which the function will be registered globally.
      functionInstance - The (possibly pre-configured) function instance containing the implementation.
    • dropTemporarySystemFunction

      boolean dropTemporarySystemFunction(String name)
      Drops a temporary system function registered under the given name.

      If a permanent function with the given name exists, it will be used from now on for any queries that reference this name.

      Parameters:
      name - The name under which the function has been registered globally.
      Returns:
      true if a function existed under the given name and was removed
    • createFunction

      void createFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)
      Registers a UserDefinedFunction class as a catalog function in the given path.

      Compared to system functions with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.

      There must not be another function (temporary or permanent) registered under the same path.

      Parameters:
      path - The path under which the function will be registered. See also the TableEnvironment class description for the format of the path.
      functionClass - The function class containing the implementation.
    • createFunction

      void createFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass, boolean ignoreIfExists)
      Registers a UserDefinedFunction class as a catalog function in the given path.

      Compared to system functions with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.

      Parameters:
      path - The path under which the function will be registered. See also the TableEnvironment class description for the format of the path.
      functionClass - The function class containing the implementation.
      ignoreIfExists - If a function exists under the given path and this flag is set, no operation is executed. An exception is thrown otherwise.
    • createFunction

      void createFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)
      Registers a UserDefinedFunction class as a catalog function in the given path by the specific class name and user defined resource uri.

      Compared to createFunction(String, Class), this method allows registering a user defined function by only providing a full path class name and a list of resources that contain the implementation of the function along with its dependencies. Users don't need to initialize the function instance in advance. The resource file can be a local or remote JAR file.

      Compared to system functions with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.

      There must not be another function (temporary or permanent) registered under the same path.

      Parameters:
      path - The path under which the function will be registered. See also the TableEnvironment class description for the format of the path.
      className - The class name of UDF to be registered.
      resourceUris - The list of udf resource uris in local or remote.
    • createFunction

      void createFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris, boolean ignoreIfExists)
      Registers a UserDefinedFunction class as a catalog function in the given path by the specific class name and user defined resource uri.

      Compared to createFunction(String, Class), this method allows registering a user defined function by only providing a full path class name and a list of resources that contain the implementation of the function along with its dependencies. Users don't need to initialize the function instance in advance. The resource file can be a local or remote JAR file.

      Compared to system functions with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.

      There must not be another function (temporary or permanent) registered under the same path.

      Parameters:
      path - The path under which the function will be registered. See also the TableEnvironment class description for the format of the path.
      className - The class name of UDF to be registered.
      resourceUris - The list of udf resource uris in local or remote.
      ignoreIfExists - If a function exists under the given path and this flag is set, no operation is executed. An exception is thrown otherwise.
    • createTemporaryFunction

      void createTemporaryFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)
      Registers a UserDefinedFunction class as a temporary catalog function.

      Compared to createTemporarySystemFunction(String, Class) with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.

      Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary function.

      Parameters:
      path - The path under which the function will be registered. See also the TableEnvironment class description for the format of the path.
      functionClass - The function class containing the implementation.
    • createTemporaryFunction

      void createTemporaryFunction(String path, org.apache.flink.table.functions.UserDefinedFunction functionInstance)
      Registers a UserDefinedFunction instance as a temporary catalog function.

      Compared to createTemporaryFunction(String, Class), this method takes a function instance that might have been parameterized before (e.g. through its constructor). This might be useful for more interactive sessions. Make sure that the instance is Serializable.

      Compared to createTemporarySystemFunction(String, UserDefinedFunction) with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.

      Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary function.

      Parameters:
      path - The path under which the function will be registered. See also the TableEnvironment class description for the format of the path.
      functionInstance - The (possibly pre-configured) function instance containing the implementation.
    • createTemporaryFunction

      void createTemporaryFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)
      Registers a UserDefinedFunction class as a temporary catalog function in the given path by the specific class name and user defined resource uri.

      Compared to createTemporaryFunction(String, Class), this method allows registering a user defined function by only providing a full path class name and a list of resources that contain the implementation of the function along with its dependencies. Users don't need to initialize the function instance in advance. The resource file can be a local or remote JAR file.

      Compared to createTemporarySystemFunction(String, String, List) with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.

      Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary function.

      Parameters:
      path - The path under which the function will be registered. See also the TableEnvironment class description for the format of the path.
      className - The class name of UDF to be registered.
      resourceUris - The list udf resource uri in local or remote.
    • createTemporarySystemFunction

      void createTemporarySystemFunction(String name, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)
      Registers a UserDefinedFunction class as a temporary system function by the specific class name and user defined resource uri.

      Compared to createTemporaryFunction(String, Class), this method allows registering a user defined function by only providing a full path class name and a list of resources that contain the implementation of the function along with its dependencies. Users don't need to initialize the function instance in advance. The resource file can be a local or remote JAR file.

      Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary system function.

      Parameters:
      name - The name under which the function will be registered globally.
      className - The class name of UDF to be registered.
      resourceUris - The list of udf resource uris in local or remote.
    • dropFunction

      boolean dropFunction(String path)
      Drops a catalog function registered in the given path.
      Parameters:
      path - The path under which the function has been registered. See also the TableEnvironment class description for the format of the path.
      Returns:
      true if a function existed in the given path and was removed
    • dropTemporaryFunction

      boolean dropTemporaryFunction(String path)
      Drops a temporary catalog function registered in the given path.

      If a permanent function with the given path exists, it will be used from now on for any queries that reference this path.

      Parameters:
      path - The path under which the function will be registered. See also the TableEnvironment class description for the format of the path.
      Returns:
      true if a function existed in the given path and was removed
    • createTemporaryTable

      void createTemporaryTable(String path, TableDescriptor descriptor)
      Registers the given TableDescriptor as a temporary catalog table.

      The descriptor is converted into a CatalogTable and stored in the catalog.

      Temporary objects can shadow permanent ones. If a permanent object in a given path exists, it will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.

      Examples:

      
       tEnv.createTemporaryTable("MyTable", TableDescriptor.forConnector("datagen")
         .schema(Schema.newBuilder()
           .column("f0", DataTypes.STRING())
           .build())
         .option(DataGenOptions.ROWS_PER_SECOND, 10)
         .option("fields.f0.kind", "random")
         .build());
       
      Parameters:
      path - The path under which the table will be registered. See also the TableEnvironment class description for the format of the path.
      descriptor - Template for creating a CatalogTable instance.
    • createTemporaryTable

      void createTemporaryTable(String path, TableDescriptor descriptor, boolean ignoreIfExists)
      Registers the given TableDescriptor as a temporary catalog table.

      The descriptor is converted into a CatalogTable and stored in the catalog.

      Temporary objects can shadow permanent ones. If a permanent object in a given path exists, it will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.

      Examples:

      
       tEnv.createTemporaryTable("MyTable", TableDescriptor.forConnector("datagen")
         .schema(Schema.newBuilder()
           .column("f0", DataTypes.STRING())
           .build())
         .option(DataGenOptions.ROWS_PER_SECOND, 10)
         .option("fields.f0.kind", "random")
         .build(),
        true);
       
      Parameters:
      path - The path under which the table will be registered. See also the TableEnvironment class description for the format of the path.
      descriptor - Template for creating a CatalogTable instance.
      ignoreIfExists - If a table exists under the given path and this flag is set, no operation is executed. An exception is thrown otherwise.
    • createTable

      void createTable(String path, TableDescriptor descriptor)
      Registers the given TableDescriptor as a catalog table.

      The descriptor is converted into a CatalogTable and stored in the catalog.

      If the table should not be permanently stored in a catalog, use createTemporaryTable(String, TableDescriptor) instead.

      Examples:

      
       tEnv.createTable("MyTable", TableDescriptor.forConnector("datagen")
         .schema(Schema.newBuilder()
           .column("f0", DataTypes.STRING())
           .build())
         .option(DataGenOptions.ROWS_PER_SECOND, 10)
         .option("fields.f0.kind", "random")
         .build());
       
      Parameters:
      path - The path under which the table will be registered. See also the TableEnvironment class description for the format of the path.
      descriptor - Template for creating a CatalogTable instance.
    • createTable

      boolean createTable(String path, TableDescriptor descriptor, boolean ignoreIfExists)
      Registers the given TableDescriptor as a catalog table.

      The descriptor is converted into a CatalogTable and stored in the catalog.

      If the table should not be permanently stored in a catalog, use createTemporaryTable(String, TableDescriptor, boolean) instead.

      Examples:

      
       tEnv.createTable("MyTable", TableDescriptor.forConnector("datagen")
         .schema(Schema.newBuilder()
           .column("f0", DataTypes.STRING())
           .build())
         .option(DataGenOptions.ROWS_PER_SECOND, 10)
         .option("fields.f0.kind", "random")
         .build(),
        true);
       
      Parameters:
      path - The path under which the table will be registered. See also the TableEnvironment class description for the format of the path.
      descriptor - Template for creating a CatalogTable instance.
      ignoreIfExists - If a table exists under the given path and this flag is set, no operation is executed. An exception is thrown otherwise.
      Returns:
      true if table was created in the given path, false if a permanent object already exists in the given path.
    • registerTable

      @Deprecated void registerTable(String name, Table table)
      Registers a Table under a unique name in the TableEnvironment's catalog. Registered tables can be referenced in SQL queries.

      Temporary objects can shadow permanent ones. If a permanent object in a given path exists, it will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.

      Parameters:
      name - The name under which the table will be registered.
      table - The table to register.
    • createTemporaryView

      void createTemporaryView(String path, Table view)
      Registers a Table API object as a temporary view similar to SQL temporary views.

      Temporary objects can shadow permanent ones. If a permanent object in a given path exists, it will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.

      Parameters:
      path - The path under which the view will be registered. See also the TableEnvironment class description for the format of the path.
      view - The view to register.
    • createView

      void createView(String path, Table view)
      Registers a Table API object as a view similar to SQL views.

      Temporary objects can shadow permanent ones. If a temporary object in a given path exists, the permanent one will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.

      Parameters:
      path - The path under which the view will be registered. See also the TableEnvironment class description for the format of the path.
      view - The view to register.
    • createView

      boolean createView(String path, Table view, boolean ignoreIfExists)
      Registers a Table API object as a view similar to SQL views.

      Temporary objects can shadow permanent ones. If a temporary object in a given path exists, the permanent one will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.

      Parameters:
      path - The path under which the view will be registered. See also the TableEnvironment class description for the format of the path.
      view - The view to register.
      ignoreIfExists - If a view or a table exists and the given flag is set, no operation is executed. An exception is thrown otherwise.
      Returns:
      true if view was created in the given path, false if a permanent object already exists in the given path.
    • scan

      @Deprecated Table scan(String... tablePath)
      Deprecated.
      Scans a registered table and returns the resulting Table.

      A table to scan must be registered in the TableEnvironment. It can be either directly registered or be an external member of a Catalog.

      See the documentation of useDatabase(String) or useCatalog(String) for the rules on the path resolution.

      Examples:

      Scanning a directly registered table.

      
       Table tab = tableEnv.scan("tableName");
       

      Scanning a table from a registered catalog.

      
       Table tab = tableEnv.scan("catalogName", "dbName", "tableName");
       
      Parameters:
      tablePath - The path of the table to scan.
      Returns:
      The resulting Table.
      See Also:
    • from

      Table from(String path)
      Reads a registered table and returns the resulting Table.

      A table to scan must be registered in the TableEnvironment.

      See the documentation of useDatabase(String) or useCatalog(String) for the rules on the path resolution.

      Examples:

      Reading a table from default catalog and database.

      
       Table tab = tableEnv.from("tableName");
       

      Reading a table from a registered catalog.

      
       Table tab = tableEnv.from("catalogName.dbName.tableName");
       

      Reading a table from a registered catalog with escaping. Dots in e.g. a database name must be escaped.

      
       Table tab = tableEnv.from("catalogName.`db.Name`.Table");
       

      Note that the returned Table is an API object and only contains a pipeline description. It actually corresponds to a view in SQL terms. Call Executable.execute() to trigger an execution.

      Parameters:
      path - The path of a table API object to scan.
      Returns:
      The Table object describing the pipeline for further transformations.
      See Also:
    • from

      Table from(TableDescriptor descriptor)
      Returns a Table backed by the given descriptor.

      The descriptor won't be registered in the catalog, but it will be propagated directly in the operation tree. Note that calling this method multiple times, even with the same descriptor, results in multiple temporary tables. In such cases, it is recommended to register it under a name using createTemporaryTable(String, TableDescriptor) and reference it via from(String).

      Examples:

      
       Table table = tEnv.from(TableDescriptor.forConnector("datagen")
         .schema(Schema.newBuilder()
           .column("f0", DataTypes.STRING())
           .build())
         .build());
       

      Note that the returned Table is an API object and only contains a pipeline description. It actually corresponds to a view in SQL terms. Call Executable.execute() to trigger an execution.

      Returns:
      The Table object describing the pipeline for further transformations.
    • listCatalogs

      String[] listCatalogs()
      Gets the names of all catalogs registered in this environment.
      Returns:
      A list of the names of all registered catalogs.
    • listModules

      String[] listModules()
      Gets an array of names of all used modules in this environment in resolution order.
      Returns:
      A list of the names of used modules in resolution order.
    • listFullModules

      ModuleEntry[] listFullModules()
      Gets an array of all loaded modules with use status in this environment. Used modules are kept in resolution order.
      Returns:
      A list of name and use status entries of all loaded modules.
    • listDatabases

      String[] listDatabases()
      Gets the names of all databases registered in the current catalog.
      Returns:
      A list of the names of all registered databases in the current catalog.
    • listTables

      String[] listTables()
      Gets the names of all tables available in the current namespace (the current database of the current catalog). It returns both temporary and permanent tables and views.
      Returns:
      A list of the names of all registered tables in the current database of the current catalog.
      See Also:
    • listTables

      String[] listTables(String catalogName, String databaseName)
      Gets the names of all tables available in the given namespace (the given database of the given catalog). It returns both temporary and permanent tables and views.
      Returns:
      A list of the names of all registered tables in the given database of the given catalog.
      See Also:
    • listViews

      String[] listViews()
      Gets the names of all views available in the current namespace (the current database of the current catalog). It returns both temporary and permanent views.
      Returns:
      A list of the names of all registered views in the current database of the current catalog.
      See Also:
    • listTemporaryTables

      String[] listTemporaryTables()
      Gets the names of all temporary tables and views available in the current namespace (the current database of the current catalog).
      Returns:
      A list of the names of all registered temporary tables and views in the current database of the current catalog.
      See Also:
    • listTemporaryViews

      String[] listTemporaryViews()
      Gets the names of all temporary views available in the current namespace (the current database of the current catalog).
      Returns:
      A list of the names of all registered temporary views in the current database of the current catalog.
      See Also:
    • listUserDefinedFunctions

      String[] listUserDefinedFunctions()
      Gets the names of all user defined functions registered in this environment.
    • listFunctions

      String[] listFunctions()
      Gets the names of all functions in this environment.
    • dropTemporaryTable

      boolean dropTemporaryTable(String path)
      Drops a temporary table registered in the given path.

      If a permanent table with a given path exists, it will be used from now on for any queries that reference this path.

      Parameters:
      path - The given path under which the temporary table will be dropped. See also the TableEnvironment class description for the format of the path.
      Returns:
      true if a table existed in the given path and was removed
    • dropTable

      boolean dropTable(String path)
      Drops a table registered in the given path.

      This method can only drop permanent objects. Temporary objects can shadow permanent ones. If a temporary object exists in a given path, make sure to drop the temporary object first using dropTemporaryTable(java.lang.String).

      Compared to SQL, this method will not throw an error if the table does not exist. Use dropTable(java.lang.String, boolean) to change the default behavior.

      Parameters:
      path - The given path under which the table will be dropped. See also the TableEnvironment class description for the format of the path.
      Returns:
      true if table existed in the given path and was dropped, false if table didn't exist in the given path.
    • dropTable

      boolean dropTable(String path, boolean ignoreIfNotExists)
      Drops a table registered in the given path.

      This method can only drop permanent objects. Temporary objects can shadow permanent ones. If a temporary object exists in a given path, make sure to drop the temporary object first using dropTemporaryTable(java.lang.String).

      Parameters:
      path - The given path under which the given table will be dropped. See also the TableEnvironment class description for the format of the path.
      ignoreIfNotExists - If false exception will be thrown if the view to drop does not exist.
      Returns:
      true if table existed in the given path and was dropped, false if table didn't exist in the given path.
    • dropTemporaryView

      boolean dropTemporaryView(String path)
      Drops a temporary view registered in the given path.

      If a permanent table or view with a given path exists, it will be used from now on for any queries that reference this path.

      Parameters:
      path - The given path under which the temporary view will be dropped. See also the TableEnvironment class description for the format of the path.
      Returns:
      true if a view existed in the given path and was removed
    • dropView

      boolean dropView(String path)
      Drops a view registered in the given path.

      This method can only drop permanent objects. Temporary objects can shadow permanent ones. If a temporary object exists in a given path, make sure to drop the temporary object first using dropTemporaryView(java.lang.String).

      Compared to SQL, this method will not throw an error if the view does not exist. Use dropView(java.lang.String, boolean) to change the default behavior.

      Parameters:
      path - The given path under which the view will be dropped. See also the TableEnvironment class description for the format of the path.
      Returns:
      true if view existed in the given path and was dropped, false if view didn't exist in the given path.
    • dropView

      boolean dropView(String path, boolean ignoreIfNotExists)
      Drops a view registered in the given path.

      This method can only drop permanent objects. Temporary objects can shadow permanent ones. If a temporary object exists in a given path, make sure to drop the temporary object first using dropTemporaryView(java.lang.String).

      Parameters:
      path - The given path under which the view will be dropped. See also the TableEnvironment class description for the format of the path.
      ignoreIfNotExists - If false exception will be thrown if the view to drop does not exist.
      Returns:
      true if view existed in the given path and was dropped, false if view didn't exist in the given path and ignoreIfNotExists was true.
    • explainSql

      default String explainSql(String statement, ExplainDetail... extraDetails)
      Returns the AST of the specified statement and the execution plan to compute the result of the given statement.
      Parameters:
      statement - The statement for which the AST and execution plan will be returned.
      extraDetails - The extra explain details which the explain result should include, e.g. estimated cost, changelog mode for streaming, displaying execution plan in json format
      Returns:
      AST and the execution plan.
    • explainSql

      String explainSql(String statement, ExplainFormat format, ExplainDetail... extraDetails)
      Returns the AST of the specified statement and the execution plan to compute the result of the given statement.
      Parameters:
      statement - The statement for which the AST and execution plan will be returned.
      format - The output format of explained plan.
      extraDetails - The extra explain details which the explain result should include, e.g. estimated cost, changelog mode for streaming, displaying execution plan in json format
      Returns:
      AST and the execution plan.
    • getCompletionHints

      @Deprecated String[] getCompletionHints(String statement, int position)
      Deprecated.
      Will be removed in the next release
      Returns completion hints for the given statement at the given cursor position. The completion happens case insensitively.
      Parameters:
      statement - Partial or slightly incorrect SQL statement
      position - cursor position
      Returns:
      completion hints that fit at the current cursor position
    • sqlQuery

      Table sqlQuery(String query)
      Evaluates a SQL query on registered tables and returns a Table object describing the pipeline for further transformations.

      All tables and other objects referenced by the query must be registered in the TableEnvironment. For example, use createTemporaryView(String, Table)) for referencing a Table object or createTemporarySystemFunction(String, Class) for functions.

      Alternatively, a Table object is automatically registered when its Table#toString() method is called, for example when it is embedded into a string. Hence, SQL queries can directly reference a Table object inline (i.e. anonymous) as follows:

      
       Table table = ...;
       String tableName = table.toString();
       // the table is not registered to the table environment
       tEnv.sqlQuery("SELECT * FROM " + tableName + " WHERE a > 12");
       

      Note that the returned Table is an API object and only contains a pipeline description. It actually corresponds to a view in SQL terms. Call Executable.execute() to trigger an execution or use executeSql(String) directly.

      Parameters:
      query - The SQL query to evaluate.
      Returns:
      The Table object describing the pipeline for further transformations.
    • executeSql

      TableResult executeSql(String statement)
      Executes the given single statement and returns the execution result.

      The statement can be DDL/DML/DQL/SHOW/DESCRIBE/EXPLAIN/USE. For DML and DQL, this method returns TableResult once the job has been submitted. For DDL and DCL statements, TableResult is returned once the operation has finished.

      If multiple pipelines should insert data into one or more sink tables as part of a single execution, use a StatementSet (see createStatementSet()).

      By default, all DML operations are executed asynchronously. Use TableResult.await() or TableResult.getJobClient() to monitor the execution. Set TableConfigOptions.TABLE_DML_SYNC for always synchronous execution.

      Returns:
      content for DQL/SHOW/DESCRIBE/EXPLAIN, the affected row count for `DML` (-1 means unknown), or a string message ("OK") for other statements.
    • getCurrentCatalog

      String getCurrentCatalog()
      Gets the current default catalog name of the current session.
      Returns:
      The current default catalog name that is used for the path resolution.
      See Also:
    • useCatalog

      void useCatalog(@Nullable String catalogName)
      Sets the current catalog to the given value. It also sets the default database to the catalog's default one. See also useDatabase(String).

      This is used during the resolution of object paths. Both the catalog and database are optional when referencing catalog objects such as tables, views etc. The algorithm looks for requested objects in following paths in that order:

      1. [current-catalog].[current-database].[requested-path]
      2. [current-catalog].[requested-path]
      3. [requested-path]

      Example:

      Given structure with default catalog set to default_catalog and default database set to default_database.

       root:
         |- default_catalog
             |- default_database
                 |- tab1
             |- db1
                 |- tab1
         |- cat1
             |- db1
                 |- tab1
       

      The following table describes resolved paths:

      Requested path Resolved path
      tab1 default_catalog.default_database.tab1
      db1.tab1 default_catalog.db1.tab1
      cat1.db1.tab1 cat1.db1.tab1

      You can unset the current catalog by passing a null value. If the current catalog is unset, you need to use fully qualified identifiers.

      Parameters:
      catalogName - The name of the catalog to set as the current default catalog.
      See Also:
    • getCurrentDatabase

      String getCurrentDatabase()
      Gets the current default database name of the running session.
      Returns:
      The name of the current database of the current catalog.
      See Also:
    • useDatabase

      void useDatabase(@Nullable String databaseName)
      Sets the current default database. It has to exist in the current catalog. That path will be used as the default one when looking for unqualified object names.

      This is used during the resolution of object paths. Both the catalog and database are optional when referencing catalog objects such as tables, views etc. The algorithm looks for requested objects in following paths in that order:

      1. [current-catalog].[current-database].[requested-path]
      2. [current-catalog].[requested-path]
      3. [requested-path]

      Example:

      Given structure with default catalog set to default_catalog and default database set to default_database.

       root:
         |- default_catalog
             |- default_database
                 |- tab1
             |- db1
                 |- tab1
         |- cat1
             |- db1
                 |- tab1
       

      The following table describes resolved paths:

      Requested path Resolved path
      tab1 default_catalog.default_database.tab1
      db1.tab1 default_catalog.db1.tab1
      cat1.db1.tab1 cat1.db1.tab1

      You can unset the current database by passing a null value. If the current database is unset, you need to qualify identifiers at least with the database name.

      Parameters:
      databaseName - The name of the database to set as the current database.
      See Also:
    • getConfig

      TableConfig getConfig()
      Returns the table config that defines the runtime behavior of the Table API.
    • createStatementSet

      StatementSet createStatementSet()
      Returns a StatementSet that accepts pipelines defined by DML statements or Table objects. The planner can optimize all added statements together and then submit them as one job.
    • loadPlan

      @Experimental CompiledPlan loadPlan(PlanReference planReference) throws org.apache.flink.table.api.TableException
      Loads a plan from a PlanReference into a CompiledPlan.

      Compiled plans can be persisted and reloaded across Flink versions. They describe static pipelines to ensure backwards compatibility and enable stateful streaming job upgrades. See CompiledPlan and the website documentation for more information.

      This method will parse the input reference and will validate the plan. The returned instance can be executed via Executable.execute().

      Note: The compiled plan feature is not supported in batch mode.

      Throws:
      org.apache.flink.table.api.TableException - if the plan cannot be loaded from the filesystem, or from classpath resources, or if the plan is invalid.
    • compilePlanSql

      @Experimental CompiledPlan compilePlanSql(String stmt) throws org.apache.flink.table.api.TableException
      Compiles a SQL DML statement into a CompiledPlan.

      Compiled plans can be persisted and reloaded across Flink versions. They describe static pipelines to ensure backwards compatibility and enable stateful streaming job upgrades. See CompiledPlan and the website documentation for more information.

      Note: Only INSERT INTO is supported at the moment.

      Note: The compiled plan feature is not supported in batch mode.

      Throws:
      org.apache.flink.table.api.TableException - if the SQL statement is invalid or if the plan cannot be persisted.
      See Also:
    • executePlan

      @Experimental default TableResult executePlan(PlanReference planReference) throws org.apache.flink.table.api.TableException
      Shorthand for tEnv.loadPlan(planReference).execute().
      Throws:
      org.apache.flink.table.api.TableException
      See Also: