Class TableEnvironmentImpl
- java.lang.Object
-
- org.apache.flink.table.api.internal.TableEnvironmentImpl
-
- All Implemented Interfaces:
TableEnvironmentInternal,TableEnvironment
@Internal public class TableEnvironmentImpl extends Object implements TableEnvironmentInternal
Implementation ofTableEnvironmentthat works exclusively with Table API interfaces. OnlyTableSourceis supported as an input andTableSinkas an output. It also does not bind to any particularStreamExecutionEnvironment.
-
-
Field Summary
Fields Modifier and Type Field Description protected ExecutorexecEnvprotected FunctionCatalogfunctionCatalogprotected Plannerplannerprotected ResourceManagerresourceManagerprotected TableConfigtableConfig
-
Constructor Summary
Constructors Modifier Constructor Description protectedTableEnvironmentImpl(CatalogManager catalogManager, ModuleManager moduleManager, ResourceManager resourceManager, TableConfig tableConfig, Executor executor, FunctionCatalog functionCatalog, Planner planner, boolean isStreamingMode)
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description CompiledPlancompilePlan(List<ModifyOperation> operations)CompiledPlancompilePlanSql(String stmt)Compiles a SQL DML statement into aCompiledPlan.static TableEnvironmentImplcreate(org.apache.flink.configuration.Configuration configuration)Creates a table environment that is the entry point and central context for creating Table and SQL API programs.static TableEnvironmentImplcreate(EnvironmentSettings settings)Creates a table environment that is the entry point and central context for creating Table and SQL API programs.voidcreateCatalog(String catalogName, org.apache.flink.table.catalog.CatalogDescriptor catalogDescriptor)Creates aCatalogusing the providedCatalogDescriptor.voidcreateFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)Registers aUserDefinedFunctionclass as a catalog function in the given path.voidcreateFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass, boolean ignoreIfExists)Registers aUserDefinedFunctionclass as a catalog function in the given path.voidcreateFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)Registers aUserDefinedFunctionclass as a catalog function in the given path by the specific class name and user defined resource uri.voidcreateFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris, boolean ignoreIfExists)Registers aUserDefinedFunctionclass as a catalog function in the given path by the specific class name and user defined resource uri.StatementSetcreateStatementSet()Returns aStatementSetthat accepts pipelines defined by DML statements orTableobjects.voidcreateTable(String path, TableDescriptor descriptor)Registers the givenTableDescriptoras a catalog table.booleancreateTable(String path, TableDescriptor descriptor, boolean ignoreIfExists)Registers the givenTableDescriptoras a catalog table.TableImplcreateTable(QueryOperation tableOperation)voidcreateTemporaryFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)Registers aUserDefinedFunctionclass as a temporary catalog function.voidcreateTemporaryFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)Registers aUserDefinedFunctionclass as a temporary catalog function in the given path by the specific class name and user defined resource uri.voidcreateTemporaryFunction(String path, org.apache.flink.table.functions.UserDefinedFunction functionInstance)Registers aUserDefinedFunctioninstance as a temporary catalog function.voidcreateTemporarySystemFunction(String name, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)Registers aUserDefinedFunctionclass as a temporary system function.voidcreateTemporarySystemFunction(String name, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)Registers aUserDefinedFunctionclass as a temporary system function by the specific class name and user defined resource uri.voidcreateTemporarySystemFunction(String name, org.apache.flink.table.functions.UserDefinedFunction functionInstance)Registers aUserDefinedFunctioninstance as a temporary system function.voidcreateTemporaryTable(String path, TableDescriptor descriptor)Registers the givenTableDescriptoras a temporary catalog table.voidcreateTemporaryTable(String path, TableDescriptor descriptor, boolean ignoreIfExists)Registers the givenTableDescriptoras a temporary catalog table.voidcreateTemporaryView(String path, Table view)Registers aTableAPI object as a temporary view similar to SQL temporary views.voidcreateView(String path, Table view)Registers aTableAPI object as a view similar to SQL views.booleancreateView(String path, Table view, boolean ignoreIfExists)Registers aTableAPI object as a view similar to SQL views.booleandropFunction(String path)Drops a catalog function registered in the given path.booleandropTable(String path)Drops a table registered in the given path.booleandropTable(String path, boolean ignoreIfNotExists)Drops a table registered in the given path.booleandropTemporaryFunction(String path)Drops a temporary catalog function registered in the given path.booleandropTemporarySystemFunction(String name)Drops a temporary system function registered under the given name.booleandropTemporaryTable(String path)Drops a temporary table registered in the given path.booleandropTemporaryView(String path)Drops a temporary view registered in the given path.booleandropView(String path)Drops a view registered in the given path.booleandropView(String path, boolean ignoreIfNotExists)Drops a view registered in the given path.TableResultInternalexecuteCachedPlanInternal(CachedPlan cachedPlan)Execute the givenCachedPlanand return the execution result.TableResultInternalexecuteInternal(List<ModifyOperation> operations)Execute the given modify operations and return the execution result.TableResultInternalexecuteInternal(Operation operation)Execute the given operation and return the execution result.TableResultInternalexecutePlan(InternalPlan plan)TableResultexecuteSql(String statement)Executes the given single statement and returns the execution result.StringexplainInternal(List<Operation> operations, ExplainFormat format, ExplainDetail... extraDetails)Returns the AST of this table and the execution plan to compute the result of this table.StringexplainPlan(InternalPlan compiledPlan, ExplainDetail... extraDetails)StringexplainSql(String statement, ExplainFormat format, ExplainDetail... extraDetails)Returns the AST of the specified statement and the execution plan to compute the result of the given statement.Tablefrom(String path)Reads a registered table and returns the resultingTable.Tablefrom(TableDescriptor descriptor)Returns aTablebacked by the givendescriptor.TablefromTableSource(org.apache.flink.table.legacy.sources.TableSource<?> source)Creates a table from a table source.TablefromValues(Iterable<?> values)Creates a Table from given collection of objects.TablefromValues(Object... values)Creates a Table from given values.TablefromValues(org.apache.flink.table.expressions.Expression... values)Creates a Table from given values.TablefromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, Iterable<?> values)Creates a Table from given collection of objects with a given row type.TablefromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, Object... values)Creates a Table from given collection of objects with a given row type.TablefromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, org.apache.flink.table.expressions.Expression... values)Creates a Table from given collection of objects with a given row type.org.apache.flink.api.dag.PipelinegeneratePipelineFromQueryOperation(QueryOperation operation, List<org.apache.flink.api.dag.Transformation<?>> transformations)generate executionPipelinefromQueryOperation.Optional<org.apache.flink.table.catalog.Catalog>getCatalog(String catalogName)Gets a registeredCatalogby name.CatalogManagergetCatalogManager()Returns aCatalogManagerthat deals with all catalog objects.String[]getCompletionHints(String statement, int position)Returns completion hints for the given statement at the given cursor position.TableConfiggetConfig()Returns the table config that defines the runtime behavior of the Table API.StringgetCurrentCatalog()Gets the current default catalog name of the current session.StringgetCurrentDatabase()Gets the current default database name of the running session.OperationTreeBuildergetOperationTreeBuilder()Returns aOperationTreeBuilderthat can createQueryOperations.ParsergetParser()Return aParserthat provides methods for parsing a SQL string.PlannergetPlanner()String[]listCatalogs()Gets the names of all catalogs registered in this environment.String[]listDatabases()Gets the names of all databases registered in the current catalog.ModuleEntry[]listFullModules()Gets an array of all loaded modules with use status in this environment.String[]listFunctions()Gets the names of all functions in this environment.String[]listModules()Gets an array of names of all used modules in this environment in resolution order.String[]listTables()Gets the names of all tables available in the current namespace (the current database of the current catalog).String[]listTables(String catalog, String databaseName)Gets the names of all tables available in the given namespace (the given database of the given catalog).String[]listTemporaryTables()Gets the names of all temporary tables and views available in the current namespace (the current database of the current catalog).String[]listTemporaryViews()Gets the names of all temporary views available in the current namespace (the current database of the current catalog).String[]listUserDefinedFunctions()Gets the names of all user defined functions registered in this environment.String[]listViews()Gets the names of all views available in the current namespace (the current database of the current catalog).voidloadModule(String moduleName, org.apache.flink.table.module.Module module)Loads aModuleunder a unique name.CompiledPlanloadPlan(PlanReference planReference)Loads a plan from aPlanReferenceinto aCompiledPlan.protected QueryOperationqualifyQueryOperation(org.apache.flink.table.catalog.ObjectIdentifier identifier, QueryOperation queryOperation)Subclasses can override this method to transform the given QueryOperation to a new one with the qualified object identifier.voidregisterCatalog(String catalogName, org.apache.flink.table.catalog.Catalog catalog)Registers aCatalogunder a unique name.voidregisterFunction(String name, org.apache.flink.table.functions.ScalarFunction function)Registers aScalarFunctionunder a unique name.voidregisterTable(String name, Table table)Registers aTableunder a unique name in the TableEnvironment's catalog.Tablescan(String... tablePath)Scans a registered table and returns the resultingTable.TablesqlQuery(String query)Evaluates a SQL query on registered tables and returns aTableobject describing the pipeline for further transformations.protected List<org.apache.flink.api.dag.Transformation<?>>translate(List<ModifyOperation> modifyOperations)voidunloadModule(String moduleName)Unloads aModulewith given name.voiduseCatalog(String catalogName)Sets the current catalog to the given value.voiduseDatabase(String databaseName)Sets the current default database.voiduseModules(String... moduleNames)Enable modules in use with declared name order.-
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.apache.flink.table.api.TableEnvironment
executePlan, explainSql
-
Methods inherited from interface org.apache.flink.table.api.internal.TableEnvironmentInternal
explainInternal
-
-
-
-
Field Detail
-
resourceManager
protected final ResourceManager resourceManager
-
tableConfig
protected final TableConfig tableConfig
-
execEnv
protected final Executor execEnv
-
functionCatalog
protected final FunctionCatalog functionCatalog
-
planner
protected final Planner planner
-
-
Constructor Detail
-
TableEnvironmentImpl
protected TableEnvironmentImpl(CatalogManager catalogManager, ModuleManager moduleManager, ResourceManager resourceManager, TableConfig tableConfig, Executor executor, FunctionCatalog functionCatalog, Planner planner, boolean isStreamingMode)
-
-
Method Detail
-
create
public static TableEnvironmentImpl create(org.apache.flink.configuration.Configuration configuration)
Description copied from interface:TableEnvironmentCreates a table environment that is the entry point and central context for creating Table and SQL API programs.It is unified both on a language level for all JVM-based languages (i.e. there is no distinction between Scala and Java API) and for bounded and unbounded data processing.
A table environment is responsible for:
- Connecting to external systems.
- Registering and retrieving
Tables and other meta objects from a catalog. - Executing SQL statements.
- Offering further configuration options.
Note: This environment is meant for pure table programs. If you would like to convert from or to other Flink APIs, it might be necessary to use one of the available language-specific table environments in the corresponding bridging modules.
- Parameters:
configuration- The specified options are used to instantiate theTableEnvironment.
-
create
public static TableEnvironmentImpl create(EnvironmentSettings settings)
Description copied from interface:TableEnvironmentCreates a table environment that is the entry point and central context for creating Table and SQL API programs.It is unified both on a language level for all JVM-based languages (i.e. there is no distinction between Scala and Java API) and for bounded and unbounded data processing.
A table environment is responsible for:
- Connecting to external systems.
- Registering and retrieving
Tables and other meta objects from a catalog. - Executing SQL statements.
- Offering further configuration options.
Note: This environment is meant for pure table programs. If you would like to convert from or to other Flink APIs, it might be necessary to use one of the available language-specific table environments in the corresponding bridging modules.
- Parameters:
settings- The environment settings used to instantiate theTableEnvironment.
-
fromValues
public Table fromValues(Object... values)
Description copied from interface:TableEnvironmentCreates a Table from given values.Examples:
You can use a
row(...)expression to create a composite rows:tEnv.fromValues( row(1, "ABC"), row(2L, "ABCDE") )will produce a Table with a schema as follows:
root |-- f0: BIGINT NOT NULL // original types INT and BIGINT are generalized to BIGINT |-- f1: VARCHAR(5) NOT NULL // original types CHAR(3) and CHAR(5) are generalized to VARCHAR(5) // it uses VARCHAR instead of CHAR so that no padding is appliedThe method will derive the types automatically from the input expressions. If types at a certain position differ, the method will try to find a common super type for all types. If a common super type does not exist, an exception will be thrown. If you want to specify the requested type explicitly see
TableEnvironment.fromValues(AbstractDataType, Object...).It is also possible to use
Rowobject instead ofrowexpressions.ROWs that are a result of e.g. a function call are not flattened
public class RowFunction extends ScalarFunction { {@literal @}DataTypeHint("ROW<f0 BIGINT, f1 VARCHAR(5)>") Row eval(); } tEnv.fromValues( call(new RowFunction()), call(new RowFunction()) )will produce a Table with a schema as follows:
root |-- f0: ROW<`f0` BIGINT, `f1` VARCHAR(5)>The row constructor can be dropped to create a table with a single column:
ROWs that are a result of e.g. a function call are not flattened
tEnv.fromValues( 1, 2L, 3 )will produce a Table with a schema as follows:
root |-- f0: BIGINT NOT NULL- Specified by:
fromValuesin interfaceTableEnvironment- Parameters:
values- Expressions for constructing rows of the VALUES table.
-
fromValues
public Table fromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, Object... values)
Description copied from interface:TableEnvironmentCreates a Table from given collection of objects with a given row type.The difference between this method and
TableEnvironment.fromValues(Object...)is that the schema can be manually adjusted. It might be helpful for assigning more generic types like e.g. DECIMAL or naming the columns.Examples:
tEnv.fromValues( DataTypes.ROW( DataTypes.FIELD("id", DataTypes.DECIMAL(10, 2)), DataTypes.FIELD("name", DataTypes.STRING()) ), row(1, "ABC"), row(2L, "ABCDE") )will produce a Table with a schema as follows:
root |-- id: DECIMAL(10, 2) |-- name: STRINGFor more examples see
TableEnvironment.fromValues(Object...).- Specified by:
fromValuesin interfaceTableEnvironment- Parameters:
rowType- Expected row type for the values.values- Expressions for constructing rows of the VALUES table.- See Also:
TableEnvironment.fromValues(Object...)
-
fromValues
public Table fromValues(org.apache.flink.table.expressions.Expression... values)
Description copied from interface:TableEnvironmentCreates a Table from given values.Examples:
You can use a
row(...)expression to create a composite rows:tEnv.fromValues( row(1, "ABC"), row(2L, "ABCDE") )will produce a Table with a schema as follows:
root |-- f0: BIGINT NOT NULL // original types INT and BIGINT are generalized to BIGINT |-- f1: VARCHAR(5) NOT NULL // original types CHAR(3) and CHAR(5) are generalized to VARCHAR(5) * // it uses VARCHAR instead of CHAR so that no padding is appliedThe method will derive the types automatically from the input expressions. If types at a certain position differ, the method will try to find a common super type for all types. If a common super type does not exist, an exception will be thrown. If you want to specify the requested type explicitly see
TableEnvironment.fromValues(AbstractDataType, Expression...).It is also possible to use
Rowobject instead ofrowexpressions.ROWs that are a result of e.g. a function call are not flattened
public class RowFunction extends ScalarFunction { {@literal @}DataTypeHint("ROW<f0 BIGINT, f1 VARCHAR(5)>") Row eval(); } tEnv.fromValues( call(new RowFunction()), call(new RowFunction()) )will produce a Table with a schema as follows:
root |-- f0: ROW<`f0` BIGINT, `f1` VARCHAR(5)>The row constructor can be dropped to create a table with a single column:
ROWs that are a result of e.g. a function call are not flattened
tEnv.fromValues( lit(1).plus(2), lit(2L), lit(3) )will produce a Table with a schema as follows:
root |-- f0: BIGINT NOT NULL- Specified by:
fromValuesin interfaceTableEnvironment- Parameters:
values- Expressions for constructing rows of the VALUES table.
-
fromValues
public Table fromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, org.apache.flink.table.expressions.Expression... values)
Description copied from interface:TableEnvironmentCreates a Table from given collection of objects with a given row type.The difference between this method and
TableEnvironment.fromValues(Expression...)is that the schema can be manually adjusted. It might be helpful for assigning more generic types like e.g. DECIMAL or naming the columns.Examples:
tEnv.fromValues( DataTypes.ROW( DataTypes.FIELD("id", DataTypes.DECIMAL(10, 2)), DataTypes.FIELD("name", DataTypes.STRING()) ), row(1, "ABC"), row(2L, "ABCDE") )will produce a Table with a schema as follows:
root |-- id: DECIMAL(10, 2) |-- name: STRINGFor more examples see
TableEnvironment.fromValues(Expression...).- Specified by:
fromValuesin interfaceTableEnvironment- Parameters:
rowType- Expected row type for the values.values- Expressions for constructing rows of the VALUES table.- See Also:
TableEnvironment.fromValues(Expression...)
-
fromValues
public Table fromValues(Iterable<?> values)
Description copied from interface:TableEnvironmentCreates a Table from given collection of objects.See
TableEnvironment.fromValues(Object...)for more explanation.- Specified by:
fromValuesin interfaceTableEnvironment- Parameters:
values- Expressions for constructing rows of the VALUES table.- See Also:
TableEnvironment.fromValues(Object...)
-
fromValues
public Table fromValues(org.apache.flink.table.types.AbstractDataType<?> rowType, Iterable<?> values)
Description copied from interface:TableEnvironmentCreates a Table from given collection of objects with a given row type.See
TableEnvironment.fromValues(AbstractDataType, Object...)for more explanation.- Specified by:
fromValuesin interfaceTableEnvironment- Parameters:
rowType- Expected row type for the values.values- Expressions for constructing rows of the VALUES table.- See Also:
TableEnvironment.fromValues(AbstractDataType, Object...)
-
getPlanner
@VisibleForTesting public Planner getPlanner()
-
fromTableSource
public Table fromTableSource(org.apache.flink.table.legacy.sources.TableSource<?> source)
Description copied from interface:TableEnvironmentInternalCreates a table from a table source.- Specified by:
fromTableSourcein interfaceTableEnvironmentInternal- Parameters:
source- table source used as table
-
registerCatalog
public void registerCatalog(String catalogName, org.apache.flink.table.catalog.Catalog catalog)
Description copied from interface:TableEnvironmentRegisters aCatalogunder a unique name. All tables registered in theCatalogcan be accessed.- Specified by:
registerCatalogin interfaceTableEnvironment- Parameters:
catalogName- The name under which the catalog will be registered.catalog- The catalog to register.
-
createCatalog
public void createCatalog(String catalogName, org.apache.flink.table.catalog.CatalogDescriptor catalogDescriptor)
Description copied from interface:TableEnvironmentCreates aCatalogusing the providedCatalogDescriptor. All table registered in theCatalogcan be accessed. TheCatalogDescriptorwill be persisted into theCatalogStore.- Specified by:
createCatalogin interfaceTableEnvironment- Parameters:
catalogName- The name under which the catalog will be createdcatalogDescriptor- The catalog descriptor for creating catalog
-
getCatalog
public Optional<org.apache.flink.table.catalog.Catalog> getCatalog(String catalogName)
Description copied from interface:TableEnvironmentGets a registeredCatalogby name.- Specified by:
getCatalogin interfaceTableEnvironment- Parameters:
catalogName- The name to look up theCatalog.- Returns:
- The requested catalog, empty if there is no registered catalog with given name.
-
loadModule
public void loadModule(String moduleName, org.apache.flink.table.module.Module module)
Description copied from interface:TableEnvironmentLoads aModuleunder a unique name. Modules will be kept in the loaded order. ValidationException is thrown when there is already a module with the same name.- Specified by:
loadModulein interfaceTableEnvironment- Parameters:
moduleName- name of theModulemodule- the module instance
-
useModules
public void useModules(String... moduleNames)
Description copied from interface:TableEnvironmentEnable modules in use with declared name order. Modules that have been loaded but not exist in names varargs will become unused.- Specified by:
useModulesin interfaceTableEnvironment- Parameters:
moduleNames- module names to be used
-
unloadModule
public void unloadModule(String moduleName)
Description copied from interface:TableEnvironmentUnloads aModulewith given name. ValidationException is thrown when there is no module with the given name.- Specified by:
unloadModulein interfaceTableEnvironment- Parameters:
moduleName- name of theModule
-
registerFunction
public void registerFunction(String name, org.apache.flink.table.functions.ScalarFunction function)
Description copied from interface:TableEnvironmentRegisters aScalarFunctionunder a unique name. Replaces already existing user-defined functions under this name.- Specified by:
registerFunctionin interfaceTableEnvironment
-
createTemporarySystemFunction
public void createTemporarySystemFunction(String name, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctionclass as a temporary system function.Compared to
TableEnvironment.createTemporaryFunction(String, Class), system functions are identified by a global name that is independent of the current catalog and current database. Thus, this method allows to extend the set of built-in system functions likeTRIM,ABS, etc.Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary system function.
- Specified by:
createTemporarySystemFunctionin interfaceTableEnvironment- Parameters:
name- The name under which the function will be registered globally.functionClass- The function class containing the implementation.
-
createTemporarySystemFunction
public void createTemporarySystemFunction(String name, org.apache.flink.table.functions.UserDefinedFunction functionInstance)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctioninstance as a temporary system function.Compared to
TableEnvironment.createTemporarySystemFunction(String, Class), this method takes a function instance that might have been parameterized before (e.g. through its constructor). This might be useful for more interactive sessions. Make sure that the instance isSerializable.Compared to
TableEnvironment.createTemporaryFunction(String, UserDefinedFunction), system functions are identified by a global name that is independent of the current catalog and current database. Thus, this method allows to extend the set of built-in system functions likeTRIM,ABS, etc.Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary system function.
- Specified by:
createTemporarySystemFunctionin interfaceTableEnvironment- Parameters:
name- The name under which the function will be registered globally.functionInstance- The (possibly pre-configured) function instance containing the implementation.
-
createTemporarySystemFunction
public void createTemporarySystemFunction(String name, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctionclass as a temporary system function by the specific class name and user defined resource uri.Compared to
TableEnvironment.createTemporaryFunction(String, Class), this method allows registering a user defined function by only providing a full path class name and a list of resources that contain the implementation of the function along with its dependencies. Users don't need to initialize the function instance in advance. The resource file can be a local or remote JAR file.Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary system function.
- Specified by:
createTemporarySystemFunctionin interfaceTableEnvironment- Parameters:
name- The name under which the function will be registered globally.className- The class name of UDF to be registered.resourceUris- The list of udf resource uris in local or remote.
-
dropTemporarySystemFunction
public boolean dropTemporarySystemFunction(String name)
Description copied from interface:TableEnvironmentDrops a temporary system function registered under the given name.If a permanent function with the given name exists, it will be used from now on for any queries that reference this name.
- Specified by:
dropTemporarySystemFunctionin interfaceTableEnvironment- Parameters:
name- The name under which the function has been registered globally.- Returns:
- true if a function existed under the given name and was removed
-
createFunction
public void createFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctionclass as a catalog function in the given path.Compared to system functions with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.
There must not be another function (temporary or permanent) registered under the same path.
- Specified by:
createFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function will be registered. See also theTableEnvironmentclass description for the format of the path.functionClass- The function class containing the implementation.
-
createFunction
public void createFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass, boolean ignoreIfExists)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctionclass as a catalog function in the given path.Compared to system functions with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.
- Specified by:
createFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function will be registered. See also theTableEnvironmentclass description for the format of the path.functionClass- The function class containing the implementation.ignoreIfExists- If a function exists under the given path and this flag is set, no operation is executed. An exception is thrown otherwise.
-
createFunction
public void createFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctionclass as a catalog function in the given path by the specific class name and user defined resource uri.Compared to
TableEnvironment.createFunction(String, Class), this method allows registering a user defined function by only providing a full path class name and a list of resources that contain the implementation of the function along with its dependencies. Users don't need to initialize the function instance in advance. The resource file can be a local or remote JAR file.Compared to system functions with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.
There must not be another function (temporary or permanent) registered under the same path.
- Specified by:
createFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function will be registered. See also theTableEnvironmentclass description for the format of the path.className- The class name of UDF to be registered.resourceUris- The list of udf resource uris in local or remote.
-
createFunction
public void createFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris, boolean ignoreIfExists)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctionclass as a catalog function in the given path by the specific class name and user defined resource uri.Compared to
TableEnvironment.createFunction(String, Class), this method allows registering a user defined function by only providing a full path class name and a list of resources that contain the implementation of the function along with its dependencies. Users don't need to initialize the function instance in advance. The resource file can be a local or remote JAR file.Compared to system functions with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.
There must not be another function (temporary or permanent) registered under the same path.
- Specified by:
createFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function will be registered. See also theTableEnvironmentclass description for the format of the path.className- The class name of UDF to be registered.resourceUris- The list of udf resource uris in local or remote.ignoreIfExists- If a function exists under the given path and this flag is set, no operation is executed. An exception is thrown otherwise.
-
dropFunction
public boolean dropFunction(String path)
Description copied from interface:TableEnvironmentDrops a catalog function registered in the given path.- Specified by:
dropFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function has been registered. See also theTableEnvironmentclass description for the format of the path.- Returns:
- true if a function existed in the given path and was removed
-
createTemporaryFunction
public void createTemporaryFunction(String path, Class<? extends org.apache.flink.table.functions.UserDefinedFunction> functionClass)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctionclass as a temporary catalog function.Compared to
TableEnvironment.createTemporarySystemFunction(String, Class)with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary function.
- Specified by:
createTemporaryFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function will be registered. See also theTableEnvironmentclass description for the format of the path.functionClass- The function class containing the implementation.
-
createTemporaryFunction
public void createTemporaryFunction(String path, org.apache.flink.table.functions.UserDefinedFunction functionInstance)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctioninstance as a temporary catalog function.Compared to
TableEnvironment.createTemporaryFunction(String, Class), this method takes a function instance that might have been parameterized before (e.g. through its constructor). This might be useful for more interactive sessions. Make sure that the instance isSerializable.Compared to
TableEnvironment.createTemporarySystemFunction(String, UserDefinedFunction)with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary function.
- Specified by:
createTemporaryFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function will be registered. See also theTableEnvironmentclass description for the format of the path.functionInstance- The (possibly pre-configured) function instance containing the implementation.
-
createTemporaryFunction
public void createTemporaryFunction(String path, String className, List<org.apache.flink.table.resource.ResourceUri> resourceUris)
Description copied from interface:TableEnvironmentRegisters aUserDefinedFunctionclass as a temporary catalog function in the given path by the specific class name and user defined resource uri.Compared to
TableEnvironment.createTemporaryFunction(String, Class), this method allows registering a user defined function by only providing a full path class name and a list of resources that contain the implementation of the function along with its dependencies. Users don't need to initialize the function instance in advance. The resource file can be a local or remote JAR file.Compared to
TableEnvironment.createTemporarySystemFunction(String, String, List)with a globally defined name, catalog functions are always (implicitly or explicitly) identified by a catalog and database.Temporary functions can shadow permanent ones. If a permanent function under a given name exists, it will be inaccessible in the current session. To make the permanent function available again one can drop the corresponding temporary function.
- Specified by:
createTemporaryFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function will be registered. See also theTableEnvironmentclass description for the format of the path.className- The class name of UDF to be registered.resourceUris- The list udf resource uri in local or remote.
-
dropTemporaryFunction
public boolean dropTemporaryFunction(String path)
Description copied from interface:TableEnvironmentDrops a temporary catalog function registered in the given path.If a permanent function with the given path exists, it will be used from now on for any queries that reference this path.
- Specified by:
dropTemporaryFunctionin interfaceTableEnvironment- Parameters:
path- The path under which the function will be registered. See also theTableEnvironmentclass description for the format of the path.- Returns:
- true if a function existed in the given path and was removed
-
createTemporaryTable
public void createTemporaryTable(String path, TableDescriptor descriptor)
Description copied from interface:TableEnvironmentRegisters the givenTableDescriptoras a temporary catalog table.The
descriptoris converted into aCatalogTableand stored in the catalog.Temporary objects can shadow permanent ones. If a permanent object in a given path exists, it will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.
Examples:
tEnv.createTemporaryTable("MyTable", TableDescriptor.forConnector("datagen") .schema(Schema.newBuilder() .column("f0", DataTypes.STRING()) .build()) .option(DataGenOptions.ROWS_PER_SECOND, 10) .option("fields.f0.kind", "random") .build());- Specified by:
createTemporaryTablein interfaceTableEnvironment- Parameters:
path- The path under which the table will be registered. See also theTableEnvironmentclass description for the format of the path.descriptor- Template for creating aCatalogTableinstance.
-
createTemporaryTable
public void createTemporaryTable(String path, TableDescriptor descriptor, boolean ignoreIfExists)
Description copied from interface:TableEnvironmentRegisters the givenTableDescriptoras a temporary catalog table.The
descriptoris converted into aCatalogTableand stored in the catalog.Temporary objects can shadow permanent ones. If a permanent object in a given path exists, it will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.
Examples:
tEnv.createTemporaryTable("MyTable", TableDescriptor.forConnector("datagen") .schema(Schema.newBuilder() .column("f0", DataTypes.STRING()) .build()) .option(DataGenOptions.ROWS_PER_SECOND, 10) .option("fields.f0.kind", "random") .build(), true);- Specified by:
createTemporaryTablein interfaceTableEnvironment- Parameters:
path- The path under which the table will be registered. See also theTableEnvironmentclass description for the format of the path.descriptor- Template for creating aCatalogTableinstance.ignoreIfExists- If a table exists under the given path and this flag is set, no operation is executed. An exception is thrown otherwise.
-
createTable
public void createTable(String path, TableDescriptor descriptor)
Description copied from interface:TableEnvironmentRegisters the givenTableDescriptoras a catalog table.The
descriptoris converted into aCatalogTableand stored in the catalog.If the table should not be permanently stored in a catalog, use
TableEnvironment.createTemporaryTable(String, TableDescriptor)instead.Examples:
tEnv.createTable("MyTable", TableDescriptor.forConnector("datagen") .schema(Schema.newBuilder() .column("f0", DataTypes.STRING()) .build()) .option(DataGenOptions.ROWS_PER_SECOND, 10) .option("fields.f0.kind", "random") .build());- Specified by:
createTablein interfaceTableEnvironment- Parameters:
path- The path under which the table will be registered. See also theTableEnvironmentclass description for the format of the path.descriptor- Template for creating aCatalogTableinstance.
-
createTable
public boolean createTable(String path, TableDescriptor descriptor, boolean ignoreIfExists)
Description copied from interface:TableEnvironmentRegisters the givenTableDescriptoras a catalog table.The
descriptoris converted into aCatalogTableand stored in the catalog.If the table should not be permanently stored in a catalog, use
TableEnvironment.createTemporaryTable(String, TableDescriptor, boolean)instead.Examples:
tEnv.createTable("MyTable", TableDescriptor.forConnector("datagen") .schema(Schema.newBuilder() .column("f0", DataTypes.STRING()) .build()) .option(DataGenOptions.ROWS_PER_SECOND, 10) .option("fields.f0.kind", "random") .build(), true);- Specified by:
createTablein interfaceTableEnvironment- Parameters:
path- The path under which the table will be registered. See also theTableEnvironmentclass description for the format of the path.descriptor- Template for creating aCatalogTableinstance.ignoreIfExists- If a table exists under the given path and this flag is set, no operation is executed. An exception is thrown otherwise.- Returns:
- true if table was created in the given path, false if a permanent object already exists in the given path.
-
registerTable
public void registerTable(String name, Table table)
Description copied from interface:TableEnvironmentRegisters aTableunder a unique name in the TableEnvironment's catalog. Registered tables can be referenced in SQL queries.Temporary objects can shadow permanent ones. If a permanent object in a given path exists, it will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.
- Specified by:
registerTablein interfaceTableEnvironment- Parameters:
name- The name under which the table will be registered.table- The table to register.
-
createTemporaryView
public void createTemporaryView(String path, Table view)
Description copied from interface:TableEnvironmentRegisters aTableAPI object as a temporary view similar to SQL temporary views.Temporary objects can shadow permanent ones. If a permanent object in a given path exists, it will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.
- Specified by:
createTemporaryViewin interfaceTableEnvironment- Parameters:
path- The path under which the view will be registered. See also theTableEnvironmentclass description for the format of the path.view- The view to register.
-
createView
public void createView(String path, Table view)
Description copied from interface:TableEnvironmentRegisters aTableAPI object as a view similar to SQL views.Temporary objects can shadow permanent ones. If a temporary object in a given path exists, the permanent one will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.
- Specified by:
createViewin interfaceTableEnvironment- Parameters:
path- The path under which the view will be registered. See also theTableEnvironmentclass description for the format of the path.view- The view to register.
-
createView
public boolean createView(String path, Table view, boolean ignoreIfExists)
Description copied from interface:TableEnvironmentRegisters aTableAPI object as a view similar to SQL views.Temporary objects can shadow permanent ones. If a temporary object in a given path exists, the permanent one will be inaccessible in the current session. To make the permanent object available again one can drop the corresponding temporary object.
- Specified by:
createViewin interfaceTableEnvironment- Parameters:
path- The path under which the view will be registered. See also theTableEnvironmentclass description for the format of the path.view- The view to register.ignoreIfExists- If a view or a table exists and the given flag is set, no operation is executed. An exception is thrown otherwise.- Returns:
- true if view was created in the given path, false if a permanent object already exists in the given path.
-
scan
public Table scan(String... tablePath)
Description copied from interface:TableEnvironmentScans a registered table and returns the resultingTable.A table to scan must be registered in the
TableEnvironment. It can be either directly registered or be an external member of aCatalog.See the documentation of
TableEnvironment.useDatabase(String)orTableEnvironment.useCatalog(String)for the rules on the path resolution.Examples:
Scanning a directly registered table.
Table tab = tableEnv.scan("tableName");Scanning a table from a registered catalog.
Table tab = tableEnv.scan("catalogName", "dbName", "tableName");- Specified by:
scanin interfaceTableEnvironment- Parameters:
tablePath- The path of the table to scan.- Returns:
- The resulting
Table. - See Also:
TableEnvironment.useCatalog(String),TableEnvironment.useDatabase(String)
-
from
public Table from(String path)
Description copied from interface:TableEnvironmentReads a registered table and returns the resultingTable.A table to scan must be registered in the
TableEnvironment.See the documentation of
TableEnvironment.useDatabase(String)orTableEnvironment.useCatalog(String)for the rules on the path resolution.Examples:
Reading a table from default catalog and database.
Table tab = tableEnv.from("tableName");Reading a table from a registered catalog.
Table tab = tableEnv.from("catalogName.dbName.tableName");Reading a table from a registered catalog with escaping. Dots in e.g. a database name must be escaped.
Table tab = tableEnv.from("catalogName.`db.Name`.Table");Note that the returned
Tableis an API object and only contains a pipeline description. It actually corresponds to a view in SQL terms. CallExecutable.execute()to trigger an execution.- Specified by:
fromin interfaceTableEnvironment- Parameters:
path- The path of a table API object to scan.- Returns:
- The
Tableobject describing the pipeline for further transformations. - See Also:
TableEnvironment.useCatalog(String),TableEnvironment.useDatabase(String)
-
from
public Table from(TableDescriptor descriptor)
Description copied from interface:TableEnvironmentReturns aTablebacked by the givendescriptor.The
descriptorwon't be registered in the catalog, but it will be propagated directly in the operation tree. Note that calling this method multiple times, even with the same descriptor, results in multiple temporary tables. In such cases, it is recommended to register it under a name usingTableEnvironment.createTemporaryTable(String, TableDescriptor)and reference it viaTableEnvironment.from(String).Examples:
Table table = tEnv.from(TableDescriptor.forConnector("datagen") .schema(Schema.newBuilder() .column("f0", DataTypes.STRING()) .build()) .build());Note that the returned
Tableis an API object and only contains a pipeline description. It actually corresponds to a view in SQL terms. CallExecutable.execute()to trigger an execution.- Specified by:
fromin interfaceTableEnvironment- Returns:
- The
Tableobject describing the pipeline for further transformations.
-
listCatalogs
public String[] listCatalogs()
Description copied from interface:TableEnvironmentGets the names of all catalogs registered in this environment.- Specified by:
listCatalogsin interfaceTableEnvironment- Returns:
- A list of the names of all registered catalogs.
-
listModules
public String[] listModules()
Description copied from interface:TableEnvironmentGets an array of names of all used modules in this environment in resolution order.- Specified by:
listModulesin interfaceTableEnvironment- Returns:
- A list of the names of used modules in resolution order.
-
listFullModules
public ModuleEntry[] listFullModules()
Description copied from interface:TableEnvironmentGets an array of all loaded modules with use status in this environment. Used modules are kept in resolution order.- Specified by:
listFullModulesin interfaceTableEnvironment- Returns:
- A list of name and use status entries of all loaded modules.
-
listDatabases
public String[] listDatabases()
Description copied from interface:TableEnvironmentGets the names of all databases registered in the current catalog.- Specified by:
listDatabasesin interfaceTableEnvironment- Returns:
- A list of the names of all registered databases in the current catalog.
-
listTables
public String[] listTables()
Description copied from interface:TableEnvironmentGets the names of all tables available in the current namespace (the current database of the current catalog). It returns both temporary and permanent tables and views.- Specified by:
listTablesin interfaceTableEnvironment- Returns:
- A list of the names of all registered tables in the current database of the current catalog.
- See Also:
TableEnvironment.listTemporaryTables(),TableEnvironment.listTemporaryViews()
-
listTables
public String[] listTables(String catalog, String databaseName)
Description copied from interface:TableEnvironmentGets the names of all tables available in the given namespace (the given database of the given catalog). It returns both temporary and permanent tables and views.- Specified by:
listTablesin interfaceTableEnvironment- Returns:
- A list of the names of all registered tables in the given database of the given catalog.
- See Also:
TableEnvironment.listTemporaryTables(),TableEnvironment.listTemporaryViews()
-
listViews
public String[] listViews()
Description copied from interface:TableEnvironmentGets the names of all views available in the current namespace (the current database of the current catalog). It returns both temporary and permanent views.- Specified by:
listViewsin interfaceTableEnvironment- Returns:
- A list of the names of all registered views in the current database of the current catalog.
- See Also:
TableEnvironment.listTemporaryViews()
-
listTemporaryTables
public String[] listTemporaryTables()
Description copied from interface:TableEnvironmentGets the names of all temporary tables and views available in the current namespace (the current database of the current catalog).- Specified by:
listTemporaryTablesin interfaceTableEnvironment- Returns:
- A list of the names of all registered temporary tables and views in the current database of the current catalog.
- See Also:
TableEnvironment.listTables()
-
listTemporaryViews
public String[] listTemporaryViews()
Description copied from interface:TableEnvironmentGets the names of all temporary views available in the current namespace (the current database of the current catalog).- Specified by:
listTemporaryViewsin interfaceTableEnvironment- Returns:
- A list of the names of all registered temporary views in the current database of the current catalog.
- See Also:
TableEnvironment.listTables()
-
dropTemporaryTable
public boolean dropTemporaryTable(String path)
Description copied from interface:TableEnvironmentDrops a temporary table registered in the given path.If a permanent table with a given path exists, it will be used from now on for any queries that reference this path.
- Specified by:
dropTemporaryTablein interfaceTableEnvironment- Parameters:
path- The given path under which the temporary table will be dropped. See also theTableEnvironmentclass description for the format of the path.- Returns:
- true if a table existed in the given path and was removed
-
dropTable
public boolean dropTable(String path)
Description copied from interface:TableEnvironmentDrops a table registered in the given path.This method can only drop permanent objects. Temporary objects can shadow permanent ones. If a temporary object exists in a given path, make sure to drop the temporary object first using
TableEnvironment.dropTemporaryTable(java.lang.String).Compared to SQL, this method will not throw an error if the table does not exist. Use
TableEnvironment.dropTable(java.lang.String, boolean)to change the default behavior.- Specified by:
dropTablein interfaceTableEnvironment- Parameters:
path- The given path under which the table will be dropped. See also theTableEnvironmentclass description for the format of the path.- Returns:
- true if table existed in the given path and was dropped, false if table didn't exist in the given path.
-
dropTable
public boolean dropTable(String path, boolean ignoreIfNotExists)
Description copied from interface:TableEnvironmentDrops a table registered in the given path.This method can only drop permanent objects. Temporary objects can shadow permanent ones. If a temporary object exists in a given path, make sure to drop the temporary object first using
TableEnvironment.dropTemporaryTable(java.lang.String).- Specified by:
dropTablein interfaceTableEnvironment- Parameters:
path- The given path under which the given table will be dropped. See also theTableEnvironmentclass description for the format of the path.ignoreIfNotExists- If false exception will be thrown if the view to drop does not exist.- Returns:
- true if table existed in the given path and was dropped, false if table didn't exist in the given path.
-
dropTemporaryView
public boolean dropTemporaryView(String path)
Description copied from interface:TableEnvironmentDrops a temporary view registered in the given path.If a permanent table or view with a given path exists, it will be used from now on for any queries that reference this path.
- Specified by:
dropTemporaryViewin interfaceTableEnvironment- Parameters:
path- The given path under which the temporary view will be dropped. See also theTableEnvironmentclass description for the format of the path.- Returns:
- true if a view existed in the given path and was removed
-
dropView
public boolean dropView(String path)
Description copied from interface:TableEnvironmentDrops a view registered in the given path.This method can only drop permanent objects. Temporary objects can shadow permanent ones. If a temporary object exists in a given path, make sure to drop the temporary object first using
TableEnvironment.dropTemporaryView(java.lang.String).Compared to SQL, this method will not throw an error if the view does not exist. Use
TableEnvironment.dropView(java.lang.String, boolean)to change the default behavior.- Specified by:
dropViewin interfaceTableEnvironment- Parameters:
path- The given path under which the view will be dropped. See also theTableEnvironmentclass description for the format of the path.- Returns:
- true if view existed in the given path and was dropped, false if view didn't exist in the given path.
-
dropView
public boolean dropView(String path, boolean ignoreIfNotExists)
Description copied from interface:TableEnvironmentDrops a view registered in the given path.This method can only drop permanent objects. Temporary objects can shadow permanent ones. If a temporary object exists in a given path, make sure to drop the temporary object first using
TableEnvironment.dropTemporaryView(java.lang.String).- Specified by:
dropViewin interfaceTableEnvironment- Parameters:
path- The given path under which the view will be dropped. See also theTableEnvironmentclass description for the format of the path.ignoreIfNotExists- If false exception will be thrown if the view to drop does not exist.- Returns:
- true if view existed in the given path and was dropped, false if view didn't exist in the given path and ignoreIfNotExists was true.
-
listUserDefinedFunctions
public String[] listUserDefinedFunctions()
Description copied from interface:TableEnvironmentGets the names of all user defined functions registered in this environment.- Specified by:
listUserDefinedFunctionsin interfaceTableEnvironment
-
listFunctions
public String[] listFunctions()
Description copied from interface:TableEnvironmentGets the names of all functions in this environment.- Specified by:
listFunctionsin interfaceTableEnvironment
-
explainSql
public String explainSql(String statement, ExplainFormat format, ExplainDetail... extraDetails)
Description copied from interface:TableEnvironmentReturns the AST of the specified statement and the execution plan to compute the result of the given statement.- Specified by:
explainSqlin interfaceTableEnvironment- Parameters:
statement- The statement for which the AST and execution plan will be returned.format- The output format of explained plan.extraDetails- The extra explain details which the explain result should include, e.g. estimated cost, changelog mode for streaming, displaying execution plan in json format- Returns:
- AST and the execution plan.
-
explainInternal
public String explainInternal(List<Operation> operations, ExplainFormat format, ExplainDetail... extraDetails)
Description copied from interface:TableEnvironmentInternalReturns the AST of this table and the execution plan to compute the result of this table.- Specified by:
explainInternalin interfaceTableEnvironmentInternal- Parameters:
operations- The operations to be explained.format- The output format.extraDetails- The extra explain details which the explain result should include, e.g. estimated cost, changelog mode for streaming- Returns:
- AST and the execution plan.
-
getCompletionHints
public String[] getCompletionHints(String statement, int position)
Description copied from interface:TableEnvironmentReturns completion hints for the given statement at the given cursor position. The completion happens case insensitively.- Specified by:
getCompletionHintsin interfaceTableEnvironment- Parameters:
statement- Partial or slightly incorrect SQL statementposition- cursor position- Returns:
- completion hints that fit at the current cursor position
-
sqlQuery
public Table sqlQuery(String query)
Description copied from interface:TableEnvironmentEvaluates a SQL query on registered tables and returns aTableobject describing the pipeline for further transformations.All tables and other objects referenced by the query must be registered in the
TableEnvironment. For example, useTableEnvironment.createTemporaryView(String, Table)) for referencing aTableobject orTableEnvironment.createTemporarySystemFunction(String, Class)for functions.Alternatively, a
Tableobject is automatically registered when itsTable#toString()method is called, for example when it is embedded into a string. Hence, SQL queries can directly reference aTableobject inline (i.e. anonymous) as follows:Table table = ...; String tableName = table.toString(); // the table is not registered to the table environment tEnv.sqlQuery("SELECT * FROM " + tableName + " WHERE a > 12");Note that the returned
Tableis an API object and only contains a pipeline description. It actually corresponds to a view in SQL terms. CallExecutable.execute()to trigger an execution or useTableEnvironment.executeSql(String)directly.- Specified by:
sqlQueryin interfaceTableEnvironment- Parameters:
query- The SQL query to evaluate.- Returns:
- The
Tableobject describing the pipeline for further transformations.
-
executeSql
public TableResult executeSql(String statement)
Description copied from interface:TableEnvironmentExecutes the given single statement and returns the execution result.The statement can be DDL/DML/DQL/SHOW/DESCRIBE/EXPLAIN/USE. For DML and DQL, this method returns
TableResultonce the job has been submitted. For DDL and DCL statements,TableResultis returned once the operation has finished.If multiple pipelines should insert data into one or more sink tables as part of a single execution, use a
StatementSet(seeTableEnvironment.createStatementSet()).By default, all DML operations are executed asynchronously. Use
TableResult.await()orTableResult.getJobClient()to monitor the execution. SetTableConfigOptions.TABLE_DML_SYNCfor always synchronous execution.- Specified by:
executeSqlin interfaceTableEnvironment- Returns:
- content for DQL/SHOW/DESCRIBE/EXPLAIN, the affected row count for `DML` (-1 means unknown), or a string message ("OK") for other statements.
-
executeCachedPlanInternal
public TableResultInternal executeCachedPlanInternal(CachedPlan cachedPlan)
Description copied from interface:TableEnvironmentInternalExecute the givenCachedPlanand return the execution result.- Specified by:
executeCachedPlanInternalin interfaceTableEnvironmentInternal- Parameters:
cachedPlan- The CachedPlan to be executed.- Returns:
- the content of the execution result.
-
createStatementSet
public StatementSet createStatementSet()
Description copied from interface:TableEnvironmentReturns aStatementSetthat accepts pipelines defined by DML statements orTableobjects. The planner can optimize all added statements together and then submit them as one job.- Specified by:
createStatementSetin interfaceTableEnvironment
-
loadPlan
public CompiledPlan loadPlan(PlanReference planReference)
Description copied from interface:TableEnvironmentLoads a plan from aPlanReferenceinto aCompiledPlan.Compiled plans can be persisted and reloaded across Flink versions. They describe static pipelines to ensure backwards compatibility and enable stateful streaming job upgrades. See
CompiledPlanand the website documentation for more information.This method will parse the input reference and will validate the plan. The returned instance can be executed via
Executable.execute().Note: The compiled plan feature is not supported in batch mode.
- Specified by:
loadPlanin interfaceTableEnvironment
-
compilePlanSql
public CompiledPlan compilePlanSql(String stmt)
Description copied from interface:TableEnvironmentCompiles a SQL DML statement into aCompiledPlan.Compiled plans can be persisted and reloaded across Flink versions. They describe static pipelines to ensure backwards compatibility and enable stateful streaming job upgrades. See
CompiledPlanand the website documentation for more information.Note: Only
INSERT INTOis supported at the moment.Note: The compiled plan feature is not supported in batch mode.
- Specified by:
compilePlanSqlin interfaceTableEnvironment- See Also:
Executable.execute(),TableEnvironment.loadPlan(PlanReference)
-
executePlan
public TableResultInternal executePlan(InternalPlan plan)
- Specified by:
executePlanin interfaceTableEnvironmentInternal
-
compilePlan
public CompiledPlan compilePlan(List<ModifyOperation> operations)
- Specified by:
compilePlanin interfaceTableEnvironmentInternal
-
executeInternal
public TableResultInternal executeInternal(List<ModifyOperation> operations)
Description copied from interface:TableEnvironmentInternalExecute the given modify operations and return the execution result.- Specified by:
executeInternalin interfaceTableEnvironmentInternal- Parameters:
operations- The operations to be executed.- Returns:
- the affected row counts (-1 means unknown).
-
executeInternal
public TableResultInternal executeInternal(Operation operation)
Description copied from interface:TableEnvironmentInternalExecute the given operation and return the execution result.- Specified by:
executeInternalin interfaceTableEnvironmentInternal- Parameters:
operation- The operation to be executed.- Returns:
- the content of the execution result.
-
generatePipelineFromQueryOperation
@VisibleForTesting public org.apache.flink.api.dag.Pipeline generatePipelineFromQueryOperation(QueryOperation operation, List<org.apache.flink.api.dag.Transformation<?>> transformations)
generate executionPipelinefromQueryOperation.
-
getCurrentCatalog
public String getCurrentCatalog()
Description copied from interface:TableEnvironmentGets the current default catalog name of the current session.- Specified by:
getCurrentCatalogin interfaceTableEnvironment- Returns:
- The current default catalog name that is used for the path resolution.
- See Also:
TableEnvironment.useCatalog(String)
-
useCatalog
public void useCatalog(String catalogName)
Description copied from interface:TableEnvironmentSets the current catalog to the given value. It also sets the default database to the catalog's default one. See alsoTableEnvironment.useDatabase(String).This is used during the resolution of object paths. Both the catalog and database are optional when referencing catalog objects such as tables, views etc. The algorithm looks for requested objects in following paths in that order:
[current-catalog].[current-database].[requested-path][current-catalog].[requested-path][requested-path]
Example:
Given structure with default catalog set to
default_catalogand default database set todefault_database.root: |- default_catalog |- default_database |- tab1 |- db1 |- tab1 |- cat1 |- db1 |- tab1The following table describes resolved paths:
Requested path Resolved path tab1 default_catalog.default_database.tab1 db1.tab1 default_catalog.db1.tab1 cat1.db1.tab1 cat1.db1.tab1 You can unset the current catalog by passing a null value. If the current catalog is unset, you need to use fully qualified identifiers.
- Specified by:
useCatalogin interfaceTableEnvironment- Parameters:
catalogName- The name of the catalog to set as the current default catalog.- See Also:
TableEnvironment.useDatabase(String)
-
getCurrentDatabase
public String getCurrentDatabase()
Description copied from interface:TableEnvironmentGets the current default database name of the running session.- Specified by:
getCurrentDatabasein interfaceTableEnvironment- Returns:
- The name of the current database of the current catalog.
- See Also:
TableEnvironment.useDatabase(String)
-
useDatabase
public void useDatabase(String databaseName)
Description copied from interface:TableEnvironmentSets the current default database. It has to exist in the current catalog. That path will be used as the default one when looking for unqualified object names.This is used during the resolution of object paths. Both the catalog and database are optional when referencing catalog objects such as tables, views etc. The algorithm looks for requested objects in following paths in that order:
[current-catalog].[current-database].[requested-path][current-catalog].[requested-path][requested-path]
Example:
Given structure with default catalog set to
default_catalogand default database set todefault_database.root: |- default_catalog |- default_database |- tab1 |- db1 |- tab1 |- cat1 |- db1 |- tab1The following table describes resolved paths:
Requested path Resolved path tab1 default_catalog.default_database.tab1 db1.tab1 default_catalog.db1.tab1 cat1.db1.tab1 cat1.db1.tab1 You can unset the current database by passing a null value. If the current database is unset, you need to qualify identifiers at least with the database name.
- Specified by:
useDatabasein interfaceTableEnvironment- Parameters:
databaseName- The name of the database to set as the current database.- See Also:
TableEnvironment.useCatalog(String)
-
getConfig
public TableConfig getConfig()
Description copied from interface:TableEnvironmentReturns the table config that defines the runtime behavior of the Table API.- Specified by:
getConfigin interfaceTableEnvironment
-
getParser
public Parser getParser()
Description copied from interface:TableEnvironmentInternalReturn aParserthat provides methods for parsing a SQL string.- Specified by:
getParserin interfaceTableEnvironmentInternal- Returns:
- initialized
Parser.
-
getCatalogManager
public CatalogManager getCatalogManager()
Description copied from interface:TableEnvironmentInternalReturns aCatalogManagerthat deals with all catalog objects.- Specified by:
getCatalogManagerin interfaceTableEnvironmentInternal
-
getOperationTreeBuilder
public OperationTreeBuilder getOperationTreeBuilder()
Description copied from interface:TableEnvironmentInternalReturns aOperationTreeBuilderthat can createQueryOperations.- Specified by:
getOperationTreeBuilderin interfaceTableEnvironmentInternal
-
qualifyQueryOperation
protected QueryOperation qualifyQueryOperation(org.apache.flink.table.catalog.ObjectIdentifier identifier, QueryOperation queryOperation)
Subclasses can override this method to transform the given QueryOperation to a new one with the qualified object identifier. This is needed for some QueryOperations, e.g. JavaDataStreamQueryOperation, which doesn't know the registered identifier when created (fromDataStream(DataStream). But the identifier is required when converting this QueryOperation to RelNode.
-
translate
protected List<org.apache.flink.api.dag.Transformation<?>> translate(List<ModifyOperation> modifyOperations)
-
createTable
@VisibleForTesting public TableImpl createTable(QueryOperation tableOperation)
-
explainPlan
public String explainPlan(InternalPlan compiledPlan, ExplainDetail... extraDetails)
- Specified by:
explainPlanin interfaceTableEnvironmentInternal
-
-