FlinkSql的DDL报错依赖缺失:Could not find a suitable table factory

使用正确的Planner

Exception in thread “main” org.apache.flink.table.api.TableException: Could not instantiate the executor.Make sure a planner module is on the classpath
at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.lookupExecutor(StreamTableEnvironmentImpl.java:176)
at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.create(StreamTableEnvironmentImpl.java:138)
at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:113)
at com.cbry.flinksql.CreateDDLConsumer.main(CreateDDLConsumer.java:16)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for ‘org.apache.flink.table.delegation.ExecutorFactory’ in
the classpath.

Reason: No factory supports the additional filters.

The following properties are requested:
class-name=org.apache.flink.table.planner.delegation.BlinkExecutorFactory
streaming-mode=true

The following factories have been considered:
org.apache.flink.table.executor.StreamExecutorFactory
at org.apache.flink.table.factories.ComponentFactoryService.find(ComponentFactoryService.java:76)
at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.lookupExecutor(StreamTableEnvironmentImpl.java:167)
… 3 more

解决方案

原先的pom引入是引入的:

		<dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-table-planner_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
		</dependency>

但是flinkSQL建表的时候用的是blink,所以需要

		<dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
		</dependency>

具体根据建表语句的env设置:

FlinkSql的DDL报错依赖缺失:Could not find a suitable table factory

又遇到:引入对应格式的类型

FlinkSql的DDL报错依赖缺失:Could not find a suitable table factory

DDL中有使用到:format.type’=‘json’

FlinkSql的DDL报错依赖缺失:Could not find a suitable table factory

添加:

    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-json</artifactId>
        <version>${flink.version}</version>
    </dependency>

还有:无执行应用

FlinkSql的DDL报错依赖缺失:Could not find a suitable table factory

java.lang.IllegalStateException: No ExecutorFactory found to execute the application.
at org.apache.flink.core.execution.DefaultExecutorServiceLoader.getExecutorFactory(DefaultExecutorServiceLoader.java:88)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:1895)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1796)
at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:69)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1782)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1765)
at com.cbry.flinksql.CreateDDLConsumer.main(CreateDDLConsumer.java:39)

引入

        <dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-clients_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
		</dependency>

遇到的灵性问题

开发控制台,在数据生产后,消费端一直未打印应该操作后的数据。但是服务器kafka一消费,再去控制台运行consumer程序就有了消费数据了。

小结

	<!-- FlinkSql -->

		<dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
		</dependency>
		
		<dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java-bridge_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        
        <dependency>
			<groupId>org.apache.flink</groupId>
			<artifactId>flink-clients_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
		</dependency>
        
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>${flink.version}</version>
        </dependency>

		<!-- Flink Connect Kafka -->

		<dependency>
			<groupId>org.apache.flink</groupId>
			 <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
			<version>${flink.version}</version>
		</dependency>
上一篇:flink 读取文件报错 Line could not be encoded


下一篇:关于windows server 2012/2016 win10 安装.net 3.5