我不是特别确定,但是看起来跟这两行代码有关系:
 List<Row&gt; rowList =  TableUtils.collectToList(table);
 System.out.println(rowList);
这两行代码应该是已经把你前面的SQL执行了。然后最后那个
 tableEnv.execute("test");
就没有可以执行的算子了。

了不起的盖茨比 <[email protected]> 于2020年5月29日周五 下午3:48写道:

> 代码如下:这是error提示:Exception in thread "main" java.lang.IllegalStateException:
> No operators defined in streaming topology. Cannot execute其实insert成功了,去hive
> cli确认过,select也查了数据,为什么还是提示我没有算子被定义呢?我需要batch table吗 EnvironmentSettings
> settings =
> EnvironmentSettings.newInstance().useBlinkPlanner().inBatchMode().build();
>  TableEnvironment tableEnv = TableEnvironment.create(settings);
>
>  String name            = "myhive";
>  String defaultDatabase = "situation";
>  String hiveConfDir     = "/load/data/hive/hive-conf"; // a local path
>  String version         = "1.2.1";
>  String CATALOG_NAME = "myhive";
>
>  HiveCatalog hiveCatalog = new HiveCatalog(name, defaultDatabase,
> hiveConfDir, version);
>  hiveCatalog.open();
>  tableEnv.registerCatalog(CATALOG_NAME, hiveCatalog);
>
>  Optional<Catalog&gt; myHive = tableEnv.getCatalog(CATALOG_NAME);
>  ObjectPath myTablePath = new ObjectPath("situation", "flink_test");
>  System.out.println(myHive.get().getTable(myTablePath).getSchema());
>
>
>  //集成Hive内置函数
> tableEnv.loadModule("hiveModule",new HiveModule(version));
>
>  tableEnv.useCatalog(CATALOG_NAME);
>
>  tableEnv.sqlUpdate("insert into situation.flink_test values (3,'kcz3')");
>  Table table = tableEnv.sqlQuery(" select * from situation.flink_test");
>  List<Row&gt; rowList =  TableUtils.collectToList(table);
>  System.out.println(rowList);
>
>
>  tableEnv.execute("test");



-- 

Best,
Benchao Li

回复