Dear All,
Flink.11.2操作hive时,对hive的版本支持是怎样的
看官网介绍是支持1.0、1.1、1.2、2.0、2.1、2.2、2.3、3.1
我的执行环境:
*Flink : 1.11.2*
*Haoop : 2.6.0-cdh5.8.3*
*Hive : 1.1.0-cdh5.8.3*
*Job运行方式 : on yarn*
同时对读写hive的demo,我不知道我写的是否正确:
public static void main(String[] args) throws Exception {
EnvironmentSettings settings = EnvironmentSettings
.newInstance()
.useBlinkPlanner()
.inBatchMode()
.build();
TableEnvironment tableEnv = TableEnvironment.create(settings);
String name = "myhive";
String defaultDatabase = "datafeed";
String hiveConfDir = "/opt/app/bigdata/hive-1.1.0-cdh5.8.3/conf";
// hive-site.xml路径
String version = "1.1.0-cdh5.8.3";
HiveCatalog hive = new HiveCatalog(name, defaultDatabase,
hiveConfDir, version);
tableEnv.registerCatalog("myhive", hive);
tableEnv.useCatalog("myhive");
String createDbSql = "INSERT INTO TABLE flink2hive_test VALUES
('55', \"333\", \"CHN\")";
tableEnv.sqlUpdate(createDbSql);
}
这样的job提交到yarn会报错:
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.mapreduce.TaskAttemptContext
是缺少MapReduce的相关包吗?
-----
Thanks!
Jacob
--
Sent from: http://apache-flink.147419.n8.nabble.com/