zhouhongyu888 opened a new issue #3680:
URL: https://github.com/apache/hudi/issues/3680
My environment Description:
Flink version:1.13.1
scala version: 2.12
hadoop version: 3.1.3
hive version: 3.1.2
Error appeared when I run the flink-sql task below, and I don't know the
reason:
CREATE TABLE t8(
uuid VARCHAR(20),
name VARCHAR(10),
age INT,
ts TIMESTAMP(3),
`partition` VARCHAR(20)
)
PARTITIONED BY (`partition`)
WITH (
'connector' = 'hudi',
'path' = 'hdfs://hadoop102:8020/flink-hudi/t8',
'table.type' = 'COPY_ON_WRITE',
'hive_sync.db' = 'hudi',
'hive_sync.table' = 't8',
'hive_sync.enable' = 'true',
'hive_sync.mode' = 'hms',
'hive_sync.metastore.uris' = 'thrift://hadoop102:9083'
);
INSERT INTO t8 VALUES
('id1','Danny',23,TIMESTAMP '1970-01-01 00:00:01','par1'),
('id2','Stephen',33,TIMESTAMP '1970-01-01 00:00:02','par1'),
('id3','Julian',53,TIMESTAMP '1970-01-01 00:00:03','par2'),
('id4','Fabian',31,TIMESTAMP '1970-01-01 00:00:04','par2'),
('id5','Sophia',18,TIMESTAMP '1970-01-01 00:00:05','par3'),
('id6','Emma',20,TIMESTAMP '1970-01-01 00:00:06','par3'),
('id7','Bob',44,TIMESTAMP '1970-01-01 00:00:07','par4'),
('id8','Han',56,TIMESTAMP '1970-01-01 00:00:08','par4');
There is no erro on my sql-client, but I find some error info for my
standalonesession
the info of the error below:
_2021-09-17 16:11:25,239 ERROR
org.apache.hudi.sink.StreamWriteOperatorCoordinator [] - Executor
executes action [sync hive metadata for instant 20210917161123] error
java.lang.ExceptionInInitializerError: null
at
org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:247) ~[?:?]
at
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231)
~[?:?]
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:388) ~[?:?]
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:332) ~[?:?]
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:312)
~[?:?]
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:288) ~[?:?]
at
org.apache.hudi.hive.ddl.HMSDDLExecutor.<init>(HMSDDLExecutor.java:66) ~[?:?]
at
org.apache.hudi.hive.HoodieHiveClient.<init>(HoodieHiveClient.java:75) ~[?:?]
at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:80) ~[?:?]
at
org.apache.hudi.sink.utils.HiveSyncContext.hiveSyncTool(HiveSyncContext.java:51)
~[?:?]
at
org.apache.hudi.sink.StreamWriteOperatorCoordinator.syncHive(StreamWriteOperatorCoordinator.java:295)
~[?:?]
at
org.apache.hudi.sink.utils.NonThrownExecutor.lambda$execute$0(NonThrownExecutor.java:67)
~[?:?]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[?:1.8.0_144]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[?:1.8.0_144]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_144]
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
org.apache.hadoop.hive.ql.udf.UDFSubstr
at
org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.getUdfClass(GenericUDFBridge.java:134)
~[?:?]
at
org.apache.hadoop.hive.ql.exec.FunctionInfo.getFunctionClass(FunctionInfo.java:151)
~[?:?]
at
org.apache.hadoop.hive.ql.exec.Registry.addFunction(Registry.java:519) ~[?:?]
at
org.apache.hadoop.hive.ql.exec.Registry.registerUDF(Registry.java:163) ~[?:?]
at
org.apache.hadoop.hive.ql.exec.Registry.registerUDF(Registry.java:154) ~[?:?]
at
org.apache.hadoop.hive.ql.exec.Registry.registerUDF(Registry.java:147) ~[?:?]
at
org.apache.hadoop.hive.ql.exec.FunctionRegistry.<clinit>(FunctionRegistry.java:194)
~[?:?]
... 15 more
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hive.ql.udf.UDFSubstr
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
~[?:1.8.0_144]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_144]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
~[?:1.8.0_144]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_144]
at java.lang.Class.forName0(Native Method) ~[?:1.8.0_144]
at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_144]
at
org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.getUdfClassInternal(GenericUDFBridge.java:142)
~[?:?]
at
org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.getUdfClass(GenericUDFBridge.java:132)
~[?:?]
at
org.apache.hadoop.hive.ql.exec.FunctionInfo.getFunctionClass(FunctionInfo.java:151)
~[?:?]
at
org.apache.hadoop.hive.ql.exec.Registry.addFunction(Registry.java:519) ~[?:?]
at
org.apache.hadoop.hive.ql.exec.Registry.registerUDF(Registry.java:163) ~[?:?]
at
org.apache.hadoop.hive.ql.exec.Registry.registerUDF(Registry.java:154) ~[?:?]
at
org.apache.hadoop.hive.ql.exec.Registry.registerUDF(Registry.java:147) ~[?:?]
at
org.apache.hadoop.hive.ql.exec.FunctionRegistry.<clinit>(FunctionRegistry.java:194)
~[?:?]
... 15 more_
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]