Komorebiyho opened a new issue, #6060:
URL: https://github.com/apache/seatunnel/issues/6060

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   In the HDP cluster, using ST to import data from PG to Hive failed. The Hive 
table is a non-transactional table. The import failed using the Zeta engine, 
and it also failed when executed with the Spark engine.
   
   ### SeaTunnel Version
   
   2.3.3
   
   ### SeaTunnel Config
   
   ```conf
   env {
     execution.parallelism = 1
     job.mode = "BATCH"
     checkpoint.interval = 10000
   }
   source {
     jdbc {
       url = "jdbc:postgresql://xxxx:5432/test"
       driver = "org.postgresql.Driver"
       user = "postgres"
       password = "Rdsp@admin2023"
       query="select * from person"
     }
   }
   
   sink {
     Hive {
       table_name = "default.person1"
       metastore_uri = "thrift://xxxx:9083"
     }
   }
   
   
   ./connectors/seatunnel:
   总用量 204220
   -rw-r--r-- 1 root root 29508120 11月 14 14:32 connector-cdc-mysql-2.3.3.jar
   -rw-r--r-- 1 root root    52297  2月 18  2022 connector-console-2.3.3.jar
   -rw-r--r-- 1 root root   162972  2月 18  2022 connector-fake-2.3.3.jar
   -rw-r--r-- 1 root root 41446248 11月 14 14:33 connector-file-hadoop-2.3.3.jar
   -rw-r--r-- 1 root root 50730956 11月 14 14:38 connector-hbase-2.3.3.jar
   -rw-r--r-- 1 root root 41466880 11月 14 14:35 connector-hive-2.3.3.jar
   -rw-r--r-- 1 root root 29383702 11月 14 14:36 connector-iceberg-2.3.3.jar
   -rw-r--r-- 1 root root   402984 11月 14 14:36 connector-jdbc-2.3.3.jar
   -rw-r--r-- 1 root root 15953189 11月 14 14:37 connector-kafka-2.3.3.jar
   
   ./lib:
   总用量 87492
   -rw-r--r-- 1 root root 45423319 12月 22 16:45 hive-exec-2.3.9.jar
   -rw-r--r-- 1 root root  1005078 11月 15 16:27 postgresql-42.2.19.jar
   -rw-r--r-- 1 root root    10503 12月 22 16:06 
seatunnel-hadoop3-3.1.4-uber-2.3.3.jar
   -rw-r--r-- 1 root root 42110549 11月 14 16:11 
seatunnel-hadoop3-3.1.4-uber-2.3.3-optional.jar
   -rw-r--r-- 1 root root  1032927  2月 18  2022 seatunnel-transforms-v2.jar
   ```
   
   
   ### Running Command
   
   ```shell
   ./bin/seatunnel.sh --config ./job/seatunnel.batch.pg_hive.conf 
   ./bin/start-seatunnel-spark-2-connector-v2.sh --config 
./job/seatunnel.batch.pg_hive.conf
   ```
   
   
   ### Error Exception
   
   ```log
   十二月 22, 2023 5:02:41 下午 com.hazelcast.internal.config.AbstractConfigLocator
   信息: Loading configuration '/opt/seatunnel/config/seatunnel.yaml' from System 
property 'seatunnel.config'
   十二月 22, 2023 5:02:41 下午 com.hazelcast.internal.config.AbstractConfigLocator
   信息: Using configuration file at /opt/seatunnel/config/seatunnel.yaml
   十二月 22, 2023 5:02:41 下午 
org.apache.seatunnel.engine.common.config.SeaTunnelConfig
   信息: seatunnel.home is /opt/seatunnel
   十二月 22, 2023 5:02:41 下午 com.hazelcast.internal.config.AbstractConfigLocator
   信息: Loading configuration '/opt/seatunnel/config/hazelcast.yaml' from System 
property 'hazelcast.config'
   十二月 22, 2023 5:02:41 下午 com.hazelcast.internal.config.AbstractConfigLocator
   信息: Using configuration file at /opt/seatunnel/config/hazelcast.yaml
   十二月 22, 2023 5:02:42 下午 com.hazelcast.internal.config.AbstractConfigLocator
   信息: Loading configuration '/opt/seatunnel/config/hazelcast-client.yaml' from 
System property 'hazelcast.client.config'
   十二月 22, 2023 5:02:42 下午 com.hazelcast.internal.config.AbstractConfigLocator
   信息: Using configuration file at /opt/seatunnel/config/hazelcast-client.yaml
   2023-12-22 17:02:42,460 INFO  
com.hazelcast.client.impl.spi.ClientInvocationService - hz.client_1 [seatunnel] 
[5.1] Running with 2 response threads, dynamic=true
   2023-12-22 17:02:42,556 INFO  com.hazelcast.core.LifecycleService - 
hz.client_1 [seatunnel] [5.1] HazelcastClient 5.1 (20220228 - 21f20e7) is 
STARTING
   2023-12-22 17:02:42,557 INFO  com.hazelcast.core.LifecycleService - 
hz.client_1 [seatunnel] [5.1] HazelcastClient 5.1 (20220228 - 21f20e7) is 
STARTED
   2023-12-22 17:02:42,591 INFO  
com.hazelcast.client.impl.connection.ClientConnectionManager - hz.client_1 
[seatunnel] [5.1] Trying to connect to cluster: seatunnel
   2023-12-22 17:02:42,595 INFO  
com.hazelcast.client.impl.connection.ClientConnectionManager - hz.client_1 
[seatunnel] [5.1] Trying to connect to [172.17.109.177]:5801
   2023-12-22 17:02:42,641 INFO  com.hazelcast.core.LifecycleService - 
hz.client_1 [seatunnel] [5.1] HazelcastClient 5.1 (20220228 - 21f20e7) is 
CLIENT_CONNECTED
   2023-12-22 17:02:42,641 INFO  
com.hazelcast.client.impl.connection.ClientConnectionManager - hz.client_1 
[seatunnel] [5.1] Authenticated with server 
[172.17.109.177]:5801:f6a94333-7898-4bf1-8629-2a4b9933d220, server version: 
5.1, local address: /172.17.109.176:33425
   2023-12-22 17:02:42,643 INFO  com.hazelcast.internal.diagnostics.Diagnostics 
- hz.client_1 [seatunnel] [5.1] Diagnostics disabled. To enable add 
-Dhazelcast.diagnostics.enabled=true to the JVM arguments.
   2023-12-22 17:02:42,657 INFO  
com.hazelcast.client.impl.spi.ClientClusterService - hz.client_1 [seatunnel] 
[5.1] 
   
   Members [2] {
        Member [172.17.109.176]:5801 - d4ed9a58-fd36-4088-a6da-afc10590e19c
        Member [172.17.109.177]:5801 - f6a94333-7898-4bf1-8629-2a4b9933d220
   }
   
   2023-12-22 17:02:42,664 INFO  
com.hazelcast.client.impl.connection.ClientConnectionManager - hz.client_1 
[seatunnel] [5.1] Authenticated with server 
[172.17.109.176]:5801:d4ed9a58-fd36-4088-a6da-afc10590e19c, server version: 
5.1, local address: /172.17.109.176:40177
   2023-12-22 17:02:42,699 INFO  
com.hazelcast.client.impl.statistics.ClientStatisticsService - Client 
statistics is enabled with period 5 seconds.
   2023-12-22 17:02:42,891 INFO  
org.apache.seatunnel.engine.client.job.JobExecutionEnvironment - add common jar 
in plugins :[]
   2023-12-22 17:02:42,907 INFO  
org.apache.seatunnel.core.starter.utils.ConfigBuilder - Loading config file 
from path: ./job/seatunnel.batch.pg_hive.conf
   2023-12-22 17:02:42,972 INFO  
org.apache.seatunnel.core.starter.utils.ConfigShadeUtils - Load config shade 
spi: [base64]
   2023-12-22 17:02:43,024 INFO  
org.apache.seatunnel.core.starter.utils.ConfigBuilder - Parsed config file: {
       "env" : {
           "execution.parallelism" : 1,
           "job.mode" : "BATCH",
           "checkpoint.interval" : 10000
       },
       "source" : [
           {
               "password" : "Rdsp@admin2023",
               "driver" : "org.postgresql.Driver",
               "query" : "select * from person where id=2",
               "plugin_name" : "jdbc",
               "user" : "postgres",
               "url" : "jdbc:postgresql://172.17.109.174:5432/test"
           }
       ],
       "sink" : [
           {
               "metastore_uri" : "thrift://kylin-nn-02.rdsp.com:9083",
               "plugin_name" : "Hive",
               "table_name" : "default.person1"
           }
       ]
   }
   
   2023-12-22 17:02:43,051 INFO  
org.apache.seatunnel.api.configuration.ReadonlyConfig - Config uses fallback 
configuration key 'plugin_name' instead of key 'factory'
   2023-12-22 17:02:43,052 INFO  
org.apache.seatunnel.api.configuration.ReadonlyConfig - Config uses fallback 
configuration key 'plugin_name' instead of key 'factory'
   2023-12-22 17:02:43,057 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Load 
SeaTunnelSink Plugin from /opt/seatunnel/connectors/seatunnel
   2023-12-22 17:02:43,064 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Discovery 
plugin jar: jdbc at: 
file:/opt/seatunnel/connectors/seatunnel/connector-jdbc-2.3.3.jar
   2023-12-22 17:02:43,065 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Discovery 
plugin jar: Hive at: 
file:/opt/seatunnel/connectors/seatunnel/connector-hive-2.3.3.jar
   2023-12-22 17:02:43,069 INFO  
org.apache.seatunnel.engine.core.parse.MultipleTableJobConfigParser - start 
generating all sources.
   2023-12-22 17:02:43,070 INFO  
org.apache.seatunnel.api.configuration.ReadonlyConfig - Config uses fallback 
configuration key 'plugin_name' instead of key 'factory'
   2023-12-22 17:02:43,099 INFO  
org.apache.seatunnel.api.configuration.ReadonlyConfig - Config uses fallback 
configuration key 'plugin_name' instead of key 'factory'
   2023-12-22 17:02:43,103 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Load 
SeaTunnelSource Plugin from /opt/seatunnel/connectors/seatunnel
   2023-12-22 17:02:43,109 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Discovery 
plugin jar: jdbc at: 
file:/opt/seatunnel/connectors/seatunnel/connector-jdbc-2.3.3.jar
   2023-12-22 17:02:43,114 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Load plugin: 
PluginIdentifier{engineType='seatunnel', pluginType='source', 
pluginName='jdbc'} from classpath
   2023-12-22 17:02:43,269 INFO  
org.apache.seatunnel.connectors.seatunnel.jdbc.source.JdbcSource - The 
partition_column parameter is not configured, and the source parallelism is set 
to 1
   2023-12-22 17:02:43,283 INFO  
org.apache.seatunnel.engine.core.parse.MultipleTableJobConfigParser - start 
generating all transforms.
   2023-12-22 17:02:43,283 INFO  
org.apache.seatunnel.engine.core.parse.MultipleTableJobConfigParser - start 
generating all sinks.
   2023-12-22 17:02:43,284 INFO  
org.apache.seatunnel.api.configuration.ReadonlyConfig - Config uses fallback 
configuration key 'plugin_name' instead of key 'factory'
   2023-12-22 17:02:43,289 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Load 
SeaTunnelSink Plugin from /opt/seatunnel/connectors/seatunnel
   2023-12-22 17:02:43,290 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Discovery 
plugin jar: Hive at: 
file:/opt/seatunnel/connectors/seatunnel/connector-hive-2.3.3.jar
   2023-12-22 17:02:43,294 INFO  
org.apache.seatunnel.plugin.discovery.AbstractPluginDiscovery - Load plugin: 
PluginIdentifier{engineType='seatunnel', pluginType='sink', pluginName='Hive'} 
from classpath
   2023-12-22 17:02:43,340 INFO  org.apache.hadoop.hive.conf.HiveConf - Found 
configuration file null
   2023-12-22 17:02:43,883 INFO  hive.metastore - Trying to connect to 
metastore with URI thrift://kylin-nn-02.rdsp.com:9083
   2023-12-22 17:02:43,910 INFO  hive.metastore - Opened a connection to 
metastore, current connections: 1
   2023-12-22 17:02:43,940 WARN  org.apache.hadoop.util.NativeCodeLoader - 
Unable to load native-hadoop library for your platform... using builtin-java 
classes where applicable
   2023-12-22 17:02:44,035 INFO  hive.metastore - Connected to metastore.
   2023-12-22 17:02:44,260 INFO  hive.metastore - Closed a connection to 
metastore, current connections: 0
   2023-12-22 17:02:44,358 INFO  
org.apache.seatunnel.engine.client.job.ClientJobProxy - Start submit job, job 
id: 790504513575321604, with plugin jar 
[file:/opt/seatunnel/connectors/seatunnel/connector-jdbc-2.3.3.jar, 
file:/opt/seatunnel/connectors/seatunnel/connector-hive-2.3.3.jar]
   2023-12-22 17:02:44,426 INFO  com.hazelcast.core.LifecycleService - 
hz.client_1 [seatunnel] [5.1] HazelcastClient 5.1 (20220228 - 21f20e7) is 
SHUTTING_DOWN
   2023-12-22 17:02:44,431 INFO  
com.hazelcast.client.impl.connection.ClientConnectionManager - hz.client_1 
[seatunnel] [5.1] Removed connection to endpoint: 
[172.17.109.177]:5801:f6a94333-7898-4bf1-8629-2a4b9933d220, connection: 
ClientConnection{alive=false, connectionId=1, 
channel=NioChannel{/172.17.109.176:33425->/172.17.109.177:5801}, 
remoteAddress=[172.17.109.177]:5801, lastReadTime=2023-12-22 17:02:42.901, 
lastWriteTime=2023-12-22 17:02:42.899, closedTime=2023-12-22 17:02:44.428, 
connected server version=5.1}
   2023-12-22 17:02:44,433 INFO  
com.hazelcast.client.impl.connection.ClientConnectionManager - hz.client_1 
[seatunnel] [5.1] Removed connection to endpoint: 
[172.17.109.176]:5801:d4ed9a58-fd36-4088-a6da-afc10590e19c, connection: 
ClientConnection{alive=false, connectionId=2, 
channel=NioChannel{/172.17.109.176:40177->/172.17.109.176:5801}, 
remoteAddress=[172.17.109.176]:5801, lastReadTime=2023-12-22 17:02:44.417, 
lastWriteTime=2023-12-22 17:02:44.362, closedTime=2023-12-22 17:02:44.431, 
connected server version=5.1}
   2023-12-22 17:02:44,433 INFO  com.hazelcast.core.LifecycleService - 
hz.client_1 [seatunnel] [5.1] HazelcastClient 5.1 (20220228 - 21f20e7) is 
CLIENT_DISCONNECTED
   2023-12-22 17:02:44,436 INFO  com.hazelcast.core.LifecycleService - 
hz.client_1 [seatunnel] [5.1] HazelcastClient 5.1 (20220228 - 21f20e7) is 
SHUTDOWN
   2023-12-22 17:02:44,436 INFO  
org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand - 
Closed SeaTunnel client......
   2023-12-22 17:02:44,437 ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
   
   
===============================================================================
   
   
   2023-12-22 17:02:44,437 ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
Fatal Error, 
   
   2023-12-22 17:02:44,437 ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
Please submit bug report in https://github.com/apache/seatunnel/issues
   
   2023-12-22 17:02:44,437 ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
Reason:SeaTunnel job executed failed 
   
   2023-12-22 17:02:44,439 ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
Exception 
StackTrace:org.apache.seatunnel.core.starter.exception.CommandExecuteException: 
SeaTunnel job executed failed
        at 
org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand.execute(ClientExecuteCommand.java:191)
        at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
        at 
org.apache.seatunnel.core.starter.seatunnel.SeaTunnelClient.main(SeaTunnelClient.java:34)
   Caused by: java.util.concurrent.CompletionException: 
java.lang.NoClassDefFoundError: Lorg/apache/hadoop/hive/metastore/api/Table;
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.wrapInCompletionException(AbstractInvocationFuture.java:1347)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.cascadeException(AbstractInvocationFuture.java:1340)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.access$200(AbstractInvocationFuture.java:65)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture$ApplyNode.execute(AbstractInvocationFuture.java:1478)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.unblockOtherNode(AbstractInvocationFuture.java:797)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.unblockAll(AbstractInvocationFuture.java:759)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.complete0(AbstractInvocationFuture.java:1235)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.completeExceptionallyInternal(AbstractInvocationFuture.java:1223)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.completeExceptionally(AbstractInvocationFuture.java:709)
        at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.completeExceptionally(ClientInvocation.java:294)
        at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.notifyExceptionWithOwnedPermission(ClientInvocation.java:321)
        at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.notifyException(ClientInvocation.java:304)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.handleResponse(ClientResponseHandlerSupplier.java:164)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.process(ClientResponseHandlerSupplier.java:141)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.access$300(ClientResponseHandlerSupplier.java:60)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier$DynamicResponseHandler.accept(ClientResponseHandlerSupplier.java:251)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier$DynamicResponseHandler.accept(ClientResponseHandlerSupplier.java:243)
        at 
com.hazelcast.client.impl.connection.tcp.TcpClientConnection.handleClientMessage(TcpClientConnection.java:245)
        at 
com.hazelcast.client.impl.protocol.util.ClientMessageDecoder.handleMessage(ClientMessageDecoder.java:135)
        at 
com.hazelcast.client.impl.protocol.util.ClientMessageDecoder.onRead(ClientMessageDecoder.java:89)
        at 
com.hazelcast.internal.networking.nio.NioInboundPipeline.process(NioInboundPipeline.java:136)
        at 
com.hazelcast.internal.networking.nio.NioThread.processSelectionKey(NioThread.java:383)
        at 
com.hazelcast.internal.networking.nio.NioThread.processSelectionKeys(NioThread.java:368)
        at 
com.hazelcast.internal.networking.nio.NioThread.selectLoop(NioThread.java:294)
        at 
com.hazelcast.internal.networking.nio.NioThread.executeRun(NioThread.java:249)
        at 
com.hazelcast.internal.util.executor.HazelcastManagedThread.run(HazelcastManagedThread.java:102)
   Caused by: java.lang.NoClassDefFoundError: 
Lorg/apache/hadoop/hive/metastore/api/Table;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
        at java.lang.Class.getDeclaredField(Class.java:2068)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1952)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:79)
        at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:540)
        at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:528)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:528)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:425)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:724)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2145)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1976)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2309)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1793)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2557)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2478)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2336)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1793)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:558)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:516)
        at 
com.hazelcast.internal.serialization.impl.defaultserializers.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:92)
        at 
com.hazelcast.internal.serialization.impl.defaultserializers.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:85)
        at 
com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:44)
        at 
com.hazelcast.internal.serialization.impl.AbstractSerializationService.readObject(AbstractSerializationService.java:349)
        at 
com.hazelcast.internal.serialization.impl.ByteArrayObjectDataInput.readObject(ByteArrayObjectDataInput.java:600)
        at 
org.apache.seatunnel.engine.core.dag.logical.LogicalVertex.readData(LogicalVertex.java:99)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.readInternal(DataSerializableSerializer.java:160)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:106)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:51)
        at 
com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:44)
        at 
com.hazelcast.internal.serialization.impl.AbstractSerializationService.readObject(AbstractSerializationService.java:349)
        at 
com.hazelcast.internal.serialization.impl.ByteArrayObjectDataInput.readObject(ByteArrayObjectDataInput.java:600)
        at 
org.apache.seatunnel.engine.core.dag.logical.LogicalDag.readData(LogicalDag.java:154)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.readInternal(DataSerializableSerializer.java:160)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:106)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:51)
        at 
com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:44)
        at 
com.hazelcast.internal.serialization.impl.AbstractSerializationService.toObject(AbstractSerializationService.java:268)
        at 
com.hazelcast.jet.impl.execution.init.CustomClassLoadedObject.deserializeWithCustomClassLoader(CustomClassLoadedObject.java:66)
        at 
org.apache.seatunnel.engine.server.master.JobMaster.init(JobMaster.java:209)
        at 
org.apache.seatunnel.engine.server.CoordinatorService.lambda$submitJob$5(CoordinatorService.java:461)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
    
   2023-12-22 17:02:44,444 ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
   
===============================================================================
   
   
   
   Exception in thread "main" 
org.apache.seatunnel.core.starter.exception.CommandExecuteException: SeaTunnel 
job executed failed
        at 
org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand.execute(ClientExecuteCommand.java:191)
        at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
        at 
org.apache.seatunnel.core.starter.seatunnel.SeaTunnelClient.main(SeaTunnelClient.java:34)
   Caused by: java.util.concurrent.CompletionException: 
java.lang.NoClassDefFoundError: Lorg/apache/hadoop/hive/metastore/api/Table;
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.wrapInCompletionException(AbstractInvocationFuture.java:1347)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.cascadeException(AbstractInvocationFuture.java:1340)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.access$200(AbstractInvocationFuture.java:65)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture$ApplyNode.execute(AbstractInvocationFuture.java:1478)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.unblockOtherNode(AbstractInvocationFuture.java:797)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.unblockAll(AbstractInvocationFuture.java:759)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.complete0(AbstractInvocationFuture.java:1235)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.completeExceptionallyInternal(AbstractInvocationFuture.java:1223)
        at 
com.hazelcast.spi.impl.AbstractInvocationFuture.completeExceptionally(AbstractInvocationFuture.java:709)
        at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.completeExceptionally(ClientInvocation.java:294)
        at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.notifyExceptionWithOwnedPermission(ClientInvocation.java:321)
        at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.notifyException(ClientInvocation.java:304)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.handleResponse(ClientResponseHandlerSupplier.java:164)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.process(ClientResponseHandlerSupplier.java:141)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.access$300(ClientResponseHandlerSupplier.java:60)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier$DynamicResponseHandler.accept(ClientResponseHandlerSupplier.java:251)
        at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier$DynamicResponseHandler.accept(ClientResponseHandlerSupplier.java:243)
        at 
com.hazelcast.client.impl.connection.tcp.TcpClientConnection.handleClientMessage(TcpClientConnection.java:245)
        at 
com.hazelcast.client.impl.protocol.util.ClientMessageDecoder.handleMessage(ClientMessageDecoder.java:135)
        at 
com.hazelcast.client.impl.protocol.util.ClientMessageDecoder.onRead(ClientMessageDecoder.java:89)
        at 
com.hazelcast.internal.networking.nio.NioInboundPipeline.process(NioInboundPipeline.java:136)
        at 
com.hazelcast.internal.networking.nio.NioThread.processSelectionKey(NioThread.java:383)
        at 
com.hazelcast.internal.networking.nio.NioThread.processSelectionKeys(NioThread.java:368)
        at 
com.hazelcast.internal.networking.nio.NioThread.selectLoop(NioThread.java:294)
        at 
com.hazelcast.internal.networking.nio.NioThread.executeRun(NioThread.java:249)
        at 
com.hazelcast.internal.util.executor.HazelcastManagedThread.run(HazelcastManagedThread.java:102)
   Caused by: java.lang.NoClassDefFoundError: 
Lorg/apache/hadoop/hive/metastore/api/Table;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
        at java.lang.Class.getDeclaredField(Class.java:2068)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1952)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:79)
        at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:540)
        at java.io.ObjectStreamClass$3.run(ObjectStreamClass.java:528)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:528)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:425)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:724)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2145)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1976)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2309)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1793)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2557)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2478)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2336)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1793)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:558)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:516)
        at 
com.hazelcast.internal.serialization.impl.defaultserializers.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:92)
        at 
com.hazelcast.internal.serialization.impl.defaultserializers.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:85)
        at 
com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:44)
        at 
com.hazelcast.internal.serialization.impl.AbstractSerializationService.readObject(AbstractSerializationService.java:349)
        at 
com.hazelcast.internal.serialization.impl.ByteArrayObjectDataInput.readObject(ByteArrayObjectDataInput.java:600)
        at 
org.apache.seatunnel.engine.core.dag.logical.LogicalVertex.readData(LogicalVertex.java:99)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.readInternal(DataSerializableSerializer.java:160)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:106)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:51)
        at 
com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:44)
        at 
com.hazelcast.internal.serialization.impl.AbstractSerializationService.readObject(AbstractSerializationService.java:349)
        at 
com.hazelcast.internal.serialization.impl.ByteArrayObjectDataInput.readObject(ByteArrayObjectDataInput.java:600)
        at 
org.apache.seatunnel.engine.core.dag.logical.LogicalDag.readData(LogicalDag.java:154)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.readInternal(DataSerializableSerializer.java:160)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:106)
        at 
com.hazelcast.internal.serialization.impl.DataSerializableSerializer.read(DataSerializableSerializer.java:51)
        at 
com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:44)
        at 
com.hazelcast.internal.serialization.impl.AbstractSerializationService.toObject(AbstractSerializationService.java:268)
        at 
com.hazelcast.jet.impl.execution.init.CustomClassLoadedObject.deserializeWithCustomClassLoader(CustomClassLoadedObject.java:66)
        at 
org.apache.seatunnel.engine.server.master.JobMaster.init(JobMaster.java:209)
        at 
org.apache.seatunnel.engine.server.CoordinatorService.lambda$submitJob$5(CoordinatorService.java:461)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
   
   [root@kylin-nn-02 seatunnel]# ./bin/start-seatunnel-spark-2-connector-v2.sh 
--config ./job/seatunnel.batch.pg_hive.conf 
   Execute SeaTunnel Spark Job: ${SPARK_HOME}/bin/spark-submit --class 
"org.apache.seatunnel.core.starter.spark.SeaTunnelSpark" --name "SeaTunnel" 
--master "local[*]" --deploy-mode "client" --jars 
"/opt/seatunnel/lib/seatunnel-transforms-v2.jar,/opt/seatunnel/lib/seatunnel-hadoop3-3.1.4-uber-2.3.3-optional.jar,/opt/seatunnel/lib/postgresql-42.2.19.jar,/opt/seatunnel/lib/seatunnel-hadoop3-3.1.4-uber-2.3.3.jar,/opt/seatunnel/lib/hive-exec-2.3.9.jar,/opt/seatunnel/connectors/seatunnel/connector-jdbc-2.3.3.jar,/opt/seatunnel/connectors/seatunnel/connector-hive-2.3.3.jar"
 --conf "job.mode=BATCH" --conf "execution.parallelism=1" --conf 
"checkpoint.interval=10000" 
/opt/seatunnel/starter/seatunnel-spark-2-starter.jar --config 
"./job/seatunnel.batch.pg_hive.conf" --master "local[*]" --deploy-mode "client" 
--name "SeaTunnel"
   Warning: Ignoring non-spark config property: execution.parallelism=1
   Warning: Ignoring non-spark config property: job.mode=BATCH
   Warning: Ignoring non-spark config property: checkpoint.interval=10000
   23/12/22 17:04:27 INFO ConfigBuilder: Loading config file from path: 
./job/seatunnel.batch.pg_hive.conf
   23/12/22 17:04:27 INFO ConfigShadeUtils: Load config shade spi: [base64]
   23/12/22 17:04:27 INFO ConfigBuilder: Parsed config file: {
       "env" : {
           "execution.parallelism" : 1,
           "job.mode" : "BATCH",
           "checkpoint.interval" : 10000
       },
       "source" : [
           {
               "password" : "Rdsp@admin2023",
               "driver" : "org.postgresql.Driver",
               "query" : "select * from person where id=2",
               "plugin_name" : "jdbc",
               "user" : "postgres",
               "url" : "jdbc:postgresql://172.17.109.174:5432/test"
           }
       ],
       "sink" : [
           {
               "metastore_uri" : "thrift://kylin-nn-02.rdsp.com:9083",
               "plugin_name" : "Hive",
               "table_name" : "default.person1"
           }
       ]
   }
   
   23/12/22 17:04:27 INFO SparkContext: Running Spark version 2.3.2.3.1.5.0-152
   23/12/22 17:04:27 INFO SparkContext: Submitted application: SeaTunnel
   23/12/22 17:04:27 INFO SecurityManager: Changing view acls to: root
   23/12/22 17:04:27 INFO SecurityManager: Changing modify acls to: root
   23/12/22 17:04:27 INFO SecurityManager: Changing view acls groups to: 
   23/12/22 17:04:27 INFO SecurityManager: Changing modify acls groups to: 
   23/12/22 17:04:27 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls enabled; users  with view permissions: Set(root); groups with 
view permissions: Set(); users  with modify permissions: Set(root); groups with 
modify permissions: Set()
   23/12/22 17:04:28 INFO Utils: Successfully started service 'sparkDriver' on 
port 44415.
   23/12/22 17:04:28 INFO SparkEnv: Registering MapOutputTracker
   23/12/22 17:04:28 INFO SparkEnv: Registering BlockManagerMaster
   23/12/22 17:04:28 INFO BlockManagerMasterEndpoint: Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
   23/12/22 17:04:28 INFO BlockManagerMasterEndpoint: 
BlockManagerMasterEndpoint up
   23/12/22 17:04:28 INFO DiskBlockManager: Created local directory at 
/tmp/blockmgr-9a0181f4-cc76-4d3c-82f1-59a84d2216f1
   23/12/22 17:04:28 INFO MemoryStore: MemoryStore started with capacity 366.3 
MB
   23/12/22 17:04:28 INFO SparkEnv: Registering OutputCommitCoordinator
   23/12/22 17:04:28 INFO log: Logging initialized @3164ms
   23/12/22 17:04:28 INFO Server: jetty-9.3.z-SNAPSHOT, build timestamp: 
2018-06-06T01:11:56+08:00, git hash: 84205aa28f11a4f31f2a3b86d1bba2cc8ab69827
   23/12/22 17:04:28 INFO Server: Started @3256ms
   23/12/22 17:04:28 INFO AbstractConnector: Started 
ServerConnector@b1d3e7a{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
   23/12/22 17:04:28 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@3d98d138{/jobs,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@5c10285a{/jobs/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@6b667cb3{/jobs/job,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@61e3cf4d{/jobs/job/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@3cec79d3{/stages,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@64b70919{/stages/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@4e31c3ec{/stages/stage,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@328902d5{/stages/stage/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@72e789cb{/stages/pool,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@7c1812b3{/stages/pool/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@43034809{/storage,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@39e67516{/storage/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@77010a30{/storage/rdd,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@4bb003e9{/storage/rdd/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@12aa4996{/environment,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@18eec010{/environment/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@67c119b7{/executors,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@2ca5f1ed{/executors/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@6c03fb16{/executors/threadDump,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@28348c6{/executors/threadDump/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@6de0f580{/static,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@8ff5094{/,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@363f0ba0{/api,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@50acf55d{/jobs/job/kill,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@3cae7b8b{/stages/stage/kill,null,AVAILABLE,@Spark}
   23/12/22 17:04:28 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at 
http://kylin-nn-02.rdsp.com:4040
   23/12/22 17:04:28 INFO SparkContext: Added JAR 
file:///opt/seatunnel/lib/seatunnel-transforms-v2.jar at 
spark://kylin-nn-02.rdsp.com:44415/jars/seatunnel-transforms-v2.jar with 
timestamp 1703235868660
   23/12/22 17:04:28 INFO SparkContext: Added JAR 
file:///opt/seatunnel/lib/seatunnel-hadoop3-3.1.4-uber-2.3.3-optional.jar at 
spark://kylin-nn-02.rdsp.com:44415/jars/seatunnel-hadoop3-3.1.4-uber-2.3.3-optional.jar
 with timestamp 1703235868661
   23/12/22 17:04:28 INFO SparkContext: Added JAR 
file:///opt/seatunnel/lib/postgresql-42.2.19.jar at 
spark://kylin-nn-02.rdsp.com:44415/jars/postgresql-42.2.19.jar with timestamp 
1703235868661
   23/12/22 17:04:28 INFO SparkContext: Added JAR 
file:///opt/seatunnel/lib/seatunnel-hadoop3-3.1.4-uber-2.3.3.jar at 
spark://kylin-nn-02.rdsp.com:44415/jars/seatunnel-hadoop3-3.1.4-uber-2.3.3.jar 
with timestamp 1703235868661
   23/12/22 17:04:28 INFO SparkContext: Added JAR 
file:///opt/seatunnel/lib/hive-exec-2.3.9.jar at 
spark://kylin-nn-02.rdsp.com:44415/jars/hive-exec-2.3.9.jar with timestamp 
1703235868670
   23/12/22 17:04:28 INFO SparkContext: Added JAR 
file:///opt/seatunnel/connectors/seatunnel/connector-jdbc-2.3.3.jar at 
spark://kylin-nn-02.rdsp.com:44415/jars/connector-jdbc-2.3.3.jar with timestamp 
1703235868670
   23/12/22 17:04:28 INFO SparkContext: Added JAR 
file:///opt/seatunnel/connectors/seatunnel/connector-hive-2.3.3.jar at 
spark://kylin-nn-02.rdsp.com:44415/jars/connector-hive-2.3.3.jar with timestamp 
1703235868670
   23/12/22 17:04:28 INFO SparkContext: Added JAR 
file:/opt/seatunnel/starter/seatunnel-spark-2-starter.jar at 
spark://kylin-nn-02.rdsp.com:44415/jars/seatunnel-spark-2-starter.jar with 
timestamp 1703235868670
   23/12/22 17:04:28 INFO Executor: Starting executor ID driver on host 
localhost
   23/12/22 17:04:28 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 46033.
   23/12/22 17:04:28 INFO NettyBlockTransferService: Server created on 
kylin-nn-02.rdsp.com:46033
   23/12/22 17:04:28 INFO BlockManager: Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
   23/12/22 17:04:28 INFO BlockManagerMaster: Registering BlockManager 
BlockManagerId(driver, kylin-nn-02.rdsp.com, 46033, None)
   23/12/22 17:04:28 INFO BlockManagerMasterEndpoint: Registering block manager 
kylin-nn-02.rdsp.com:46033 with 366.3 MB RAM, BlockManagerId(driver, 
kylin-nn-02.rdsp.com, 46033, None)
   23/12/22 17:04:28 INFO BlockManagerMaster: Registered BlockManager 
BlockManagerId(driver, kylin-nn-02.rdsp.com, 46033, None)
   23/12/22 17:04:28 INFO BlockManager: Initialized BlockManager: 
BlockManagerId(driver, kylin-nn-02.rdsp.com, 46033, None)
   23/12/22 17:04:29 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@3727f0ee{/metrics/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:29 INFO EventLoggingListener: Logging events to 
hdfs:/spark2-history/local-1703235868699
   23/12/22 17:04:30 INFO AbstractPluginDiscovery: Load SeaTunnelSource Plugin 
from /opt/seatunnel/connectors/seatunnel
   23/12/22 17:04:30 INFO AbstractPluginDiscovery: Discovery plugin jar: jdbc 
at: file:/opt/seatunnel/connectors/seatunnel/connector-jdbc-2.3.3.jar
   23/12/22 17:04:30 INFO AbstractPluginDiscovery: Load plugin: 
PluginIdentifier{engineType='seatunnel', pluginType='source', 
pluginName='jdbc'} from classpath
   23/12/22 17:04:30 INFO JdbcSource: The partition_column parameter is not 
configured, and the source parallelism is set to 1
   23/12/22 17:04:30 INFO SparkRuntimeEnvironment: register plugins 
:[file:/opt/seatunnel/connectors/seatunnel/connector-jdbc-2.3.3.jar]
   23/12/22 17:04:30 INFO AbstractPluginDiscovery: Load SeaTunnelTransform 
Plugin from /opt/seatunnel/lib
   23/12/22 17:04:30 INFO SparkRuntimeEnvironment: register plugins :[]
   23/12/22 17:04:30 INFO AbstractPluginDiscovery: Load SeaTunnelSink Plugin 
from /opt/seatunnel/connectors/seatunnel
   23/12/22 17:04:30 INFO AbstractPluginDiscovery: Discovery plugin jar: Hive 
at: file:/opt/seatunnel/connectors/seatunnel/connector-hive-2.3.3.jar
   23/12/22 17:04:30 INFO AbstractPluginDiscovery: Load plugin: 
PluginIdentifier{engineType='seatunnel', pluginType='sink', pluginName='Hive'} 
from classpath
   23/12/22 17:04:30 INFO metastore: Trying to connect to metastore with URI 
thrift://kylin-nn-02.rdsp.com:9083
   23/12/22 17:04:30 INFO metastore: Connected to metastore.
   23/12/22 17:04:31 INFO SparkRuntimeEnvironment: register plugins 
:[file:/opt/seatunnel/connectors/seatunnel/connector-hive-2.3.3.jar]
   23/12/22 17:04:31 INFO SharedState: loading hive config file: 
file:/etc/spark2/3.1.5.0-152/0/hive-site.xml
   23/12/22 17:04:31 INFO SharedState: Setting hive.metastore.warehouse.dir 
('null') to the value of spark.sql.warehouse.dir 
('/warehouse/tablespace/managed/hive').
   23/12/22 17:04:31 INFO SharedState: Warehouse path is 
'/warehouse/tablespace/managed/hive'.
   23/12/22 17:04:31 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@4a058df8{/SQL,null,AVAILABLE,@Spark}
   23/12/22 17:04:31 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@4b56b031{/SQL/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:31 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@aa61e4e{/SQL/execution,null,AVAILABLE,@Spark}
   23/12/22 17:04:31 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@733e6df7{/SQL/execution/json,null,AVAILABLE,@Spark}
   23/12/22 17:04:31 INFO ContextHandler: Started 
o.s.j.s.ServletContextHandler@32227215{/static/sql,null,AVAILABLE,@Spark}
   23/12/22 17:04:31 INFO StateStoreCoordinatorRef: Registered 
StateStoreCoordinator endpoint
   23/12/22 17:04:34 INFO CodeGenerator: Code generated in 211.708487 ms
   Exception in thread "main" java.lang.AbstractMethodError: 
org.apache.seatunnel.translation.spark.source.reader.batch.BatchSourceReader.createDataReaderFactories()Ljava/util/List;
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExec.readerFactories$lzycompute(DataSourceV2ScanExec.scala:55)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExec.readerFactories(DataSourceV2ScanExec.scala:52)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExec.inputRDD$lzycompute(DataSourceV2ScanExec.scala:76)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExec.inputRDD(DataSourceV2ScanExec.scala:60)
        at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExec.inputRDDs(DataSourceV2ScanExec.scala:79)
        at 
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
        at 
org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2.scala:59)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
        at 
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:664)
        at 
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:664)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
        at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:664)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:256)
        at 
org.apache.seatunnel.core.starter.spark.execution.SinkExecuteProcessor.execute(SinkExecuteProcessor.java:117)
        at 
org.apache.seatunnel.core.starter.spark.execution.SparkExecution.execute(SparkExecution.java:74)
        at 
org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:60)
        at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
        at 
org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:900)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:217)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   23/12/22 17:04:34 INFO SparkContext: Invoking stop() from shutdown hook
   23/12/22 17:04:34 INFO AbstractConnector: Stopped 
Spark@b1d3e7a{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
   23/12/22 17:04:34 INFO SparkUI: Stopped Spark web UI at 
http://kylin-nn-02.rdsp.com:4040
   23/12/22 17:04:34 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
   23/12/22 17:04:34 INFO MemoryStore: MemoryStore cleared
   23/12/22 17:04:34 INFO BlockManager: BlockManager stopped
   23/12/22 17:04:34 INFO BlockManagerMaster: BlockManagerMaster stopped
   23/12/22 17:04:34 INFO 
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
   23/12/22 17:04:34 INFO SparkContext: Successfully stopped SparkContext
   23/12/22 17:04:34 INFO ShutdownHookManager: Shutdown hook called
   23/12/22 17:04:34 INFO ShutdownHookManager: Deleting directory 
/tmp/spark-e0dbb9f0-cc67-46fc-a668-191afda959df
   23/12/22 17:04:34 INFO ShutdownHookManager: Deleting directory 
/tmp/spark-419884f9-2422-4c71-b50c-249452f1fe3f
   ```
   
   
   ### Zeta or Flink or Spark Version
   
   spark 2.3.0
   hive 3.1.0
   hdp 3.1.5
   
   ### Java or Scala Version
   
   jdk 1.8.0_312
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to