yangkaikb opened a new issue, #7118:
URL: https://github.com/apache/seatunnel/issues/7118

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   java.util.concurrent.CompletionException: java.lang.NoSuchMethodError: 
org.apache.hadoop.conf.Configuration.getTimeDuration.
   Hadoop Versiopn:3.1.3
   I have put hadoop-client-3.1.4.jar and  
seatunnel-hadoop3-3.1.4-uber-2.3.3-optional.jar in $SEATUNNEL_HOME/lib
   
   ### SeaTunnel Version
   
   2.3.3
   
   ### SeaTunnel Config
   
   ```conf
   "env" : {
                "execution.parallelism" : 2,
                "job.mode" : "BATCH",
                "checkpoint.interval" : 10000
            },
            "source" : [
                {
                    "password" : "xxxxxxx",
                    "driver" : "oracle.jdbc.driver.OracleDriver",
                    "query" : "select * from GMES.PP_PRODUCTIONORDER where 
MODIFYDATETIME >= '2024-07-05 13:26:52'",
                    "connection_check_timeout_sec" : 100,
                    "plugin_name" : "Jdbc",
                    "user" : "gmes",
                    "url" : "jdbc:oracle:thin:@//10.xxx.xx.xxx:1521/GMES"
                }
            ],
            "transform" : [],
            "sink" : [
                {
                    "password" : "xxxxxx",
                    "fenodes" : "10.xx.xxx.xxx:8030",
                    "sink.enable-2pc" : "true",
                    "doris.config" : {
                        "format" : "json",
                        "read_json_by_line" : "true"
                    },
                    "table.identifier" : "PP_PRODUCTIONORDER",
                    "plugin_name" : "Doris",
                    "sink.label-prefix" : "test_json",
                    "username" : "root"
                }
            ]
        }
   ```
   
   
   ### Running Command
   
   ```shell
   dolphinscheduler seatunnel
   ```
   
   
   ### Error Exception
   
   ```log
   00:25:06.061 [main] ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
        
        
===============================================================================
        
        
        00:25:06.061 [main] ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
Fatal Error, 
        
        00:25:06.061 [main] ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
Please submit bug report in https://github.com/apache/seatunnel/issues
        
        00:25:06.061 [main] ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
Reason:SeaTunnel job executed failed 
        
        00:25:06.065 [main] ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
Exception 
StackTrace:org.apache.seatunnel.core.starter.exception.CommandExecuteException: 
SeaTunnel job executed failed
                at 
org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand.execute(ClientExecuteCommand.java:191)
                at 
org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
                at 
org.apache.seatunnel.core.starter.seatunnel.SeaTunnelClient.main(SeaTunnelClient.java:34)
        Caused by: java.util.concurrent.CompletionException: 
java.lang.NoSuchMethodError: 
org.apache.hadoop.conf.Configuration.getTimeDuration(Ljava/lang/String;JLjava/util/concurrent/TimeUnit;Ljava/util/concurrent/TimeUnit;)J
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.wrapInCompletionException(AbstractInvocationFuture.java:1347)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.cascadeException(AbstractInvocationFuture.java:1340)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.access$200(AbstractInvocationFuture.java:65)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture$ApplyNode.execute(AbstractInvocationFuture.java:1478)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.unblockOtherNode(AbstractInvocationFuture.java:797)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.unblockAll(AbstractInvocationFuture.java:759)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.complete0(AbstractInvocationFuture.java:1235)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.completeExceptionallyInternal(AbstractInvocationFuture.java:1223)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.completeExceptionally(AbstractInvocationFuture.java:709)
                at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.completeExceptionally(ClientInvocation.java:294)
                at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.notifyExceptionWithOwnedPermission(ClientInvocation.java:321)
                at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.notifyException(ClientInvocation.java:304)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.handleResponse(ClientResponseHandlerSupplier.java:164)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.process(ClientResponseHandlerSupplier.java:141)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.access$300(ClientResponseHandlerSupplier.java:60)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier$DynamicResponseHandler.accept(ClientResponseHandlerSupplier.java:251)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier$DynamicResponseHandler.accept(ClientResponseHandlerSupplier.java:243)
                at 
com.hazelcast.client.impl.connection.tcp.TcpClientConnection.handleClientMessage(TcpClientConnection.java:245)
                at 
com.hazelcast.client.impl.protocol.util.ClientMessageDecoder.handleMessage(ClientMessageDecoder.java:135)
                at 
com.hazelcast.client.impl.protocol.util.ClientMessageDecoder.onRead(ClientMessageDecoder.java:89)
                at 
com.hazelcast.internal.networking.nio.NioInboundPipeline.process(NioInboundPipeline.java:136)
                at 
com.hazelcast.internal.networking.nio.NioThread.processSelectionKey(NioThread.java:383)
                at 
com.hazelcast.internal.networking.nio.NioThread.processSelectionKeys(NioThread.java:368)
                at 
com.hazelcast.internal.networking.nio.NioThread.selectLoop(NioThread.java:294)
                at 
com.hazelcast.internal.networking.nio.NioThread.executeRun(NioThread.java:249)
                at 
com.hazelcast.internal.util.executor.HazelcastManagedThread.run(HazelcastManagedThread.java:102)
        Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.conf.Configuration.getTimeDuration(Ljava/lang/String;JLjava/util/concurrent/TimeUnit;Ljava/util/concurrent/TimeUnit;)J
                at 
org.apache.hadoop.hdfs.client.impl.DfsClientConf.<init>(DfsClientConf.java:248)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:306)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:290)
                at 
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:172)
                at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3242)
                at 
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:121)
                at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3291)
                at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3259)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:470)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:223)
                at 
org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorage.initStorage(HdfsStorage.java:68)
                at 
org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorage.<init>(HdfsStorage.java:57)
                at 
org.apache.seatunnel.engine.checkpoint.storage.hdfs.common.HdfsFileStorageInstance.getOrCreateStorage(HdfsFileStorageInstance.java:53)
                at 
org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorageFactory.create(HdfsStorageFactory.java:75)
                at 
org.apache.seatunnel.engine.server.checkpoint.CheckpointManager.<init>(CheckpointManager.java:103)
                at 
org.apache.seatunnel.engine.server.master.JobMaster.initCheckPointManager(JobMaster.java:251)
                at 
org.apache.seatunnel.engine.server.master.JobMaster.init(JobMaster.java:234)
                at 
org.apache.seatunnel.engine.server.CoordinatorService.lambda$submitJob$5(CoordinatorService.java:461)
                at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
                at java.util.concurrent.FutureTask.run(FutureTask.java:266)
                at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
                at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
                at java.lang.Thread.run(Thread.java:748)
         
        00:25:06.065 [main] ERROR org.apache.seatunnel.core.starter.SeaTunnel - 
        
===============================================================================
        
        
        
        Exception in thread "main" 
org.apache.seatunnel.core.starter.exception.CommandExecuteException: SeaTunnel 
job executed failed
                at 
org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand.execute(ClientExecuteCommand.java:191)
                at 
org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
                at 
org.apache.seatunnel.core.starter.seatunnel.SeaTunnelClient.main(SeaTunnelClient.java:34)
        Caused by: java.util.concurrent.CompletionException: 
java.lang.NoSuchMethodError: 
org.apache.hadoop.conf.Configuration.getTimeDuration(Ljava/lang/String;JLjava/util/concurrent/TimeUnit;Ljava/util/concurrent/TimeUnit;)J
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.wrapInCompletionException(AbstractInvocationFuture.java:1347)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.cascadeException(AbstractInvocationFuture.java:1340)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.access$200(AbstractInvocationFuture.java:65)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture$ApplyNode.execute(AbstractInvocationFuture.java:1478)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.unblockOtherNode(AbstractInvocationFuture.java:797)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.unblockAll(AbstractInvocationFuture.java:759)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.complete0(AbstractInvocationFuture.java:1235)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.completeExceptionallyInternal(AbstractInvocationFuture.java:1223)
                at 
com.hazelcast.spi.impl.AbstractInvocationFuture.completeExceptionally(AbstractInvocationFuture.java:709)
                at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.completeExceptionally(ClientInvocation.java:294)
                at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.notifyExceptionWithOwnedPermission(ClientInvocation.java:321)
                at 
com.hazelcast.client.impl.spi.impl.ClientInvocation.notifyException(ClientInvocation.java:304)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.handleResponse(ClientResponseHandlerSupplier.java:164)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.process(ClientResponseHandlerSupplier.java:141)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier.access$300(ClientResponseHandlerSupplier.java:60)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier$DynamicResponseHandler.accept(ClientResponseHandlerSupplier.java:251)
                at 
com.hazelcast.client.impl.spi.impl.ClientResponseHandlerSupplier$DynamicResponseHandler.accept(ClientResponseHandlerSupplier.java:243)
                at 
com.hazelcast.client.impl.connection.tcp.TcpClientConnection.handleClientMessage(TcpClientConnection.java:245)
                at 
com.hazelcast.client.impl.protocol.util.ClientMessageDecoder.handleMessage(ClientMessageDecoder.java:135)
                at 
com.hazelcast.client.impl.protocol.util.ClientMessageDecoder.onRead(ClientMessageDecoder.java:89)
                at 
com.hazelcast.internal.networking.nio.NioInboundPipeline.process(NioInboundPipeline.java:136)
                at 
com.hazelcast.internal.networking.nio.NioThread.processSelectionKey(NioThread.java:383)
                at 
com.hazelcast.internal.networking.nio.NioThread.processSelectionKeys(NioThread.java:368)
                at 
com.hazelcast.internal.networking.nio.NioThread.selectLoop(NioThread.java:294)
                at 
com.hazelcast.internal.networking.nio.NioThread.executeRun(NioThread.java:249)
                at 
com.hazelcast.internal.util.executor.HazelcastManagedThread.run(HazelcastManagedThread.java:102)
        Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.conf.Configuration.getTimeDuration(Ljava/lang/String;JLjava/util/concurrent/TimeUnit;Ljava/util/concurrent/TimeUnit;)J
                at 
org.apache.hadoop.hdfs.client.impl.DfsClientConf.<init>(DfsClientConf.java:248)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:306)
                at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:290)
                at 
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:172)
                at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3242)
                at 
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:121)
                at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3291)
                at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3259)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:470)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:223)
                at 
org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorage.initStorage(HdfsStorage.java:68)
                at 
org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorage.<init>(HdfsStorage.java:57)
                at 
org.apache.seatunnel.engine.checkpoint.storage.hdfs.common.HdfsFileStorageInstance.getOrCreateStorage(HdfsFileStorageInstance.java:53)
                at 
org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorageFactory.create(HdfsStorageFactory.java:75)
                at 
org.apache.seatunnel.engine.server.checkpoint.CheckpointManager.<init>(CheckpointManager.java:103)
                at 
org.apache.seatunnel.engine.server.master.JobMaster.initCheckPointManager(JobMaster.java:251)
                at 
org.apache.seatunnel.engine.server.master.JobMaster.init(JobMaster.java:234)
                at 
org.apache.seatunnel.engine.server.CoordinatorService.lambda$submitJob$5(CoordinatorService.java:461)
                at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
                at java.util.concurrent.FutureTask.run(FutureTask.java:266)
                at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
                at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
                at java.lang.Thread.run(Thread.java:748)
   [INFO] 2024-07-06 00:25:06.848 +0800 - FINALIZE_SESSION
   ```
   
   
   ### Zeta or Flink or Spark Version
   
   _No response_
   
   ### Java or Scala Version
   
   _No response_
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to