zhang5059T opened a new issue, #6988:
URL: https://github.com/apache/seatunnel/issues/6988

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   Run Seatunnel using the Spark engine in Spark cluster mode. An extremely 
strange error occurred during execution.
   
   I am using the Spark-operator solution of Spark on K8s.
   
   ## Error LOG
   24/06/14 02:45:56 INFO Executor: Fetching 
file:/opt/seatunnel/connectors/connector-console-2.3.5.jar with timestamp 
1718333109675
   24/06/14 02:45:56 INFO Utils: Copying 
/opt/seatunnel/connectors/connector-console-2.3.5.jar to 
/opt/spark/work-dir/spark-28920bb8-5a06-4d88-83dd-d6068515f887/3768210091718333109675_cache
   24/06/14 02:45:56 INFO Utils: Copying 
/opt/spark/work-dir/spark-28920bb8-5a06-4d88-83dd-d6068515f887/3768210091718333109675_cache
 to /opt/./connector-console-2.3.5.jar
   24/06/14 02:45:56 ERROR CoarseGrainedExecutorBackend: Executor self-exiting 
due to : Unable to create executor due to ./connector-console-2.3.5.jar
   java.nio.file.AccessDeniedException: ./connector-console-2.3.5.jar
           at java.base/sun.nio.fs.UnixException.translateToIOException(Unknown 
Source)
           at java.base/sun.nio.fs.UnixException.rethrowAsIOException(Unknown 
Source)
           at java.base/sun.nio.fs.UnixException.rethrowAsIOException(Unknown 
Source)
           at java.base/sun.nio.fs.UnixCopyFile.copyFile(Unknown Source)
           at java.base/sun.nio.fs.UnixCopyFile.copy(Unknown Source)
           at java.base/sun.nio.fs.UnixFileSystemProvider.copy(Unknown Source)
           at java.base/java.nio.file.Files.copy(Unknown Source)
           at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:771)
           at org.apache.spark.util.Utils$.copyFile(Utils.scala:742)
           at org.apache.spark.util.Utils$.fetchFile(Utils.scala:550)
           at 
org.apache.spark.executor.Executor.$anonfun$updateDependencies$13(Executor.scala:1010)
           at 
org.apache.spark.executor.Executor.$anonfun$updateDependencies$13$adapted(Executor.scala:1002)
           at 
scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:985)
           at 
scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
           at 
scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
           at 
scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
           at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
           at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
           at 
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:984)
           at 
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:1002)
           at org.apache.spark.executor.Executor.<init>(Executor.scala:273)
           at 
org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receive$1.applyOrElse(CoarseGrainedExecutorBackend.scala:171)
           at 
org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:115)
           at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
           at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
           at 
org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
           at 
org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
           at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
           at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
           at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
           at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
           at java.base/java.lang.Thread.run(Unknown Source)
   
   ### SeaTunnel Version
   
   2.3.5
   
   ### SeaTunnel Config
   
   ```conf
   env {
     parallelism = 4
   }
   source {
       Jdbc {
           url = "jdbc:oracle:thin:@xxx:1521:orcl"
           driver = "oracle.jdbc.OracleDriver"
           user = "system"
           password = "123456"
           query = "select * from DATA"
       }
   }
   
   
   
   sink {
       Jdbc {
           url = "jdbc:oracle:thin:@xxx:1521:orcl"
           driver = "oracle.jdbc.OracleDriver"
           user = "seatunnel"
           password = "seatunnel123456"
                generate_sink_sql = true
           database = ORCL
           table = "DATA"
       }
   }
   ```
   
   
   ### Running Command
   
   ```shell
   apiVersion: "sparkoperator.k8s.io/v1beta2"
   kind: SparkApplication
   metadata:
     name: spark-seatunnel
     namespace: spark-operator
   spec:
     type: Scala
     mode: cluster
     image: "spark3.3.0-seatunnal2.3.5:v3"
     mainClass: org.apache.seatunnel.core.starter.spark.SeaTunnelSpark
     arguments: ["--config", "/data/v2.oracle.conf"]
     mainApplicationFile: 
"local:///opt/seatunnel/starter/seatunnel-spark-3-starter.jar"
     imagePullPolicy: Always
     sparkVersion: "3.3.0"
     restartPolicy:
       type: Never
     volumes:
       - name: seatunnel-oracle
         configMap:
           name: seatunnel-oracle
           items:
             - key: v2.oracle.conf
               path: v2.oracle.conf
     deps:
       jars: 
["local:///opt/seatunnel/lib/ojdbc8-23.4.0.24.05.jar","local:///opt/seatunnel/lib/seatunnel-hadoop3-3.1.4-uber.jar","local:///opt/seatunnel/lib/seatunnel-transforms-v2.jar","local:///opt/seatunnel/connectors/connector-console-2.3.5.jar","local:///opt/seatunnel/connectors/connector-fake-2.3.5.jar"]
     driver:
       cores: 1
       coreLimit: "1200m"
       memory: "512m"
       labels:
         version: 3.3.0
       serviceAccount: spark-release-spark-operator
       env:
         - name: SPARK_EXECUTOR_DIRS
           value: '/opt/spark/work-dir'
         - name: SEATUNNEL_HOME
           value: "/opt/seatunnel"
       volumeMounts:
         - name: seatunnel-oracle
           mountPath: /data/v2.oracle.conf
           subPath: v2.oracle.conf
     executor:
       cores: 1
       instances: 1
       memory: "512m"
       env:
         - name: SPARK_OPTS
           value: "--conf spark.files.useFetchCache=false"
         - name: SPARK_EXECUTOR_DIRS
           value: '/opt/spark/work-dir'
         - name: SPARK_HOME
           value: '/opt/spark'
       labels:
         version: 3.3.0
       volumeMounts:
         - name: seatunnel-oracle
           mountPath: /data/v2.oracle.conf
           subPath: v2.oracle.conf
   ```
   
   
   ### Error Exception
   
   ```log
   24/06/14 02:45:56 INFO Executor: Starting executor with user classpath 
(userClassPathFirst = false): ''
   24/06/14 02:45:56 INFO Executor: Fetching 
file:/opt/seatunnel/connectors/connector-console-2.3.5.jar with timestamp 
1718333109675
   24/06/14 02:45:56 INFO Utils: Copying 
/opt/seatunnel/connectors/connector-console-2.3.5.jar to 
/opt/spark/work-dir/spark-28920bb8-5a06-4d88-83dd-d6068515f887/3768210091718333109675_cache
   24/06/14 02:45:56 INFO Utils: Copying 
/opt/spark/work-dir/spark-28920bb8-5a06-4d88-83dd-d6068515f887/3768210091718333109675_cache
 to /opt/./connector-console-2.3.5.jar
   24/06/14 02:45:56 ERROR CoarseGrainedExecutorBackend: Executor self-exiting 
due to : Unable to create executor due to ./connector-console-2.3.5.jar
   java.nio.file.AccessDeniedException: ./connector-console-2.3.5.jar
           at java.base/sun.nio.fs.UnixException.translateToIOException(Unknown 
Source)
           at java.base/sun.nio.fs.UnixException.rethrowAsIOException(Unknown 
Source)
           at java.base/sun.nio.fs.UnixException.rethrowAsIOException(Unknown 
Source)
           at java.base/sun.nio.fs.UnixCopyFile.copyFile(Unknown Source)
           at java.base/sun.nio.fs.UnixCopyFile.copy(Unknown Source)
           at java.base/sun.nio.fs.UnixFileSystemProvider.copy(Unknown Source)
           at java.base/java.nio.file.Files.copy(Unknown Source)
           at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:771)
           at org.apache.spark.util.Utils$.copyFile(Utils.scala:742)
           at org.apache.spark.util.Utils$.fetchFile(Utils.scala:550)
           at 
org.apache.spark.executor.Executor.$anonfun$updateDependencies$13(Executor.scala:1010)
           at 
org.apache.spark.executor.Executor.$anonfun$updateDependencies$13$adapted(Executor.scala:1002)
           at 
scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:985)
           at 
scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
           at 
scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
           at 
scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
           at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
           at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
           at 
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:984)
           at 
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:1002)
           at org.apache.spark.executor.Executor.<init>(Executor.scala:273)
           at 
org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receive$1.applyOrElse(CoarseGrainedExecutorBackend.scala:171)
           at 
org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:115)
           at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
           at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
           at 
org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
           at 
org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
           at 
java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
           at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
           at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
           at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
           at java.base/java.lang.Thread.run(Unknown Source)
   ```
   
   
   ### Zeta or Flink or Spark Version
   
   spark:3.3.0
   
   ### Java or Scala Version
   
   openjdk 11
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to