vegastar002 opened a new issue, #1858:
URL: https://github.com/apache/incubator-seatunnel/issues/1858

   ver
   apache-seatunnel-incubating-2.1.1
   spark-2.4.0-bin-hadoop2.6
   
   ### my.conf
   path:
   /opt/apache-seatunnel-incubating-2.1.1/config/my.conf
   
   ```xml
   env {
     spark.app.name = "SeaTunnel"
     spark.executor.instances = 2
     spark.executor.cores = 1
     spark.executor.memory = "1g"
   }
   
   
   source {
        jdbc {
                driver = "oracle.jdbc.driver.OracleDriver"
                url = "jdbc:oracle:thin://192.168.9.26:1521/dwe"
                table = "TEST_DW"
                result_table_name = "TEST_DW_log"
                user = "123"
                password = "123"
        }
   
   }
   
   
   transform {
        sql {
            sql = "SELECT BH , DM , DWLB , JC ,QC  FROM DSECS.TEST_DW "
        }
   }
   
   
   sink {
        clickhouse {
                host = "192.168.9.103:8123"
                clickhouse.socket_timeout = 50000
                database = "default"
                table = "dw"
                fields = ["BH", "DM", "DWLB", "JC", "QC" ]
                username = "root"
                password = "1234567"
                bulk_size = 20000
        }
   }
   ```
   
   ### step
   run command
   ```shell
   ./bin/start-seatunnel-spark.sh --master local[4] --deploy-mode client 
--config ./config/my.conf
   ```
   
   the error report
   ```
   2022-05-12 10:47:03 WARN  NativeCodeLoader:62 - Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   2022-05-12 10:47:04 INFO  ConfigBuilder:59 - Loading config file: 
./config/my.conf
   2022-05-12 10:47:04 INFO  ConfigBuilder:70 - parsed config file: {
       "env" : {
           "spark.app.name" : "SeaTunnel",
           "spark.executor.instances" : 2,
           "spark.executor.cores" : 1,
           "spark.executor.memory" : "1g"
       },
       "source" : [
           {
               "password" : "dsecs",
               "driver" : "oracle.jdbc.driver.OracleDriver",
               "result_table_name" : "TEST_DW_log",
               "plugin_name" : "jdbc",
               "user" : "dsecs",
               "url" : "jdbc:oracle:thin://192.168.9.236:1521/DSECS",
               "table" : "TEST_DW"
           }
       ],
       "transform" : [
           {
               "plugin_name" : "sql",
               "sql" : "SELECT BH , DM , DWLB , JC ,QC  FROM DSECS.TEST_DW "
           }
       ],
       "sink" : [
           {
               "database" : "default",
               "password" : "1234567",
               "clickhouse.socket_timeout" : 50000,
               "host" : "192.168.9.103:8123",
               "bulk_size" : 20000,
               "fields" : [
                   "BH",
                   "DM",
                   "DWLB",
                   "JC",
                   "QC"
               ],
               "plugin_name" : "clickhouse",
               "table" : "dw",
               "username" : "root"
           }
       ]
   }
   
   2022-05-12 10:47:04 INFO  SparkContext:54 - Running Spark version 2.4.0
   2022-05-12 10:47:04 INFO  SparkContext:54 - Submitted application: SeaTunnel
   2022-05-12 10:47:04 INFO  SecurityManager:54 - Changing view acls to: root
   2022-05-12 10:47:04 INFO  SecurityManager:54 - Changing modify acls to: root
   2022-05-12 10:47:04 INFO  SecurityManager:54 - Changing view acls groups to: 
   2022-05-12 10:47:04 INFO  SecurityManager:54 - Changing modify acls groups 
to: 
   2022-05-12 10:47:04 INFO  SecurityManager:54 - SecurityManager: 
authentication disabled; ui acls disabled; users  with view permissions: 
Set(root); groups with view permissions: Set(); users  with modify permissions: 
Set(root); groups with modify permissions: Set()
   2022-05-12 10:47:05 INFO  Utils:54 - Successfully started service 
'sparkDriver' on port 46513.
   2022-05-12 10:47:05 INFO  SparkEnv:54 - Registering MapOutputTracker
   2022-05-12 10:47:05 INFO  SparkEnv:54 - Registering BlockManagerMaster
   2022-05-12 10:47:05 INFO  BlockManagerMasterEndpoint:54 - Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
   2022-05-12 10:47:05 INFO  BlockManagerMasterEndpoint:54 - 
BlockManagerMasterEndpoint up
   2022-05-12 10:47:05 INFO  DiskBlockManager:54 - Created local directory at 
/tmp/blockmgr-02b28f86-bfbc-468d-8daa-bee1e2dd2528
   2022-05-12 10:47:05 INFO  MemoryStore:54 - MemoryStore started with capacity 
366.3 MB
   2022-05-12 10:47:05 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
   2022-05-12 10:47:05 INFO  log:192 - Logging initialized @4054ms
   2022-05-12 10:47:05 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build 
timestamp: unknown, git hash: unknown
   2022-05-12 10:47:05 INFO  Server:419 - Started @4204ms
   2022-05-12 10:47:05 INFO  AbstractConnector:278 - Started 
ServerConnector@194152cf{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
   2022-05-12 10:47:05 INFO  Utils:54 - Successfully started service 'SparkUI' 
on port 4040.
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@4604b900{/jobs,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@c827db{/jobs/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@377c68c6{/jobs/job,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@238ad8c{/jobs/job/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@430fa4ef{/stages,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1761de10{/stages/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@22df874e{/stages/stage,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@42d236fb{/stages/stage/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1ce93c18{/stages/pool,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@19f21b6b{/stages/pool/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1532c619{/storage,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@46044faa{/storage/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1358b28e{/storage/rdd,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1a78dacd{/storage/rdd/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@19f9d595{/environment,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@7de4a01f{/environment/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@2bfeb1ef{/executors,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@778ca8ef{/executors/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@208e9ef6{/executors/threadDump,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@78b236a0{/executors/threadDump/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@261d8190{/static,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@56f2bbea{/,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@78f9ed3e{/api,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@7f4037ed{/jobs/job/kill,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@24e8de5c{/stages/stage/kill,null,AVAILABLE,@Spark}
   2022-05-12 10:47:05 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started 
at http://keep1:4040
   2022-05-12 10:47:05 INFO  SparkContext:54 - Added JAR 
file:/opt/apache-seatunnel-incubating-2.1.1/lib/seatunnel-core-spark.jar at 
spark://keep1:46513/jars/seatunnel-core-spark.jar with timestamp 1652323625721
   2022-05-12 10:47:05 INFO  Executor:54 - Starting executor ID driver on host 
localhost
   2022-05-12 10:47:06 INFO  Utils:54 - Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 44944.
   2022-05-12 10:47:06 INFO  NettyBlockTransferService:54 - Server created on 
keep1:44944
   2022-05-12 10:47:06 INFO  BlockManager:54 - Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
   2022-05-12 10:47:06 INFO  BlockManagerMaster:54 - Registering BlockManager 
BlockManagerId(driver, keep1, 44944, None)
   2022-05-12 10:47:06 INFO  BlockManagerMasterEndpoint:54 - Registering block 
manager keep1:44944 with 366.3 MB RAM, BlockManagerId(driver, keep1, 44944, 
None)
   2022-05-12 10:47:06 INFO  BlockManagerMaster:54 - Registered BlockManager 
BlockManagerId(driver, keep1, 44944, None)
   2022-05-12 10:47:06 INFO  BlockManager:54 - Initialized BlockManager: 
BlockManagerId(driver, keep1, 44944, None)
   2022-05-12 10:47:06 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@16c8b7bd{/metrics/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:07 INFO  ClickHouseDriver:42 - Driver registered
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -                                 
                                                                                
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -          ********               
             ##############                                                     
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -         *########               
             ##############                                                     
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -        *#*** ****               
                   ##                                                           
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -        *#*                      
                   ##                                                           
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -        *#*                      
                   ##                                                           
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -        *#*            ******    
  *******          ##         ##      ##     ## ******      ## ******       
******      ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -        *##**         **#####*   
  ####*##*         ##         ##      ##     ##**##**#*     ##**##**#*     
**#####*     ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -         *##***       *#** **#*  
  ***  *#*         ##         ##      ##     ##**** *#*     ##**** *#*     *#** 
**#*    ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -          **##***    *#*    *#*  
       *#*         ##         ##      ##     ##**   *#*     ##**   *#*    *#*   
 *#*    ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -           ***##**   *#*    *#*  
        ##         ##         ##      ##     ##*     ##     ##*     ##    *#*   
 *#*    ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -              **#**  *########*  
   *****##         ##         ##      ##     ##      ##     ##      ##    
*########*    ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -               **#*  *########*  
 **##***##         ##         ##      ##     ##      ##     ##      ##    
*########*    ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -                *#*  *#*         
 *#**   ##         ##         ##      ##     ##      ##     ##      ##    *#*   
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -                 #*  *#*         
 *#*    ##         ##         ##     *##     ##      ##     ##      ##    *#*   
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -                *#*  *#**        
 *#*    ##*        ##         *#*   **##     ##      ##     ##      ##    *#**  
        ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -        ***** ***#*  **#*** ***  
 *#*****##*        ##         *#* ****##     ##      ##     ##      ##    
**#*** ***    ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -        ########**    **#######  
 *#####**##*       ##         *#####**##     ##      ##     ##      ##     
**#######    ##                      
   2022-05-12 10:47:07 INFO  AsciiArtUtils:69 -        *********      ***#****  
  **********       ##          ****** ##     ##      ##     ##      ##      
***#****    ##                      
   2022-05-12 10:47:08 INFO  ExecutionFactory:80 - current execution is 
[org.apache.seatunnel.spark.batch.SparkBatchExecution]
   2022-05-12 10:47:08 INFO  SharedState:54 - Setting 
hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir 
('file:/opt/apache-seatunnel-incubating-2.1.1/spark-warehouse').
   2022-05-12 10:47:08 INFO  SharedState:54 - Warehouse path is 
'file:/opt/apache-seatunnel-incubating-2.1.1/spark-warehouse'.
   2022-05-12 10:47:08 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@622fdb81{/SQL,null,AVAILABLE,@Spark}
   2022-05-12 10:47:08 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@1f3165e7{/SQL/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:08 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@15b82644{/SQL/execution,null,AVAILABLE,@Spark}
   2022-05-12 10:47:08 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@20576557{/SQL/execution/json,null,AVAILABLE,@Spark}
   2022-05-12 10:47:08 INFO  ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@3b1ed14b{/static/sql,null,AVAILABLE,@Spark}
   2022-05-12 10:47:09 INFO  StateStoreCoordinatorRef:54 - Registered 
StateStoreCoordinator endpoint
   2022-05-12 10:47:09 INFO  Version:133 - Elasticsearch Hadoop v6.8.3 
[8a5f44bf7d]
   2022-05-12 10:47:09 ERROR Seatunnel:69 - 
   
   
===============================================================================
   
   
   2022-05-12 10:47:09 ERROR Seatunnel:72 - Fatal Error, 
   
   2022-05-12 10:47:09 ERROR Seatunnel:74 - Please submit bug report in 
https://github.com/apache/incubator-seatunnel/issues
   
   2022-05-12 10:47:09 ERROR Seatunnel:76 - Reason:Execute Spark task error 
   
   2022-05-12 10:47:09 ERROR Seatunnel:77 - Exception 
StackTrace:java.lang.RuntimeException: Execute Spark task error
        at 
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:61)
        at 
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
        at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
        at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at 
org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
        at scala.Option.foreach(Option.scala:257)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:99)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
        at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
        at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
        at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:31)
        at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:28)
        at 
org.apache.seatunnel.spark.SparkEnvironment.registerInputTempView(SparkEnvironment.java:126)
        at 
org.apache.seatunnel.spark.batch.SparkBatchExecution.lambda$start$0(SparkBatchExecution.java:45)
        at java.util.ArrayList.forEach(ArrayList.java:1257)
        at 
org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:45)
        at 
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
        ... 15 more
    
   2022-05-12 10:47:09 ERROR Seatunnel:78 - 
   
===============================================================================
   
   
   
   Exception in thread "main" java.lang.RuntimeException: Execute Spark task 
error
        at 
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:61)
        at 
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
        at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
        at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at 
org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$5.apply(JDBCOptions.scala:99)
        at scala.Option.foreach(Option.scala:257)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:99)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
        at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
        at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
        at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
        at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:31)
        at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:28)
        at 
org.apache.seatunnel.spark.SparkEnvironment.registerInputTempView(SparkEnvironment.java:126)
        at 
org.apache.seatunnel.spark.batch.SparkBatchExecution.lambda$start$0(SparkBatchExecution.java:45)
        at java.util.ArrayList.forEach(ArrayList.java:1257)
        at 
org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:45)
        at 
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
        ... 15 more
   2022-05-12 10:47:09 INFO  SparkContext:54 - Invoking stop() from shutdown 
hook
   2022-05-12 10:47:09 INFO  AbstractConnector:318 - Stopped 
Spark@194152cf{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
   2022-05-12 10:47:09 INFO  SparkUI:54 - Stopped Spark web UI at 
http://keep1:4040
   2022-05-12 10:47:09 INFO  MapOutputTrackerMasterEndpoint:54 - 
MapOutputTrackerMasterEndpoint stopped!
   2022-05-12 10:47:09 INFO  MemoryStore:54 - MemoryStore cleared
   2022-05-12 10:47:09 INFO  BlockManager:54 - BlockManager stopped
   2022-05-12 10:47:09 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
   2022-05-12 10:47:09 INFO  
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - 
OutputCommitCoordinator stopped!
   2022-05-12 10:47:09 INFO  SparkContext:54 - Successfully stopped SparkContext
   2022-05-12 10:47:09 INFO  ShutdownHookManager:54 - Shutdown hook called
   2022-05-12 10:47:09 INFO  ShutdownHookManager:54 - Deleting directory 
/tmp/spark-3a41298b-5c74-45ad-84ea-01de872c8f51
   2022-05-12 10:47:09 INFO  ShutdownHookManager:54 - Deleting directory 
/tmp/spark-f7ee6d32-ec39-4433-979a-542dcf335132
   ```
   
   then I copy ojdbc6-11.2.0.3.jar and ojdbc14-10.2.0.1.jar to
   /opt/apache-seatunnel-incubating-2.1.1/lib
   
   but still report this error.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to