vegastar002 opened a new issue, #1773:
URL: https://github.com/apache/incubator-seatunnel/issues/1773
ver
apache-seatunnel-incubating-2.1.1
### my.conf
path:
/opt/apache-seatunnel-incubating-2.1.1/config/my.conf
```xml
env {
spark.app.name = "SeaTunnel"
spark.executor.instances = 2
spark.executor.cores = 1
spark.executor.memory = "1g"
}
source {
jdbc {
driver = "oracle.jdbc.driver.OracleDriver"
url = "jdbc:oracle:thin://192.168.9.26:1521/dwe"
table = "TEST_DW"
result_table_name = "TEST_DW_log"
user = "123"
password = "123"
}
}
transform {
sql {
sql = "SELECT BH , DM , DWLB , JC ,QC FROM DSECS.TEST_DW "
}
}
sink {
clickhouse {
host = "192.168.9.103:8123"
clickhouse.socket_timeout = 50000
database = "default"
table = "dw"
fields = ["BH", "DM", "DWLB", "JC", "QC" ]
username = "root"
password = "1234567"
bulk_size = 20000
}
}
```
### step
run command
```shell
./bin/start-seatunnel-spark.sh --master local[4] --deploy-mode client
--config ./config/my.conf
```
the error report
```
log4j:WARN No appenders could be found for logger
(org.apache.seatunnel.config.ConfigBuilder).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
22/04/28 17:29:42 INFO SparkContext: Running Spark version 3.2.1
22/04/28 17:29:42 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
22/04/28 17:29:42 INFO ResourceUtils:
==============================================================
22/04/28 17:29:42 INFO ResourceUtils: No custom resources configured for
spark.driver.
22/04/28 17:29:42 INFO ResourceUtils:
==============================================================
22/04/28 17:29:42 INFO SparkContext: Submitted application: SeaTunnel
22/04/28 17:29:42 INFO ResourceProfile: Default ResourceProfile created,
executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: ,
memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name:
offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name:
cpus, amount: 1.0)
22/04/28 17:29:42 INFO ResourceProfile: Limiting resource is cpus at 1 tasks
per executor
22/04/28 17:29:42 INFO ResourceProfileManager: Added ResourceProfile id: 0
22/04/28 17:29:42 INFO SecurityManager: Changing view acls to: root
22/04/28 17:29:42 INFO SecurityManager: Changing modify acls to: root
22/04/28 17:29:42 INFO SecurityManager: Changing view acls groups to:
22/04/28 17:29:42 INFO SecurityManager: Changing modify acls groups to:
22/04/28 17:29:42 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(root); groups
with view permissions: Set(); users with modify permissions: Set(root); groups
with modify permissions: Set()
22/04/28 17:29:42 INFO Utils: Successfully started service 'sparkDriver' on
port 40462.
22/04/28 17:29:42 INFO SparkEnv: Registering MapOutputTracker
22/04/28 17:29:42 INFO SparkEnv: Registering BlockManagerMaster
22/04/28 17:29:42 INFO BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/04/28 17:29:42 INFO BlockManagerMasterEndpoint:
BlockManagerMasterEndpoint up
22/04/28 17:29:43 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/04/28 17:29:43 INFO DiskBlockManager: Created local directory at
/tmp/blockmgr-dff1a217-f3f1-48a6-8488-b4773b843f97
22/04/28 17:29:43 INFO MemoryStore: MemoryStore started with capacity 366.3
MiB
22/04/28 17:29:43 INFO SparkEnv: Registering OutputCommitCoordinator
22/04/28 17:29:43 INFO Utils: Successfully started service 'SparkUI' on port
4040.
22/04/28 17:29:43 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at
http://keep1:4040
22/04/28 17:29:43 INFO SparkContext: Added JAR
file:/opt/apache-seatunnel-incubating-2.1.1/lib/seatunnel-core-spark.jar at
spark://keep1:40462/jars/seatunnel-core-spark.jar with timestamp 1651138182159
22/04/28 17:29:43 INFO Executor: Starting executor ID driver on host keep1
22/04/28 17:29:43 INFO Executor: Fetching
spark://keep1:40462/jars/seatunnel-core-spark.jar with timestamp 1651138182159
22/04/28 17:29:43 INFO TransportClientFactory: Successfully created
connection to keep1/192.168.217.136:40462 after 43 ms (0 ms spent in bootstraps)
22/04/28 17:29:43 INFO Utils: Fetching
spark://keep1:40462/jars/seatunnel-core-spark.jar to
/tmp/spark-94672d91-7180-4df1-ac9f-8df06181f546/userFiles-72fb88c5-b975-477e-ae26-79e0d3bc2c15/fetchFileTemp1469917794103960589.tmp
22/04/28 17:29:44 INFO Executor: Adding
file:/tmp/spark-94672d91-7180-4df1-ac9f-8df06181f546/userFiles-72fb88c5-b975-477e-ae26-79e0d3bc2c15/seatunnel-core-spark.jar
to class loader
22/04/28 17:29:44 INFO Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 33762.
22/04/28 17:29:44 INFO NettyBlockTransferService: Server created on
keep1:33762
22/04/28 17:29:44 INFO BlockManager: Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy
22/04/28 17:29:44 INFO BlockManagerMaster: Registering BlockManager
BlockManagerId(driver, keep1, 33762, None)
22/04/28 17:29:44 INFO BlockManagerMasterEndpoint: Registering block manager
keep1:33762 with 366.3 MiB RAM, BlockManagerId(driver, keep1, 33762, None)
22/04/28 17:29:44 INFO BlockManagerMaster: Registered BlockManager
BlockManagerId(driver, keep1, 33762, None)
22/04/28 17:29:44 INFO BlockManager: Initialized BlockManager:
BlockManagerId(driver, keep1, 33762, None)
22/04/28 17:29:44 WARN PluginFactory: Error when load plugin: [clickhouse]
java.util.ServiceConfigurationError:
org.apache.seatunnel.spark.BaseSparkSink: Provider
org.apache.seatunnel.spark.kafka.sink.Kafka could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at
org.apache.seatunnel.config.PluginFactory.createPluginInstanceIgnoreCase(PluginFactory.java:122)
at
org.apache.seatunnel.config.PluginFactory.lambda$createPlugins$0(PluginFactory.java:92)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at
org.apache.seatunnel.config.PluginFactory.createPlugins(PluginFactory.java:90)
at
org.apache.seatunnel.config.ExecutionContext.<init>(ExecutionContext.java:54)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:44)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError:
org/apache/spark/internal/Logging$class
at org.apache.seatunnel.spark.kafka.sink.Kafka.<init>(Kafka.scala:31)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 23 more
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.internal.Logging$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 30 more
22/04/28 17:29:44 WARN PluginFactory: Error when load plugin: [clickhouse]
java.util.ServiceConfigurationError:
org.apache.seatunnel.spark.BaseSparkSink: Provider
org.apache.seatunnel.spark.hive.sink.Hive could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at
org.apache.seatunnel.config.PluginFactory.createPluginInstanceIgnoreCase(PluginFactory.java:122)
at
org.apache.seatunnel.config.PluginFactory.lambda$createPlugins$0(PluginFactory.java:92)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at
org.apache.seatunnel.config.PluginFactory.createPlugins(PluginFactory.java:90)
at
org.apache.seatunnel.config.ExecutionContext.<init>(ExecutionContext.java:54)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:44)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError:
org/apache/spark/internal/Logging$class
at org.apache.seatunnel.spark.hive.sink.Hive.<init>(Hive.scala:29)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 23 more
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.internal.Logging$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 30 more
22/04/28 17:29:44 WARN PluginFactory: Error when load plugin: [clickhouse]
java.util.ServiceConfigurationError:
org.apache.seatunnel.spark.BaseSparkSink: Provider
org.apache.seatunnel.spark.phoenix.sink.Phoenix could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at
org.apache.seatunnel.config.PluginFactory.createPluginInstanceIgnoreCase(PluginFactory.java:122)
at
org.apache.seatunnel.config.PluginFactory.lambda$createPlugins$0(PluginFactory.java:92)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at
org.apache.seatunnel.config.PluginFactory.createPlugins(PluginFactory.java:90)
at
org.apache.seatunnel.config.ExecutionContext.<init>(ExecutionContext.java:54)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:44)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError:
org/apache/spark/internal/Logging$class
at
org.apache.seatunnel.spark.phoenix.sink.Phoenix.<init>(Phoenix.scala:29)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 23 more
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.internal.Logging$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 30 more
22/04/28 17:29:44 WARN PluginFactory: Error when load plugin: [clickhouse]
java.util.ServiceConfigurationError:
org.apache.seatunnel.spark.BaseSparkSink: Provider
org.apache.seatunnel.spark.redis.sink.Redis could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at
org.apache.seatunnel.config.PluginFactory.createPluginInstanceIgnoreCase(PluginFactory.java:122)
at
org.apache.seatunnel.config.PluginFactory.lambda$createPlugins$0(PluginFactory.java:92)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at
org.apache.seatunnel.config.PluginFactory.createPlugins(PluginFactory.java:90)
at
org.apache.seatunnel.config.ExecutionContext.<init>(ExecutionContext.java:54)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:44)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError:
org/apache/spark/internal/Logging$class
at org.apache.seatunnel.spark.redis.sink.Redis.<init>(Redis.scala:33)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 23 more
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.internal.Logging$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 30 more
22/04/28 17:29:45 INFO ClickHouseDriver: Driver registered
22/04/28 17:29:45 INFO AsciiArtUtils:
##
22/04/28 17:29:45 INFO AsciiArtUtils: ********
##############
##
22/04/28 17:29:45 INFO AsciiArtUtils: *########
##############
##
22/04/28 17:29:45 INFO AsciiArtUtils: *#*** ****
##
##
22/04/28 17:29:45 INFO AsciiArtUtils: *#*
##
##
22/04/28 17:29:45 INFO AsciiArtUtils: *#*
##
##
22/04/28 17:29:45 INFO AsciiArtUtils: *#* ******
******* ## ## ## ## ****** ## ******
****** ##
22/04/28 17:29:45 INFO AsciiArtUtils: *##** **#####*
####*##* ## ## ## ##**##**#* ##**##**#*
**#####* ##
22/04/28 17:29:45 INFO AsciiArtUtils: *##*** *#** **#* ***
*#* ## ## ## ##**** *#* ##**** *#* *#** **#*
##
22/04/28 17:29:45 INFO AsciiArtUtils: **##*** *#* *#*
*#* ## ## ## ##** *#* ##** *#* *#* *#*
##
22/04/28 17:29:45 INFO AsciiArtUtils: ***##** *#* *#*
## ## ## ## ##* ## ##* ## *#* *#*
##
22/04/28 17:29:45 INFO AsciiArtUtils: **#** *########*
*****## ## ## ## ## ## ## ##
*########* ##
22/04/28 17:29:45 INFO AsciiArtUtils: **#* *########*
**##***## ## ## ## ## ## ## ##
*########* ##
22/04/28 17:29:45 INFO AsciiArtUtils: *#* *#* *#**
## ## ## ## ## ## ## ## *#*
##
22/04/28 17:29:45 INFO AsciiArtUtils: #* *#* *#*
## ## ## *## ## ## ## ## *#*
##
22/04/28 17:29:45 INFO AsciiArtUtils: *#* *#** *#*
##* ## *#* **## ## ## ## ## *#**
##
22/04/28 17:29:45 INFO AsciiArtUtils: ***** ***#* **#*** ***
*#*****##* ## *#* ****## ## ## ## ## **#***
*** ##
22/04/28 17:29:45 INFO AsciiArtUtils: ########** **#######
*#####**##* ## *#####**## ## ## ## ##
**####### ##
22/04/28 17:29:45 INFO AsciiArtUtils: ********* ***#****
********** ## ****** ## ## ## ## ##
***#**** ##
22/04/28 17:29:45 INFO ExecutionFactory: current execution is
[org.apache.seatunnel.spark.batch.SparkBatchExecution]
Exception in thread "main" java.lang.IllegalAccessError: class
org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface
org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3379)
at
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3424)
at
org.apache.hadoop.fs.FsUrlStreamHandlerFactory.<init>(FsUrlStreamHandlerFactory.java:77)
at
org.apache.spark.sql.internal.SharedState$.liftedTree2$1(SharedState.scala:193)
at
org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$setFsUrlStreamHandlerFactory(SharedState.scala:192)
at
org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:54)
at
org.apache.spark.sql.SparkSession.$anonfun$sharedState$1(SparkSession.scala:139)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:139)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:138)
at
org.apache.spark.sql.SparkSession.$anonfun$sessionState$2(SparkSession.scala:158)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:156)
at
org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:153)
at
org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:732)
at org.apache.spark.sql.SparkSession.read(SparkSession.scala:658)
at org.apache.seatunnel.spark.jdbc.source.Jdbc.jdbcReader(Jdbc.scala:40)
at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:31)
at org.apache.seatunnel.spark.jdbc.source.Jdbc.getData(Jdbc.scala:28)
at
org.apache.seatunnel.spark.SparkEnvironment.registerInputTempView(SparkEnvironment.java:126)
at
org.apache.seatunnel.spark.batch.SparkBatchExecution.lambda$start$0(SparkBatchExecution.java:45)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at
org.apache.seatunnel.spark.batch.SparkBatchExecution.start(SparkBatchExecution.java:45)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:58)
at
org.apache.seatunnel.command.spark.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:36)
at org.apache.seatunnel.Seatunnel.run(Seatunnel.java:48)
at org.apache.seatunnel.SeatunnelSpark.main(SeatunnelSpark.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
22/04/28 17:29:45 INFO SparkContext: Invoking stop() from shutdown hook
22/04/28 17:29:45 INFO SparkUI: Stopped Spark web UI at http://keep1:4040
22/04/28 17:29:45 INFO MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
22/04/28 17:29:45 INFO MemoryStore: MemoryStore cleared
22/04/28 17:29:45 INFO BlockManager: BlockManager stopped
22/04/28 17:29:45 INFO BlockManagerMaster: BlockManagerMaster stopped
22/04/28 17:29:45 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
22/04/28 17:29:45 INFO SparkContext: Successfully stopped SparkContext
22/04/28 17:29:45 INFO ShutdownHookManager: Shutdown hook called
22/04/28 17:29:45 INFO ShutdownHookManager: Deleting directory
/tmp/spark-18585c84-29c0-4683-bc20-e397119439a1
22/04/28 17:29:45 INFO ShutdownHookManager: Deleting directory
/tmp/spark-94672d91-7180-4df1-ac9f-8df06181f546
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]