leo65535 commented on pull request #885:
URL:
https://github.com/apache/incubator-seatunnel/pull/885#issuecomment-1002481893
Every things are ok, seatunnel version: dev
```
[dcadmin@dcadmin seatunnel-dist-2.0.5-SNAPSHOT-2.11.8]$
./bin/start-seatunnel-spark.sh -c config/application.conf -e client -m local -i
app=test -t
[INFO] spark conf: --conf "spark.app.name=seatunnel"
Warning: Ignoring non-spark config property: "spark.app.name=seatunnel"
2021-12-29 15:59:34 WARN NativeCodeLoader:62 - Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
2021-12-29 15:59:34 INFO ConfigBuilder:78 - Loading config file:
config/application.conf
2021-12-29 15:59:34 INFO ConfigBuilder:89 - parsed config file: {
"env" : {
"spark.app.name" : "seatunnel"
},
"source" : [
{
"result_table_name" : "my_dataset",
"plugin_name" : "Fake"
}
],
"transform" : [],
"sink" : [
{
"plugin_name" : "Console"
}
]
}
2021-12-29 15:59:34 INFO SparkContext:54 - Running Spark version 2.4.0
2021-12-29 15:59:34 INFO SparkContext:54 - Submitted application: seatunnel
2021-12-29 15:59:34 INFO SecurityManager:54 - Changing view acls to: dcadmin
2021-12-29 15:59:34 INFO SecurityManager:54 - Changing modify acls to:
dcadmin
2021-12-29 15:59:34 INFO SecurityManager:54 - Changing view acls groups to:
2021-12-29 15:59:34 INFO SecurityManager:54 - Changing modify acls groups
to:
2021-12-29 15:59:34 INFO SecurityManager:54 - SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(dcadmin); groups with view permissions: Set(); users with modify
permissions: Set(dcadmin); groups with modify permissions: Set()
2021-12-29 15:59:34 INFO Utils:54 - Successfully started service
'sparkDriver' on port 42057.
2021-12-29 15:59:34 INFO SparkEnv:54 - Registering MapOutputTracker
2021-12-29 15:59:34 INFO SparkEnv:54 - Registering BlockManagerMaster
2021-12-29 15:59:34 INFO BlockManagerMasterEndpoint:54 - Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2021-12-29 15:59:34 INFO BlockManagerMasterEndpoint:54 -
BlockManagerMasterEndpoint up
2021-12-29 15:59:34 INFO DiskBlockManager:54 - Created local directory at
/tmp/blockmgr-4e1490d8-b4ae-498c-a5cf-b12c73e844ea
2021-12-29 15:59:34 INFO MemoryStore:54 - MemoryStore started with capacity
366.3 MB
2021-12-29 15:59:34 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2021-12-29 15:59:34 INFO log:192 - Logging initialized @4579ms
2021-12-29 15:59:35 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build
timestamp: unknown, git hash: unknown
2021-12-29 15:59:35 INFO Server:419 - Started @4726ms
2021-12-29 15:59:35 INFO AbstractConnector:278 - Started
ServerConnector@72ed9aad{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2021-12-29 15:59:35 INFO Utils:54 - Successfully started service 'SparkUI'
on port 4040.
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@50825a02{/jobs,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@5f84abe8{/jobs/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@4650a407{/jobs/job,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6a4d7f76{/jobs/job/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@10ec523c{/stages,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@53dfacba{/stages/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@79767781{/stages/stage,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@245a060f{/stages/stage/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6edaa77a{/stages/pool,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@1e63d216{/stages/pool/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@62ddd21b{/storage,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@16c3ca31{/storage/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@2d195ee4{/storage/rdd,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@2d6aca33{/storage/rdd/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@21ab988f{/environment,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@29314cc9{/environment/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@4e38d975{/executors,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@35f8a9d3{/executors/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@48ea2003{/executors/threadDump,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6b1e7ad3{/executors/threadDump/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@63e5e5b4{/static,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@6a969fb8{/,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@7a18e8d{/api,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@bb095{/jobs/job/kill,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@777c350f{/stages/stage/kill,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started
at http://dcadmin.work:4040
2021-12-29 15:59:35 INFO SparkContext:54 - Added JAR
file:///work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/lib/seatunnel-core-flink.jar
at spark://dcadmin.work:42057/jars/seatunnel-core-flink.jar with timestamp
1640764775331
2021-12-29 15:59:35 INFO SparkContext:54 - Added JAR
file:///work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/lib/seatunnel-core-spark.jar
at spark://dcadmin.work:42057/jars/seatunnel-core-spark.jar with timestamp
1640764775331
2021-12-29 15:59:35 INFO SparkContext:54 - Added JAR
file:///work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/lib/seatunnel-core-sql-2.0.5-SNAPSHOT-2.11.8.jar
at
spark://dcadmin.work:42057/jars/seatunnel-core-sql-2.0.5-SNAPSHOT-2.11.8.jar
with timestamp 1640764775332
2021-12-29 15:59:35 WARN SparkContext:66 - The jar
file:/work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/lib/seatunnel-core-spark.jar
has been added already. Overwriting of added jars is not supported in the
current version.
2021-12-29 15:59:35 INFO Executor:54 - Starting executor ID driver on host
localhost
2021-12-29 15:59:35 INFO Utils:54 - Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 44739.
2021-12-29 15:59:35 INFO NettyBlockTransferService:54 - Server created on
dcadmin.work:44739
2021-12-29 15:59:35 INFO BlockManager:54 - Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy
2021-12-29 15:59:35 INFO BlockManagerMaster:54 - Registering BlockManager
BlockManagerId(driver, dcadmin.work, 44739, None)
2021-12-29 15:59:35 INFO BlockManagerMasterEndpoint:54 - Registering block
manager dcadmin.work:44739 with 366.3 MB RAM, BlockManagerId(driver,
dcadmin.work, 44739, None)
2021-12-29 15:59:35 INFO BlockManagerMaster:54 - Registered BlockManager
BlockManagerId(driver, dcadmin.work, 44739, None)
2021-12-29 15:59:35 INFO BlockManager:54 - Initialized BlockManager:
BlockManagerId(driver, dcadmin.work, 44739, None)
2021-12-29 15:59:35 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@3f1ddac2{/metrics/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:35 WARN StreamingContext:66 - spark.master should be set
as local[n], n > 1 in local mode if you have receivers to get data, otherwise
Spark jobs will not get resources to process the received data.
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *********
##############
##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#########
##############
##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#*** ****
##
##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#*
##
##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#*
##
##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#*
******** ******** ## ## ## ## ******* ##
******* ******** ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#*
**#######* ########* ## ## ## ##*######**
##*######** **#######* ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#**** **#***
***** **** ***#* ## ## ## ###******#* ###******#*
**#*** ***** ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *###***** *#*
*#* *#* ## ## ## ##* *#* ##* *#*
*#* *#* ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - ***####** *#*
*#* *#* ## ## ## ##* *#* ##* *#*
*#* *#* ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - ****##*
*#******###* *****####* ## ## ## ## ## ##
## *#******###* ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#*
*########### **######### ## ## ## ## ## ##
## *########### ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#* *#*
*#**** ## ## ## ## ## ## ## ##
*#* ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#* *#*
*#* *## ## *#* *## ## ## ## ##
*#* ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - *#* *#**
*#* *## ## *#* *## ## ## ## ##
*#** ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - ***** ****#* **#***
**** *#** ***### ## *#*** **### ## ## ## ##
**#*** **** ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - ##########*
**######## **######*## ## **######*## ## ## ##
## **######## ##
2021-12-29 15:59:36 INFO AsciiArtUtils:57 - **********
****#**** ******* ## ## ******* ## ## ## ##
## ****#**** ##
2021-12-29 15:59:37 INFO SharedState:54 - Setting
hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir
('file:/work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/spark-warehouse').
2021-12-29 15:59:37 INFO SharedState:54 - Warehouse path is
'file:/work/projects/opensource/seatunnel/seatunnel-dist/seatunnel-dist-2.0.5-SNAPSHOT-2.11.8/spark-warehouse'.
2021-12-29 15:59:37 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@3fde8f7c{/SQL,null,AVAILABLE,@Spark}
2021-12-29 15:59:37 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@11d86b9d{/SQL/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:37 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@800d065{/SQL/execution,null,AVAILABLE,@Spark}
2021-12-29 15:59:37 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@691124ee{/SQL/execution/json,null,AVAILABLE,@Spark}
2021-12-29 15:59:37 INFO ContextHandler:781 - Started
o.s.j.s.ServletContextHandler@4293e066{/static/sql,null,AVAILABLE,@Spark}
2021-12-29 15:59:38 INFO StateStoreCoordinatorRef:54 - Registered
StateStoreCoordinator endpoint
2021-12-29 15:59:38 INFO CodeGenerator:54 - Code generated in 186.677462 ms
2021-12-29 15:59:39 INFO CodeGenerator:54 - Code generated in 9.829865 ms
2021-12-29 15:59:39 INFO CodeGenerator:54 - Code generated in 9.030868 ms
+------------------+
|raw_message |
+------------------+
|Hello garyelephant|
|Hello rickyhuo |
|Hello kid-xiong |
+------------------+
2021-12-29 15:59:39 INFO SparkContext:54 - Invoking stop() from shutdown
hook
2021-12-29 15:59:39 INFO AbstractConnector:318 - Stopped
Spark@72ed9aad{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2021-12-29 15:59:39 INFO SparkUI:54 - Stopped Spark web UI at
http://dcadmin.work:4040
2021-12-29 15:59:39 INFO MapOutputTrackerMasterEndpoint:54 -
MapOutputTrackerMasterEndpoint stopped!
2021-12-29 15:59:39 INFO MemoryStore:54 - MemoryStore cleared
2021-12-29 15:59:39 INFO BlockManager:54 - BlockManager stopped
2021-12-29 15:59:39 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2021-12-29 15:59:39 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 -
OutputCommitCoordinator stopped!
2021-12-29 15:59:39 INFO SparkContext:54 - Successfully stopped SparkContext
2021-12-29 15:59:39 INFO ShutdownHookManager:54 - Shutdown hook called
2021-12-29 15:59:39 INFO ShutdownHookManager:54 - Deleting directory
/tmp/spark-f25a7304-fcb9-4bb3-8d39-63b2f8b737d8
2021-12-29 15:59:39 INFO ShutdownHookManager:54 - Deleting directory
/tmp/spark-c5ba55c4-8635-4a5a-9d18-781eb8e78f6b
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]