yaooqinn commented on PR #2669:
URL:
https://github.com/apache/incubator-kyuubi/pull/2669#issuecomment-1128464534
> > ```
> > kill: (11656): No such process
> > - JpsApplicationOperation with spark local mode *** FAILED ***
> > response._1 was false Failed to terminate: 11656
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class
org.apache.kyuubi.engine.spark.SparkSQLEngine --conf
spark.kyuubi.engine.operation.log.dir.root=target/engine_operation_logs --conf
spark.kyuubi.session.idle.timeout=PT3M --conf
spark.abc=f41c3b50-1025-49c4-b1a4-3f4e01e91345 --conf spark.kyuubi.testing=true
--conf spark.kyuubi.frontend.bind.host=localhost --conf spark.master=local
--conf
spark.kyuubi.metrics.json.location=/home/runner/work/incubator-kyuubi/incubator-kyuubi/kyuubi-server/target/metrics
--conf spark.kyuubi.operation.log.dir.root=target/server_operation_logs
--proxy-user runner
/home/runner/work/incubator-kyuubi/incubator-kyuubi/externals/kyuubi-spark-sql-engine/target/kyuubi-spark-sql-engine_2.12-1.6.0-SNAPSHOT.jar,
due to Nonzero exit code: 1 (JpsApplicationOperationSuite.scala:87)
> > ```
>
> It seems that the process should not terminate itself before being killed.
```
22/05/17 05:00:38 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
2022-05-17 05:00:38.915 INFO util.SignalRegister: Registering signal handler
for TERM
2022-05-17 05:00:38.917 INFO util.SignalRegister: Registering signal handler
for HUP
2022-05-17 05:00:38.917 INFO util.SignalRegister: Registering signal handler
for INT
2022-05-17 05:00:39.285 INFO conf.HiveConf: Found configuration file null
2022-05-17 05:00:39.543 INFO spark.SparkContext: Running Spark version 3.1.3
2022-05-17 05:00:39.621 INFO resource.ResourceUtils:
==============================================================
2022-05-17 05:00:39.621 INFO resource.ResourceUtils: No custom resources
configured for spark.driver.
2022-05-17 05:00:39.622 INFO resource.ResourceUtils:
==============================================================
2022-05-17 05:00:39.622 INFO spark.SparkContext: Submitted application:
org.apache.kyuubi.engine.spark.SparkSQLEngine
2022-05-17 05:00:39.680 INFO resource.ResourceProfile: Default
ResourceProfile created, executor resources: Map(cores -> name: cores, amount:
1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor:
, offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources:
Map(cpus -> name: cpus, amount: 1.0)
2022-05-17 05:00:39.683 INFO resource.ResourceProfile: Limiting resource is
cpu
2022-05-17 05:00:39.690 INFO resource.ResourceProfileManager: Added
ResourceProfile id: 0
2022-05-17 05:00:39.776 INFO spark.SecurityManager: Changing view acls to:
runner
2022-05-17 05:00:39.776 INFO spark.SecurityManager: Changing modify acls to:
runner
2022-05-17 05:00:39.776 INFO spark.SecurityManager: Changing view acls
groups to:
2022-05-17 05:00:39.777 INFO spark.SecurityManager: Changing modify acls
groups to:
2022-05-17 05:00:39.777 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(runner); groups with view permissions: Set(); users with modify
permissions: Set(runner); groups with modify permissions: Set()
2022-05-17 05:00:40.202 INFO util.Utils: Successfully started service
'sparkDriver' on port 43437.
2022-05-17 05:00:40.269 INFO spark.SparkEnv: Registering MapOutputTracker
2022-05-17 05:00:40.375 INFO spark.SparkEnv: Registering BlockManagerMaster
2022-05-17 05:00:40.413 INFO storage.BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2022-05-17 05:00:40.414 INFO storage.BlockManagerMasterEndpoint:
BlockManagerMasterEndpoint up
2022-05-17 05:00:40.419 INFO spark.SparkEnv: Registering
BlockManagerMasterHeartbeat
2022-05-17 05:00:40.438 INFO storage.DiskBlockManager: Created local
directory at /tmp/blockmgr-f7e1c27f-c2ec-4752-b9fd-5a02a842acf1
2022-05-17 05:00:40.472 INFO memory.MemoryStore: MemoryStore started with
capacity 366.3 MiB
2022-05-17 05:00:40.503 INFO spark.SparkEnv: Registering
OutputCommitCoordinator
2022-05-17 05:00:40.615 INFO util.log: Logging initialized @4628ms to
org.sparkproject.jetty.util.log.Slf4jLog
2022-05-17 05:00:40.755 INFO server.Server: jetty-9.4.40.v20210413; built:
2021-04-13T20:42:42.668Z; git: b881a572662e1943a14ae12e7e1207989f218b74; jvm
1.8.0_332-b09
2022-05-17 05:00:40.851 INFO server.Server: Started @4864ms
2022-05-17 05:00:41.059 INFO server.AbstractConnector: Started
ServerConnector@6bfdb014{HTTP/1.1, (http/1.1)}{localhost:42741}
2022-05-17 05:00:41.059 INFO util.Utils: Successfully started service
'SparkUI' on port 42741.
2022-05-17 05:00:41.102 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@367795c7{/jobs,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.106 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@75699e35{/jobs/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.109 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@4aeaadc1{/jobs/job,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.112 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@7daa61f3{/jobs/job/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.115 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@6e4ea0bd{/stages,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.125 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@78f9ed3e{/stages/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.126 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@b0964b2{/stages/stage,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.128 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@64040287{/stages/stage/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.128 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@6f89f665{/stages/pool,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.129 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@4925f4f5{/stages/pool/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.130 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@3a43d133{/storage,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.130 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5f2afe62{/storage/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.131 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@28782602{/storage/rdd,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.143 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@68105edc{/storage/rdd/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.147 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@38b972d7{/environment,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.147 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@3935e9a8{/environment/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.148 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5b56b654{/executors,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.149 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@534243e4{/executors/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.149 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@470a9030{/executors/threadDump,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.151 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@27494e46{/executors/threadDump/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.166 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1e411d81{/static,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.168 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@39ab59f8{/,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.172 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@111610e6{/api,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.179 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@4beddc56{/jobs/job/kill,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.180 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1b812421{/stages/stage/kill,null,AVAILABLE,@Spark}
2022-05-17 05:00:41.182 INFO ui.SparkUI: Bound SparkUI to localhost, and
started at http://localhost:42741
2022-05-17 05:00:41.204 INFO spark.SparkContext: Added JAR
file:/home/runner/work/incubator-kyuubi/incubator-kyuubi/externals/kyuubi-spark-sql-engine/target/kyuubi-spark-sql-engine_2.12-1.6.0-SNAPSHOT.jar
at
spark://localhost:43437/jars/kyuubi-spark-sql-engine_2.12-1.6.0-SNAPSHOT.jar
with timestamp 1652763639524
2022-05-17 05:00:41.500 INFO executor.Executor: Starting executor ID driver
on host localhost
2022-05-17 05:00:41.528 INFO executor.Executor: Using REPL class URI:
spark://localhost:43437/classes
2022-05-17 05:00:41.555 INFO executor.Executor: Fetching
spark://localhost:43437/jars/kyuubi-spark-sql-engine_2.12-1.6.0-SNAPSHOT.jar
with timestamp 1652763639524
2022-05-17 05:00:41.619 INFO client.TransportClientFactory: Successfully
created connection to localhost/127.0.0.1:43437 after 42 ms (0 ms spent in
bootstraps)
2022-05-17 05:00:41.631 INFO util.Utils: Fetching
spark://localhost:43437/jars/kyuubi-spark-sql-engine_2.12-1.6.0-SNAPSHOT.jar to
/tmp/spark-fa12c304-8837-445c-864c-e2a3522e8f69/userFiles-1df90a08-5586-4f46-9271-562e051c5790/fetchFileTemp2804306972146769908.tmp
2022-05-17 05:00:41.737 INFO executor.Executor: Adding
file:/tmp/spark-fa12c304-8837-445c-864c-e2a3522e8f69/userFiles-1df90a08-5586-4f46-9271-562e051c5790/kyuubi-spark-sql-engine_2.12-1.6.0-SNAPSHOT.jar
to class loader
2022-05-17 05:00:41.761 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 40877.
2022-05-17 05:00:41.761 INFO netty.NettyBlockTransferService: Server created
on localhost:40877
2022-05-17 05:00:41.763 INFO storage.BlockManager: Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy
2022-05-17 05:00:41.780 INFO storage.BlockManagerMaster: Registering
BlockManager BlockManagerId(driver, localhost, 40877, None)
2022-05-17 05:00:41.785 INFO storage.BlockManagerMasterEndpoint: Registering
block manager localhost:40877 with 366.3 MiB RAM, BlockManagerId(driver,
localhost, 40877, None)
2022-05-17 05:00:41.792 INFO storage.BlockManagerMaster: Registered
BlockManager BlockManagerId(driver, localhost, 40877, None)
2022-05-17 05:00:41.793 INFO storage.BlockManager: Initialized BlockManager:
BlockManagerId(driver, localhost, 40877, None)
2022-05-17 05:00:42.341 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@776802b0{/metrics/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:42.993 INFO internal.SharedState: Setting
hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir
('file:/home/runner/work/incubator-kyuubi/incubator-kyuubi/kyuubi-server/target/work/runner/spark-warehouse/').
2022-05-17 05:00:42.994 INFO internal.SharedState: Warehouse path is
'file:/home/runner/work/incubator-kyuubi/incubator-kyuubi/kyuubi-server/target/work/runner/spark-warehouse/'.
2022-05-17 05:00:43.080 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@6c796cc1{/SQL,null,AVAILABLE,@Spark}
2022-05-17 05:00:43.081 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1cb7936c{/SQL/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:43.083 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@7123be6c{/SQL/execution,null,AVAILABLE,@Spark}
2022-05-17 05:00:43.086 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@77a2aa4a{/SQL/execution/json,null,AVAILABLE,@Spark}
2022-05-17 05:00:43.172 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@30c0d731{/static/sql,null,AVAILABLE,@Spark}
2022-05-17 05:00:50.995 INFO hive.HiveUtils: Initializing
HiveMetastoreConnection version 2.3.7 using Spark classes.
2022-05-17 05:00:51.266 INFO conf.HiveConf: Found configuration file null
2022-05-17 05:00:52.449 INFO session.SessionState: Created local directory:
/tmp/runner
2022-05-17 05:00:52.488 INFO session.SessionState: Created HDFS directory:
/tmp/hive/runner/4b4bfd19-ddc4-456f-b07a-6f0905e56c93
2022-05-17 05:00:52.513 INFO session.SessionState: Created local directory:
/tmp/runner/4b4bfd19-ddc4-456f-b07a-6f0905e56c93
2022-05-17 05:00:52.540 INFO session.SessionState: Created HDFS directory:
/tmp/hive/runner/4b4bfd19-ddc4-456f-b07a-6f0905e56c93/_tmp_space.db
2022-05-17 05:00:52.576 INFO client.HiveClientImpl: Warehouse location for
Hive client (version 2.3.7) is
file:/home/runner/work/incubator-kyuubi/incubator-kyuubi/kyuubi-server/target/work/runner/spark-warehouse/
2022-05-17 05:00:54.465 WARN conf.HiveConf: HiveConf of name
hive.stats.jdbc.timeout does not exist
2022-05-17 05:00:54.466 WARN conf.HiveConf: HiveConf of name
hive.stats.retries.wait does not exist
2022-05-17 05:00:54.466 INFO metastore.HiveMetaStore: 0: Opening raw store
with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-05-17 05:00:54.538 INFO metastore.ObjectStore: ObjectStore, initialize
called
2022-05-17 05:00:54.947 INFO DataNucleus.Persistence: Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
2022-05-17 05:00:54.952 INFO DataNucleus.Persistence: Property
datanucleus.cache.level2 unknown - will be ignored
2022-05-17 05:00:58.994 INFO metastore.ObjectStore: Setting MetaStore object
pin classes with
hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2022-05-17 05:01:03.444 INFO metastore.MetaStoreDirectSql: Using direct SQL,
underlying DB is DERBY
2022-05-17 05:01:03.460 INFO metastore.ObjectStore: Initialized ObjectStore
2022-05-17 05:01:03.643 WARN metastore.ObjectStore: Version information not
found in metastore. hive.metastore.schema.verification is not enabled so
recording the schema version 2.3.0
2022-05-17 05:01:03.643 WARN metastore.ObjectStore:
setMetaStoreSchemaVersion called but recording version is disabled: version =
2.3.0, comment = Set by MetaStore [email protected]
2022-05-17 05:01:03.717 WARN metastore.ObjectStore: Failed to get database
default, returning NoSuchObjectException
2022-05-17 05:01:04.185 INFO metastore.HiveMetaStore: Added admin role in
metastore
2022-05-17 05:01:04.187 INFO metastore.HiveMetaStore: Added public role in
metastore
2022-05-17 05:01:04.300 INFO metastore.HiveMetaStore: No user is added in
admin role, since config is empty
2022-05-17 05:01:04.597 INFO metastore.HiveMetaStore: 0: get_all_functions
2022-05-17 05:01:04.598 INFO HiveMetaStore.audit: ugi=runner
ip=unknown-ip-addr cmd=get_all_functions
2022-05-17 05:01:04.732 INFO metastore.HiveMetaStore: 0: get_database:
default
2022-05-17 05:01:04.732 INFO HiveMetaStore.audit: ugi=runner
ip=unknown-ip-addr cmd=get_database: default
2022-05-17 05:01:04.787 INFO metastore.HiveMetaStore: 0: get_databases: *
2022-05-17 05:01:04.787 INFO HiveMetaStore.audit: ugi=runner
ip=unknown-ip-addr cmd=get_databases: *
2022-05-17 05:01:05.401 INFO codegen.CodeGenerator: Code generated in
211.927261 ms
2022-05-17 05:01:05.467 INFO codegen.CodeGenerator: Code generated in
7.722799 ms
2022-05-17 05:01:05.518 INFO util.ThreadUtils:
SparkSQLSessionManager-exec-pool: pool size: 100, wait queue size: 100, thread
keepalive time: 60000 ms
2022-05-17 05:01:05.522 INFO operation.SparkSQLOperationManager:
Service[SparkSQLOperationManager] is initialized.
2022-05-17 05:01:05.524 INFO session.SparkSQLSessionManager:
Service[SparkSQLSessionManager] is initialized.
2022-05-17 05:01:05.524 INFO spark.SparkSQLBackendService:
Service[SparkSQLBackendService] is initialized.
2022-05-17 05:01:05.565 INFO spark.SparkTBinaryFrontendService: Initializing
SparkTBinaryFrontend on localhost:40031 with [9, 999] worker threads
2022-05-17 05:01:05.566 INFO spark.SparkTBinaryFrontendService:
Service[SparkTBinaryFrontend] is initialized.
2022-05-17 05:01:05.566 INFO spark.SparkSQLEngine: Service[SparkSQLEngine]
is initialized.
2022-05-17 05:01:05.580 INFO operation.SparkSQLOperationManager:
Service[SparkSQLOperationManager] is started.
2022-05-17 05:01:05.580 INFO session.SparkSQLSessionManager:
Service[SparkSQLSessionManager] is started.
2022-05-17 05:01:05.580 INFO spark.SparkSQLBackendService:
Service[SparkSQLBackendService] is started.
2022-05-17 05:01:05.580 INFO spark.SparkTBinaryFrontendService:
Service[SparkTBinaryFrontend] is started.
2022-05-17 05:01:05.580 INFO spark.SparkSQLEngine: Service[SparkSQLEngine]
is started.
2022-05-17 05:01:05.594 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@13cc3984{/kyuubi,null,AVAILABLE,@Spark}
2022-05-17 05:01:05.594 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5625ba2{/kyuubi/json,null,AVAILABLE,@Spark}
2022-05-17 05:01:05.597 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@4bd1b07d{/kyuubi/session,null,AVAILABLE,@Spark}
2022-05-17 05:01:05.598 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5fafd099{/kyuubi/session/json,null,AVAILABLE,@Spark}
2022-05-17 05:01:05.613 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2512155e{/kyuubi/stop,null,AVAILABLE,@Spark}
2022-05-17 05:01:05.616 INFO spark.SparkSQLEngine:
Spark application name: org.apache.kyuubi.engine.spark.SparkSQLEngine
application ID: local-1652763641274
application web UI: http://localhost:42741
master: local
version: 3.1.3
driver: [cpu: 1, mem: 1g]
executor: [cpu: 2, mem: 1g, maxNum: 2]
Start time: Tue May 17 05:00:39 UTC 2022
User: runner (shared mode: USER)
State: STARTED
````
the engine log looks pretty normal
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]