LuciferYang commented on PR #47500:
URL: https://github.com/apache/spark/pull/47500#issuecomment-2311702580
> > ```
> > sbin/start-connect-server.sh --conf
spark.log.structuredLogging.enabled=false
> > ```
>
> I used the command `dev/make-distribution.sh --tgz -Phive` to package the
latest code.
>
> And without this pr, I manually run `sbin/start-connect-server.sh --conf
spark.log.structuredLogging.enabled=false`, and it did use `structuredLogging`,
even though `spark.log.structuredLogging.enabled` was specified as `false`:
>
> ```
> tail -100f
logs/spark-yangjie01-org.apache.spark.sql.connect.service.SparkConnectServer-1-MacBook-Pro.local.out
>
> Spark Command: /Users/yangjie01/Tools/zulu17/bin/java -cp
hive-jackson/*:/Users/yangjie01/Tools/4.0/spark-4.0.0-SNAPSHOT-bin-3.4.0/conf/:/Users/yangjie01/Tools/4.0/spark-4.0.0-SNAPSHOT-bin-3.4.0/jars/slf4j-api-2.0.16.jar:/Users/yangjie01/Tools/4.0/spark-4.0.0-SNAPSHOT-bin-3.4.0/jars/*
-Xmx1g -XX:+IgnoreUnrecognizedVMOptions --add-modules=jdk.incubator.vector
--add-opens=java.base/java.lang=ALL-UNNAMED
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED
--add-opens=java.base/java.io=ALL-UNNAMED
--add-opens=java.base/java.net=ALL-UNNAMED
--add-opens=java.base/java.nio=ALL-UNNAMED
--add-opens=java.base/java.util=ALL-UNNAMED
--add-opens=java.base/java.util.concurrent=ALL-UNNAMED
--add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED
--add-opens=java.base/jdk.internal.ref=ALL-UNNAMED
--add-opens=java.base/sun.nio.ch=ALL-UNNAMED
--add-opens=java.base/sun.nio.cs=ALL-UNNAMED
--add-opens=java.base/sun.security.action=ALL-UNNAMED -
-add-opens=java.base/sun.util.calendar=ALL-UNNAMED
--add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED
-Djdk.reflect.useDirectMethodHandle=false
-Dio.netty.tryReflectionSetAccessible=true
-Dderby.connection.requireAuthentication=false
org.apache.spark.deploy.SparkSubmit --conf
spark.log.structuredLogging.enabled=false --class
org.apache.spark.sql.connect.service.SparkConnectServer --name Spark Connect
server spark-internal
> ========================================
> WARNING: Using incubator modules: jdk.incubator.vector
> Using Spark's default log4j profile:
org/apache/spark/log4j2-defaults.properties
> {"ts":"2024-08-27T06:42:25.198Z","level":"WARN","msg":"Your hostname,
MacBook-Pro.local, resolves to a loopback address: 127.0.0.1; using
172.22.200.254 instead (on interface
en0)","context":{"host":"MacBook-Pro.local","host_port":"127.0.0.1","host_port2":"172.22.200.254","network_if":"en0"},"logger":"Utils"}
> {"ts":"2024-08-27T06:42:25.199Z","level":"WARN","msg":"Set SPARK_LOCAL_IP
if you need to bind to another address","logger":"Utils"}
> {"ts":"2024-08-27T06:42:25.309Z","level":"INFO","msg":"Starting Spark
session.","logger":"SparkConnectServer"}
> {"ts":"2024-08-27T06:42:25.321Z","level":"INFO","msg":"Running Spark
version 4.0.0-SNAPSHOT","logger":"SparkContext"}
> {"ts":"2024-08-27T06:42:25.322Z","level":"INFO","msg":"OS info Mac OS X,
14.6.1, aarch64","logger":"SparkContext"}
> {"ts":"2024-08-27T06:42:25.322Z","level":"INFO","msg":"Java version
17.0.12","logger":"SparkContext"}
> {"ts":"2024-08-27T06:42:25.359Z","level":"WARN","msg":"Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable","logger":"NativeCodeLoader"}
>
{"ts":"2024-08-27T06:42:25.375Z","level":"INFO","msg":"==============================================================","logger":"ResourceUtils"}
> {"ts":"2024-08-27T06:42:25.375Z","level":"INFO","msg":"No custom resources
configured for spark.driver.","logger":"ResourceUtils"}
>
{"ts":"2024-08-27T06:42:25.375Z","level":"INFO","msg":"==============================================================","logger":"ResourceUtils"}
> {"ts":"2024-08-27T06:42:25.376Z","level":"INFO","msg":"Submitted
application: Spark Connect server","logger":"SparkContext"}
> {"ts":"2024-08-27T06:42:25.383Z","level":"INFO","msg":"Default
ResourceProfile created, executor resources: Map(cores -> name: cores, amount:
1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor:
, offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources:
Map(cpus -> name: cpus, amount: 1.0)","logger":"ResourceProfile"}
> {"ts":"2024-08-27T06:42:25.384Z","level":"INFO","msg":"Limiting resource
is cpu","logger":"ResourceProfile"}
> {"ts":"2024-08-27T06:42:25.384Z","level":"INFO","msg":"Added
ResourceProfile id: 0","logger":"ResourceProfileManager"}
> {"ts":"2024-08-27T06:42:25.400Z","level":"INFO","msg":"Changing view acls
to: yangjie01","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.401Z","level":"INFO","msg":"Changing modify
acls to: yangjie01","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.401Z","level":"INFO","msg":"Changing view acls
groups to: yangjie01","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.401Z","level":"INFO","msg":"Changing modify
acls groups to: yangjie01","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.402Z","level":"INFO","msg":"SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
yangjie01 groups with view permissions: EMPTY; users with modify permissions:
yangjie01; groups with modify permissions: EMPTY; RPC SSL
disabled","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.483Z","level":"INFO","msg":"Successfully
started service 'sparkDriver' on port 51354.","logger":"Utils"}
> {"ts":"2024-08-27T06:42:25.493Z","level":"INFO","msg":"Registering
MapOutputTracker","logger":"SparkEnv"}
> {"ts":"2024-08-27T06:42:25.496Z","level":"INFO","msg":"Registering
BlockManagerMaster","logger":"SparkEnv"}
> {"ts":"2024-08-27T06:42:25.503Z","level":"INFO","msg":"Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information","logger":"BlockManagerMasterEndpoint"}
>
{"ts":"2024-08-27T06:42:25.503Z","level":"INFO","msg":"BlockManagerMasterEndpoint
up","logger":"BlockManagerMasterEndpoint"}
> {"ts":"2024-08-27T06:42:25.504Z","level":"INFO","msg":"Registering
BlockManagerMasterHeartbeat","logger":"SparkEnv"}
> {"ts":"2024-08-27T06:42:25.512Z","level":"INFO","msg":"Created local
directory at
/private/var/folders/j2/cfn7w6795538n_416_27rkqm0000gn/T/blockmgr-33631b4f-63ad-49a7-8285-f286ff242854","logger":"DiskBlockManager"}
> {"ts":"2024-08-27T06:42:25.519Z","level":"INFO","msg":"Registering
OutputCommitCoordinator","logger":"SparkEnv"}
> {"ts":"2024-08-27T06:42:25.564Z","level":"INFO","msg":"Start Jetty
0.0.0.0:4040 for SparkUI","logger":"JettyUtils"}
> {"ts":"2024-08-27T06:42:25.587Z","level":"INFO","msg":"Successfully
started service 'SparkUI' on port 4040.","logger":"Utils"}
> {"ts":"2024-08-27T06:42:25.608Z","level":"INFO","msg":"Changing view acls
to: yangjie01","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.608Z","level":"INFO","msg":"Changing modify
acls to: yangjie01","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.608Z","level":"INFO","msg":"Changing view acls
groups to: yangjie01","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.608Z","level":"INFO","msg":"Changing modify
acls groups to: yangjie01","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.608Z","level":"INFO","msg":"SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
yangjie01 groups with view permissions: EMPTY; users with modify permissions:
yangjie01; groups with modify permissions: EMPTY; RPC SSL
disabled","logger":"SecurityManager"}
> {"ts":"2024-08-27T06:42:25.631Z","level":"INFO","msg":"Starting executor
ID driver on host 172.22.200.254","logger":"Executor"}
> {"ts":"2024-08-27T06:42:25.631Z","level":"INFO","msg":"OS info Mac OS X,
14.6.1, aarch64","logger":"Executor"}
> {"ts":"2024-08-27T06:42:25.631Z","level":"INFO","msg":"Java version
17.0.12","logger":"Executor"}
> {"ts":"2024-08-27T06:42:25.633Z","level":"INFO","msg":"Starting executor
with user classpath (userClassPathFirst = false): ''","logger":"Executor"}
> {"ts":"2024-08-27T06:42:25.634Z","level":"INFO","msg":"Created or updated
repl class loader org.apache.spark.util.MutableURLClassLoader@6fff46bf for
default.","logger":"Executor"}
> {"ts":"2024-08-27T06:42:25.641Z","level":"INFO","msg":"Successfully
started service 'org.apache.spark.network.netty.NettyBlockTransferService' on
port 51355.","logger":"Utils"}
> {"ts":"2024-08-27T06:42:25.643Z","level":"INFO","msg":"Server created on
172.22.200.254:51355","logger":"NettyBlockTransferService"}
> {"ts":"2024-08-27T06:42:25.643Z","level":"INFO","msg":"Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy","logger":"BlockManager"}
> {"ts":"2024-08-27T06:42:25.648Z","level":"INFO","msg":"Registering
BlockManager BlockManagerId(driver, 172.22.200.254, 51355,
None)","logger":"BlockManagerMaster"}
> {"ts":"2024-08-27T06:42:25.650Z","level":"INFO","msg":"Registering block
manager 172.22.200.254:51355 with 434.4 MiB RAM, BlockManagerId(driver,
172.22.200.254, 51355, None)","logger":"BlockManagerMasterEndpoint"}
> {"ts":"2024-08-27T06:42:25.651Z","level":"INFO","msg":"Registered
BlockManager BlockManagerId(driver, 172.22.200.254, 51355,
None)","logger":"BlockManagerMaster"}
> {"ts":"2024-08-27T06:42:25.651Z","level":"INFO","msg":"Initialized
BlockManager: BlockManagerId(driver, 172.22.200.254, 51355,
None)","logger":"BlockManager"}
> {"ts":"2024-08-27T06:42:25.793Z","level":"INFO","msg":"Successfully
started service 'org.apache.spark.sql.connect.service.SparkConnectService$' on
port 15002.","logger":"Utils"}
> {"ts":"2024-08-27T06:42:25.798Z","level":"INFO","msg":"Spark Connect
server started at: 0:0:0:0:0:0:0:0:15002","logger":"SparkConnectServer"}
> ```
After packaging, I didn't perform any other actions before executing
`sbin/start-connect-server.sh --conf
spark.log.structuredLogging.enabled=false`, except for unpacking the tarball
cc @panbingkun @wayneguow do you have time to help check this issue? Thanks
~ I'm not sure if it's related to some environmental characteristics.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]