igreenfield commented on pull request #28629:
URL: https://github.com/apache/spark/pull/28629#issuecomment-647640494
@cloud-fan
code:
```scala
MDC.put("appName", "app for test")
val session = SparkSession.builder()
.master("local")
.config(UI_ENABLED.key, value = false)
.config("some-config", "v2")
.getOrCreate()
```
log4j.properties:
```properties
#File Appender with MDC
log4j.appender.FAMDC=org.apache.log4j.FileAppender
log4j.appender.FAMDC.append=false
log4j.appender.FAMDC.file=target/unit-tests-mdc.log
log4j.appender.FAMDC.layout=org.apache.log4j.PatternLayout
log4j.appender.FAMDC.layout.ConversionPattern=%d{HH:mm:ss.SSS} [%X{appId}]
[%X{appName}] %t %p %c{1}: %m%n
```
log file:
```log
09:41:12.400 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SparkContext: Running Spark
version 3.0.0-SNAPSHOT
09:41:12.777 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite WARN NativeCodeLoader: Unable to
load native-hadoop library for your platform... using builtin-java classes
where applicable
09:41:12.974 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO ResourceUtils: 09:41:12.974
[] [app for test] ScalaTest-run-running-SparkSessionBuilderSuite INFO
ResourceUtils: No custom resources configured for spark.driver.
09:41:12.975 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO ResourceUtils: 09:41:12.975
[] [app for test] ScalaTest-run-running-SparkSessionBuilderSuite INFO
SparkContext: Submitted application: c4b615df-6461-4568-bc32-c6afa43a777b
09:41:13.046 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO ResourceProfile: Default
ResourceProfile created, executor resources: Map(cores -> name: cores, amount:
1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor:
), task resources: Map(cpus -> name: cpus, amount: 1.0)
09:41:13.060 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO ResourceProfile: Limiting
resource is cpu
09:41:13.061 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO ResourceProfileManager:
Added ResourceProfile id: 0
09:41:13.341 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SecurityManager: Changing
view acls to: igreenfield
09:41:13.342 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SecurityManager: Changing
modify acls to: igreenfield
09:41:13.342 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SecurityManager: Changing
view acls groups to:
09:41:13.343 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SecurityManager: Changing
modify acls groups to:
09:41:13.343 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SecurityManager:
SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(igreenfield); groups with view permissions: Set(); users with
modify permissions: Set(igreenfield); groups with modify permissions: Set()
09:41:19.076 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO Utils: Successfully started
service 'sparkDriver' on port 55518.
09:41:19.178 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SparkEnv: Registering
MapOutputTracker
09:41:19.357 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SparkEnv: Registering
BlockManagerMaster
09:41:19.367 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO BlockManagerMasterEndpoint:
Using org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
09:41:19.371 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO BlockManagerMasterEndpoint:
BlockManagerMasterEndpoint up
09:41:19.386 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SparkEnv: Registering
BlockManagerMasterHeartbeat
09:41:19.474 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO DiskBlockManager: Created
local directory at
C:\Users\igreenfield\AppData\Local\Temp\blockmgr-98ef6083-53df-4f18-9f66-b2aa0215a505
09:41:19.542 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO MemoryStore: MemoryStore
started with capacity 4.3 GiB
09:41:19.610 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SparkEnv: Registering
OutputCommitCoordinator
09:41:20.182 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO Executor: Starting executor
ID driver on host IL-LP-005.global.axiomsl.com
09:41:20.311 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO Utils: Successfully started
service 'org.apache.spark.network.netty.NettyBlockTransferService' on port
55528.
09:41:20.311 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO NettyBlockTransferService:
Server created on IL-LP-005.global.axiomsl.com:55528
09:41:20.317 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO BlockManager: Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy
09:41:20.345 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO BlockManagerMaster:
Registering BlockManager BlockManagerId(driver, IL-LP-005.global.axiomsl.com,
55528, None)
09:41:20.363 [] [app for test] dispatcher-BlockManagerMaster INFO
BlockManagerMasterEndpoint: Registering block manager
IL-LP-005.global.axiomsl.com:55528 with 4.3 GiB RAM, BlockManagerId(driver,
IL-LP-005.global.axiomsl.com, 55528, None)
09:41:20.370 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO BlockManagerMaster:
Registered BlockManager BlockManagerId(driver, IL-LP-005.global.axiomsl.com,
55528, None)
09:41:20.375 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO BlockManager: Initialized
BlockManager: BlockManagerId(driver, IL-LP-005.global.axiomsl.com, 55528, None)
09:41:20.887 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO log: Logging initialized
@14628ms to org.eclipse.jetty.util.log.Slf4jLog
09:41:21.670 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SharedState: loading hive
config file:
file:/C:/GITHUB/igreenfield/spark/sql/core/target/scala-2.12/test-classes/hive-site.xml
09:41:21.736 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SharedState: Setting
hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir
('file:/C:/GITHUB/igreenfield/spark/sql/core/spark-warehouse').
09:41:21.737 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SharedState: Warehouse path
is 'file:/C:/GITHUB/igreenfield/spark/sql/core/spark-warehouse'.
09:41:26.157 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SparkSessionBuilderSuite:
09:41:26.221 [] [app for test] dispatcher-event-loop-0 INFO
MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
09:41:26.237 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO MemoryStore: MemoryStore
cleared
09:41:26.237 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO BlockManager: BlockManager
stopped
09:41:26.351 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO BlockManagerMaster:
BlockManagerMaster stopped
09:41:26.365 [] [app for test] dispatcher-event-loop-1 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
09:41:26.370 [] [app for test]
ScalaTest-run-running-SparkSessionBuilderSuite INFO SparkContext: Successfully
stopped SparkContext
09:41:26.385 [] [app for test] ScalaTest-run WARN SparkSessionBuilderSuite:
09:41:26.398 [] [app for test] Thread-1 INFO ShutdownHookManager: Shutdown
hook called
09:41:26.398 [] [app for test] Thread-1 INFO ShutdownHookManager: Deleting
directory
C:\Users\igreenfield\AppData\Local\Temp\spark-5d205a0f-329d-4b55-9434-cb4520fab2ff
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]