alefischer13 edited a comment on pull request #26624:
URL: https://github.com/apache/spark/pull/26624#issuecomment-771017526
@igreenfield this does not seem to be working for me. I'm trying to log
spark application_id by setting `mdc.applicationId` to sparkContext's
applicationId and adding `%X{applicationId}` to patternLayout, but no
applicationId shows up neither on the driver nor on the executors. For
reference, `%X{taskName}` doesn't work either, but setting the MDC value
explicitly (`MDC.put...`) does provide the applicationId value, but only for
the driver. Is there anything else we have to change?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]