gaborgsomogyi opened a new pull request #31622:
URL: https://github.com/apache/spark/pull/31622


   ### What changes were proposed in this pull request?
   Some of the built-in JDBC connection providers are changing the JVM security 
context to do the authentication which is fine. The problematic part is that 
executors can be reused by another query. The following situation leads to 
incorrect behaviour:
   * Query1 opens JDBC connection and changes JVM security context in Executor1
   * Query2 tries to open JDBC connection but it realizes there is already an 
entry for that DB type in Executor1
   * Query2 is not changing JVM security context and uses Query1 keytab and 
principal
   * Query2 fails with authentication error
   
   In this PR I've changed to code such a way that JVM security context is 
changed all the time but only temporarily until the connection built-up and 
then rolled back. Since `getConnection` is synchronised with 
`SecurityConfigurationLock` it ends-up in correct behaviour without any race.
   
   ### Why are the changes needed?
   Incorrect JVM security context handling.
   
   ### Does this PR introduce _any_ user-facing change?
   No.
   
   ### How was this patch tested?
   Existing unit + integration tests.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to