Kimahriman commented on code in PR #49880:
URL: https://github.com/apache/spark/pull/49880#discussion_r1960875165
##########
python/pyspark/sql/connect/client/core.py:
##########
@@ -125,6 +125,7 @@ class ChannelBuilder:
PARAM_USER_ID = "user_id"
PARAM_USER_AGENT = "user_agent"
PARAM_SESSION_ID = "session_id"
+ CONNECT_LOCAL_AUTH_TOKEN_PARAM_NAME = "local_token"
Review Comment:
No more the opposite, I assume the custom `local_token` header was used here
to avoid the TLS requirement, but it's probably ok to allow non-TLS uses of the
authorization bearer token header for a local connections, which the Python
gRPC client even has support for. Adding the UDS supported would just improve
that further. The `local_channel_credentials` even supports both of those
cases, so you could use that with the authorization header to avoid the TLS
requirement for this use case, while still requiring it for remote connections.
The "workarounds" are simply that you can still use the authorization header
without TLS by using a custom interceptor that injects it in the metadata after
the fact. Figured that out while trying to build a custom dynamic proxy for
launching cluster deploy mode connect sessions to replace something like Livy.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]