sryza commented on code in PR #53020:
URL: https://github.com/apache/spark/pull/53020#discussion_r2519895777
##########
python/pyspark/sql/connect/session.py:
##########
@@ -920,6 +895,60 @@ def clearTags(self) -> None:
clearTags.__doc__ = PySparkSession.clearTags.__doc__
+ def addThreadlocalUserContextExtension(self, extension: any_pb2.Any) ->
str:
+ """
+ Add a user context extension to the current session in the current
thread.
+ It will be sent in the UserContext of every request sent from the
current thread, until
+ it is removed with removeUserContextExtension using the returned id.
+
+ Parameters
+ ----------
+ extension: any_pb2.Any
+ Protobuf Any message to add as the extension to UserContext.
+
+ Returns
+ -------
+ str
+ Id that can be used with removeUserContextExtension to remove the
extension.
+ """
+ return self.client.add_threadlocal_user_context_extension(extension)
+
+ def addGlobalUserContextExtension(self, extension: any_pb2.Any) -> str:
+ """
+ Add a user context extension to the current session, globally.
+ It will be sent in the UserContext of every request, until it is
removed with
+ removeUserContextExtension using the returned id. It will precede any
threadlocal extension.
+
+ Parameters
+ ----------
+ extension: any_pb2.Any
+ Protobuf Any message to add as the extension to UserContext.
+
+ Returns
+ -------
+ str
+ Id that can be used with removeUserContextExtension to remove the
extension.
+ """
+ return self.client.add_global_user_context_extension(extension)
+
+ def removeUserContextExtension(self, extension_id: str) -> None:
+ """
+ Remove a user context extension previously added by
addUserContextExtension.
Review Comment:
```suggestion
Remove a user context extension previously added by
addThreadlocalUserContextExtension.
```
##########
python/pyspark/sql/connect/session.py:
##########
@@ -920,6 +895,60 @@ def clearTags(self) -> None:
clearTags.__doc__ = PySparkSession.clearTags.__doc__
+ def addThreadlocalUserContextExtension(self, extension: any_pb2.Any) ->
str:
Review Comment:
This is adding public APIs if I understand correctly. I wonder if we need to
add these public APIs? In general, better to limit public APIs to what we
strictly, as we'll be stuck supporting them forever.
If I understand correctly, the purpose of this change is to enable setting
the `PipelineAnalysisContext` on `UserContext` inside pipelines. Do we need to
expose an API on `SparkSession` to support that? Can we just access the
underlying methods on the Connect client directly?
cc @SCHJonathan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]