SCHJonathan commented on code in PR #53024:
URL: https://github.com/apache/spark/pull/53024#discussion_r2521611995
##########
python/pyspark/pipelines/spark_connect_graph_element_registry.py:
##########
@@ -110,8 +112,11 @@ def register_output(self, output: Output) -> None:
self._client.execute_command(command)
def register_flow(self, flow: Flow) -> None:
- with block_spark_connect_execution_and_analysis():
- df = flow.func()
+ with add_pipeline_analysis_context(
Review Comment:
Yes! Exactly, for spark.sql() outside query function, it would only have one
extension associated with it, but two for spark.sql() inside query function
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]