zhengruifeng opened a new pull request, #53485:
URL: https://github.com/apache/spark/pull/53485

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: 
https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: 
https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., 
'[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a 
faster review.
     7. If you want to add a new configuration, please read the guideline first 
for naming configurations in
        
'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the 
guideline first in
        'common/utils/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   Make `RemoteModelRef.release_ref` async
   
   
   ### Why are the changes needed?
   the test `pyspark.ml.tests.connect.test_parity_classification` was flaky and 
got stuck occasionally, the traceback is like
   ```
   Traceback (most recent call last):
     File "/Users/ruifeng.zheng/spark/python/pyspark/ml/util.py", line 379, in 
wrapped
       self._remote_model_obj.release_ref()
     File "/Users/ruifeng.zheng/spark/python/pyspark/ml/util.py", line 162, in 
release_ref
       del_remote_cache(self.ref_id)
     File "/Users/ruifeng.zheng/spark/python/pyspark/ml/util.py", line 358, in 
del_remote_cache
       session.client._delete_ml_cache([ref_id])
     File 
"/Users/ruifeng.zheng/spark/python/pyspark/sql/connect/client/core.py", line 
2133, in _delete_ml_cache
       (_, properties, _) = self.execute_command(command)
     File 
"/Users/ruifeng.zheng/spark/python/pyspark/sql/connect/client/core.py", line 
1158, in execute_command
       data, _, metrics, observed_metrics, properties = self._execute_and_fetch(
     File 
"/Users/ruifeng.zheng/spark/python/pyspark/sql/connect/client/core.py", line 
1660, in _execute_and_fetch
       for response in self._execute_and_fetch_as_iterator(
     File 
"/Users/ruifeng.zheng/spark/python/pyspark/sql/connect/client/core.py", line 
1635, in _execute_and_fetch_as_iterator
       raise kb
     File 
"/Users/ruifeng.zheng/spark/python/pyspark/sql/connect/client/core.py", line 
1617, in _execute_and_fetch_as_iterator
       generator = ExecutePlanResponseReattachableIterator(
     File 
"/Users/ruifeng.zheng/spark/python/pyspark/sql/connect/client/reattach.py", 
line 127, in __init__
       self._stub.ExecutePlan(self._initial_request, metadata=metadata)
     File 
"/Users/ruifeng.zheng/.dev/miniconda3/envs/spark_dev_313/lib/python3.13/site-packages/grpc/_channel.py",
 line 1396, in __call__
       call = self._managed_call(
     File 
"/Users/ruifeng.zheng/.dev/miniconda3/envs/spark_dev_313/lib/python3.13/site-packages/grpc/_channel.py",
 line 1784, in create
       with state.lock:
     File "/Users/ruifeng.zheng/spark/python/pyspark/core/context.py", line 
409, in signal_handler
       raise KeyboardInterrupt()
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   Manually run the test in my local, the hanging issue doesn't occur in 
successive 10 runs
   
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to