zero323 commented on a change in pull request #34466:
URL: https://github.com/apache/spark/pull/34466#discussion_r762609181



##########
File path: python/pyspark/context.py
##########
@@ -128,31 +152,40 @@ class SparkContext(object):
     ValueError: ...
     """
 
-    _gateway = None
-    _jvm = None
+    # set assignment ignore temporarily to prevent errors from other files
+    _gateway: ClassVar[JavaGateway] = None  # type: ignore[assignment]
+    _jvm: ClassVar[JVMView] = None  # type: ignore[assignment]
     _next_accum_id = 0
-    _active_spark_context = None
+    _active_spark_context: ClassVar["SparkContext"] = None  # type: 
ignore[assignment]

Review comment:
       I am not very fond of this, to be honest.
   
   While we're still discussing possible strategies for handling such cases, 
for now we should stick to the convention already set by `sql.session`
   
   
https://github.com/apache/spark/blob/feba5ac32f2598f6ca8a274850934106be0db64d/python/pyspark/sql/session.py#L289-L290
   
   and use actual types (`Optional[_]`).
   
   However, I've checked this locally and there are almost 500 invocations that 
require type adjustments if go this way, so I lean towards keeping these 
ignores for now and creating a separate ticket for further type refinements.
   
   cc @HyukjinKwon @ueshin 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to