zero323 commented on a change in pull request #34466:
URL: https://github.com/apache/spark/pull/34466#discussion_r740926529



##########
File path: python/pyspark/context.py
##########
@@ -150,8 +161,10 @@ def __init__(self, master=None, appName=None, 
sparkHome=None, pyFiles=None,
             self.stop()
             raise
 
-    def _do_init(self, master, appName, sparkHome, pyFiles, environment, 
batchSize, serializer,
-                 conf, jsc, profiler_cls):
+    # TODO

Review comment:
       Most of the time, it is just an exercise in tracing code execution. Once 
you have known context, it is usually relatively easy to fill the blanks. For 
example here, you can see that `_do_init` is called only by `__init__`.
   
   
https://github.com/apache/spark/blob/8cd7c236d8c78236d83562b9a43efd05679640a8/python/pyspark/context.py#L157-L158
   
   So all the types have to conform to `__init__` signature.
   
   In more convoluted cases you  can try either stepping through code with 
debugger or start with tool like `MonkeyType`, but you have to keep in mind 
that different paths might be possible and automated annotations are usually 
useless for anything, but guidance.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to