HyukjinKwon commented on code in PR #49221:
URL: https://github.com/apache/spark/pull/49221#discussion_r1889877585


##########
python/docs/source/development/testing.rst:
##########
@@ -38,6 +38,27 @@ After that, the PySpark test cases can be run via using 
``python/run-tests``. Fo
 
 Note that you may set ``OBJC_DISABLE_INITIALIZE_FORK_SAFETY`` environment 
variable to ``YES`` if you are running tests on Mac OS.
 
+.. note::
+
+    If the Spark driver is unavailable, you can resolve the issue using the 
following methods:
+
+    **Set SPARK_LOCAL_IP**:
+
+    Configure the environment variable ``SPARK_LOCAL_IP`` to bind to the local 
address ``127.0.0.1``::
+
+        export SPARK_LOCAL_IP=127.0.0.1

Review Comment:
   I actually don't think this is specific to PySpark but Spark development in 
general. Since we don't have a page for Spark itself yet, maybe we can add the 
details into 
https://spark.apache.org/docs/latest/api/python/development/debugging.html#common-exceptions-errors
 for now.
   
   It'd be great if we can explain the background and details about why it has 
to be set, and why it happens.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to