viirya commented on a change in pull request #29639:
URL: https://github.com/apache/spark/pull/29639#discussion_r484171058



##########
File path: python/docs/source/development/debugging.rst
##########
@@ -0,0 +1,280 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+=================
+Debugging PySpark
+=================
+
+PySpark uses Spark as an engine. PySpark uses `Py4J <https://www.py4j.org/>`_ 
to leverage Spark to submit and computes the jobs.
+
+On the driver side, PySpark communicates with the driver on JVM by using `Py4J 
<https://www.py4j.org/>`_.
+When :class:`pyspark.sql.SparkSession` or :class:`pyspark.SparkContext` is 
created and initialized, PySpark launches a JVM
+to communicate.
+
+On the executor side, Python workers execute and handle Python native 
functions or data. They are not launched if
+a PySpark application does not require interaction between Python workers and 
JVMs. They are lazily launched only when
+Python native functions or data have to be handled, for example, when you 
execute pandas UDFs or
+PySpark RDD APIs.
+
+This page focuses on debugging Python side of PySpark on both driver and 
executor sides instead of focusing on debugging
+with JVM. Profiling and debugging JVM is described at `Useful Developer Tools 
<https://spark.apache.org/developer-tools.html>`_.
+
+Note that,
+
+- If you are running locally, you can directly debug the driver side via using 
your IDE without the remote debug feature.
+- *There are many other ways of debugging PySpark applications*. For example, 
you can remotely debug by using the open source `Remote Debugger 
<https://www.pydev.org/manual_adv_remote_debugger.html>`_ instead of using 
PyCharm Professional documented here.
+
+
+Remote Debugging (PyCharm Professional)
+---------------------------------------
+
+This section describes remote debugging on both driver and executor sides 
within a single machine to demonstrate easily.
+The ways of debugging PySpark on the executor side is different from doing in 
the driver. Therefore, they will be demonstrated respectively.
+In order to debug PySpark applications on other machines, please refer to the 
full instructions that are specific
+to PyCharm, documented `here 
<https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html>`_.
+
+Firstly, choose **Edit Configuration...** from the *Run* menu. It opens the 
**Run/Debug Configurations dialog**.
+You have to click ``+`` configuration on the toolbar, and from the list of 
available configurations, select **Python Debug Server**.
+Enter the name of this new configuration, for example, ``MyRemoteDebugger`` 
and also specify the port number, for example ``12345``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug1.png
+    :alt: PyCharm remote debugger setting
+
+| After that, you should install the corresponding version of the 
``pydevd-pycahrm`` package in all the machines which will connect to your 
PyCharm debugger. In the previous dialog, it shows the command to install.
+
+.. code-block:: text
+
+    pip install pydevd-pycharm~=<version of PyCharm on the local machine>
+
+Driver Side
+~~~~~~~~~~~
+
+To debug on the driver side, your application should connect to the debugging 
server. Copy and paste the codes
+with ``pydevd_pycharm.settrace`` to the top of your PySpark script. Suppose 
the file name is ``app.py``:
+
+.. code-block:: bash
+
+    echo "#======================Copy and paste from the previous 
dialog===========================
+    import pydevd_pycharm
+    pydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, 
stderrToServer=True)
+    
#========================================================================================
+    # Your PySpark application codes:
+    from pyspark.sql import SparkSession
+    spark = SparkSession.builder.getOrCreate()
+    spark.range(10).show()" > app.py
+
+Start debugging with your ``MyRemoteDebugger``.
+
+.. image:: ../../../../docs/img/pyspark-remote-debug2.png
+    :alt: PyCharm run remote debugger
+
+| After that, submit your application. This will connect to your PyCharm 
debugging server and enable you to debug on driver side remotely.
+
+.. code-block:: bash
+
+    spark-submit app.py
+
+Executor Side
+~~~~~~~~~~~~~
+
+To debug on the executor side, prepare a Python file as below In your current 
working directory.

Review comment:
       In -> in




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to