jhtimmins commented on a change in pull request #15631:
URL: https://github.com/apache/airflow/pull/15631#discussion_r625495972
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -387,8 +391,9 @@ Using Breeze
</div>
3. Setup mysql database in
-MySQL Workbench with Host ``127.0.0.1``, port ``23306``, user ``root`` and
password
- blank(leave empty), default schema ``airflow``.
+ PyCharm Database tool with Host ``127.0.0.1``, port ``23306``, user
``root`` and password
Review comment:
This phrasing seems a bit off. It seems to presume that Pycharm is the
default, which I'm not sure is accurate
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -572,6 +623,8 @@ Creating a branch
<div align="center" style="padding-bottom:10px">
<img src="images/quick_start/creating_branch_2.png"
alt="Giving name to a branch">
+ <img src="images/quick_start/vscode_creating_branch_2.png"
+ alt="Giving name to a branch">
Review comment:
```suggestion
alt="Giving a name to a branch">
```
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -544,6 +551,48 @@ Setting up Debug
alt="Add environment variable pycharm">
</div>
+- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in mysql workbench.
Review comment:
```suggestion
- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in MySQL Workbench.
```
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -544,6 +551,48 @@ Setting up Debug
alt="Add environment variable pycharm">
</div>
+- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in mysql workbench.
+
+B. Using Visual Studio Code
+
+- In Visual Studio Code open airflow project, directory ``\files\dags`` of
local machine is by default mounted to docker
+ machine when breeze airflow is started. So any DAG file present in this
directory will be picked automatically by
+ scheduler running in docker machine and same can be seen on
``http://127.0.0.1:28080``.
+
+- Copy any example DAG present in airflow project's ``\airflow\example_dags``
directory to ``\files\dags\``.
+
+- Add main block at the end of your DAG file to make it runnable. It will run
a back_fill job:
+
+ .. code-block:: python
+
+ if __name__ == '__main__':
+ from airflow.utils.state import State
+ dag.clear(dag_run_state=State.NONE)
+ dag.run()
+
+- Add ``"AIRFLOW__CORE__EXECUTOR": "DebugExecutor"`` to the ``"env"`` field of
Debug configuration.
+
+ - Using Run view click on create a launch.json file
+
+ .. raw:: html
+
+ <div align="center" style="padding-bottom:10px">
+ <img src="images/quick_start/vscode_add_configuration_1.png"
+ alt="Add Debug Configuration to Visual Studio Code">
+ <img src="images/quick_start/vscode_add_configuration_2.png"
+ alt="Add Debug Configuration to Visual Studio Code">
+ <img src="images/quick_start/vscode_add_configuration_3.png"
+ alt="Add Debug Configuration to Visual Studio Code">
+ </div>
+
+ - Change ``"program"`` to point to an example dag and add ``"env"`` and
``"python"`` fields to new Python configuration
Review comment:
```suggestion
- Change ``"program"`` to point to an example dag and add ``"env"`` and
``"python"`` fields to the new Python configuration
```
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -544,6 +551,48 @@ Setting up Debug
alt="Add environment variable pycharm">
</div>
+- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in mysql workbench.
+
+B. Using Visual Studio Code
+
+- In Visual Studio Code open airflow project, directory ``\files\dags`` of
local machine is by default mounted to docker
+ machine when breeze airflow is started. So any DAG file present in this
directory will be picked automatically by
+ scheduler running in docker machine and same can be seen on
``http://127.0.0.1:28080``.
+
+- Copy any example DAG present in airflow project's ``\airflow\example_dags``
directory to ``\files\dags\``.
Review comment:
```suggestion
- Copy any example DAG present in the ``\airflow\example_dags`` directory to
``\files\dags\``.
```
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -544,6 +551,48 @@ Setting up Debug
alt="Add environment variable pycharm">
</div>
+- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in mysql workbench.
+
+B. Using Visual Studio Code
+
+- In Visual Studio Code open airflow project, directory ``\files\dags`` of
local machine is by default mounted to docker
+ machine when breeze airflow is started. So any DAG file present in this
directory will be picked automatically by
Review comment:
I think this paragraph is a bit confusing. Is there a central thing
you're trying to communicate?
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -544,6 +551,48 @@ Setting up Debug
alt="Add environment variable pycharm">
</div>
+- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in mysql workbench.
+
+B. Using Visual Studio Code
+
+- In Visual Studio Code open airflow project, directory ``\files\dags`` of
local machine is by default mounted to docker
+ machine when breeze airflow is started. So any DAG file present in this
directory will be picked automatically by
+ scheduler running in docker machine and same can be seen on
``http://127.0.0.1:28080``.
+
+- Copy any example DAG present in airflow project's ``\airflow\example_dags``
directory to ``\files\dags\``.
+
+- Add main block at the end of your DAG file to make it runnable. It will run
a back_fill job:
+
+ .. code-block:: python
+
+ if __name__ == '__main__':
+ from airflow.utils.state import State
+ dag.clear(dag_run_state=State.NONE)
+ dag.run()
Review comment:
```suggestion
from airflow.utils.state import State
...
if __name__ == '__main__':
dag.clear(dag_run_state=State.NONE)
dag.run()
```
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -544,6 +551,48 @@ Setting up Debug
alt="Add environment variable pycharm">
</div>
+- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in mysql workbench.
+
+B. Using Visual Studio Code
+
+- In Visual Studio Code open airflow project, directory ``\files\dags`` of
local machine is by default mounted to docker
+ machine when breeze airflow is started. So any DAG file present in this
directory will be picked automatically by
+ scheduler running in docker machine and same can be seen on
``http://127.0.0.1:28080``.
+
+- Copy any example DAG present in airflow project's ``\airflow\example_dags``
directory to ``\files\dags\``.
+
+- Add main block at the end of your DAG file to make it runnable. It will run
a back_fill job:
Review comment:
```suggestion
- Add a `__main__` block at the end of your DAG file to make it runnable. It
will run a `back_fill` job:
```
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -556,13 +605,15 @@ Starting development
Creating a branch
-----------------
-1. Click on branch symbol in the bottom right corner of Pycharm
+1. Click on branch symbol in the status bar
Review comment:
```suggestion
1. Click on the branch symbol in the status bar
```
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -544,6 +551,48 @@ Setting up Debug
alt="Add environment variable pycharm">
</div>
+- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in mysql workbench.
+
+B. Using Visual Studio Code
+
+- In Visual Studio Code open airflow project, directory ``\files\dags`` of
local machine is by default mounted to docker
+ machine when breeze airflow is started. So any DAG file present in this
directory will be picked automatically by
+ scheduler running in docker machine and same can be seen on
``http://127.0.0.1:28080``.
+
+- Copy any example DAG present in airflow project's ``\airflow\example_dags``
directory to ``\files\dags\``.
+
+- Add main block at the end of your DAG file to make it runnable. It will run
a back_fill job:
+
+ .. code-block:: python
+
+ if __name__ == '__main__':
+ from airflow.utils.state import State
+ dag.clear(dag_run_state=State.NONE)
+ dag.run()
Review comment:
In general, we should probably put import statements at the module level
unless we're trying to avoid a circular import.
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -837,15 +890,15 @@ To avoid burden on CI infrastructure and to save time,
Pre-commit hooks can be r
.. code-block:: bash
- $ cd ~/PycharmProjects/airflow
+ $ cd ~/Projects/airflow
$ pre-commit install
$ git commit -m "Added xyz"
-9. To disable Pre-commit
+1. To disable Pre-commit
Review comment:
Is there a reason these have been renumbered to `1.`?
##########
File path: TESTING.rst
##########
@@ -55,10 +55,10 @@ Follow the guidelines when writing unit tests:
**NOTE:** We plan to convert all unit tests to standard "asserts"
semi-automatically, but this will be done later
in Airflow 2.0 development phase. That will include setUp/tearDown/context
managers and decorators.
-Running Unit Tests from IDE
----------------------------
+Running Unit Tests from PyCharm IDE
+-----------------------------------
-To run unit tests from the IDE, create the `local virtualenv
<LOCAL_VIRTUALENV.rst>`_,
+To run unit tests from PyCharm IDE, create the `local virtualenv
<LOCAL_VIRTUALENV.rst>`_,
Review comment:
```suggestion
To run unit tests from the PyCharm IDE, create the `local virtualenv
<LOCAL_VIRTUALENV.rst>`_,
```
##########
File path: CONTRIBUTORS_QUICK_START.rst
##########
@@ -544,6 +551,48 @@ Setting up Debug
alt="Add environment variable pycharm">
</div>
+- Now Debug an example dag and view the entries in tables such as ``dag_run,
xcom`` etc in mysql workbench.
+
+B. Using Visual Studio Code
+
+- In Visual Studio Code open airflow project, directory ``\files\dags`` of
local machine is by default mounted to docker
+ machine when breeze airflow is started. So any DAG file present in this
directory will be picked automatically by
+ scheduler running in docker machine and same can be seen on
``http://127.0.0.1:28080``.
+
+- Copy any example DAG present in airflow project's ``\airflow\example_dags``
directory to ``\files\dags\``.
+
+- Add main block at the end of your DAG file to make it runnable. It will run
a back_fill job:
+
+ .. code-block:: python
+
+ if __name__ == '__main__':
+ from airflow.utils.state import State
+ dag.clear(dag_run_state=State.NONE)
+ dag.run()
+
+- Add ``"AIRFLOW__CORE__EXECUTOR": "DebugExecutor"`` to the ``"env"`` field of
Debug configuration.
+
+ - Using Run view click on create a launch.json file
Review comment:
```suggestion
- Using the `Run` view click on `Create a launch.json` file
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]