quoctienkt opened a new issue, #29765:
URL: https://github.com/apache/airflow/issues/29765

   ### Apache Airflow version
   
   2.5.1
   
   ### What happened
   
   I'm following Airflow tutorial, and I faced error when come to this doc:
   [Tutorials - Testing 
section](https://airflow.apache.org/docs/apache-airflow/stable/tutorial/fundamentals.html#id2)
   
   This is error stack trace:
   `
   D:\Documents\Projects\Learning\Airflow>airflow tasks test tutorial sleep 
2015-06-01
   
c:\users\quoct\appdata\local\programs\python\python38\lib\site-packages\airflow\models\base.py:49
 MovedIn20Warning: [31mDeprecated API features detected! These feature(s) are 
not compatible with SQLAlchemy 2.0. [32mTo prevent incompatible upgrades prior 
to updating applications, ensure requirements files are pinned to 
"sqlalchemy<2.0". [36mSet environment variable SQLALCHEMY_WARN_20=1 to show all 
deprecation warnings.  Set environment variable 
SQLALCHEMY_SILENCE_UBER_WARNING=1 to silence this message.[0m (Background on 
SQLAlchemy 2.0 at: 
   https://sqlalche.me/e/b8d9)
   [2023-02-25 21:16:09,229] {dagbag.py:538} INFO - Filling up the DagBag from 
C:\Users\quoct\airflow\dags
   [2023-02-25 21:16:09,644] {taskmixin.py:205} WARNING - Dependency 
<Task(BashOperator): create_entry_group>, delete_entry_group already registered 
for DAG: example_complex
   [2023-02-25 21:16:09,645] {taskmixin.py:205} WARNING - Dependency 
<Task(BashOperator): delete_entry_group>, create_entry_group already registered 
for DAG: example_complex
   [2023-02-25 21:16:09,646] {taskmixin.py:205} WARNING - Dependency 
<Task(BashOperator): create_entry_gcs>, delete_entry already registered for 
DAG: example_complex
   [2023-02-25 21:16:09,646] {taskmixin.py:205} WARNING - Dependency 
<Task(BashOperator): delete_entry>, create_entry_gcs already registered for 
DAG: example_complex
   [2023-02-25 21:16:09,647] {taskmixin.py:205} WARNING - Dependency 
<Task(BashOperator): create_tag>, delete_tag already registered for DAG: 
example_complex
   [2023-02-25 21:16:09,648] {taskmixin.py:205} WARNING - Dependency 
<Task(BashOperator): delete_tag>, create_tag already registered for DAG: 
example_complex
   [2023-02-25 21:16:09,818] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): prepare_email>, send_email already registered 
for DAG: example_dag_decorator
   [2023-02-25 21:16:09,819] {taskmixin.py:205} WARNING - Dependency 
<Task(EmailOperator): send_email>, prepare_email already registered for DAG: 
example_dag_decorator
   [2023-02-25 21:16:09,893] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): print_the_context>, log_sql_query already 
registered for DAG: example_python_operator
   [2023-02-25 21:16:09,894] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): log_sql_query>, print_the_context already 
registered for DAG: example_python_operator
   [2023-02-25 21:16:09,896] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): print_the_context>, log_sql_query already 
registered for DAG: example_python_operator
   [2023-02-25 21:16:09,897] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): log_sql_query>, print_the_context already 
registered for DAG: example_python_operator
   [2023-02-25 21:16:09,899] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): print_the_context>, log_sql_query already 
registered for DAG: example_python_operator
   [2023-02-25 21:16:09,900] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): log_sql_query>, print_the_context already 
registered for DAG: example_python_operator
   [2023-02-25 21:16:09,901] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): print_the_context>, log_sql_query already 
registered for DAG: example_python_operator
   [2023-02-25 21:16:09,902] {taskmixin.py:205} WARNING - Dependency 
<Task(_PythonDecoratedOperator): log_sql_query>, print_the_context already 
registered for DAG: example_python_operator
   
c:\users\quoct\appdata\local\programs\python\python38\lib\site-packages\airflow\cli\commands\task_command.py:159
 RemovedInAirflow3Warning: Calling `DAG.create_dagrun()` without an explicit 
data interval is deprecated
   [2023-02-25 21:16:10,813] {taskinstance.py:1083} INFO - Dependencies all met 
for <TaskInstance: tutorial.sleep 
__airflow_temporary_run_2023-02-25T14:16:10.736460+00:00__ [None]>
   [2023-02-25 21:16:10,821] {taskinstance.py:1083} INFO - Dependencies all met 
for <TaskInstance: tutorial.sleep 
__airflow_temporary_run_2023-02-25T14:16:10.736460+00:00__ [None]>
   [2023-02-25 21:16:10,822] {taskinstance.py:1279} INFO -
   
--------------------------------------------------------------------------------
   [2023-02-25 21:16:10,822] {taskinstance.py:1280} INFO - Starting attempt 1 
of 4
   [2023-02-25 21:16:10,823] {taskinstance.py:1281} INFO -
   
--------------------------------------------------------------------------------
   [2023-02-25 21:16:10,824] {taskinstance.py:1300} INFO - Executing 
<Task(BashOperator): sleep> on 2015-06-01T00:00:00+00:00
   [2023-02-25 21:16:10,984] {taskinstance.py:1507} INFO - Exporting the 
following env vars:
   [email protected]
   AIRFLOW_CTX_DAG_OWNER=airflow
   AIRFLOW_CTX_DAG_ID=tutorial
   AIRFLOW_CTX_TASK_ID=sleep
   AIRFLOW_CTX_EXECUTION_DATE=2015-06-01T00:00:00+00:00
   AIRFLOW_CTX_TRY_NUMBER=1
   
AIRFLOW_CTX_DAG_RUN_ID=__airflow_temporary_run_2023-02-25T14:16:10.736460+00:00__
   [2023-02-25 21:16:11,004] {subprocess.py:63} INFO - Tmp dir root location: 
    C:\Users\quoct\AppData\Local\Temp
   [2023-02-25 21:16:11,006] {subprocess.py:75} INFO - Running command: 
['bash', '-c', 'sleep 5']
   [2023-02-25 21:16:11,007] {taskinstance.py:1768} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"c:\users\quoct\appdata\local\programs\python\python38\lib\site-packages\airflow\operators\bash.py",
 line 187, in execute        
       result = self.subprocess_hook.run_command(
     File 
"c:\users\quoct\appdata\local\programs\python\python38\lib\site-packages\airflow\hooks\subprocess.py",
 line 77, in run_command   
       self.sub_process = Popen(
     File 
"c:\users\quoct\appdata\local\programs\python\python38\lib\subprocess.py", line 
761, in __init__
       raise ValueError("preexec_fn is not supported on Windows "
   ValueError: preexec_fn is not supported on Windows platforms
   `
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   Create a python file with below code:
   
   `
   
   from datetime import datetime, timedelta
   from textwrap import dedent
   
   # The DAG object; we'll need this to instantiate a DAG
   from airflow import DAG
   
   # Operators; we need this to operate!
   from airflow.operators.bash import BashOperator
   
   import subprocess
   subprocess.call('preexec_fn', shell=True)
   
   with DAG(
       "tutorial",
       # These args will get passed on to each operator
       # You can override them on a per-task basis during operator 
initialization
       default_args={
           "depends_on_past": False,
           "email": ["[email protected]"],
           "email_on_failure": False,
           "email_on_retry": False,
           "retries": 1,
           "retry_delay": timedelta(minutes=5),
           # 'queue': 'bash_queue',
           # 'pool': 'backfill',
           # 'priority_weight': 10,
           # 'end_date': datetime(2016, 1, 1),
           # 'wait_for_downstream': False,
           # 'sla': timedelta(hours=2),
           # 'execution_timeout': timedelta(seconds=300),
           # 'on_failure_callback': some_function,
           # 'on_success_callback': some_other_function,
           # 'on_retry_callback': another_function,
           # 'sla_miss_callback': yet_another_function,
           # 'trigger_rule': 'all_success'
       },
       description="A simple tutorial DAG",
       schedule=timedelta(days=1),
       start_date=datetime(2021, 1, 1),
       catchup=False,
       tags=["example"],
   ) as dag:
   
       # t1, t2 and t3 are examples of tasks created by instantiating operators
       t1 = BashOperator(
           task_id="print_date",
           bash_command="date",
       )
   
       t2 = BashOperator(
           task_id="sleep",
           depends_on_past=False,
           bash_command="sleep 5",
           retries=3,
       )
       t1.doc_md = dedent(
           """\
       #### Task Documentation
       You can document your task using the attributes `doc_md` (markdown),
       `doc` (plain text), `doc_rst`, `doc_json`, `doc_yaml` which gets
       rendered in the UI's Task Instance Details page.
       
![img](http://montcs.bloomu.edu/~bobmon/Semesters/2012-01/491/import%20soul.png)
       **Image Credit:** Randall Munroe, [XKCD](https://xkcd.com/license.html)
       """
       )
   
       dag.doc_md = __doc__  # providing that you have a docstring at the 
beginning of the DAG; OR
       dag.doc_md = """
       This is a documentation placed anywhere
       """  # otherwise, type it like this
       templated_command = dedent(
           """
       {% for i in range(5) %}
           echo "{{ ds }}"
           echo "{{ macros.ds_add(ds, 7)}}"
       {% endfor %}
       """
       )
   
       t3 = BashOperator(
           task_id="templated",
           depends_on_past=False,
           bash_command=templated_command,
       )
   
       t1 >> [t2, t3]
   `
   
   Then run 
   # initialize the database tables
   airflow db init
   
   # print the list of active DAGs
   airflow dags list
   
   # prints the list of tasks in the "tutorial" DAG
   airflow tasks list tutorial
   
   # prints the hierarchy of tasks in the "tutorial" DAG
   airflow tasks list tutorial --tree
   
   # command layout: command subcommand [dag_id] [task_id] [(optional) date]
   
   # testing print_date
   airflow tasks test tutorial print_date 2015-06-01
   
   # testing sleep
   airflow tasks test tutorial sleep 2015-06-01
   
   
   ### Operating System
   
   Window 10 21H2 19044.2604
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to