gkoller opened a new issue #16106:
URL: https://github.com/apache/airflow/issues/16106


   **Apache Airflow version**: 
   
   * 2.1.0 
   * 2.0.2 
   * 2.0.1
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): 
   
   n/a
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: 
     - Thinkpad T480
   - **OS** (e.g. from /etc/os-release): 
     - FreeBSD 13.0
   - **Kernel** (e.g. `uname -a`):  
     - `FreeBSD ketoembar 13.0-RELEASE FreeBSD 13.0-RELEASE #0 
releng/13.0-n244733-ea31abc261f: Fri Apr  9 04:24:09 UTC 2021     
[email protected]:/usr/obj/usr/src/amd64.amd64/sys/GENERIC  amd64`
   - **Install tools**: 
     - pip 21.1.2
   - **Others**: 
     - Python 3.8
   
   **What happened**:
   
   I wanted to debug a number of DAGs using PyCharm's debugger. Hence I 
configured Airflow's executor to be the `DebugExecutor` to prevent it from 
spawning processs/threads. However every DAG now fails with:
   
   ```
   Traceback (most recent call last):
     File "/usr/home/guido/devel/airflow_debug_executor/venv/bin/airflow", line 
8, in <module>
       sys.exit(main())
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/__main__.py",
 line 40, in main
       args.func(args)
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/cli/cli_parser.py",
 line 48, in command
       return func(*args, **kwargs)
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/utils/cli.py",
 line 91, in wrapper
       return f(*args, **kwargs)
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/cli/commands/scheduler_command.py",
 line 64, in scheduler
       job.run()
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/jobs/base_job.py",
 line 237, in run
       self._execute()
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
 line 1286, in _execute
       self._run_scheduler_loop()
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
 line 1381, in _run_scheduler_loop
       self.executor.heartbeat()
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/executors/base_executor.py",
 line 162, in heartbeat
       self.sync()
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/executors/debug_executor.py",
 line 72, in sync
       task_succeeded = self._run_task(ti)
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/executors/debug_executor.py",
 line 86, in _run_task
       ti._run_finished_callback()  # pylint: disable=protected-access
     File 
"/usr/home/guido/devel/airflow_debug_executor/venv/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1363, in _run_finished_callback
       task = self.task
   AttributeError: 'TaskInstance' object has no attribute 'task'
   ```
   
   **What you expected to happen**:
   
   That the DAGs  run without that error.
   
   **How to reproduce it**:
   
   I have reproduced it using the installation instructions at 
http://airflow.apache.org/docs/apache-airflow/stable/start/local.html
   
   Basic virtualenv setup
   ```console
   mkdir airflow_debug_executor
   cd airflow_debug_executor
   python3.8 -m venv venv
   source venv/bin/activate
   pip install -U pip setuptools wheel
   ```
   
   Airflow installation:
   
   ```console
   export AIRFLOW_HOME=$(pwd)/airflow
   pip install apache-airflow==2.1.0 --constraint 
https://raw.githubusercontent.com/apache/airflow/constraints-2.1.0/constraints-3.8.txt
   airflow db init
   airflow users create --username admin --firstname Peter --lastname Parker 
--role Admin --email [email protected]
   ```
   
   * Configure Airflow to use `DebugExecutor` by editing `airflow.cfg`
   * Start `airflow webserver --port 8080` 
   * Start `airflow scheduler`
   * In browser trigger `example_bash_operator`
   * Observer first task `runme_0` fail with `AttributeError: 'TaskInstance' 
object has no attribute 'task`
   
   **Anything else we need to know**: 
   
   In our environment we use PostgreSQL. The steps in **How to reproduce it** 
use SQLite. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to