jsnb-devoted opened a new issue #19172:
URL: https://github.com/apache/airflow/issues/19172


   ### Apache Airflow version
   
   2.1.3
   
   ### Operating System
   
   Debian GNU/Linux
   
   ### Versions of Apache Airflow Providers
   
   ```
   apache-airflow-providers-amazon==2.2.0
   apache-airflow-providers-cncf-kubernetes==2.0.2
   apache-airflow-providers-datadog==2.0.1
   apache-airflow-providers-ftp==2.0.1
   apache-airflow-providers-google==5.1.0
   apache-airflow-providers-http==2.0.1
   apache-airflow-providers-imap==2.0.1
   apache-airflow-providers-jdbc==2.0.1
   apache-airflow-providers-postgres==2.2.0
   apache-airflow-providers-snowflake==2.1.1
   apache-airflow-providers-sqlite==2.0.1
   apache-airflow-providers-ssh==2.1.1
   ```
   
   ### Deployment
   
   Other 3rd-party Helm chart
   
   ### Deployment details
   
   running the k8s executor
   
   ### What happened
   
   Running `airflow tasks run -A <dag id> <task id> "$(date 
+"%m-%d-%YT%H:%M:%S.%3N")"` with k8s executor to execute a single task from the 
command line _without running the whole dag._ The command launches the pod and 
runs the task as expected, but once the task exits it tries to query the 
database for a DAG run that doesn't exist. 
   
   When I tail the pod logs I see the following traceback **after the task 
succeeds or fails**:
   ```
   [2021-10-22 17:56:46,403] {{local_task_job.py:151}} INFO - Task exited with 
return code 1
   Traceback (most recent call last):
     File "/usr/local/bin/airflow", line 8, in <module>
       sys.exit(main())
     File "/usr/local/lib/python3.8/site-packages/airflow/__main__.py", line 
40, in main
       args.func(args)
     File "/usr/local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", 
line 48, in command
       return func(*args, **kwargs)
     File "/usr/local/lib/python3.8/site-packages/airflow/utils/cli.py", line 
91, in wrapper
       return f(*args, **kwargs)
     File 
"/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", 
line 238, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", 
line 64, in _run_task_by_selected_method
       _run_task_by_local_task_job(args, ti)
     File 
"/usr/local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py", 
line 121, in _run_task_by_local_task_job
       run_job.run()
     File "/usr/local/lib/python3.8/site-packages/airflow/jobs/base_job.py", 
line 245, in run
       self._execute()
     File 
"/usr/local/lib/python3.8/site-packages/airflow/jobs/local_task_job.py", line 
128, in _execute
       self.handle_task_exit(return_code)
     File 
"/usr/local/lib/python3.8/site-packages/airflow/jobs/local_task_job.py", line 
166, in handle_task_exit
       self._run_mini_scheduler_on_child_tasks()
     File "/usr/local/lib/python3.8/site-packages/airflow/utils/session.py", 
line 70, in wrapper
       return func(*args, session=session, **kwargs)
     File 
"/usr/local/lib/python3.8/site-packages/airflow/jobs/local_task_job.py", line 
227, in _run_mini_scheduler_on_child_tasks
       dag_run = with_row_locks(
     File "/usr/local/lib/python3.8/site-packages/sqlalchemy/orm/query.py", 
line 3500, in one
       raise orm_exc.NoResultFound("No row was found for one()")
   sqlalchemy.orm.exc.NoResultFound: No row was found for one()
   2021/10/22 17:56:47.552525 [ERR] (cli) unexpected exit from subprocess (1)
   time="2021-10-22T17:56:47Z" level=info msg="Supervised subcommand exited" 
exitStatus=1 pid=13
   time="2021-10-22T17:56:47Z" level=info msg=Writing status=terminated 
statusPath=/run/vault/status-base
   time="2021-10-22T17:56:47Z" level=info msg=Exiting... exitCode=1
   ```
   
   ### What you expected to happen
   
   I expected `airflow tasks run` to be able to run the task without needing a 
DAG run.
   
   I took a quick look at `local_task_job` and it seems like the "mini 
scheduler" is causing the error to happen. I don't know enough about that 
feature but it feels like maybe [the except 
block](https://github.com/apache/airflow/blob/614858fb7d443880451e6111b27fdaf942f563a4/airflow/jobs/local_task_job.py#L264)
 could be expanded to catch more sqlalchemy errors?
   
   ### How to reproduce
   
   I would think any execution of:
   `airflow tasks run -A <dag id> <task id> "$(date +"%m-%d-%YT%H:%M:%S.%3N")"`
   should trigger this error. Admittedly I had never tried to use this command 
before -- not sure if this was always the expected behavior.
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to