easontm opened a new issue #20143:
URL: https://github.com/apache/airflow/issues/20143


   ### Apache Airflow version
   
   2.2.2 (latest released)
   
   ### What happened
   
   The [example task instance 
mutation](https://airflow.apache.org/docs/apache-airflow/stable/concepts/cluster-policies.html#task-instance-mutation)
 shows a queue mutation:
   ```python
   def task_instance_mutation_hook(task_instance: TaskInstance):
       if task_instance.try_number >= 1:
           task_instance.queue = 'retry_queue'
   ```
   
   An easy way for me to validate that queue mutation works is to use my 
`CeleryKubernetesExecutor` deployment, and make the hook send the task to the 
Kube queue.
   
   **What happens**: If I trigger a new DAGrun, I get print statements from my 
pod mutation hook (so it did successfully mutate the task and send it to Kube) 
and the task does not appear in the Celery worker logs. However, if I then 
clear the task from the UI and let the scheduler automatically add it again, it 
goes to the Celery worker.
   
   Task log 1:
   ```
   *** Reading local file: 
/usr/local/airflow/logs/foo/bar/2021-12-07T01:00:00+00:00/1.log
   [2021-12-08, 14:13:28 UTC] {taskinstance.py:1035} INFO - Dependencies all 
met for <TaskInstance: foo.bar manual__2021-12-07T01:00:00+00:00 [queued]>
   [2021-12-08, 14:13:28 UTC] {taskinstance.py:1035} INFO - Dependencies all 
met for <TaskInstance: foo.bar manual__2021-12-07T01:00:00+00:00 [queued]>
   [2021-12-08, 14:13:28 UTC] {taskinstance.py:1241} INFO - 
   
--------------------------------------------------------------------------------
   [2021-12-08, 14:13:28 UTC] {taskinstance.py:1242} INFO - Starting attempt 1 
of 1
   [2021-12-08, 14:13:28 UTC] {taskinstance.py:1243} INFO - 
   
--------------------------------------------------------------------------------
   [2021-12-08, 14:13:28 UTC] {taskinstance.py:1262} INFO - Executing 
<Task(PythonOperator): bar> on 2021-12-07 01:00:00+00:00
   [2021-12-08, 14:13:28 UTC] {standard_task_runner.py:52} INFO - Started 
process 41 to run task
   [2021-12-08, 14:13:28 UTC] {standard_task_runner.py:76} INFO - Running: 
['airflow', 'tasks', 'run', 'foo', 'bar', 'manual__2021-12-07T01:00:00+00:00', 
'--job-id', '2572809', '--raw', '--subdir', 'DAGS_FOLDER/foo/foo.py', 
'--cfg-path', '/tmp/tmpbk0cglnj', '--error-file', '/tmp/tmpcua95x7x']
   [2021-12-08, 14:13:28 UTC] {standard_task_runner.py:77} INFO - Job 2572809: 
Subtask bar
   [2021-12-08, 14:13:28 UTC] {logging_mixin.py:109} INFO - Running 
<TaskInstance: foo.bar manual__2021-12-07T01:00:00+00:00 [running]> on host 
foobar.7314175dfa62
   [2021-12-08, 14:13:28 UTC] {logging_mixin.py:109} INFO - CUSTOM_HOOK - 
Mutating pod for task foo.bar.2021-12-07T01_00_00_plus_00_00
   [2021-12-08, 14:13:28 UTC] {taskinstance.py:1429} INFO - Exporting the 
following env vars:
   ```
   
   Task log 2:
   ```
   *** Reading local file: 
/usr/local/airflow/logs/foo/bar/2021-12-07T01:00:00+00:00/2.log
   [2021-12-08, 14:16:59 UTC] {taskinstance.py:1035} INFO - Dependencies all 
met for <TaskInstance: foo.bar manual__2021-12-07T01:00:00+00:00 [queued]>
   [2021-12-08, 14:16:59 UTC] {taskinstance.py:1035} INFO - Dependencies all 
met for <TaskInstance: foo.bar manual__2021-12-07T01:00:00+00:00 [queued]>
   [2021-12-08, 14:16:59 UTC] {taskinstance.py:1241} INFO - 
   
--------------------------------------------------------------------------------
   [2021-12-08, 14:16:59 UTC] {taskinstance.py:1242} INFO - Starting attempt 2 
of 2
   [2021-12-08, 14:16:59 UTC] {taskinstance.py:1243} INFO - 
   
--------------------------------------------------------------------------------
   [2021-12-08, 14:16:59 UTC] {taskinstance.py:1262} INFO - Executing 
<Task(PythonOperator): bar> on 2021-12-07 01:00:00+00:00
   [2021-12-08, 14:16:59 UTC] {standard_task_runner.py:52} INFO - Started 
process 81 to run task
   [2021-12-08, 14:16:59 UTC] {standard_task_runner.py:76} INFO - Running: 
['airflow', 'tasks', 'run', 'foo', 'bar', 'manual__2021-12-07T01:00:00+00:00', 
'--job-id', '2572817', '--raw', '--subdir', 'DAGS_FOLDER/foo/foo.py', 
'--cfg-path', '/tmp/tmphi0jnedl', '--error-file', '/tmp/tmp26kgm3k9']
   [2021-12-08, 14:16:59 UTC] {standard_task_runner.py:77} INFO - Job 2572817: 
Subtask bar
   [2021-12-08, 14:16:59 UTC] {logging_mixin.py:109} INFO - Running 
<TaskInstance: foo.bar manual__2021-12-07T01:00:00+00:00 [running]> on host 
airflow-celery-worker-5654798876-4pnfx
   [2021-12-08, 14:16:59 UTC] {taskinstance.py:1429} INFO - Exporting the 
following env vars:
   ```
   
   And then the task appears in my Celery worker's logs
   ```
   [2021-12-08 14:16:58,713: INFO/ForkPoolWorker-16] Executing command in 
Celery: ['airflow', 'tasks', 'run', 'foo', 'bar', 
'manual__2021-12-07T01:00:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/foo/foo.py']
   ```
   
   ### What you expected to happen
   
   The task should go to the kube queue every time.
   
   ### How to reproduce
   
   Using CeleryKubernetesExecutor:
   1. in `airflow_local_settings.py`
   ```python
   def task_instance_mutation_hook(task_instance):
       if task_instance.queue == "default":
           task_instance.queue = "kubernetes"
   ```
   2. Let the scheduler create a DAGrun for anything that's not a 
KubernetesPodOperator
   3. Observe that the task is executed in a pod
   4. Clear the task and let the scheduler run it again
   5. Check your Celery worker logs for the task.
   
   ### Operating System
   
   Ubuntu 19.10
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to