Hforsman opened a new issue, #64476:
URL: https://github.com/apache/airflow/issues/64476

   ### Apache Airflow Provider(s)
   
   standard
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow==3.1.8
   apache-airflow-core==3.1.8
   apache-airflow-providers-cncf-kubernetes==10.13.0
   apache-airflow-providers-standard==1.12.0
   Python 3.11
   
   ### Apache Airflow version
   
   apache-airflow==3.1.8
   
   ### Operating System
   
   PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   Helm Chart version 1.20.0 being deployed in an air-gapped environment
   
   ```
   postgres:
     env: prod
     name: airflow-postgres
     cpu: 256m
     memory: 512Mi
     isInferenceNamespace: false
     workspace:
       name: data-airflow
   
   airflow-official:
     defaultAirflowRepository: 
"docker-io.docker-proxy.company.repo/apache/airflow"
     defaultAirflowTag: "3.1.8-python3.11"
   
     # Airflow version (Used to make some decisions based on Airflow Version 
being deployed)
     airflowVersion: "3.1.8"
   
     fullnameOverride: airflow
     nameOverride: airflow
     multiNamespaceMode: false
     uid: 1000
     gid: 1000
   
     secret:
       - envName: "AIRFLOW_CONN_MYS3CONN"
         secretName: "airflow-s3-connection"
         secretKey: "connection"
   
     registry:
       secretName: "regcred"
   
     postgresql:
       enabled: false
   
     workers:
       resources:
         limits:
           cpu: "1"
           memory: 1024Mi
         requests:
           cpu: 100m
           memory: 1024Mi
       extraVolumes:
         - name: ca-certificates
           configMap:
             name: ca-certificates
         - name: pip-config
           configMap:
             name: airflow-pip-configmap
       extraVolumeMounts:
         - name: ca-certificates
           mountPath: /etc/ssl/certs/
           readOnly: true
         - name: ca-certificates
           mountPath: 
/home/airflow/.local/lib/python3.11/site-packages/certifi/cacert.pem
           subPath: ca-certificates.crt
           readOnly: true
         - name: pip-config
           mountPath: /etc/xdg/pip/
           readOnly: true
       securityContext:
         runAsUser: 1337
       persistence:
         enabled: false
   
     cleanup:
       enabled: false  # External cleaning process
   
     dags:
       gitSync:
         knownHosts: |-
           # list of hosts for company
         uid: 1337
         enabled: true
         repo: [email protected]:path/to/airflow-example.git  # comes from 
frontend.
         branch: main  # comes from frontend.
         rev: HEAD
         subPath: "dags"
         sshKeySecret: airflow-ssh-secret
         resources:
           limits:
             cpu: 500m
             memory: 128Mi
           requests:
             cpu: 100m
             memory: 128Mi
         extraVolumeMounts:
           - name: ca-certificates
             mountPath: /etc/ssl/certs/
             readOnly: true
   
     config:
       core:
         remote_logging: "True"
       logging:
         remote_logging: "True"
         remote_log_conn_id: "MYS3CONN"
         encrypt_s3_logs: "False"
         remote_base_log_folder: "s3://project/airflow/logs"  # passed from 
backend
         logging_level: "INFO"
       api:
         enable_swagger_ui: "False"
         rbac: "False"
         authenticate: "False"
   
     scheduler:
       annotations:
         secret.reloader.stakater.com/reload: airflow-s3-connection
       resources:
         limits:
           cpu: "1"
           memory: 2048Mi
         requests:
           cpu: 512m
           memory: 2048Mi
       env:
         - name: AWS_CA_BUNDLE
           value: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
       extraVolumes:
         - name: ca-certificates
           configMap:
             name: ca-certificates
         - name: pip-config
           configMap:
             name: airflow-pip-configmap
       extraVolumeMounts:
         - name: ca-certificates
           mountPath: /etc/ssl/certs/
           readOnly: true
         - name: ca-certificates
           mountPath: 
/home/airflow/.local/lib/python3.11/site-packages/certifi/cacert.pem
           subPath: ca-certificates.crt
           readOnly: true
         - name: pip-config
           mountPath: /etc/xdg/pip/
           readOnly: true
       waitForMigrations:
         enabled: false
       logGroomerSidecar:
         enabled: false
     statsd:
       enabled: false
     redis:
       enabled: false
     enableBuiltInSecretEnvVars:
       AIRFLOW__CELERY__RESULT_BACKEND: false
       AIRFLOW__CELERY__BROKER_URL: false
       AIRFLOW__ELASTICSEARCH__HOST: false
     executor: "LocalExecutor,KubernetesExecutor"
   
     triggerer:
       annotations:
           secret.reloader.stakater.com/reload: airflow-s3-connection
       waitForMigrations:
         enabled: false
       resources:
         limits:
           cpu: 1024m
           memory: 1024Mi
         requests:
           cpu: 100m
           memory: 1024Mi
       extraVolumes:
         - name: ca-certificates
           configMap:
             name: ca-certificates
       extraVolumeMounts:
         - name: ca-certificates
           mountPath: /etc/ssl/certs/
           readOnly: true
         - name: ca-certificates
           mountPath: 
/home/airflow/.local/lib/python3.11/site-packages/certifi/cacert.pem
           subPath: ca-certificates.crt
           readOnly: true
       logGroomerSidecar:
         enabled: false
       persistence:
         enabled: false
   
     apiSecretKeySecretName: airflow-api-server-secret-key
   
     apiServer:
       annotations:
         secret.reloader.stakater.com/reload: airflow-s3-connection
       startupProbe:
         timeoutSeconds: 60
       service:
         ports:
           - name: http-airflow-ui
             port: 8080
       waitForMigrations:
         enabled: false
       resources:
         limits:
           cpu: "2"
           memory: 2536Mi
         requests:
           cpu: "2"
           memory: 2536Mi
       env:
         - name: AWS_CA_BUNDLE
           value: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
       extraVolumes:
         - name: ca-certificates
           configMap:
             name: ca-certificates
       extraVolumeMounts:
         - name: ca-certificates
           mountPath: /etc/ssl/certs/
           readOnly: true
         - name: ca-certificates
           mountPath: 
/home/airflow/.local/lib/python3.11/site-packages/certifi/cacert.pem
           subPath: ca-certificates.crt
           readOnly: true
   
     dagProcessor:
       annotations:
         secret.reloader.stakater.com/reload: airflow-s3-connection
       waitForMigrations:
         enabled: false
       resources:
         limits:
           cpu: 200m
           memory: 1Gi
         requests:
           cpu: 200m
           memory: 1Gi
       env:
         - name: AWS_CA_BUNDLE
           value: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
       extraVolumes:
         - name: ca-certificates
           configMap:
             name: ca-certificates
       extraVolumeMounts:
         - name: ca-certificates
           mountPath: /etc/ssl/certs/
           readOnly: true
         - name: ca-certificates
           mountPath: 
/home/airflow/.local/lib/python3.11/site-packages/certifi/cacert.pem
           subPath: ca-certificates.crt
           readOnly: true
       logGroomerSidecar:
         enabled: false
   
     createUserJob:
       jobAnnotations:
         argocd.argoproj.io/hook: PostSync
         argocd.argoproj.io/hook-delete-policy: HookSucceeded  # Helm hooks are 
apparently not picked up with our current structure of helm/customize/argo-cd, 
so we delete the job such that argo-cd doesn't give an error on re-sync.
       useHelmHooks: false
       resources:
         limits:
           cpu: 500m
           memory: 1024Mi
         requests:
           cpu: 100m
           memory: 1024Mi
   
     migrateDatabaseJob:
       jobAnnotations:
         argocd.argoproj.io/hook: Sync
         argocd.argoproj.io/hook-delete-policy: HookSucceeded  # Helm hooks are 
apparently not picked up with our current structure of helm/customize/argo-cd, 
so we delete the job such that argo-cd doesn't give an error on re-sync.
       useHelmHooks: false
       resources:
         limits:
           cpu: "1"
           memory: 2024Mi
         requests:
           cpu: "1"
           memory: 2024Mi
   
     data:
       metadataConnection:
         host: postgres-airflow-postgres
   
   ```
   
   ```
   apiVersion: v1
   kind: ConfigMap
   metadata:
     name: airflow-{{ .Values.name }}-sync-s3-key-secret-yaml
   data:
     secret.yaml: |-
       kind: Secret
       apiVersion: v1
       metadata:
         name: airflow-s3-connection
       data:
         connection: PLACEHOLDER
       type: Opaque
   ```
   
   ### What happened
   
   We're upgrading from Airflow 2 to Airflow 3 and running into the issue that 
the scheduler crashes when triggering a Python DAG. As long as the DAG is 
scheduled the scheduler will come up and immediately crash again. Only 
disabling the DAG will result in a stable scheduler
   
   The DAG that crashes the scheduler:
   ```
   import datetime
   
   from airflow.sdk import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   
   
   def call_me():
       print("bye airflow")
   
   with DAG(
           dag_id="marks-simple-dag",
           start_date=datetime.datetime(2026, 3, 27),
   ):
       PythonOperator(task_id="this-is-my-id", python_callable=call_me)
   ```
   
   The stack trace from the scheduler:
   ```
   2026-03-30T11:02:57.825045Z [info ] Reset the following 1 orphaned 
TaskInstances:
   <TaskInstance: marks-simple-dag.this-is-my-id 
manual__2026-03-30T11:00:54.786381+00:00 [queued]> 
[airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
loc=scheduler_job_runner.py:2453
   2026-03-30T11:02:58.114364Z [info ] 1 tasks up for execution:
   <TaskInstance: marks-simple-dag.this-is-my-id 
manual__2026-03-30T11:00:54.786381+00:00 [scheduled]> 
[airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
loc=scheduler_job_runner.py:448
   2026-03-30T11:02:58.114635Z [info ] DAG marks-simple-dag has 0/16 running 
and queued tasks [airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
loc=scheduler_job_runner.py:520
   2026-03-30T11:02:58.114772Z [info ] Setting the following tasks to queued 
state (scheduler job_id=4):
   <TaskInstance: marks-simple-dag.this-is-my-id 
manual__2026-03-30T11:00:54.786381+00:00 [scheduled]> 
(id=019d3e69-3e20-78e2-9398-043565823e73, try_number=2) 
[airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
loc=scheduler_job_runner.py:661
   2026-03-30T11:02:58.117227Z [info ] Trying to enqueue tasks: [<TaskInstance: 
marks-simple-dag.this-is-my-id manual__2026-03-30T11:00:54.786381+00:00 
[scheduled]>] for executor: LocalExecutor(parallelism=32) 
[airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
loc=scheduler_job_runner.py:765
   /home/airflow/.local/lib/python3.11/site-packages/jwt/api_jwt.py:153 
InsecureKeyLengthWarning: The HMAC key is 32 bytes long, which is below the 
minimum recommended length of 64 bytes for SHA512. See RFC 7518 Section 3.2.
   2026-03-30T11:02:58.420536Z [info ] Secrets backends loaded for worker 
[supervisor] backend_classes=['EnvironmentVariablesBackend', 
'MetastoreBackend'] count=2 loc=supervisor.py:1975
   2026-03-30T11:02:58.514435Z [info ] Process exited [supervisor] 
exit_code=<Negsignal.SIGKILL: -9> loc=supervisor.py:710 pid=167 
signal_sent=SIGKILL
   2026-03-30T11:02:58.515297Z [error ] uhoh 
[airflow.executors.local_executor.LocalExecutor] loc=local_executor.py:100
   Traceback (most recent call last):
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/local_executor.py",
 line 96, in _run_worker
   _execute_work(log, workload)
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/local_executor.py",
 line 124, in _execute_work
   supervise(
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/execution_time/supervisor.py",
 line 1984, in supervise
   process = ActivitySubprocess.start(
   ^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/execution_time/supervisor.py",
 line 955, in start
   proc._on_child_started(ti=what, dag_rel_path=dag_rel_path, 
bundle_info=bundle_info)
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/execution_time/supervisor.py",
 line 966, in _on_child_started
   ti_context = self.client.task_instances.start(ti.id, self.pid, start_date)
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/api/client.py", 
line 215, in start
   resp = self.client.patch(f"task-instances/{id}/run", 
content=body.model_dump_json())
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/home/airflow/.local/lib/python3.11/site-packages/httpx/_client.py", 
line 1218, in patch
   return self.request(
   ^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/tenacity/__init__.py", line 
331, in wrapped_f
   return copy(f, *args, **kw)
   ^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/tenacity/__init__.py", line 
470, in __call__
   do = self.iter(retry_state=retry_state)
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/tenacity/__init__.py", line 
371, in iter
   result = action(retry_state)
   ^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/tenacity/__init__.py", line 
393, in <lambda>
   self._add_action_func(lambda rs: rs.outcome.result())
   ^^^^^^^^^^^^^^^^^^^
   File "/usr/python/lib/python3.11/concurrent/futures/_base.py", line 449, in 
result
   return self.__get_result()
   ^^^^^^^^^^^^^^^^^^^
   File "/usr/python/lib/python3.11/concurrent/futures/_base.py", line 401, in 
__get_result
   raise self._exception
   File 
"/home/airflow/.local/lib/python3.11/site-packages/tenacity/__init__.py", line 
473, in __call__
   result = fn(*args, **kwargs)
   ^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/api/client.py", 
line 887, in request
   return super().request(*args, **kwargs)
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/home/airflow/.local/lib/python3.11/site-packages/httpx/_client.py", 
line 825, in request
   return self.send(request, auth=auth, follow_redirects=follow_redirects)
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/home/airflow/.local/lib/python3.11/site-packages/httpx/_client.py", 
line 914, in send
   response = self._send_handling_auth(
   ^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/home/airflow/.local/lib/python3.11/site-packages/httpx/_client.py", 
line 942, in _send_handling_auth
   response = self._send_handling_redirects(
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/home/airflow/.local/lib/python3.11/site-packages/httpx/_client.py", 
line 999, in _send_handling_redirects
   raise exc
   File "/home/airflow/.local/lib/python3.11/site-packages/httpx/_client.py", 
line 982, in _send_handling_redirects
   hook(response)
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/sdk/api/client.py", 
line 186, in raise_on_4xx_5xx_with_note
   return get_json_error(response) or response.raise_for_status()
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File "/home/airflow/.local/lib/python3.11/site-packages/httpx/_models.py", 
line 829, in raise_for_status
   raise HTTPStatusError(message, request=request, response=self)
   httpx.HTTPStatusError: Client error '401 Unauthorized' for url 
'http://my-namespace-airflow-app-api-server:8080/execution/task-instances/019d3e69-3e20-78e2-9398-043565823e73/run'
   For more information check: 
https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401
   Correlation-id=019d3e69-40ba-726c-9b0c-ba88e3ecbe54
   2026-03-30T11:02:59.437611Z [error ] Exception when executing 
SchedulerJob._run_scheduler_loop 
[airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
loc=scheduler_job_runner.py:1130
   Traceback (most recent call last):
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/jobs/scheduler_job_runner.py",
 line 1126, in _execute
   self._run_scheduler_loop()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/jobs/scheduler_job_runner.py",
 line 1424, in _run_scheduler_loop
   executor.heartbeat()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/traces/tracer.py", 
line 58, in wrapper
   return func(*args, **kwargs)
   ^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/base_executor.py",
 line 257, in heartbeat
   self.sync()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/local_executor.py",
 line 241, in sync
   self._read_results()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/local_executor.py",
 line 246, in _read_results
   key, state, exc = self.result_queue.get()
   ^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/python/lib/python3.11/multiprocessing/queues.py", line 367, in get
   return _ForkingPickler.loads(res)
   ^^^^^^^^^^^^^^^^^^^^^^^^^^
   TypeError: HTTPStatusError.__init__() missing 2 required keyword-only 
arguments: 'request' and 'response'
   2026-03-30T11:02:59.440652Z [info ] Shutting down LocalExecutor; waiting for 
running tasks to finish. Signal again if you don't want to wait. 
[airflow.executors.local_executor.LocalExecutor] loc=local_executor.py:252
   2026-03-30T11:03:00.091055Z [info ] Shutting down Kubernetes executor 
[airflow.providers.cncf.kubernetes.executors.kubernetes_executor.KubernetesExecutor]
 loc=kubernetes_executor.py:729
   2026-03-30T11:03:19.103661Z [warning ] kube_watcher didn't terminate in 
time=<KubernetesJobWatcher name='KubernetesJobWatcher-36' pid=161 parent=7 
started> 
[airflow.providers.cncf.kubernetes.executors.kubernetes_executor_utils.AirflowKubernetesScheduler]
 loc=kubernetes_executor_utils.py:715
   2026-03-30T11:03:19.150666Z [info ] Exited execute loop 
[airflow.jobs.scheduler_job_runner.SchedulerJobRunner] 
loc=scheduler_job_runner.py:1142
   Traceback (most recent call last):
   File "/home/airflow/.local/bin/airflow", line 6, in <module>
   sys.exit(main())
   ^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/__main__.py", line 
55, in main
   args.func(args)
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/cli/cli_config.py", 
line 49, in command
   return func(*args, **kwargs)
   ^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/cli.py", line 
114, in wrapper
   return f(*args, **kwargs)
   ^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/providers_configuration_loader.py",
 line 54, in wrapped_function
   return func(*args, **kwargs)
   ^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/cli/commands/scheduler_command.py",
 line 52, in scheduler
   run_command_with_daemon_option(
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/cli/commands/daemon_utils.py",
 line 86, in run_command_with_daemon_option
   callback()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/cli/commands/scheduler_command.py",
 line 55, in <lambda>
   callback=lambda: _run_scheduler_job(args),
   ^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/cli/commands/scheduler_command.py",
 line 43, in _run_scheduler_job
   run_job(job=job_runner.job, execute_callable=job_runner._execute)
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/utils/session.py", 
line 100, in wrapper
   return func(*args, session=session, **kwargs)
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/jobs/job.py", line 
368, in run_job
   return execute_job(job, execute_callable=execute_callable)
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/jobs/job.py", line 
397, in execute_job
   ret = execute_callable()
   ^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/jobs/scheduler_job_runner.py",
 line 1126, in _execute
   self._run_scheduler_loop()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/jobs/scheduler_job_runner.py",
 line 1424, in _run_scheduler_loop
   executor.heartbeat()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/traces/tracer.py", 
line 58, in wrapper
   return func(*args, **kwargs)
   ^^^^^^^^^^^^^^^^^^^^^
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/base_executor.py",
 line 257, in heartbeat
   self.sync()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/local_executor.py",
 line 241, in sync
   self._read_results()
   File 
"/home/airflow/.local/lib/python3.11/site-packages/airflow/executors/local_executor.py",
 line 246, in _read_results
   key, state, exc = self.result_queue.get()
   ^^^^^^^^^^^^^^^^^^^^^^^
   File "/usr/python/lib/python3.11/multiprocessing/queues.py", line 367, in get
   return _ForkingPickler.loads(res)
   ^^^^^^^^^^^^^^^^^^^^^^^^^^
   TypeError: HTTPStatusError.__init__() missing 2 required keyword-only 
arguments: 'request' and 'response'
   INFO: Shutting down
   INFO: Waiting for application shutdown.
   INFO: Application shutdown complete.
   INFO: Finished server process [44]
   ```
   
   ### What you think should happen instead
   
   I don't think that running a PythonOperator DAG should result in this error
   
   ### How to reproduce
   
   Launch the official Airflow Helm Chart with the provided values.yaml and 
configmap.
   When everything is up trigger the provided DAG
   Watch the scheduler status and logs
   
   ### Anything else
   
   This happens every time we try to run a PythonOperator DAG.
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to