lozuwa opened a new issue, #23584:
URL: https://github.com/apache/airflow/issues/23584

   ### Apache Airflow version
   
   2.2.4
   
   ### What happened
   
   We have upgraded from Airflow 1.10.5 to Airflow 2.2.4. Upgrade path was 
1.10.5 -> 1.10.15 -> 2.1.0 -> 2.2.4
   
   Everything seems to work with the exception that new DAGs cannot be added to 
Airflow. They are not displayed in either the serialized_dag table or the 
webserver UI.
   
   
   ### What you think should happen instead
   
   If a new DAG is added to the dags folder path, Airflow should add it to the 
serialized_dag table and webserver should display it in its UI.
   
   ### How to reproduce
   
   Steps:
   1. Deploy Airflow 2.2.4 with gitSync enabled and persistance disabled. 
   2. Add a new DAG to the repository in the correct path.
   3. New dag can be listed using command `airflow dags list` but is not 
displayed either in the database or the UI.
   
   ### Operating System
   
   Kubernetes v1.20.15
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==3.3.0
   apache-airflow-providers-google==6.8.0
   apache-airflow-providers-slack==4.2.3
   apache-airflow-providers-postgres==4.1.0
   apache-airflow-providers-cncf-kubernetes==3.0.0
   apache-airflow-providers-ssh==2.4.3
   apache-airflow-providers-mysql
   
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   Standard deployment with the following parameters
   
   ```yaml
   config:
     core:
       dags_folder: '{{ include "airflow_dags" . }}'
       store_serialized_dags: "True"
       store_dag_code: "True"
       plugins_folder: "/opt/airflow/airflow-plugins/"
       load_examples: "False"
       executor: '{{ .Values.executor }}'
       donot_pickle: "False"
       secure_mode: "True"
       max_active_runs_per_dag: "16"
       dagbag_import_timeout: "120"
       sql_engine_encoding: "utf-8"
       sql_alchemy_pool_enabled: "True"
       sql_alchemy_pool_size: "10"
       sql_alchemy_max_overflow: "10"
       sql_alchemy_pool_recycle: "60"
       sql_alchemy_pool_pre_ping: "True"
     logging:
       logging_level: "INFO"
       colored_console_log: "True"
       # Remote logging
       remote_logging: "True"
       logging_config_class: "log_config.LOGGING_CONFIG"
       remote_log_folder: "xxx"
       remote_base_log_folder: "xxx"
       task_log_reader: "s3.task"
       #remote_log_conn_id: "s3liveintent"
       remote_log_conn_id: "aws_default"
       encrypt_s3_logs: "False"
     metrics:
       statsd_on: '{{ ternary "True" "False" .Values.statsd.enabled }}'
       statsd_port: 9125
       statsd_prefix: airflow
       statsd_host: '{{ printf "%s-statsd" .Release.Name }}'
     webserver:
       enable_proxy_fix: "True"
       expose_config: "True"
       rbac: "True"
       base_url: "http://xxx";
       web_server_host: "0.0.0.0"
       web_server_port: "8080"
       web_server_worker_timeout: "300"
     email:
       email_backend: "airflow.utils.email.send_email_smtp"
     smtp:
       smtp_host: "xxx"
       smtp_starttls: "True"
       smtp_ssl: "False"
       smtp_user: "xxx"
       smtp_password: "xxx"
       smtp_port: "587"
       smtp_mail_from: "xxx"
     celery:
       worker_concurrency: 32
     scheduler:
       # statsd params included for Airflow 1.10 backward compatibility; moved 
to [metrics] in 2.0
       statsd_on: '{{ ternary "True" "False" .Values.statsd.enabled }}'
       statsd_port: 9125
       statsd_prefix: airflow
       statsd_host: '{{ printf "%s-statsd" .Release.Name }}'
       # `run_duration` included for Airflow 1.10 backward compatibility; 
removed in 2.0.
       run_duration: 41460
       parsing_processes: "8"
       min_file_process_interval: "60"
       dag_dir_list_interval: "120"
       catchup_by_default: "False"
       job_heartbeat_sec: "10"
       scheduler_heartbeat_sec: "10"
     elasticsearch:
       json_format: 'True'
       log_id_template: "{dag_id}_{task_id}_{execution_date}_{try_number}"
     elasticsearch_configs:
       max_retries: 3
       timeout: 30
       retry_timeout: 'True'
     kerberos:
       keytab: '{{ .Values.kerberos.keytabPath }}'
       reinit_frequency: '{{ .Values.kerberos.reinitFrequency }}'
       principal: '{{ .Values.kerberos.principal }}'
       ccache: '{{ .Values.kerberos.ccacheMountPath }}/{{ 
.Values.kerberos.ccacheFileName }}'
     celery_kubernetes_executor:
       kubernetes_queue: 'kubernetes'
     kubernetes:
       namespace: '{{ .Release.Namespace }}'
       airflow_configmap: '{{ include "airflow_config" . }}'
       airflow_local_settings_configmap: '{{ include "airflow_config" . }}'
       pod_template_file: '{{ include "airflow_pod_template_file" . 
}}/pod_template_file.yaml'
       worker_container_repository: '{{ .Values.images.airflow.repository | 
default .Values.defaultAirflowRepository }}'
       worker_container_tag: '{{ .Values.images.airflow.tag | default 
.Values.defaultAirflowTag }}'
       multi_namespace_mode: '{{ if .Values.multiNamespaceMode }}True{{ else 
}}False{{ end }}'
   # yamllint enable rule:line-length
   ```
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to