cguermi opened a new issue, #23364:
URL: https://github.com/apache/airflow/issues/23364

   ### Official Helm Chart version
   
   1.5.0 (latest released)
   
   ### Apache Airflow version
   
   2.2.4
   
   ### Kubernetes Version
   
   1.22.2
   
   ### Helm Chart configuration
   
   # ===========================================#
   # Apache Airflow related configuration       #
   # ===========================================#
   
   airflow:
     executor: "CeleryExecutor"
     config:
       core: 
         default_timezone:  Europe/Paris
       api:
         auth_backend: airflow.api.auth.backend.basic_auth
       webserver:
         default_ui_timezone: Europe/Paris
         base_url: 
http://ep-prod-regions-webserver.ep-prod-regions.svc.cluster.local:8080
     registry:
       secretName: harbor-secret-password
     scheduler:
       resources:
       # We usually recommend not to specify default resources and to leave 
this as a conscious
       # choice for the user. This also increases chances charts run on 
environments with little
       # resources, such as Minikube. If you do want to specify resources, 
uncomment the following
       # lines, adjust them as necessary, and remove the curly braces after 
'resources:'.
         limits:
           memory: 1G
         requests:
           memory: 1G
       securityContext: 
         runAsUser: 0
         fsGroup: 0
         runAsGroup: 0
       nodeSelector:
         environment: ep-prod-regions
     statsd:
       nodeSelector:
         environment: ep-prod-regions
     webserver:
       resources:
       # We usually recommend not to specify default resources and to leave 
this as a conscious
       # choice for the user. This also increases chances charts run on 
environments with little
       # resources, such as Minikube. If you do want to specify resources, 
uncomment the following
       # lines, adjust them as necessary, and remove the curly braces after 
'resources:'.
         limits:
           memory: 1G
         requests:
           memory: 1G
       nodeSelector:
         environment: ep-prod-regions
       extraVolumes:
         - name: configuration
           persistentVolumeClaim:
               claimName: configuration-claim
       extraVolumeMounts:
         - name: configuration
           mountPath: /opt/airflow/crystal
           subPath: CrystalEnergyConf/client/data
       defaultUser:
         enabled: true
         role: Admin
         username: ####
         email: [email protected]
         firstName: ####
         lastName: ####
         password: ####
     webserverSecretKeySecretName: regions-airflow-webserver-secret
     triggerer:
       nodeSelector:
         environment: ep-prod-regions
     flower:
       nodeSelector:
         environment: ep-prod-regions
     redis:
       nodeSelector:
         environment: ep-prod-regions
     workers:
       extraVolumes:
         - name: configuration
           persistentVolumeClaim:
               claimName: configuration-claim
       extraVolumeMounts:
         - name: configuration
           mountPath: /opt/airflow/crystal
           subPath: CrystalEnergyConf/client/data
       nodeSelector:
         environment: ep-prod-regions
       securityContext:
         runAsUser: 0
         fsGroup: 0
         runAsGroup: 0
     data:
       metadataSecretName: airflow-metadata-connection-secret
       resultBackendSecretName: airflow-result-backend-secret
     persistence: 
       fixPermissions: true
     enabled: true
     fernetKeySecretName: fernet-key-secret
   
     postgresql:
       enabled: true
       existingSecret: regions-database-secret
       volumePermissions:
         enabled: true
       persistence:
         enabled: true
         size: 20Gi
       service:
         port: 5432
       primary:
         nodeSelector:
           environment: ep-prod-regions
       resources:
       # We usually recommend not to specify default resources and to leave 
this as a conscious
       # choice for the user. This also increases chances charts run on 
environments with little
       # resources, such as Minikube. If you do want to specify resources, 
uncomment the following
       # lines, adjust them as necessary, and remove the curly braces after 
'resources:'.
         limits:
           memory: 1G
         requests:
           memory: 1G
   
   
   ### Docker Image customisations
   
   FROM apache/airflow:2.2.4
   
   USER root
   
   COPY --chown=airflow:root ./dags/ ${AIRFLOW_HOME}/dags/
   # TODO : fix permssions issue on /opt/airflow/logs K8S volume
   USER root
   
   
   ### What happened
   
   Hi, 
   
   I just installed the helm chart, everything runs well (migrations jobs , 
pods initialization etc ), but when i connect to the UI and tried to launch a 
simple dag i received the message : 
   
   ```
   
   "The scheduler does not appear to be running. Last heartbeat was received 22 
minutes ago.
   
   The DAGs list may not update, and new tasks will not be scheduled."
   ```
   
   
   also in the scheduler logs i got this error message: 
   
   **“Executor reports task instance finished (failed) although the task says 
its queued. (Info: None) Was the task killed externally?”**
   
   maybe it's not relevant but i noticed that in the scheduler container i got 
"module not found" when i ran “airflow config list” command .
   
   ### What you think should happen instead
   
   i should be able to run my simple dag tasks with no error : 
   
   ```
   import time
   from datetime import datetime
   
   from airflow.decorators import dag, task
   
   sleep_time = 5
   
   
   @task()
   def bidding():
       print('Start of bidding')
       time.sleep(sleep_time)
       print('End of bidding')
   
   
   @task()
   def market_clearing(bids):
       print('Start of market_clearing')
       time.sleep(sleep_time)
       print('End of market_clearing')
   
   
   @task()
   def post_processing(accepted_bids):
       print('Start of post_processing')
       time.sleep(sleep_time)
       print('End of post_processing')
   
   
   @dag(
       schedule_interval=None,
       start_date=datetime(2021, 1, 1),
       catchup=False,
       tags=['simulation'],
       params={
       }
   )
   def market_simulation():
   
   
       bids = bidding()
       accepted_bids = market_clearing(bids)
       post_processing(accepted_bids)
   
   
   market_simulation_dag = market_simulation()
   ```
   
   ### How to reproduce
   
   Install the chart version mentioned above and run the dag.
   
   ### Anything else
   
   i tried to run the chart with all the executors (LocalEcecutor, 
KubernetesExecutor, CeleryExecutor) with no success. 
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to