regisferlima opened a new issue, #29953: URL: https://github.com/apache/airflow/issues/29953
### Apache Airflow version Other Airflow 2 version (please specify below) ### What happened Airflow Version: 2.3.2 Hi, we having a problem with this dag: _airflow.dagrun.duration.success.avg_ We sent the logs to Datadog but they said that this dag is sending negative values to Datadog. Answer from datadog: _Unfortunately there's not much that I can see which indicates why this metric value is reporting negative. Since the trace level flare showed that the Agent is receiving a negative value, that indicates to me that the metric may be sent to Datadog as a negative value. Here are the logs from Datadog Side: [Logs](https://drive.google.com/file/d/17XPk4I1iI5K5BvEr_ch1gxAo_v62NHmg/view?usp=sharing) Thank you, Janae Quinones | Solutions Engineer | Datadog_ Do you know guys how we can see if it is working correctly? how can we see the values inside the airflow, I believe it is related to StatsD. ### What you think should happen instead _No response_ ### How to reproduce I believe run locally with docker. ### Operating System Debian GNU/Linux 11 (bullseye) ### Versions of Apache Airflow Providers 2.3.2 ### Deployment Official Apache Airflow Helm Chart ### Deployment details `- name: datadog namespace: monitoring chart: datadog/datadog version: 3.6.8 values: - datadog: clusterName: {{ requiredEnv "PROJECTID" }} logLevel: {{ .Values.datadog.logLevel }} kubeStateMetricsEnabled: false kubeStateMetricsCore: enabled: true site: datadoghq.eu apiKeyExistingSecret: datadog-api-key appKeyExistingSecret: datadog-app-key podLabelsAsTags: app: dh_app dag_id: airflow_dag_id task_id: airflow_task_id try_number: airflow_try_number kubernetes_pod_operator: airflow_kubernetes_pod_operator kubernetes_executor: airflow_kubernetes_executor namespaceLabelsAsTags: dh_platform: dh_platform dh_tribe: dh_tribe dh_env: dh_env logs: enabled: true containerCollectAll: true tags: - project_id:{{ requiredEnv "PROJECTID" }} - dh_platform:datahub - dh_tribe:global-data - dh_squad:{{ requiredEnv "PROJECTID" }} - dh_cc_name:{{ requiredEnv "DH_CC_NAME" }} - dh_cc_id:{{ requiredEnv "DH_CC_ID" }} dogstatsd: port: 8125 useHostPort: true kubelet: tlsVerify: false env: - name: DD_DOGSTATSD_MAPPER_PROFILES value: >- {{ exec "jq" (list "-c" "." "dogstatsd_mapper_profiles.json") }} clusterAgent: replicas: 2 createPodDisruptionBudget: true agents: containers: agent: livenessProbe: failureThreshold: 6 httpGet: path: /live port: 5555 scheme: HTTP initialDelaySeconds: 30 periodSeconds: 5 successThreshold: 1 timeoutSeconds: 5 tolerations: - key: "role" operator: "Equal" value: "static" effect: "NoSchedule" - key: cloud.google.com/gke-spot operator: "Equal" value: "true" effect: "NoSchedule" hooks: - events: ["presync"] showlogs: true command: "sh" args: ["-c","kubectl create ns monitoring || exit 0"] ` ---- mapper profiles.json: ---- `[ { "prefix": "airflow.", "name": "airflow", "mappings": [ { "name": "airflow.job.start", "match": "airflow.*_start", "tags": { "job_name": "$1" } }, { "name": "airflow.job.end", "match": "airflow.*_end", "tags": { "job_name": "$1" } }, { "name": "airflow.job.heartbeat.failure", "match": "airflow.*_heartbeat_failure", "tags": { "job_name": "$1" } }, { "name": "airflow.operator_failures", "match": "airflow.operator_failures_*", "tags": { "operator_name": "$1" } }, { "name": "airflow.operator_successes", "match": "airflow.operator_successes_*", "tags": { "operator_name": "$1" } }, { "match_type": "regex", "name": "airflow.dag_processing.last_runtime", "match": "airflow\\.dag_processing\\.last_runtime\\.(.*)", "tags": { "dag_file": "$1" } }, { "match_type": "regex", "name": "airflow.dag_processing.last_run.seconds_ago", "match": "airflow\\.dag_processing\\.last_run\\.seconds_ago\\.(.*)", "tags": { "dag_file": "$1" } }, { "match_type": "regex", "name": "airflow.dag.loading_duration", "match": "airflow\\.dag\\.loading-duration\\.(.*)", "tags": { "dag_file": "$1" } }, { "name": "airflow.dagrun.first_task_scheduling_delay", "match": "airflow.dagrun.*.first_task_scheduling_delay", "tags": { "dag_id": "$1" } }, { "name": "airflow.pool.open_slots", "match": "airflow.pool.open_slots.*", "tags": { "pool_name": "$1" } }, { "name": "airflow.pool.queued_slots", "match": "airflow.pool.queued_slots.*", "tags": { "pool_name": "$1" } }, { "name": "airflow.pool.running_slots", "match": "airflow.pool.running_slots.*", "tags": { "pool_name": "$1" } }, { "name": "airflow.pool.used_slots", "match": "airflow.pool.used_slots.*", "tags": { "pool_name": "$1" } }, { "name": "airflow.pool.starving_tasks", "match": "airflow.pool.starving_tasks.*", "tags": { "pool_name": "$1" } }, { "match_type": "regex", "name": "airflow.dagrun.dependency_check", "match": "airflow\\.dagrun\\.dependency-check\\.(.*)", "tags": { "dag_id": "$1" } }, { "match_type": "regex", "name": "airflow.dag.task.duration", "match": "airflow\\.dag\\.(.*)\\.([^.]*)\\.duration", "tags": { "dag_id": "$1", "task_id": "$2" } }, { "match_type": "regex", "name": "airflow.dag_processing.last_duration", "match": "airflow\\.dag_processing\\.last_duration\\.(.*)", "tags": { "dag_file": "$1" } }, { "match_type": "regex", "name": "airflow.dagrun.duration.success", "match": "airflow\\.dagrun\\.duration\\.success\\.(.*)", "tags": { "dag_id": "$1" } }, { "match_type": "regex", "name": "airflow.dagrun.duration.failed", "match": "airflow\\.dagrun\\.duration\\.failed\\.(.*)", "tags": { "dag_id": "$1" } }, { "match_type": "regex", "name": "airflow.dagrun.schedule_delay", "match": "airflow\\.dagrun\\.schedule_delay\\.(.*)", "tags": { "dag_id": "$1" } }, { "name": "airflow.scheduler.tasks.running", "match": "airflow.scheduler.tasks.running" }, { "name": "airflow.scheduler.tasks.starving", "match": "airflow.scheduler.tasks.starving" }, { "name": "airflow.sla_email_notification_failure", "match": "airflow.sla_email_notification_failure" }, { "match_type": "regex", "name": "airflow.dag.task_removed", "match": "airflow\\.task_removed_from_dag\\.(.*)", "tags": { "dag_id": "$1" } }, { "match_type": "regex", "name": "airflow.dag.task_restored", "match": "airflow\\.task_restored_to_dag\\.(.*)", "tags": { "dag_id": "$1" } }, { "name": "airflow.task.instance_created", "match": "airflow.task_instance_created-*", "tags": { "task_class": "$1" } }, { "name": "airflow.ti.start", "match": "airflow.ti.start.*.*", "tags": { "dag_id": "$1", "task_id": "$2" } }, { "name": "airflow.ti.finish", "match": "airflow.ti.finish.*.*.*", "tags": { "dag_id": "$1", "state": "$3", "task_id": "$2" } }, { "name": "airflow.dag.callback", "match": "airflow.dag.callback.*.*.*", "tags": { "dag_id": "$1", "task_id": "$2", "state": "$3" } } ] } ] ` ### Anything else _No response_ ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
