ferruzzi commented on code in PR #53722:
URL: https://github.com/apache/airflow/pull/53722#discussion_r2449691238


##########
airflow-core/src/airflow/jobs/scheduler_job_runner.py:
##########
@@ -2108,18 +2136,43 @@ def _emit_pool_metrics(self, session: Session = 
NEW_SESSION) -> None:
         with DebugTrace.start_span(span_name="emit_pool_metrics", 
component="SchedulerJobRunner") as span:
             pools = Pool.slots_stats(session=session)
             for pool_name, slot_stats in pools.items():
-                Stats.gauge(f"pool.open_slots.{pool_name}", slot_stats["open"])
-                Stats.gauge(f"pool.queued_slots.{pool_name}", 
slot_stats["queued"])
-                Stats.gauge(f"pool.running_slots.{pool_name}", 
slot_stats["running"])
-                Stats.gauge(f"pool.deferred_slots.{pool_name}", 
slot_stats["deferred"])
-                Stats.gauge(f"pool.scheduled_slots.{pool_name}", 
slot_stats["scheduled"])
-
-                # Same metrics with tagging
-                Stats.gauge("pool.open_slots", slot_stats["open"], 
tags={"pool_name": pool_name})
-                Stats.gauge("pool.queued_slots", slot_stats["queued"], 
tags={"pool_name": pool_name})
-                Stats.gauge("pool.running_slots", slot_stats["running"], 
tags={"pool_name": pool_name})
-                Stats.gauge("pool.deferred_slots", slot_stats["deferred"], 
tags={"pool_name": pool_name})
-                Stats.gauge("pool.scheduled_slots", slot_stats["scheduled"], 
tags={"pool_name": pool_name})
+                # If enabled on the config, publish metrics twice,
+                # once with backward compatible name, and then with tags.
+                DualStatsManager.gauge(
+                    f"pool.open_slots.{pool_name}",
+                    "pool.open_slots",
+                    slot_stats["open"],
+                    tags={},
+                    extra_tags={"pool_name": pool_name},
+                )

Review Comment:
   LOL  Sory, I'm not explaining my idea very well I guess.  Alright, back this 
up a bit.  Forget everything about a possible default, that was just adding 
confusion.
   
   The registry entries for `ti.finish.<dag_id>.<task_id>.<state>` and 
`task_restored_to_dag.<dag_id>` might look like this:
   
   ```
   metrics:
     - name: "ti.finish"
       description: "Number of completed task in a given Dag. Similar to 
<job_name>_end but for task"
       type: "counter"
       legacy_name: "ti.finish.{dag_id}.{task_id}.{state}"
   
     - name: "task_restored_to_dag"
       description: "Number of tasks restored for a given Dag (i.e. task 
instance which was previously in REMOVED state in the DB is added to Dag file). 
Metric with dag_id and run_type tagging."
       type: "counter"
       legacy_name: "task_restored_to_dag.{dag_id}"
   
   ```
   
   then in the code before emitting the metric, we populate the legacy_name 
template using the metric's `tags`.  
   
   As an added bonus, as part of the build-docs script we can walk through this 
registry and generate the table [in the 
docs](https://airflow.apache.org/docs/apache-airflow/stable/administration-and-deployment/logging-monitoring/metrics.html)
 so that doc page is always up to date.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to