Abdel39 commented on issue #53442: URL: https://github.com/apache/airflow/issues/53442#issuecomment-3364995368
Hello, We run Airflow 3.x in containers at Datadog. In our setup, the Airflow UI already push and reads logs directly from Datadog via a custom RemoteLogIO (LoggingMixin / ExternalLoggingMixin). Because of that, the Supervisor's per-task file writing doesn't add any value for us, it's just extra overhead. What we'd really like is the ability to swap in our own Supervisor implementation (e.g. a DatadogSupervisor) that consumes the structured task JSON events, enriches them with tags/trace IDs, and forwards them directly to our pipeline — without touching disk. Would the project be open to making the Supervisor pluggable via a stable interface or entrypoint (with the current file-based one remaining default)? (This is not time sensitive and can wait after 3.1) Down the line, this could also enable us to package our approach as a plugin so other Airflow users could adopt Datadog log integration without custom hacks. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
