GitHub user ecodina added a comment to the discussion: Integrating Airflow with 
SLURM

Hello @AndreaRiboniSF 

We developed a SlurmOperator, SlurmTrigger and some deamons. You can have more 
information in my last year presentation at the Airflow Summit 
(https://youtu.be/ol6k7df3Kr0)

The communication between Airflow and the deamons is done via Redis, but this 
communication is limited to task parameters (code, resources needed and Slurm 
job status). 

In the cluster, all our jobs generate logs in the same folder, with a 
predefined name (`slurm-{jobid}.out`). We made this folder available in the 
virtual machine where Airflow runs (this will depend on your setup: on how the 
shared filesystem works), so that the trigger can access the log by just 
knowing the jobid.

Anlther alternative could be to stream the logs via Redis. For that, you'd 
probably need a 3rd daemon that should monitor all the files and send the 
lines. You could also see if using fluentd or the ELK stack could help you out. 

GitHub link: 
https://github.com/apache/airflow/discussions/24076#discussioncomment-14639167

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to