That's sad that seff is being deprecated due to dropping the perl api. It would be great if it was reimplemented via the C API, REST API, or via command parsing using python.

As I understand it OSC has implemented a seff-gpu along the same lines of Princeton JobStats (they require the same collectors) but the source code is not public. It would be great to merge that into the general job stats platform.

Suffice it to say seff is a really useful command and it would be a shame to see it die, it should in fact be extended to include GPUs.

-Paul Edmon-

On 9/10/25 12:01 PM, Prentice Bisbal via slurm-users wrote:
On 9/10/25 11:37 AM, Prentice Bisbal wrote:
On 9/2/25 10:01 AM, Loris Bennett via slurm-users wrote:
Hi,

Josu Lazkano Lete via slurm-users
<slurm-users@lists.schedmd.com> writes:

Hello,

We are looking to optimize the GPU jobs of our HPC users, is it possible to add GPU info in the seff?

It will be great to know how much GPU resources the users request and compare with how much GPU resources they use.
Various sites have produced their own versions of 'seff'-like programs.
We currently use

   https://github.com/PrincetonUniversity/jobstats

which reports CPU, memory and GPU utility as well as providing
suggestions to users about the amount of resources users should request
for similar future jobs.

Cheers,

Loris


Both seff and jobstats were created by the same group of people at Princeton University.

Prentice


I just e-mailed one of the developers/maintainers of seff and jobstats. seff is being deprecated because it uses the Slurm Perl API which SchedMD is moving away from. For their own in-house use, jobstats has largely replaced seff, but moving to jobstats requires setting up additional machinery which seff doesn't require, so jobstats is not a 1:1 replacement for seff.

Unfortunately, it doesn't look like GPU support will be added to seff.

Prentice


--
slurm-users mailing list -- slurm-users@lists.schedmd.com
To unsubscribe send an email to slurm-users-le...@lists.schedmd.com

Reply via email to