Site Tools


slurm_report_one_week

CS SLURM Cluster Report - 1 week

Report generated for jobs run on the CS SLURM cluster from 2026-03-01 through 2026-03-07.

Job total during this query range: 39,775

Job total since August 1st 2024: 5,594,942

This page is updated every Sunday at 5:00pm EST.


SLURM Scheduler System Output

--------------------------------------------------------------------------------
Cluster Utilization 2026-03-01T00:00:00 - 2026-03-07T23:59:59
Usage reported in TRES Hours/Percentage of Total
--------------------------------------------------------------------------------
  Cluster      TRES Name              Allocated                  Down         PLND Down                    Idle            Planned                Reported 
--------- -------------- ---------------------- --------------------- ----------------- ----------------------- ------------------ ----------------------- 
       cs            cpu         281071(41.35%)           3779(0.56%)          0(0.00%)          181872(26.76%)     213006(31.34%)         679728(100.00%) 
       cs            mem     1696226075(23.16%)      110263207(1.51%)          0(0.00%)      5515958718(75.33%)           0(0.00%)     7322448000(100.00%) 
       cs       gres/gpu           9273(29.83%)            428(1.38%)          0(0.00%)           21379(68.79%)           0(0.00%)          31080(100.00%) 

* Total Cluster Resources Avaialble by Partition
 (Note, TRES is short for Trackable RESources)
PartitionName=cpu
   TRES=cpu=1534,mem=17306000M,node=39
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gpu
   TRES=cpu=2036,mem=22190000M,node=41,gres/gpu=158
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0
PartitionName=nolim
   TRES=cpu=220,mem=2464000M,node=6
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gnolim
   TRES=cpu=256,mem=1626000M,node=10,gres/gpu=27
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0

SLURM Usage by Partition

PartitionNametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
cpu28387102954:19:322546523200125000296181000000
gpu903477593:47:34616314140123000020819000000
nolim113812549:45:5687722803200010000000
gnolim12169447:56:24103217201000020000000

SLURM Usage by Advisor Group

  • slurm-cs-undefined, users that have CS accounts but are not CS students
  • slurm-cs-unassigned, users that are CS students but do not have a listed CS advisor
GroupNametotal_jobscputime(HH:MM:SS)cpugpunolimgnolimcompletedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
slurm-cs-ashish-venkat3699078728:26:30268827958984116631677368301109000332189000000
slurm-cs-undefined36173986:39:02106160455082190120000140000000
slurm-cs-adwait-jog39311864:47:523000930190133041000290000000
slurm-cs-yen-ling-kuo6589599:58:48641170064360800010000000
slurm-cs-madhur-behl1337811:45:38113200317000001130000000
slurm-cs-unassigned335835:22:34726005801900010000000
slurm-cs-mark-floryan25489:05:300200020000000000000
slurm-cs-mircea-stan843434:09:288220044310300006000000
slurm-cs-henry-kautz1343227:11:088038160841103000090000000
slurm-cs-lu-feng541645:17:460540091002900033000000
slurm-cs-ferdinando-fioretto36339:34:38036002050900020000000
slurm-cs-chen-yu-wei845302:23:362785670084000500000000000
slurm-cs-hadi-daneshmand29261:55:52029004501800002000000
slurm-cs-sebastian-elbaum510:09:200500000200030000000
slurm-cs-kevin-skadron1506:57:5610500840300000000000
slurm-cs-wei-kai-lin201:16:360200200000000000000
slurm-cs-briana-morrison100:47:120100000100000000000

SLURM Usage by NodeName

Nodenametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
puma0121213055:56:341256701300052000000
bigcat034376893:41:40382480400030000000
jaguar037746433:34:34597138014000250000000
serval07266299:01:283402000170000000
bigcat014605995:20:163934303000210000000
lotus3125289:33:1622132017000393000000
cheetah022615019:42:14169810700031000000
bigcat028334874:36:3873759025000102000000
bigcat0519343813:23:0217841380000084000000
bigcat0420763800:39:481899137014000521000000
cheetah04123726:01:16630300000000000
affogato01963717:32:1269230200020000000
bigcat0619633640:35:021783140020000137000000
affogato021223564:24:32100140600011000000
heartpiece1443400:43:381082601000000000000
slurm23153258:01:10235710800010000000
cheetah034563178:03:3234782010000170000000
affogato051763168:28:141262905000160000000
serval091983148:43:26123710400000000000
cortado0113692856:33:4612648800000611000000
struct071982833:56:1615719012000100000000
cheetah092102829:15:261276401600021000000
adriatic021472800:05:3696440400030000000
panther01552788:52:1439140000002000000
cortado0213542777:23:081247900000089000000
adriatic011332768:15:4283420400040000000
cheetah082442763:50:441675901100043000000
cortado0312952754:37:4811839100000147000000
adriatic031532728:48:04106410400020000000
lynx081642678:29:12136190200070000000
cortado0510672533:28:2495585000001611000000
cortado0610792474:00:1295898000001211000000
cortado0410762465:18:30947103000001313000000
adriatic064002436:18:483037201500091000000
cortado079612401:42:228518900000156000000
cortado108062329:37:22709910000006000000
cortado099042297:20:48798930000049000000
serval08292291:54:541140500090000000
cortado0810122267:37:488939600000815000000
struct081472095:46:101211400000120000000
slurm5862090:20:2056250500000000000
slurm11372084:58:0690390800000000000
jaguar011662034:13:4611624014000111000000
affogato042931966:13:42258260300060000000
cheetah012371956:34:321684701500043000000
ai052561924:31:36212420200000000000
nekomata013641923:23:362796707000101000000
jaguar063051890:51:402324208000212000000
serval061991815:15:20150450300010000000
affogato131571698:41:06132230200000000000
jinx021271685:52:3294300200010000000
struct048241685:00:42767550200000000000
affogato141541673:31:14139130200000000000
jinx011401670:29:44105320200010000000
adriatic041221648:13:2477410200020000000
affogato111841642:03:36154270300000000000
jaguar02841634:33:1462130700011000000
lynx093321623:54:10299240200070000000
struct038361617:15:04762630000056000000
struct058041585:22:18741550300032000000
struct017751581:10:08730410000004000000
struct027351564:13:38672550000044000000
slurm33801559:17:16325540100000000000
affogato151161466:42:3411020400000000000
ai064071385:11:54342620000030000000
ai071461240:25:1613690100000000000
struct061771210:25:421422500000100000000
affogato036651156:41:305786301000158000000
affogato066141141:39:005395601000612000000
affogato075681137:31:40514430100073000000
struct093121127:05:52285170400060000000
titanx031571108:58:28142140100000000000
affogato105361058:34:36493360000052000000
affogato085671044:01:08519410100060000000
affogato09562998:49:28510420000073000000
lynx10370961:33:54292690100071000000
titanx05131912:22:04119110100000000000
serval0320723:34:041010100080000000
titanx02160619:34:54143160100000000000
lynx01221436:51:10199220000000000000
jaguar0521426:14:081150300020000000
ai04165411:08:0216230000000000000
lynx03248315:45:32209370100010000000
ai01120287:14:4211730000000000000
lynx02222263:07:06200220000000000000
ai03144243:01:0213860000000000000
lynx04174231:01:38150230100000000000
ai02133224:42:0012580000000000000
ai0865218:35:065690000000000000
lynx05110207:51:5083250100010000000
lynx07103189:15:2280210100001000000
slurm476156:25:2663130000000000000
lynx0677147:01:5264110100010000000
ai092764:27:161980000000000000
adriatic052143:00:161350200010000000
ai10702:39:28610000000000000

slurm_report_one_week.txt · Last modified: 2026/03/15 17:00 by 127.0.0.1