Site Tools


slurm_report_one_week

CS SLURM Cluster Report - 1 week

Report generated for jobs run on the CS SLURM cluster from 2026-04-26 through 2026-05-02.

Job total during this query range: 209,636

Job total since August 1st 2024: 6,149,701

This page is updated every Sunday at 5:00pm EST.


SLURM Scheduler System Output

--------------------------------------------------------------------------------
Cluster Utilization 2026-04-26T00:00:00 - 2026-05-02T23:59:59
Usage reported in TRES Hours/Percentage of Total
--------------------------------------------------------------------------------
  Cluster      TRES Name              Allocated              Down         PLND Down                    Idle            Planned                Reported 
--------- -------------- ---------------------- ----------------- ----------------- ----------------------- ------------------ ----------------------- 
       cs            cpu         172146(25.46%)          0(0.00%)          0(0.00%)          428645(63.41%)      75240(11.13%)         676032(100.00%) 
       cs            mem     1321410858(18.15%)          0(0.00%)          0(0.00%)      5959037142(81.85%)           0(0.00%)     7280448000(100.00%) 
       cs       gres/gpu           7445(24.08%)          0(0.00%)          0(0.00%)           23467(75.92%)           0(0.00%)          30912(100.00%) 

* Total Cluster Resources Avaialble by Partition
 (Note, TRES is short for Trackable RESources)
PartitionName=cpu
   TRES=cpu=1534,mem=17306000M,node=39
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gpu
   TRES=cpu=2036,mem=22190000M,node=41,gres/gpu=158
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0
PartitionName=nolim
   TRES=cpu=220,mem=2464000M,node=6
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gnolim
   TRES=cpu=234,mem=1376000M,node=9,gres/gpu=26
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0

SLURM Usage by Partition

PartitionNametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
gpu9580690148:56:50946093330832000239000000
cpu11188573949:40:02108976616022350004315000000
gnolim19453360:04:1817559509500000000000
nolim000:00:00000000000000000

SLURM Usage by Advisor Group

  • slurm-cs-undefined, users that have CS accounts but are not CS students
  • slurm-cs-unassigned, users that are CS students but do not have a listed CS advisor
GroupNametotal_jobscputime(HH:MM:SS)cpugpunolimgnolimcompletedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
slurm-cs-chen-yu-wei17137484564:20:2684944864300016930200207200000000000
slurm-cs-lu-feng3381321927:56:242357782920194432551494076800000000000
slurm-cs-undefined54914513:36:40475740030620503500030000000
slurm-cs-ashish-venkat231513970:59:02231500020582550000020000000
slurm-cs-unassigned607158:01:422580016403100081000000
slurm-cs-hadi-daneshmand555961:04:56055004240900000000000
slurm-cs-yu-meng55447:26:000500050000000000000
slurm-cs-zezhou-cheng1563452:17:5612630006127029000390000000
slurm-cs-kevin-skadron693340:50:5637320054201200010000000
slurm-cs-ferdinando-fioretto2451569:26:521081370018115036000310000000
slurm-cs-shangtong-zhang101370:19:4401000210700000000000
slurm-cs-yangfeng-ji2641256:35:44026400189007200030000000
slurm-cs-yen-ling-kuo5411066:35:04230311004801304800000000000
slurm-cs-wei-kai-lin43794:17:581420015201600019000000
slurm-cs-tianhao-wang8602:44:480800330100010000000
slurm-cs-aidong-zhang79183:06:2238410056801100040000000
slurm-cs-henry-kautz14134:36:448600800200004000000
slurm-cs-matheus-xavier-ferreira2475:42:48240001240700010000000
slurm-cs-rich-nguyen666:36:160600220200000000000
slurm-cs-yue-cheng602:04:480501200400000000000

SLURM Usage by NodeName

Nodenametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
cheetah04711051:56:00020200030000000
jaguar0182547210:26:388192705500000000000
bigcat0158126371:19:2257771801700000000000
bigcat0257816354:59:3057303201900000000000
cheetah0299915990:53:3498872507900000000000
cheetah0858565683:13:525800904600010000000
cheetah0960225451:22:0059541305200021000000
jaguar06110754951:49:2811039103300020000000
cheetah0390554853:55:0089821505600020000000
jaguar0247754710:29:0046921007200001000000
serval075624675:56:485251801600012000000
puma0183024641:47:5081447208600000000000
lynx0861063806:28:0659029019000023000000
bigcat0358313192:05:0457254805000080000000
ai0651633159:55:245125003700010000000
lynx1047713138:08:124732403500000000000
serval0310903090:53:521070101700011000000
lynx0951783072:44:30498914017400001000000
bigcat0562892787:58:4061912806900010000000
bigcat0462382720:57:0461244306800030000000
bigcat0665052501:03:0464132506700000000000
struct0141522433:32:1640481408900001000000
struct0241352414:31:3040142609300011000000
serval0926772402:57:5426441601500002000000
jaguar03522316:28:2036201400000000000
nekomata0159442278:15:525896704100000000000
affogato0224732234:31:502411605600000000000
struct0935532209:44:2434621507500010000000
struct0829022171:14:062757190116000100000000
struct0336862076:27:1635981807000000000000
serval0818622063:08:34185020900010000000
struct0631502058:39:2230682105500060000000
lotus4361956:19:244061401500001000000
adriatic0124791838:25:1024251503900000000000
affogato0519401824:17:2618212109800000000000
struct0724211809:27:1823392205100090000000
affogato0127111780:59:3626151807800000000000
jaguar0525241744:50:002474804100001000000
struct0524371718:53:36226725014400001000000
affogato0422251674:19:3821156010300001000000
adriatic0221981598:42:5221531503000000000000
adriatic0322591544:54:4822252001200020000000
struct0413031522:07:0412761401300000000000
panther0112501405:37:3812121402200011000000
adriatic0520221358:02:4419911501600000000000
affogato0613391357:31:321279605200002000000
adriatic0420041353:58:3019751501300010000000
adriatic0620341328:41:5819991502000000000000
affogato1011601305:47:3210921405200002000000
affogato0921261298:46:142046907000001000000
affogato0721641244:53:582096606200000000000
affogato0821101217:34:0820431005700000000000
affogato0313811068:46:361329504600001000000
affogato11240834:00:2622560900000000000
ai05345679:21:282932203000000000000
cheetah0110660:13:08220300030000000
serval06971646:55:02935310300020000000
ai0765551:26:424880900000000000
cortado01712544:35:566891101200000000000
cortado02803538:42:20781150700000000000
cortado08987441:33:54946004000010000000
ai0856440:48:264080800000000000
cortado03552440:36:0054390000000000000
jinx01457433:04:104231801600000000000
cortado101233410:11:521221101100000000000
jinx02375395:28:28359100600000000000
cortado091108380:33:001096001200000000000
cortado04439335:46:2043900000000000000
titanx03463319:13:264331701300000000000
ai0933275:25:202540400000000000
affogato13229269:50:4421870400000000000
cortado05370235:51:4637000000000000000
affogato14189230:20:1617870400000000000
cortado07729225:45:48718001100000000000
lynx0362213:12:026110000000000000
affogato15162208:54:3415170400000000000
lynx0456198:59:545600000000000000
titanx05147197:35:4213080900000000000
lynx0273197:24:547100200000000000
lynx01104164:33:049740300000000000
ai03119146:26:5010650800000000000
ai04118141:50:4210850500000000000
ai02114134:43:2210150800000000000
lynx0550125:09:044600300010000000
cortado06291118:56:5629010000000000000
ai018398:39:427340600000000000
ai10467:40:36400000000000000
lynx075762:11:465700000000000000
lynx065261:45:265200000000000000
heartpiece000:00:00000000000000000
slurm1000:00:00000000000000000
slurm2000:00:00000000000000000
slurm3000:00:00000000000000000
slurm4000:00:00000000000000000
slurm5000:00:00000000000000000

slurm_report_one_week.txt · Last modified: 2026/05/10 17:00 by 127.0.0.1