Site Tools


slurm_report_four_weeks

CS SLURM Cluster Report - 4 weeks

Report generated for jobs run on the CS SLURM cluster from 2026-01-18 through 2026-02-14.

Job total during this query range: 111,753

Job total since August 1st 2024: 5,429,688

This page is updated every Sunday at 5:00pm EST.


SLURM Scheduler System Output

--------------------------------------------------------------------------------
Cluster Utilization 2026-01-18T00:00:00 - 2026-02-14T23:59:59
Usage reported in TRES Hours/Percentage of Total
--------------------------------------------------------------------------------
  Cluster      TRES Name               Allocated                   Down         PLND Down                    Idle             Planned                 Reported 
--------- -------------- ----------------------- ---------------------- ----------------- ----------------------- ------------------- ------------------------ 
       cs            cpu          560823(20.14%)           47428(1.70%)          0(0.00%)         1689230(60.66%)      487287(17.50%)         2784768(100.00%) 
       cs            mem      3881901994(13.14%)       402346013(1.36%)        142(0.00%)     25263591851(85.50%)            0(0.00%)     29547840000(100.00%) 
       cs       gres/gpu           28495(22.44%)            2934(2.31%)          0(0.00%)           95579(75.25%)            0(0.00%)          127008(100.00%) 

* Total Cluster Resources Avaialble by Partition
 (Note, TRES is short for Trackable RESources)
PartitionName=cpu
   TRES=cpu=1534,mem=17306000M,node=39
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gpu
   TRES=cpu=2036,mem=22190000M,node=41,gres/gpu=158
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0
PartitionName=nolim
   TRES=cpu=220,mem=2464000M,node=6
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gnolim
   TRES=cpu=256,mem=1626000M,node=10,gres/gpu=27
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0

SLURM Usage by Partition

PartitionNametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
cpu89839233940:12:3678474411306368000456428000000
gpu17383220782:09:501405710590206400077126000000
nolim449433109:08:4222043410194800001000000
gnolim376416:45:202502300052000000

SLURM Usage by Advisor Group

  • slurm-cs-undefined, users that have CS accounts but are not CS students
  • slurm-cs-unassigned, users that are CS students but do not have a listed CS advisor
GroupNametotal_jobscputime(HH:MM:SS)cpugpunolimgnolimcompletedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
slurm-cs-undefined3387219100:30:5226094882553599515540823000150000000
slurm-cs-henry-kautz1412856424:41:3498721742390860193504332000159101000000
slurm-cs-ashish-venkat4347849492:00:084347800039106204701797000250278000000
slurm-cs-lu-feng3637047193:22:182721491540234427511013160004112000000
slurm-cs-yen-ling-kuo66827814:52:30614540058254019000013000000
slurm-cs-madhur-behl45023472:47:32176274002463401000004426000000
slurm-cs-tianhao-wang25621349:25:520256001644604400011000000
slurm-cs-yue-cheng2413847:28:32024009101200020000000
slurm-cs-yu-meng109969:46:322800240400000000000
slurm-cs-kevin-skadron2057951:27:38961090012234036000121000000
slurm-cs-unassigned123817759:31:50575066310010283212018290003621000000
slurm-cs-mark-floryan12887:41:300100000000010000000
slurm-cs-zezhou-cheng102615:58:0001000300300040000000
slurm-cs-ferdinando-fioretto2142586:55:200214001326301600030000000
slurm-cs-hyojoon-kim37864:28:580370071301100051000000
slurm-cs-wei-kai-lin65357:26:509560024803100002000000
slurm-cs-briana-morrison24323:23:20024001520600010000000
slurm-cs-rich-nguyen19122:36:16019005001400000000000
slurm-cs-chen-chen1058:09:1810000400500010000000
slurm-cs-mircea-stan251:48:002000200000000000000
slurm-cs-haiying-shen101:21:040100000100000000000
slurm-cs-shangtong-zhang101:20:441000100000000000000
slurm-cs-matheus-xavier-ferreira601:07:446000200400000000000
slurm-cs-sebastian-elbaum600:04:060600500000001000000

SLURM Usage by NodeName

Nodenametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
puma01447024260:08:16372333603580002033000000
bigcat011164017503:27:281001754209450006868000000
serval0618116503:54:261391801900041000000
bigcat02976915318:49:22881730105790003933000000
serval0753214400:38:404862002100050000000
jaguar0120213943:05:5610033054000312000000
bigcat03513012819:48:484719159023600088000000
serval0930512620:16:442551902700022000000
cheetah0882612288:32:186825508400005000000
cheetah04257311648:03:1424831806700032000000
cheetah0264211619:43:384908106200018000000
bigcat04491511553:50:44420613605490001113000000
jaguar0348210452:50:2227488077000142000000
heartpiece61110437:44:3426470027600001000000
bigcat05542510015:05:1645981700643000113000000
lynx086189027:10:5042768011100075000000
serval085628573:45:245012403100042000000
cheetah0914068467:35:0097353037700021000000
slurm210938441:21:2456081045200000000000
lotus3198267:40:04111660118000915000000
affogato11557916:41:1843901100010000000
affogato0515457724:44:18139388049000105000000
affogato0416947543:11:18148396094000129000000
bigcat0662217207:11:24560019304040001311000000
affogato131137026:11:4453906900000000000
affogato0120706952:17:0218361100980001214000000
cheetah016376920:56:1251869037000211000000
slurm116956512:18:2488851075600000000000
affogato14566496:39:0443202000000000000
affogato0218246491:01:421550166094000311000000
slurm57506179:29:4836155033400000000000
serval036856021:52:20651210700042000000
jaguar0617036012:37:2015846504100076000000
struct0117215528:09:48151182011400068000000
affogato0614455250:01:561303510640001710000000
lynx10235190:09:523170200010000000
cheetah034834897:52:584223801700024000000
affogato0814884884:53:24135363054000126000000
ai031064653:43:0690130200010000000
panther0128634645:34:1224381390264000913000000
lynx097394533:23:3653960012600086000000
struct0713644499:05:40118372087000139000000
struct0215844489:52:18140264010200079000000
struct0414664473:10:2412888908000045000000
struct0315104435:28:3013209208500049000000
ai021094373:49:0889160400000000000
struct0616384285:50:02143086011100047000000
adriatic011274278:56:14512005300030000000
ai041164236:31:46101110300010000000
adriatic021104189:35:20421505000030000000
struct0515644161:56:50132788014000054000000
adriatic031003874:20:40202305400030000000
struct0812203840:30:3010727905900028000000
nekomata014243715:58:023082807900027000000
struct0915213664:41:1013648306300029000000
affogato15483541:21:2031902600000000000
ai061933449:38:421321804300000000000
jaguar0223273161:22:52208947018200045000000
cortado0114573118:34:1612208101020003222000000
cortado0212652917:46:18110728092000308000000
cortado0310772810:07:46925330750003014000000
cortado048332634:59:5473728048000137000000
affogato0915842449:40:52144573051000123000000
cortado057452325:43:066762803700031000000
affogato0717002208:02:281548690600001013000000
cortado066142086:03:545652302100032000000
affogato0314422040:41:22119076016300067000000
affogato109631761:00:328366905200060000000
adriatic041061734:35:04201906400030000000
cortado076431692:13:005704002500044000000
adriatic05791622:30:48211404100030000000
slurm32991441:29:4612860011100000000000
ai05101426:24:32120500002000000
cortado105721367:41:48532310500013000000
adriatic06551367:04:00211002100021000000
cortado084611350:53:30415320300047000000
cortado094211265:51:423693107000311000000
jinx0171222:38:00020500000000000
jinx0261141:12:48010400010000000
ai073867:59:44100100010000000
titanx054586:10:16000300010000000
titanx024586:10:08000300010000000
titanx033586:09:52000200010000000
ai01248489:24:06225801500000000000
jaguar051431278:04:16116315025200010000000
slurm42395:43:400230000000000000
lynx01600:00:08000600000000000
ai08000:00:00000000000000000
ai09000:00:00000000000000000
ai10000:00:00000000000000000
lynx02000:00:00000000000000000
lynx03000:00:00000000000000000
lynx04000:00:00000000000000000
lynx05000:00:00000000000000000
lynx06000:00:00000000000000000
lynx07000:00:00000000000000000

slurm_report_four_weeks.txt · Last modified: 2026/02/22 17:00 (external edit)