Site Tools


slurm_report_one_week

CS SLURM Cluster Report - 1 week

Report generated for jobs run on the CS SLURM cluster from 2026-03-22 through 2026-03-28.

Job total during this query range: 55,517

Job total since August 1st 2024: 5,708,654

This page is updated every Sunday at 5:00pm EST.


SLURM Scheduler System Output

--------------------------------------------------------------------------------
Cluster Utilization 2026-03-22T00:00:00 - 2026-03-28T23:59:59
Usage reported in TRES Hours/Percentage of Total
--------------------------------------------------------------------------------
  Cluster      TRES Name              Allocated                  Down         PLND Down                    Idle             Planned                Reported 
--------- -------------- ---------------------- --------------------- ----------------- ----------------------- ------------------- ----------------------- 
       cs            cpu         357842(52.64%)           2650(0.39%)          0(0.00%)            19678(2.90%)      299558(44.07%)         679728(100.00%) 
       cs            mem     2046306206(27.95%)       30111667(0.41%)          0(0.00%)      5246030127(71.64%)            0(0.00%)     7322448000(100.00%) 
       cs       gres/gpu          13774(44.32%)            120(0.39%)          0(0.00%)           17185(55.29%)            0(0.00%)          31080(100.00%) 

* Total Cluster Resources Avaialble by Partition
 (Note, TRES is short for Trackable RESources)
PartitionName=cpu
   TRES=cpu=1534,mem=17306000M,node=39
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gpu
   TRES=cpu=2036,mem=22190000M,node=41,gres/gpu=158
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0
PartitionName=nolim
   TRES=cpu=220,mem=2464000M,node=6
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gnolim
   TRES=cpu=234,mem=1376000M,node=9,gres/gpu=26
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0

SLURM Usage by Partition

PartitionNametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
cpu45447119907:37:444222815960833000158632000000
gpu466589031:36:12347237606900006364000000
nolim280741166:06:28266191000002530000000
gnolim259840786:03:082435102030001642000000

SLURM Usage by Advisor Group

  • slurm-cs-undefined, users that have CS accounts but are not CS students
  • slurm-cs-unassigned, users that are CS students but do not have a listed CS advisor
GroupNametotal_jobscputime(HH:MM:SS)cpugpunolimgnolimcompletedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
slurm-cs-undefined192191560:14:0897751010455207000880000000
slurm-cs-ashish-venkat4486736994:47:32364053080279725854204318130127000136748000000
slurm-cs-hadi-daneshmand4617987:04:400460024101300062000000
slurm-cs-lu-feng812414105:05:107474650007123252074600030000000
slurm-cs-zezhou-cheng1905479:45:20118900791010700012000000
slurm-cs-henry-kautz4355219:00:483845100389303300037000000
slurm-cs-unassigned474753:34:102450010303100030000000
slurm-cs-mircea-stan3723624:56:58372000590030400090000000
slurm-cs-tianhao-wang453098:23:38045003260700000000000
slurm-cs-yu-meng12887:33:300100000000010000000
slurm-cs-yen-ling-kuo9032809:14:42690213008452003400013000000
slurm-cs-yue-cheng41037:23:280400200000020000000
slurm-cs-kevin-skadron96861:01:2609600561001900065000000
slurm-cs-sebastian-elbaum56334:38:020530312403900010000000
slurm-cs-madhur-behl10052:22:2801000059004000010000000
slurm-cs-ferdinando-fioretto531:19:420500400100000000000
slurm-cs-chen-yu-wei126:45:161000100000000000000
slurm-cs-chen-chen717:37:040700400200010000000
slurm-cs-wei-kai-lin2610:35:30215009001600001000000

SLURM Usage by NodeName

Nodenametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
serval0312414719:54:3411310300025000000
puma0185313458:08:42806260700095000000
affogato024429066:56:3637910040000112000000
slurm26178397:30:54577260000068000000
slurm35888324:12:28563180000043000000
slurm45848289:35:50559160000054000000
heartpiece3087978:34:0229260000046000000
ai072317920:35:42213100000044000000
ai102597914:36:3823890000039000000
slurm51787906:54:3616460000044000000
cortado0318956781:31:5617447703000764000000
cheetah021266105:06:00662602800060000000
cortado0911595837:56:181124240000083000000
cortado1013985665:42:4213363305000618000000
cortado0514665573:09:4613694407000442000000
cortado0232555458:26:1030081480250001658000000
cortado0611865352:57:14112122013000525000000
bigcat0610945318:41:2095033085000818000000
cheetah092195305:02:061522104100041000000
cortado0137775253:13:5835111730260001651000000
cortado0434345246:33:263209138023000559000000
cortado078884977:25:508571101400033000000
cheetah083444903:17:062552406000023000000
bigcat0419384540:13:48178880041000425000000
bigcat0124834521:15:442266830113000615000000
ai055404342:43:56505260000036000000
bigcat0324074230:50:36218981088000742000000
struct014834201:49:06448270600020000000
titanx053584122:52:363311500000111000000
titanx032614116:43:28246100000023000000
slurm15334113:22:06506190000035000000
ai082644071:42:10251100000012000000
jinx022094058:20:2219850000015000000
jaguar014093961:14:063323103400075000000
jinx011253948:09:5212130000010000000
adriatic011333931:54:58771304000030000000
serval091763923:18:50143270100014000000
jaguar022143908:10:20147705400033000000
ai09543879:43:225210000010000000
cortado0812063870:41:2011552201400096000000
struct035463723:43:04509240300037000000
serval082443614:18:28214200200044000000
lynx0911133599:41:2810533107000319000000
adriatic06493499:32:244320000031000000
affogato017383448:04:006792807000222000000
cheetah01643401:09:484580600050000000
bigcat059343319:46:0883623052000617000000
lotus1043146:26:0464902800030000000
struct095802764:30:42538290700042000000
lynx0811042647:26:5610612001600025000000
cheetah032572574:23:581575104300024000000
adriatic02622550:30:225320400021000000
adriatic03572543:36:304720400022000000
adriatic05392354:07:103230000022000000
serval071612324:12:58133601700023000000
affogato056502249:22:225713903200044000000
struct047952236:53:30762220100019000000
struct087702227:11:24716460000008000000
affogato045592176:23:445002902000037000000
struct064952137:15:424592401000011000000
cheetah04342117:33:302340600010000000
struct056442087:18:10608230800014000000
panther016871831:47:186353404000212000000
bigcat0217571716:17:46168223036000610000000
affogato112101631:05:201103206300032000000
jaguar063631594:03:2627013069000110000000
struct027161521:56:16678230600018000000
struct077071451:48:346433402100018000000
jaguar05641412:37:265360300011000000
serval06951353:01:0668202100022000000
adriatic04511279:25:564240100013000000
nekomata012051272:02:50184701000013000000
affogato14491232:28:1038100000010000000
affogato13491224:58:2632120400010000000
affogato15501223:51:144450000010000000
ai04591156:33:0629202700010000000
affogato035741098:09:105371601200018000000
ai03611050:28:3231102800001000000
ai0129977:41:002050400000000000
ai0259901:19:4029102900000000000
ai06184889:04:501441002800002000000
lynx10103851:30:469370300000000000
affogato09520635:04:584762401200026000000
affogato10545610:47:564963101400013000000
affogato08598602:03:2054526015000111000000
affogato07502588:44:424541802200017000000
affogato06577587:31:225302001800018000000
lynx0132276:59:322520500000000000
lynx0232267:32:442600600000000000
lynx043248:55:563000200000000000
lynx032834:46:462400300001000000
lynx052031:46:321900100000000000
lynx061828:20:001700000001000000
lynx071817:21:021800000000000000
jaguar033716:12:323600100000000000

slurm_report_one_week.txt · Last modified: 2026/04/05 17:00 (external edit)