Site Tools


slurm_report_one_week

CS SLURM Cluster Report - 1 week

Report generated for jobs run on the CS SLURM cluster from 2026-01-18 through 2026-01-24.

Job total during this query range: 11,750

Job total since August 1st 2024: 5,325,252

This page is updated every Sunday at 5:00pm EST.


SLURM Scheduler System Output

--------------------------------------------------------------------------------
Cluster Utilization 2026-01-18T00:00:00 - 2026-01-24T23:59:59
Usage reported in TRES Hours/Percentage of Total
--------------------------------------------------------------------------------
  Cluster      TRES Name              Allocated                 Down         PLND Down                    Idle            Planned                Reported 
--------- -------------- ---------------------- -------------------- ----------------- ----------------------- ------------------ ----------------------- 
       cs            cpu          72808(10.46%)           516(0.07%)          0(0.00%)          456126(65.52%)     166743(23.95%)         696192(100.00%) 
       cs            mem       697328376(9.44%)       5498854(0.07%)          0(0.00%)      6684132770(90.49%)           0(0.00%)     7386960000(100.00%) 
       cs       gres/gpu            3058(9.63%)             0(0.00%)          0(0.00%)           28694(90.37%)           0(0.00%)          31752(100.00%) 

* Total Cluster Resources Avaialble by Partition
 (Note, TRES is short for Trackable RESources)
PartitionName=cpu
   TRES=cpu=1596,mem=17562000M,node=40
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gpu
   TRES=cpu=2066,mem=22254000M,node=42,gres/gpu=162
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0
PartitionName=nolim
   TRES=cpu=226,mem=2528000M,node=7
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gnolim
   TRES=cpu=256,mem=1626000M,node=10,gres/gpu=27
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0

SLURM Usage by Partition

PartitionNametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
cpu519626483:41:1431035670152500010000000
gpu485918114:57:3447057407000055000000
nolim16756429:38:55505540111600000000000
gnolim201693:35:041001900000000000

SLURM Usage by Advisor Group

  • slurm-cs-undefined, users that have CS accounts but are not CS students
  • slurm-cs-unassigned, users that are CS students but do not have a listed CS advisor
GroupNametotal_jobscputime(HH:MM:SS)cpugpunolimgnolimcompletedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
slurm-cs-henry-kautz592227379:13:15424701675029093930262000000000000
slurm-cs-shangtong-zhang687919:26:4820280207305800000000000
slurm-cs-tianhao-wang2307747:16:3015476001961202200000000000
slurm-cs-yenling-kuo143459:44:5001400050400005000000
slurm-cs-madhur-behl41698:50:404000400000000000000
slurm-cs-unassigned45841651:28:223458100457530500010000000
slurm-cs-undefined1471604:52:16414300785401300020000000
slurm-cs-ashish-venkat759817:53:467590005362230000000000000
slurm-cs-kevin-skadron19441:23:1651400920500030000000
slurm-cs-haiying-shen101:21:040100000100000000000
slurm-cs-yu-meng200:22:000200000200000000000

SLURM Usage by NodeName

Nodenametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
bigcat0115103638:21:24736135063800010000000
bigcat0210413442:17:5863182032800000000000
affogato02442121:33:5032120000000000000
cheetah0423832112:49:44238210000000000000
slurm23241889:58:021400018400000000000
serval0691749:55:40420300000000000
puma013621748:12:3614938017500000000000
serval0791591:09:16620100000000000
heartpiece2171582:04:165624013700000000000
slurm17471559:03:052186052300000000000
jaguar01211529:42:241310700000000000
hydro1881519:17:401381803200000000000
jaguar0612491406:38:201195460700010000000
slurm53641397:32:268823025300000000000
serval092101372:55:5020030700000000000
bigcat032741283:08:141795803700000000000
affogato04471092:09:463870200000000000
lynx08531084:10:1212603500000000000
serval036071061:27:3060310200010000000
jaguar0321058:37:52110000000000000
cheetah08151019:09:40910500000000000
cheetah0941008:45:04100300000000000
bigcat04145972:57:36123401800000000000
affogato0172930:28:0053160300000000000
lotus10915:48:20400100005000000
serval0820899:07:581040600000000000
panther01234825:52:301043709300000000000
struct0452706:40:144750000000000000
struct0651692:45:484560000000000000
lynx0961660:48:2623503300000000000
cheetah0223620:12:241120900010000000
affogato0568575:10:306150200000000000
jaguar0222551:20:521540300000000000
struct0343528:15:263850000000000000
lynx101523:14:08000100000000000
ai054509:58:16000400000000000
jinx013462:42:40000300000000000
struct0243442:04:063850000000000000
struct0743439:27:243850000000000000
struct0848432:22:184440000000000000
struct0139431:58:243450000000000000
jinx023406:31:04000300000000000
affogato0650363:49:444170200000000000
affogato1065351:18:025870000000000000
ai065341:06:40130100000000000
affogato0859340:24:545070200000000000
affogato0954327:43:524770000000000000
affogato0766322:51:465970000000000000
cheetah01256307:07:0425030200010000000
bigcat05136301:43:0896400000000000000
ai072289:58:00100100000000000
struct05119279:44:34461306000000000000
struct0943257:12:043940000000000000
affogato03105252:00:5834806300000000000
bigcat0679118:49:507080100000000000
nekomata01328:47:12000200010000000
lynx11216:16:16000200000000000
titanx05308:08:32000300000000000
titanx02308:08:24000300000000000
titanx03208:08:08000200000000000
affogato11102:06:24000100000000000
epona2301:01:063101900000000000
adriatic01000:00:00000000000000000
adriatic02000:00:00000000000000000
adriatic03000:00:00000000000000000
adriatic04000:00:00000000000000000
adriatic05000:00:00000000000000000
adriatic06000:00:00000000000000000
affogato13000:00:00000000000000000
affogato14000:00:00000000000000000
affogato15000:00:00000000000000000
ai01000:00:00000000000000000
ai02000:00:00000000000000000
ai03000:00:00000000000000000
ai04000:00:00000000000000000
ai08000:00:00000000000000000
ai09000:00:00000000000000000
ai10000:00:00000000000000000
cheetah03000:00:00000000000000000
cortado01000:00:00000000000000000
cortado02000:00:00000000000000000
cortado03000:00:00000000000000000
cortado04000:00:00000000000000000
cortado05000:00:00000000000000000
cortado06000:00:00000000000000000
cortado07000:00:00000000000000000
cortado08000:00:00000000000000000
cortado09000:00:00000000000000000
cortado10000:00:00000000000000000
jaguar05000:00:00000000000000000
lynx01000:00:00000000000000000
lynx02000:00:00000000000000000
lynx03000:00:00000000000000000
lynx04000:00:00000000000000000
lynx05000:00:00000000000000000
lynx06000:00:00000000000000000
lynx07000:00:00000000000000000
slurm3000:00:00000000000000000
slurm4000:00:00000000000000000

slurm_report_one_week.txt · Last modified: 2026/02/01 17:00 by 127.0.0.1