Site Tools


slurm_report_one_week

CS SLURM Cluster Report - 1 week

Report generated for jobs run on the CS SLURM cluster from 2025-12-28 through 2026-01-03.

Job total during this query range: 25,408

Job total since August 1st 2024: 5,274,369

This page is updated every Sunday at 5:00pm EST.


SLURM Scheduler System Output

--------------------------------------------------------------------------------
Cluster Utilization 2025-12-28T00:00:00 - 2026-01-03T23:59:59
Usage reported in TRES Hours/Percentage of Total
--------------------------------------------------------------------------------
  Cluster      TRES Name              Allocated                  Down         PLND Down                    Idle            Planned                Reported 
--------- -------------- ---------------------- --------------------- ----------------- ----------------------- ------------------ ----------------------- 
       cs            cpu           48123(6.91%)           6399(0.92%)          0(0.00%)          527188(75.72%)     114482(16.44%)         696192(100.00%) 
       cs            mem       448784032(6.08%)       29058276(0.39%)          0(0.00%)      6909117693(93.53%)           0(0.00%)     7386960000(100.00%) 
       cs       gres/gpu           3460(10.90%)            815(2.57%)          0(0.00%)           27477(86.54%)           0(0.00%)          31752(100.00%) 

* Total Cluster Resources Avaialble by Partition
 (Note, TRES is short for Trackable RESources)
PartitionName=cpu
   TRES=cpu=1596,mem=17562000M,node=40
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gpu
   TRES=cpu=2066,mem=22254000M,node=42,gres/gpu=162
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0
PartitionName=nolim
   TRES=cpu=226,mem=2528000M,node=7
   TRESBillingWeights=CPU=2.0,Mem=0.15
PartitionName=gnolim
   TRES=cpu=256,mem=1626000M,node=10,gres/gpu=27
   TRESBillingWeights=CPU=1.0,Mem=0.15,GRES/gpu=2.0

SLURM Usage by Partition

PartitionNametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
gpu95422843:26:04743109080000220000000
cpu1832616427:58:381805420406400040000000
nolim60943303:44:06358326802232000110000000
gnolim34331:47:1214101900000000000

SLURM Usage by Advisor Group

  • slurm-cs-undefined, users that have CS accounts but are not CS students
  • slurm-cs-unassigned, users that are CS students but do not have a listed CS advisor
GroupNametotal_jobscputime(HH:MM:SS)cpugpunolimgnolimcompletedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
slurm-cs-yenling-kuo48510389:06:20485000460250000000000000
slurm-cs-madhur-behl67754:35:120600030000030000000
slurm-cs-lu-feng176645356:30:28175798500175753205100060000000
slurm-cs-tianhao-wang304819:31:44030007140800010000000
slurm-cs-henry-kautz63213773:53:54227060940361944602241000150000000
slurm-cs-yangfeng-ji353522:28:480350014401600010000000
slurm-cs-zezhou-cheng1832881:10:080183001163002900080000000
slurm-cs-shangtong-zhang841497:44:40253402555402500000000000
slurm-cs-unassigned4941303:51:0804940048920300000000000
slurm-cs-undefined56804:40:08155004350800000000000
slurm-cs-kevin-skadron19662:42:0291095201200000000000
slurm-cs-charles-reiss31140:41:280310011150200030000000

SLURM Usage by NodeName

Nodenametotal_jobscputime(HH:MM:SS)completedcancelledrunningfailedpreemptedrequeuedpendingtimeoutout_of_memorysuspendedboot_faildeadlinenode_failresizingrevoked
serval091115492:12:5491701100020000000
serval06863937:23:147560400010000000
serval0893507:29:36710000010000000
cheetah01182569:30:041110100050000000
hydro15022210:44:46148580900000000000
bigcat0118312000:10:2817586001300000000000
puma0119881757:02:261977110000000000000
bigcat021061251:34:3890160000000000000
jaguar012161126:09:0220870100000000000
heartpiece14011086:53:4490371042000070000000
serval071381071:37:4813120500000000000
jaguar06601044:21:165640000000000000
cheetah041892:58:40010000000000000
slurm51480806:43:0092857049500000000000
slurm12294761:29:08125994094000010000000
cheetah0837623:26:422570200030000000
struct04699614:45:4869150200010000000
struct08686613:09:2267760300000000000
lotus5609:39:50140000000000000
struct07677605:23:1466950300000000000
affogato04546585:56:5254330000000000000
cheetah0930581:52:322140400010000000
affogato05548577:16:4054530000000000000
struct02678572:34:1266950400000000000
struct05695568:00:5668650400000000000
struct06679567:01:3267250200000000000
struct03697559:55:0069050200000000000
struct09684558:21:0667150800000000000
struct01688556:58:4268250100000000000
affogato01792556:18:1278830100000000000
affogato02514553:04:3051130000000000000
jaguar0246391:55:403770200000000000
slurm2386385:02:2811827024100000000000
lynx0815303:43:001200300000000000
ai0646275:50:5214240400040000000
epona533263:35:4637519013600030000000
serval038262:25:48400400000000000
lynx0911209:31:24900200000000000
panther01485207:32:16450310100030000000
cheetah0221160:29:321740000000000000
jaguar0326138:10:442420000000000000
affogato07442136:26:2843930000000000000
affogato10431136:24:0642730100000000000
affogato09454136:09:5245130000000000000
affogato08455136:04:3645130100000000000
affogato06442135:18:2243930000000000000
jinx0110124:43:12300700000000000
ai028119:05:58410300000000000
jinx027116:43:28310300000000000
affogato03333104:34:2232850000000000000
ai051090:20:32800200000000000
bigcat0430853:34:2830800000000000000
bigcat0331853:33:5631800000000000000
bigcat0531253:31:0831200000000000000
bigcat0630653:16:1630600000000000000
nekomata015331:37:4427160500050000000
affogato15301:10:24000300000000000
affogato11301:10:00000300000000000
affogato13301:10:00000300000000000
affogato14301:10:00000300000000000
cheetah03300:50:32010200000000000
ai04200:27:28000200000000000
ai03100:23:20000100000000000
lynx10100:23:12000100000000000
lynx11100:23:12000100000000000
adriatic01000:00:00000000000000000
adriatic02000:00:00000000000000000
adriatic03000:00:00000000000000000
adriatic04000:00:00000000000000000
adriatic05000:00:00000000000000000
adriatic06000:00:00000000000000000
ai01000:00:00000000000000000
ai07000:00:00000000000000000
ai08000:00:00000000000000000
ai09000:00:00000000000000000
ai10000:00:00000000000000000
cortado01000:00:00000000000000000
cortado02000:00:00000000000000000
cortado03000:00:00000000000000000
cortado04000:00:00000000000000000
cortado05000:00:00000000000000000
cortado06000:00:00000000000000000
cortado07000:00:00000000000000000
cortado08000:00:00000000000000000
cortado09000:00:00000000000000000
cortado10000:00:00000000000000000
jaguar05000:00:00000000000000000
lynx01000:00:00000000000000000
lynx02000:00:00000000000000000
lynx03000:00:00000000000000000
lynx04000:00:00000000000000000
lynx05000:00:00000000000000000
lynx06000:00:00000000000000000
lynx07000:00:00000000000000000
slurm3000:00:00000000000000000
slurm4000:00:00000000000000000
titanx02000:00:00000000000000000
titanx03000:00:00000000000000000
titanx05000:00:00000000000000000

slurm_report_one_week.txt · Last modified: 2026/01/11 17:00 by 127.0.0.1