Deadline repository became full. Need to shrink disc usage

Repository currently taking most of the disc up. Became 100% full and needed to reboot machine to regain database connection.

/opt/Thinkbox/DeadlineRepository10/jobs 21G
/opt/Thinkbox/DeadlineRepository10/jobsArchived 5.1G

Best way to clean this up. Possible to do this from the commandline?

Delete the jobs which have the largest folders from the Monitor and the house cleaning operation will clean up for you after about two hours. The job ids are used for the folders, so just run this guy to find the guilty ones:

du -s /opt/Thinkbox/DeadlineRepository10/ | sort -n

If you want to offload the jobs folder to a different server, the archived jobs setting is likely a good plan. We don’t have a similar option for archived jobs.

1 Like

du -s /opt/Thinkbox/DeadlineRepository10/jobs/* | sort -n

Lots of empty ones and ones around 240MB no higher e.g

0 /opt/Thinkbox/DeadlineRepository10/jobs/5a61ffc7d1094d50d4c64253
0 /opt/Thinkbox/DeadlineRepository10/jobs/5a62053e000d122c885e9292
.
.
244MB /opt/Thinkbox/DeadlineRepository10/jobs/5c49b3ee05bb54155b2340c4
244MB /opt/Thinkbox/DeadlineRepository10/jobs/5c49b67005bb5416edafa396

Thinking it’s ok to delete contents of /opt/Thinkbox/DeadlineRepository10/jobsArchived ?
4.5G 2018-12
8.0K 2018-03
10M 2018-06
41M 2018-07
68M 2018-09
79M 2018-08
80K 2018-02
134M 2018-10
278M 2018-11
424K 2018-01

It’s just historical jobs. If you don’t think you’ll ever need to bring back and old job it’s perfectly fine to delete straight out of that folder.