/files/tmp has thousands of old/unnecessary files -- Can these be deleted?

Permalink
I've got dedicated servers, so this isn't at the request of a picky host, but I like to keep my servers clean and just today migrated an [extremely] small, more or less one-page c5 website.

I noticed that it was using much more space than I'd have expected, and over 50,000 files.

Between deleting ~4 old upgrade folders in /updates and deleting /files/tmp, it was down to below 8,000 files.

Is this really necessary? I understand the updates and why they're saved, and deleting those got to ~30,000, but deleting /files/tmp went from 30k to 8k.

It may not take up a lot of space, but when you account for dozens and dozens of websites backing up every night, it ends up being a -lot- of unnecessary i/o.

Is there any way to streamline this, or automatically get rid of old useless files?

 
SheldonB replied on at Permalink Reply
SheldonB
yes you can delete whats in the tmp foler

https://www.concrete5.org/community/forums/chat/help-millions-of-ses...

this thread might help you in the future
bw1 replied on at Permalink Reply
Is there any reason to keep the contents of /files/tmp if everything is working properly?

If not, then why isn't there a cleanup job as part of the cron / automated jobs?

Thanks for the link -- my particular situation isn't so much session files, but rather the result of updates and other curl activity.

Either way, I think the results are the same: There should be something to help keep this clean. No, it isn't an "active" problem, but when there are thousands (or in my experience today tens of thousands) of needless files sitting around for no reason, I think it would be appropriate to expire them.

Just another example from a site I just worked with -- clearing out /updates and /files/tmp went from 54,000+ files down to 13,400, and also got rid of 400MB+ of excess data.
cvtgorg replied on at Permalink Reply
Agree, there needs to be a way to keep the number of files down. My ftp mirroring program uses "dir -RAL" to get a recursive list of all files on the site then only uploads the ones that are new or misisng. My new bluehost web server unfortunately truncates the output of the recursive directory listing to 8000 files. I switched to a program that mirrors using directory-by-directory comparisons, but it runs 20x slower.