View Full Version : Need cron job to delete files in folder that are older than X days
I need to know how to create a cron job script to delete files within a specific folder that are older than say seven days old. I can figure out how to schedule the cron job to run on my server, but I need a script. I'm assuming the script would probably need to be written in PHP. My webserver is Apache running on FreeBSD (UNIX).
What about the unlink() (http://uk3.php.net/unlink) function?
unlink() would be the core of the script, but it requires the file name, so I'm assuming I need some sort of "for" loop that cycles through the files by date.
Chris
08-19-2007, 06:04 AM
Is this for perhaps log rotating or backups ken?
Here is the sh script I use for backups & rotating. It creates SQL dumps of databases listed in a file (db-list.txt) and then rotates them. It puts all the files in a daily backup directory where they're stored for 4 days, but then also copies the most recent one to a uploads folder for transfering to a backup server via FTP.
#!/bin/bash
# Set a value that we can use for a datestamp
DATE=`date +%Y-%m-%d` $
# Our Base backup directory
BASEBACKUP="/home/backups/daily"
for DATABASE in `cat /home/backups/db-list.txt`
do
# This is where we throw our backups.
FILEDIR="$BASEBACKUP/$DATABASE"
# Test to see if our backup directory exists.
# If not, create it.
if [ ! -d $FILEDIR ]
then
mkdir -p $FILEDIR
fi
echo -n "Exporting database: $DATABASE"
mysqldump --user=root --opt $DATABASE | gzip -c -9 > $FILEDIR/$DATABASE-$DATE.sql.gz
echo " ......[ Done Exporting to local backup, now exporting for remote backup] "
cp $FILEDIR/$DATABASE-$DATE.sql.gz /home/backups/uploads/$DATABASE.sql.gz
echo " .......[Done]"
done
# AutoPrune our backups. This will find all files
# that are "MaxFileAge" days old and delete them.
MaxFileAge=4
find $BASEBACKUP -name '*.gz' -type f -mtime +$MaxFileAge -exec rm -f {} \;
It is for the OSS RSS parser I use called Magpie (http://magpierss.sourceforge.net) to pull my career listings. It has a nasty habit of leaving old files behind, which I didn't discover until this past week when I started having problems with my listing pages not working correctly and discovered over 4,000 cached files in my Magpie cache. The file names are randomly generated 32 character GUIDs so there is no systematic way to delete the files via file name. I need to look at their age first. I simply want to delete any file in the magpie cache that is over seven days old once a day.
Chris
08-19-2007, 06:25 AM
Well pick apart the last 4 lines from my script, the code is all there.
rpanella
08-19-2007, 03:00 PM
find /path/to/cache/files -maxdepth 1 -type f -mtime +7 -exec rm -f {} \;
Placing this line in your cron should work but make sure to set the right path so you don't delete any files you didn't mean to. This line also only deletes files from the cache directory and not an subdirectories.
________
Anime Hardcore (http://www.****tube.com/categories/105/hardcore/videos/1)
Powered by vBulletin® Version 4.2.2 Copyright © 2025 vBulletin Solutions, Inc. All rights reserved.