Results 1 to 7 of 7

Thread: Need cron job to delete files in folder that are older than X days

  1. #1
    Site Contributor KLB's Avatar
    Join Date
    Feb 2006
    Location
    Saco Maine
    Posts
    1,181

    Need cron job to delete files in folder that are older than X days

    I need to know how to create a cron job script to delete files within a specific folder that are older than say seven days old. I can figure out how to schedule the cron job to run on my server, but I need a script. I'm assuming the script would probably need to be written in PHP. My webserver is Apache running on FreeBSD (UNIX).
    Ken Barbalace - EnvironmentalChemistry.com (Environmental Careers, Blog)
    InternetSAR.org: Volunteers Assisting Search and Rescue via the Internet
    My Firefox Theme Classic Compact: Based onFirefox's classic theme but uses much less window space

  2. #2
    Registered Mike's Avatar
    Join Date
    May 2003
    Location
    UK
    Posts
    2,755
    What about the unlink() function?
    Don't you just love free internet games ?

  3. #3
    Site Contributor KLB's Avatar
    Join Date
    Feb 2006
    Location
    Saco Maine
    Posts
    1,181
    unlink() would be the core of the script, but it requires the file name, so I'm assuming I need some sort of "for" loop that cycles through the files by date.
    Ken Barbalace - EnvironmentalChemistry.com (Environmental Careers, Blog)
    InternetSAR.org: Volunteers Assisting Search and Rescue via the Internet
    My Firefox Theme Classic Compact: Based onFirefox's classic theme but uses much less window space

  4. #4
    Administrator Chris's Avatar
    Join Date
    Feb 2003
    Location
    East Lansing, MI USA
    Posts
    7,055
    Is this for perhaps log rotating or backups ken?

    Here is the sh script I use for backups & rotating. It creates SQL dumps of databases listed in a file (db-list.txt) and then rotates them. It puts all the files in a daily backup directory where they're stored for 4 days, but then also copies the most recent one to a uploads folder for transfering to a backup server via FTP.

    Code:
    #!/bin/bash
    # Set a value that we can use for a datestamp
    DATE=`date +%Y-%m-%d`                                                                                                                                       $
    # Our Base backup directory
    BASEBACKUP="/home/backups/daily"
    
    for DATABASE in `cat /home/backups/db-list.txt`
    do
            # This is where we throw our backups.
            FILEDIR="$BASEBACKUP/$DATABASE"
    
            # Test to see if our backup directory exists.
            # If not, create it.
            if [ ! -d $FILEDIR ]
            then
                    mkdir -p $FILEDIR
            fi
    
            echo -n "Exporting database:  $DATABASE"
            mysqldump --user=root --opt $DATABASE | gzip -c -9 > $FILEDIR/$DATABASE-$DATE.sql.gz
            echo "      ......[ Done Exporting to local backup, now exporting for remote backup] "
            cp $FILEDIR/$DATABASE-$DATE.sql.gz /home/backups/uploads/$DATABASE.sql.gz
            echo "      .......[Done]"
    done
    
    # AutoPrune our backups.  This will find all files
    # that are "MaxFileAge" days old and delete them.
    MaxFileAge=4
    find $BASEBACKUP -name '*.gz' -type f -mtime +$MaxFileAge -exec rm -f {} \;
    Chris Beasley - My Guide to Building a Successful Website[size=1]
    Content Sites: ABCDFGHIJKLMNOP|Forums: ABCD EF|Ecommerce: Swords Knives

  5. #5
    Site Contributor KLB's Avatar
    Join Date
    Feb 2006
    Location
    Saco Maine
    Posts
    1,181
    It is for the OSS RSS parser I use called Magpie (http://magpierss.sourceforge.net) to pull my career listings. It has a nasty habit of leaving old files behind, which I didn't discover until this past week when I started having problems with my listing pages not working correctly and discovered over 4,000 cached files in my Magpie cache. The file names are randomly generated 32 character GUIDs so there is no systematic way to delete the files via file name. I need to look at their age first. I simply want to delete any file in the magpie cache that is over seven days old once a day.
    Ken Barbalace - EnvironmentalChemistry.com (Environmental Careers, Blog)
    InternetSAR.org: Volunteers Assisting Search and Rescue via the Internet
    My Firefox Theme Classic Compact: Based onFirefox's classic theme but uses much less window space

  6. #6
    Administrator Chris's Avatar
    Join Date
    Feb 2003
    Location
    East Lansing, MI USA
    Posts
    7,055
    Well pick apart the last 4 lines from my script, the code is all there.
    Chris Beasley - My Guide to Building a Successful Website[size=1]
    Content Sites: ABCDFGHIJKLMNOP|Forums: ABCD EF|Ecommerce: Swords Knives

  7. #7
    Registered
    Join Date
    Aug 2006
    Location
    Sacramento, CA
    Posts
    208
    Code:
    find /path/to/cache/files -maxdepth 1 -type f -mtime +7 -exec rm -f {} \;
    Placing this line in your cron should work but make sure to set the right path so you don't delete any files you didn't mean to. This line also only deletes files from the cache directory and not an subdirectories.
    ________
    Anime Hardcore
    Last edited by rpanella; 03-17-2011 at 10:59 AM.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •