Delete files older than X days on remote server with SCP/SFTP

Solution 1:

Sure I can write some script on perl etc but it's overkill.

You don't need a script to achieve the intended effect - a one-liner will do if you have shell access to send a command:

ssh user@host 'find /path/to/old_backups/* -mtime +7 -exec rm {} \;'

-mtime +7 matches files created one week ago from midnight of the present day.

Solution 2:

This question is very old but I still wanted to add my bash only solution as I was just searching for one when I came here. The grep tar in the listing command is just for my own purpose to list only tar files, can be adapted of course.

RESULT=`echo "ls -t path/to/old_backups/" | sftp -i ~/.ssh/your_ssh_key [email protected] | grep tar`

i=0
max=7
while read -r line; do
    (( i++ ))
    if (( i > max )); then
        echo "DELETE $i...$line"
        echo "rm $line" | sftp -i ~/.ssh/your_ssh_key [email protected]
    fi
done <<< "$RESULT"

This deletes all tar files in the given directory except the last 7 ones. It is not considering the date though but if you only have one backup per day it is good enough.


Solution 3:

If you insist on SCP/SFTP you can list files, parse them using a simple script and delete old backup files.

Batch mode "-b" switch should help you out. It reads sftp commands from file. http://linux.die.net/man/1/sftp