Remote backup via SSH

Here are two examples of what I do: one for files and one for MySQL. Both are pull solutions; your local machine logs into a remote machine and retrieves files. However, the local machine tells the remote machine to prepare the archive.

Setup & Background

I use crontab and passwordless authentication with ssh to archive and gzip on a remote machine and then direct the output of gzip to the local machine over ssh. Make sure crontab and passwordless authentication are set up. I also have some cleanup one-liners.

The benefit to this is it uses bandwidth most efficiently. The drawback is it's more resource-intensive, though on modern hardware I doubt this matters unless you're dealing with absurdly large files.

Backing up files and directories

This is probably the part you care about. Tell ssh to execute a tar and pipe to gzip on the remote machine. Have gzip write the compressed file to standard output (c flag) and direct the output to a file on your local machine.

00 00 * * * /usr/bin/ssh login@host "sudo tar -cf - -C /path/to/directory/to/backup/ file_to_back_up | gzip -9c" > /file/on/local/machine/BackUp_$(date +\%Y-\%m-\%d-\%Hh\%Mm\%Ss_\%A).tar.gz

Important: file_to_back_up is the file you are actually backing up; it can be a file or directory. It can be a series of files too: file1.txt, file2.php, etc.

The 9 flag is maximum gzip compression.

The -cf - flags and parameter create a new archive and spit the data to standard output. The -Cflag tells tar to start from a different directory than the current one. file_to_back_up can be a file or a directory. This keeps a lot of extraneous relative paths out of the archive. If you want to back up something in your home directory, I guess you can omit -C /path/to/directory/to/backup/ because ssh will by default log you into your home directory.

date +\%Y-\%m-\%d-\%Hh\%Mm\%Ss_\%A will generate a timestamp like 2015-03-19-08h58m09s_Thursday

The crontab columns 00 00 * * * basically means midnight every day.

Backing up a MySQL database

Similar to the above. The caveat is you need to make sure your mysql access can be passwordless too; the safe way to do that is with .cnf files. Skip this section if you don't use MySQL, but the concept could carry over to other tools.

30 0,13 * * * /usr/bin/ssh [email protected] "mysqldump --defaults-file=.my.database.cnf -uroot databasename | gzip -9c" > /path/to/databasebackup_$(date +\%Y-\%m-\%d-\%Hh\%Mm\%Ss_\%A).sql.gz

The .cnf file contains login credentials so you don't have to pass them by the shell so they appear in ps for other users to see. It should contain:

[client] user=mysqluser password=yourpassword host=localhost

Depending on your setup, you may want a bunch of these for different projects/databases. If you don't, .my.cnf is used by default.

Security: you probably want to do chmod 600 *.cnf so only the owner can read and write to this file.

Cleanup

I tend to automatically delete backups older than five days with the findcommand, unless they fall on Friday. I archive Friday longer. That's why I include the day of the week in my file names.


yes, some examples:

tar cvjf - * | ssh user@host "(cd /desired/path; tar xjf -)"
tar cvzf - dir/ | ssh user@host "cat > /backup/dir.tar.gz"
tar cvzf - dir/ | ssh user@host "dd of=/backup/dir.tar.gz"
ssh user@host "cat /backup/dir.tar.gz" | tar xvzf -
tar cvjf - * | ssh root@host "(cd /desired/path; tar xjf - )"

To backup remote computer and save tar on local computer

ssh user@host "(cd /desired/path; tar cvzf - *)" > /path/to/backup.tar.gz

other usage examples: https://blog.bravi.org/?p=259

Tags:

Ssh

Tar