4 methods to backup remote files across a network

On linux there are several methods to backup remote files across a network.

Method 1: SCP (Secure Copy)

SCP is basically RCP with password support. This method is good for copying files or directories recursively so long as you do not need to use an include or exclude list. As far as I can tell SCP does not have support for exclude list. Still, the syntax is very easy to remeber making scp an easy way to securely copy files across a network.

the syntax to copy a single file is: scp user@host:sourcefile user@host:target
If you are copying to or from the local machine you can omit user@host

scp bob@bobland.com:/www/index.html /home/bob/website_backup

Pretty simple right, and the syntax is the same regardless of wether you are copying local to remote, remote to remote, remote to local, etc.

To recursively copy an entire directory just add the -r (recursive) option

scp -r bob@bobland.com:/www /home/bob/website_backup

Now Bob has his whole website backed up -- sweet.

This works great and is easy to remember, but this method will copy every file evey time it is run which is slow if bob has already made a backup in the past.
Also, since you can't exclude directories this method does not work very well for making system backups where you wouldn't want to back up certain directories like /tmp or cache files.

 

Method 2: RSYNC (Remote Sync)

Rsync is the fastest method by far if you have copied the files previously, because it will only copy files that are new or have changed.
Rsync allows exclude list using the same format as tar, so it is much better for making system backups. The syntax for rsync is also pretty easy to remember.

rsync -e ssh -av bob@bobland.com:www /home/bob/website_backup

This would run super fast since Bob already has all the files, and after Bob makes changes only those changes will get copied.
Notice that Bob used a few options.
-e, --rsh=COMMAND           specify the remote shell to use
-a, --archive               archive mode; same as -rlptgoD (no -H, -A)
-v, --verbose               increase verbosity

If Bob doesn't want to include his testing directory he could do

rsync -e ssh -av bob@bobland.com:www --exclude="www/testing" /home/bob/website_backup
To exclude more dirs just add another exclude option --exclude="var" --exclude="tmp"

This method works very well for making backups and is fast, but this isn't a good archive method. If Bob changes his files and runs rsync again, his backup would be overwritten and he would not have any of the files in their previous state.

Rsync has the abbility to make incremental backups using --backup and either --suffix or --backup-dir. I like using the backup-dir method which puts all of the changes in a seperate folder. If Bob were to run the following:

rsync -e ssh -av bob@bobland.com:www --exclude="www/testing" /home/bob/website_backup --backup --packup-dir="../website_backup_01"

Then the dir /home/bob would conatain 2 folders, website_backup and website_backup_01 and only what has changed would be saved in the later directory.

Method 3: Tar

Tar is one of my favorites. Mostly because I have restored from tar backups so many times that "In tar I trust". Tar handles recursion and symbolic links better than any other method. Tar can also compress on the fly and unlike duplicating files and directories, tar saves the file structure to a single file. Tar has a pretty simple syntax for doing simple stuff, but gets complex when used via ssh to tar files across a network.

 

Almost done, just need to give some simple and complex tar examples. and sshfs examples

but here is a quick peek on how to tar across a network. This is how I back up my iphone when I want to make a complete tar archive.

ssh root@192.168.1.105 "tar cvpfz - / --exclude="tmpl" | ssh regx@192.168.1.102 "cat > /media/shared/backup/iphone/iphone_bak_$today.tgz"

Basically you just set the output on onside to - and pick it up on the other end via ssh