Using a linux NAS to daily backup my ispconfig server with Rsync

Daily backup of my ispconfig server with Rsync

I want to keep notes on how i manage to make this simple backup works.

Basically i have this 1TB NAS  WD MyBook World Edition running optware and i want to use it to get a fresh backup each day of the last week and one day for each past month.

The idea is to use rsync over ssh, so the first thing to do is to log in into the NAS and generate a RSA keypair to be used to access the server and make the backup automatic. Remember to NOT use a passfrase otherwise the automation won’t work.

~ # ssh-keygen -t rsa -b 2048
Generating public/private rsa key pair.
Enter file in which to save the key (/root/.ssh/id_rsa): 
Created directory '/root/.ssh'.
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /root/.ssh/id_rsa.
Your public key has been saved in /root/.ssh/
The key fingerprint is:
44:86:dd:2e:f0:14:54:17:c5:ef:6f:d8:86:d6:b1:4c root@NAS

What i need to do now is to add the public part of the key to the “authorized_keys” file on the server, in the .ssh folder right into the home dir of the user which will be used for this purpose.

So grab the output of

cat  .ssh/

into the mouse clipboard and ssh into the server you want to backup.
Once there cd into the directory of the user you want to be used on the other side of rsync.
In my case i used the user backup, with the home directory in /var/backups.

If you want to do so, just check if the user is already oresent on the system.

In order to create a user called backup with the home directory in /var/backups as root type:

adduser --home /var/backups backup

As root i created the directory /var/backups/.ssh, then i chmod this directory in order to belong to the user backup. Then i become backup doing “su backup-“. Once became backup i created a file called authorized keys in the .ssh dir with the command

nano .ssh/authorized_keys

And then i pasted the content of the public key previously read with cat.

At this point save the file and go back to the NAS and try to see if it works:

ssh -l backup -p 2202
Linux 2.6.32-5-xen-amd64 #1 SMP Fri May 10 11:48:05 UTC 2013 x86_64

The programs included with the Debian GNU/Linux system are free software;
the exact distribution terms for each program are described in the
individual files in /usr/share/doc/*/copyright.

Debian GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
permitted by applicable law.

And of course it works. So the authentication part is done.

Now i have to go back to the “ONLINE” server and schedule the dump of the databases and the copy of the files in a directory where the user backup will synchronize a local folder on NAS with a remote folder on the ONLINE server.

Well the MySQL part has been solved in minutes thanks to my script for MySQL backup per db, the file part instead was done using this very small script: compress each sub folder in a separate tar archive. The second script actually doesn’t help that much because the folder structure in my webserver is not human friendly, indeed.

The human friendly folders arelinked into a who knows: /var/www/ -> /var/www/client1/web1/

So what i want is to have a tar archive made out of the real folder but with the name of the symlink.

Here i mixed up a couple of things like perl and bash scripting, let me illustrate:

First of all I mada a small bash script that identifies the symlinks and prints the output to a txt file, so create folder to contain the scripts:

sudo mkdir /scripts

a bash script using the command:

nano /scripts/

In the new file, paste the following content:

cd /var/www
find . -maxdepth 1 -type l -print > /tmp/bck.list
sed -i 's/\.\///g' /tmp/bck.list


The real job is done by the “find ” command which will output each symlink present in /var/www
But since it runs in the /var/www folder, it will output the symlink name preceded by “./” which is not useful for our purposes. At this point it will be easy to understand why you need to use the output of find to replace “./” with sed.

Next step is a script capable of reading the config file (/tmp/bck.list) and compress the folder read from the config file in order to use tar to get the archive with the right name. Here you are the :

#!/usr/bin/perl -w
use warnings;
# configurable vars
my $file_list = '/tmp/bck.list';
my $start_dir = '/var/www';
my $output_dir = '/var/backups/server/web';
sub read_file_list {
  print "Reading list of files: $file_list\n";
  if ! (open(LIST, $file_list)) {
   print ("ERROR: Could not open $file_list: $!\n");
  while (my $line = <LIST>) {
    chomp $line;
    unless ($line =~ /^#/) { 
    push(@backup_dirs, $line);
 close LIST;
sub tar_backup_dirs {
  foreach my $dir (@backup_dirs) {
  print "Backup dir $dir\n";
  system("tar -czf " . $output_dir . "/" . $dir . ".tar.gz " . $start_dir . "/" . $dir. "/web");
(Visited 6,208 times, 1 visits today)

Author: Giuseppe Urso

Giuseppe lives in Haarlem now with his shiny dog, Filippa In 1982 received his first home computer, a Commodore 64, followed by Datasette and a 1541 Floppy Disk Drive. In 1999 he installed his first Linux distro (LRH6). In 2006 he switched to Debian as favourite OS. Giuseppe Urso actively sustains the Free Software Fundation and his founder Richard Mattew Stallman, he speaks to people trying to convince them to join the fight now, and about how important is to use Free Software only. He has a job as Infra Specialist at Hippo Enterprise Java Cms an Open Source Enterprise class Content Management System, one of the coolest company ever, in Amsterdam. He's always ready to install Debian on other people computers for free.

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.