Backup Solutions

Dave Mason dmason-bqArmZWzea/GcjXNFnLQ/w at public.gmane.org
Thu Aug 30 13:44:22 UTC 2007


Depending on the bandwidth available, you might do as I have done.  I
have a backup machine, let's call it backup.foo.org, and clients c1.org,
c2.org.  Then on backup I create accounts c1, c2 with the following
/etc/passwd entries:

c1:x:990:990:C1 Backup:/home/c1:/home/c1/Start
c2:x:990:990:C2 Backup:/home/c2:/home/c2/Start

Then I put the executable file Start as follows:
#! /bin/sh
case "$1.$2" in
  -c.save)
	umask 077
	N=`date +%Y-%m-%d-%H-%M`
	cat >save-$N.tar.gz
	ls -l save-$N.tar.gz
	exit 0;;
  -c.list)
	ls -l save-*.tar.gz
	exit 0;;
  *) echo Error
     exit 1
     ;;
esac

in each home directory.  I set up ssh for each of them and copy the pub
file to the client systems.  Then I put a cron job on the clients that
looks something like:

     0 1 * * *	tar zcf - dir1 dir2 | ssh c1-QYmCcGDK76Yu87cJ84SUZA at public.gmane.org save

I say "something like" because I don't have access to those machines
anymore, so I'm going by memory.

Worked like a charm.  On the backup machine I would go in once a month
or so and delete all but every 10th backup.  If this is too much storage
or bandwidth, doing full backup on Sundays and doing incrementals the
other days has worked very well for me in other contexts.

../Dave
--
The Toronto Linux Users Group.      Meetings: http://gtalug.org/
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://gtalug.org/wiki/Mailing_lists





More information about the Legacy mailing list