[GTALUG] On the subject of backups.

John Sellens jsellens at syonex.com
Mon May 4 11:27:06 EDT 2020


I bet no one would want this advice, but it seems to me that the
implementation needs to change i.e. that one big (possibly shallow)
filesystem on xfs is unworkable.

The best answer of course depends on the value of the data.

One obvious approach is to use a filesystem/NAS with off-site replication.
Typically a commerical product.

At relatively modest cost, I like the truenas systems from ixsystems.com.
ZFS based, HA versions available, replication can be done.
The HA versions are two servers in one chassis, with dual-ported SAS disks.

For do-it-yourselfers, I like using ZFS and ZFS replication of snapshots.
For example, I do much (much) smaller offsites from my home to work
using ZFS and zfs-replicate.

You can also do freenas (non-commercial truenas) but without the HA
hardware and code.

Hope that helps - cheers

John


On Mon, 2020/05/04 09:55:51AM -0400, Alvin Starr via talk <talk at gtalug.org> wrote:
| The actual data-size for 100M files is on the order of 15TB so there is a
| lot of data to backup but the data only increases on the order of tens to
| hundreds of MB a day.


More information about the talk mailing list