[GTALUG] On the subject of backups.

Alvin Starr alvin at netvel.net
Mon May 4 09:55:51 EDT 2020


I am hoping someone has seen this kind of problem before and knows of a 
solution.
I have a client who has file systems filled with lots of small files on 
the orders of hundreds of millions of files.
Running something like a find on filesystem takes the better part of a 
week so any kind of directory walking backup tool will take even longer 
to run.
The actual data-size for 100M files is on the order of 15TB so there is 
a lot of data to backup but the data only increases on the order of tens 
to hundreds of MB a day.


Even things like xfsdump take a long time.
For example I tried xfsdump on a 50M file set and it took over 2 days to 
complete.

The only thing that seems to be workable is Veeam.
It will run an incremental volume snapshot in a few hours a night but I 
dislike adding proprietary kernel modules into the systems.


-- 
Alvin Starr                   ||   land:  (647)478-6285
Netvel Inc.                   ||   Cell:  (416)806-0133
alvin at netvel.net              ||



More information about the talk mailing list