I just read several more horror stories that include, among other things, failed backups. I’ve said it before (at volume, extreme volume), and evidently I have to say it again. Simply creating a backup file is not enough to ensure the protection of your information. In order to attempt to reinforce the importance of this idea, I’m going to introduce a new concept. Maybe it’ll help. I’m calling it “The Three T’s of Backups.”
Take ‘Em
First, and most important, you have to Take backups. That’s your first “T”.
No, disk redundancy through RAID or a SAN or some other setup is not adequate to protect your information. You must take backups. They have to be created and they have to be run regularly. You absolutely should automate this.
Test ‘Em
Now that you are taking your backups, you need to Test your backups. That’s your second “T”.
Simply having a backup doesn’t mean much if you can’t restore it. How do you know if you can restore a backup? You test it. How do you test it? You run the restore process. Not only does this test your backup, but it tests your ability to do the restore. I would strongly recommend automating this as well.
Transfer ‘Em
With your backups in place and tested, now you need to copy them to a second location, Transfer them offsite. This is your third “T”.
Things happen. Fire, flood, wind and extra-solar radiation. Any, or all, of these things and more, can affect your servers. So, let’s take our tested backups and transfer them to a second location (and maybe a third if we’re feeling really paranoid). One onsite and one in the cloud. One in a local data center and one in a distant data center. Oh, and let’s automate transferring the tested backups we’ve taken.
The Three T’s
Backups:
Take ‘Em
Test ‘Em
Transfer ‘Em
Now get out there and make this happen. Take your backups. Test your backups. Transfer your backups. Automate each of these steps and validate that everything is actually working. It’s your information. It’s your business. If you want to protect it, you’ll do the Three T’s.
May I add “Benchmark” them? I have been running benchmarking of our backups across environments, various backup options, sizes, type of disks etc.
More details to be blogged soon.
Not a bad idea, although it messes with my nice, neat, “Three T’s”. Three T’s and a B isn’t the same.
Take ’em, Test ’em, Transfer ’em, and Time ’em?
What tools are commonly used to transfer and manage backups? I’ve been using powershell to manage my backups, but it can be inconvenient at times.
I try to keep:
Daily Backups for 3 months
Weekly backups for 2 months
Bi weekly backups for 2 months
Monthly backups for 5 months
And yearly backups anything older than that
Are there any good tools to manage the backups and transfer them to an archive server?
any full samples scripts using powershell for do backups?
Not immediately at hand. I have some simple ones shown in this article: https://www.red-gate.com/simple-talk/sql/sql-tools/automated-database-provisioning-development-testing/
It’s very straight forward to use PowerShell for backup and restore.
There are some tools that help. For example, Redgate SQL Backup will include automated copies of the backups to secondary locations. Other than that, most people set up their backup scripts to include the process of copying over the backups to secondary servers. I also know that a lot of places have a service for offsite storage that will pull from their backup location. If search around you can find additional stuff on this. It doesn’t matter that much exactly how you get to the secondary location, as long as you do.