There are lots of robust backup solutions out there, each with varying costs and sophistication. They are great and I won’t claim that this post is better than any of them. But in my case I simply wanted a cheap nightly backup of a database that could be stored in my Amazon S3 cloud. With a little scripting I was able to do it for just the tiny price of the transfers to and storage in S3.
1. Created an Amazon S3 Account and create a bucket for your backups. (I needed to look up my access keys in the Accounts section while I was there)
2. On the server, created a folder for my backup files and scripts (e.g. c:\backups)
3. Downloaded “S3copy Amazon S3 Command Line Copy” from http://download.cnet.com/S3copy-Amazon-S3-Command-Line-Copy/3000-2242_4-10915715.html?tag=mncol. This is a simple, free EXE file (s3copy.exe) that allows you to upload a file to S3 from the command line. I put it in my c:\backups folder.
4. Create a SQL script to backup the database, named backup.sql. I wanted a fresh, full backup each time so my script was simply:
BACKUP DATABASE [MyDatabase] TO DISK = N’C:\Backups\MyDatabase.bak’ WITH FORMAT, INIT, NAME = N’MyDatabase-Full Database Backup’, SKIP, NOREWIND, NOUNLOAD, STATS = 10
5. Create a batch file (BackupDB.bat) that will call SQLCMD to run this script and then use s3copy.exe to upload it:
REM — Perform database backup
sqlcmd -S .\SQLEXPRESS –U <user> –P <password> -i "c:\Backups\Backup.sql"
REM — Copy to S3
c:\backups\s3copy.exe c:\backups\ <bucketname> MyDatabase.bak <accesskey> <secretkey>
6. I then used Task Scheduler to create a nightly job that runs BackupDB.bat.
Yes, readers, this has all sorts of flaws. It has no error checking, it doesn’t store multiple generations, it’s storing keys in the open, etc. I will be fixing all that. But this gives you the basic framework for how a little creative scripting can get you a low cost, easy solution.