Categories
Uncategorized

mysql backup shell script

This is what I tend to use for a simple MySQL database backup script… I wanted to post this so I can look it up when I need it. There are probably better ways to do this (tell me about them!) but this works for me.

#!/bin/bash

DT=`date +"%Y%m%d%H%M%S"`

mysqldump -u [USERNAME] -p[PASSWORD] [DATABASENAME] > /home/backups/[DATABASENAME]-$DT.dump

gzip /home/backups/[DATABASENAME]-$DT.dump

mysqlsm

Substitute your MySQL user for [USERNAME]. (There should be a space between the ‘-u’ and the [USERNAME])

Substitute your MySQL user’s password for [PASSWORD]. (There should not be a space between the -p and the [PASSWORD])

Substitute your MySQL user’s database for p[DATABASENAME].

Each time you run it, it will get the date with the year, month, day, hours, minutes, seconds, and use it in the name. So %Y%m%d%H%M%S would produce something like 20100711090854. If you are running one backup per day, you could shorten it to %Y%m%d.

This would put the files in the /home/backups directory. Set this to wherever you want the files to go.

The gzip command compresses the dumped database file. If you don’t want to compress it (and save disk space) then don’t use it.

(BTW, you don’t type the [ brackets ]. They are just there to highlight the words you need to fill in.)

Categories
Uncategorized

Amazon S3 Update

Over at Z2 Marketing + Design we’ve implemented Amazon S3 as an off-site backup solution. As a design studio, there are a lot of large files, between photo and video shoots, print projects, and a zillion other things that require creating files, we’ve got our own storage issues to deal with. Will the building burn down? Dunno, it might, but if it does, it will be nice to know our data is safe over at Amazon.

Since I’m still moving a lot of older data there, the uploading is constant. I’d estimate I’m moving about a GB of data per day right now. In fact, let’s look at the numbers…

Amazon says we’ve transferred 16.035 GB this month, and we’re storing about that much as well. But wait, what is the cost of this? Well, as of right now, it looks like Amazon will be billing us just about $5 or so. (Probably a bit more, as I have another 11 days to keep moving data there.) So for the cost of a good burrito, we’ve transferred and stored over 15 GB of data. Freakin’ awesome!

I’m still not 100% happy with the tools, and how I’m doing the transfers (much of it is manual right now) but better tools will come along, and with a new server in-house, we’ll look at automating all the backups, so it “just works” and if something doesn’t work, it’ll let us know. Thanks Amazon!



>