Simple function to backup mysql and files to S3 with an easy configuration.
This backup script will dump the local MYSQL Database to a file and zip it along with your html directory. Then, upload that zip file to S3. Also, it will clean-up any old backups on S3.
Ideal use case: You have a single small webserver which has one or more websites and one or more MYSQL databases, and you want to backup the files and MYSQL database to AWS S3.
- I have two websites on my server located:
- /var/www/my-website
- /var/www/my-other-website
- Each of these websites has a MYSQL database
- I want to create a zip file containing
- 2 .sql dump files (one for each MYSQL database)
- /var/www/my-website
- /var/www/my-other-website
- I want to then upload this zip file to AWS S3 daily at 3am
-
Upload this file to your server to e.g. /var/www/backup/backup.php
sudo mkdir /var/www/backup
sudo chmod 777 /var/www/backup -
There are some dependencies:
- zip
- Check if it is installed by: zip
- Install it with: sudo apt install zip
- mysqldump
- Check if it is installed by: mysqldump -V
- Install it with: TODO
- composer
- Check if it is installed by: composer -v
- Install it with: TODO
- NOTE: If you don't want to install composer on the server:
- Upload 'backup.php-vendor.zip' to the server and:
- cd /var/www/backup
- unzip backup.php-vendor.zip
- rm backup.php-vendor.zip
- NOTE: If you don't want to install composer on the server:
- php
- Check if it is installed by: php -v
- Install it with: TODO
- zip
-
Create a new IAM user account on AWS and grant permission to put ,get, etc files in the bucket:
- login to AWS console
- Generate a new user in IAM with Programmatic access and get their key and secret (used in the* CONFIG* block below)
- Create a new S3 bucket
- Go to "Permissions" -> "Bucket Policy" for the newly created S3 bucket
- Use the "Policy generator" to generate a new policy and add it.
- NOTE: For the "Resource" you will need to add:
- ARN for the bucket
- ARN for the bucket contents (append "/*")
- e.g.
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::123456789123:user/my-s3-iam-user" }, "Action": "s3:*", "Resource": "arn:aws:s3:::my-bucket" }, { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::123456789123:user/my-s3-iam-user" }, "Action": "s3:*", "Resource": "arn:aws:s3:::my-bucket/*" } ] }
-
Create .config.php file (copy from .config.example.php and update)
-
Manually run the backup just to test if there are any errors:
php /var/www/backup/backup.php
-
Schedule it to run daily. Add this to Crontab:
- $ crontab -e
- Add the below line to the file. NOTE: https://crontab.guru/#0_3___*
0 3 * * * php /var/www/backup/backup.php
- Add the below line to the file. NOTE: https://crontab.guru/#0_3___*
- $ crontab -e
- Run
php /var/www/backup/restore.php
- Just follow the wizard
Whenever a backup is created old backups will be deleted.
What old backups are kept on S3?
- Any backups older than one week and there are multiple backups on that day: delete the duplicates
- Any backups older than one year should only be kept if they were taken on the first of Jan
- Any backups older than one month should only be kept if they were taken on the first of the month
- Any backups older than one week should only be kept if they were taken on Monday or the first of the month
- restore.php should call the firstTimeSetup of the backup script
- After zipping we should check that all the files, we wanted to skip actually exist in the zip file
- Add a webhook or some sort of email callback that gets fired when this script throws an exception. SNS?
- Add a mysql options array to .config for https://dev.mysql.com/doc/refman/8.0/en/mysqldump.html#mysqldump-option-summary
- validateConfig() validate the MYSQL config.
- restore.php currently does [i-iv] whereas it would be better if it did [a-d]
- Ask for MYSQL settings
- Import MYSQL
- Ask file input settings
- Extract file
- Ask for MYSQL settings
- Ask file input settings
- Import MYSQL
- Extract file
- temp directory should be the systems /tmp directory
- Custom string you can set in the .env file to append to the file name to help identify zip files e.g. 2020-07-19_15-00-01-{custom string}.zip
- Remove all the crazy automatic creation of the composer.json file and auto running of
composer update
. composer.json should be a normal file andcomposer update
should be left up to the admin to run