Laravel DigitalOcean

Backup Laravel easily on remote server4 min read

This approach to backup Laravel can also be applied to other MySQL based systems, however my main focus is on Laravel.


You just launched your new website and are ready to prototype it and gain feedback or maybe you are already in production. From here all data stored are precious and valuable and losing it could in worst case ruin the whole project. It’s therefor important to have some kind of backup from the beginning.

However, as it is for many things in the world, especially when programming, the success is all about scaling at the right pace. You shouldn’t spend all your energy on the backup plan from the beginning for at least a few reasons:

  • There are many tasks to accomplish in order to meet a stable state
  • The project is still fairly simple and small, so incremental backup and restore plans are not that important yet
  • The user base is small and might be a testing group and a bit downtown to restore a backup is not critical

You might find more reasons and the important key here is that it’s not going to be the permanent solution for the lifetime of the project.

Build it simple

Linux script

The best way to keep it simple and yet stable is to take leave it out of your applications codebase and write it a system supported script language such as Bash (Unix or Linux).

Cron job

To automate the process you should benefit from the cron.

DigitalOcean Spaces

Lastly you must have somewhere else to store the backup as it isn’t going to be optimal to store it on the same server as the Laravel project if the server is being hijacked. My recommendation is to use DigitalOcean Spaces. It’s a file protocol services where you can easily store and retrieve files throught their API, I use it myself for my services. The price starts from $5 per month and should be sufficient for your needs.

The implementation

At this time your project is still simple and you should have all files stored on Git, so the method will only focus on a full database backup.

Files from forms should be uploaded directly to a service such as DigitalOcean Spaces to keep your webserver clean from files outside Git.

MySQLdump

First step is to gather the data from the database, the most efficient way to do this is to use MySQLs own too mysqldump. You will need to provide a username, password and of course the database and output file.

dbuser=root
dbpassword=secret
dbname=database
dbhost=127.0.0.1
file=db-backup.sql

mysqldump --host=${dbhost} --user=${dbuser} --password=${dbpassword} ${dbname} > ~/${file}

DigitalOcean Space

Next step is to send your db-backup.sql file to your Space. In order to this you’ll need an API key and secret which can be generated from your account: https://cloud.digitalocean.com/account/api/tokens
You must create a Space access key, not an account token.

You also need to create your Space for storing the files: https://cloud.digitalocean.com/spaces
You can leave all the settings to the defaults if you like and only choose a unique name.

Create DigitalOcean Space

Now add the variables to your backup script. You can find your Spaces endpoint if you go to it’s settings.

key=xxxxxxxxxxxxxxxxxxxx
secret=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
endpoint=xxxxx.digitaloceanspaces.com
bucket=my-unique-name

The complete script

Putting it all together you now have a working backup automisation. Also the file will prefix a timestamp in order to keep the names unique and orderable.

# YOUR VARIABLES
key=xxxxxxxxxxxxxxxxxxxx
secret=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
endpoint=xxxxx.digitaloceanspaces.com
bucket=my-unique-name
dbuser=root
dbpassword=secret
dbname=database

# MIGHT NOT NEED EDIT
file=db-backup.sql
dbhost=127.0.0.1

# STATIC VARIABLES
file=$(date "+%Y.%m.%d-%H.%M.%S").${file}
resource="/${bucket}/${file}"
contentType="application/sql"
dateValue=`date -R`
stringToSign="PUT\n\n${contentType}\n${dateValue}\n${resource}"
signature=`/bin/echo -en ${stringToSign} | openssl sha1 -hmac ${secret} -binary | base64`

# DUMP SQL
mysqldump --host=${dbhost} --user=${dbuser} --password=${dbpassword} ${dbname} > ~/${file}

# UPLOAD
curl -X PUT -T ~/${file} \
  -H "Host: ${bucket}.${endpoint}" \
  -H "Date: ${dateValue}" \
  -H "Content-Type: ${contentType}" \
  -H "Authorization: AWS ${key}:${signature}" \
  https://${bucket}.${endpoint}/${file}

# REMOVE FILE AFTER UPLOAD
rm ~/${file}

Save it to your home folder ~/backup.sh and ensure it’s executable:

chmod +x ~/backup.sh

Cron

Create a cron job to automate the backup, the interval is up to you depending on importance of data and frequency of new data, for this example I have set it to run every 4ht hour.

To open crontab, enter in your console:

crontab -e

And add the line at the bottom

* */4 * * * sh ~/backup.sh >/dev/null 2>&1

Congratulations. Your Laravel backup is not setup!

Leave a Reply

Your email address will not be published. Required fields are marked *