Home > Backup+

Backup+

Backing Up Databases

Backing Up Databases

Please be aware that Backup+ clients do not have native support to back up running database systems, such as Microsoft SQL server 2005, MySQL and PostgreSQL.

Before attempting a database backup with the Backup+ service, you should either manually run a database dump (such as psql_dump for PostgreSQL or mysqldump for MySQL) to store your current datasets to a static file or files, or script such a dump into your backup job shell script or the Pre- and Post-Actions in Handy Backup.

Backup Plus – Ordering Details

To order or enquire about our Backup+ service, please call our sales team on 0845 11 99 991 or email [email protected]

Backup Plus – Overview

Designing your backup

Backups can often feel like a thankless task, but when a disaster strikes a good backup set is worth its weight in gold.

When setting up your backup solution, it is critical that you have sufficient storage for your requirements. To ensure this is the case, careful consideration should be given to what data should be backed up, how frequently it is to be backed up and how long it should be retained. It is also important to consider the rate of growth of your data set.

What is to be backed up

When deciding what is to be backed up, it is important to consider what you will require to restore your system to a working state in the event of a total failure. In addition to your own data, you should consider what configuration files and custom application backups you will require to restore service.

How often to back up

You should decide whether or not to implement a ‘Full backup’ or an ‘Incremental backup’. Incremental backups are useful when most of your data is static; for example a static HTML website which changes very little on a day to day basis. A full backup is still taken periodically, e.g. once a week. During the daily incremental backups, the backup client will compare your current data with that of the previous backup, and only back up files which have changed.

Alternatively, you can simply back up the full set on each run. This can be quicker and less CPU intensive than incremental backups because the client does not need to compare your current data with previous backups. The downside is that full backups generally require greater backup capacity and bandwidth than an incremental solution.


Space required

As an absolute minimum, you should ensure that you have have sufficient backup capacity to carry out a new full backup without first having to delete your old backups. This protects you from being caught without a backup should the new backup fail.

For example, if you have 100MB of data on a weekly full backup with daily incrementals backing up approximately 10MB of updated files every day, then you will require at least 260MB of storage to safely back up your data – 160MB for the weekly full backup plus six incremental daily backups then a further 100MB to save the new weekly backup prior to the existing backups being removed.
N.B. This does not allow for long term backup retention or growth of your data set – both of which require additional backup capacity above the minimum illustrated above.