Backups are an important part of maintenance of any web platform, therefore it is important that the correct backup approach is used the the correct scenario. Content of websites should be treated no different, as they contain lots of files which can cause a disaster should they become lost, corrupted or unavailable. I manage a number of different websites and make use of a backup solution for each one to ensure that websites are recoverable in the event of a disaster.
Although there are many different available third-party options to complete website backups I found that most tended to be bloated, cumbersome or to include features which I felt were unnecessary, so I decided to write my own. I wrote a simple BASH script which fits the basic needs of backing up the entire web content of a website. This script could be run as a cronjob to automatically backup the web content at set intervals to meet the requirements of backup availability. The script would also automatically remove any backups which were older than a certain time-period to avoid taking up unnecessary resources. The script also gave you control over the end backup location so you could send them to any directory, such as to a remote secerver, to provide further redundancy.