Now that we have some features and the project is coming along - we should think about the worst...
What if we lose the data?
Like with the code in GitHub - we should also create database backups!
Seriously, backups are one of those things we don't think about until it's too late. So, let's get ahead of the curve and add backups to our project.
Adding Backups
For our backups, we will use the good old Spatie Backups package. It's tested, reliable, and easy to use.
First, let's install the package:
composer require spatie/laravel-backup
Then publish the configuration file:
php artisan vendor:publish --provider="Spatie\Backup\BackupServiceProvider"
And that's it for the installation. Now, we need to configure the package.
Configuring Backups
We need to edit the config/backup.php
file to configure our backups. In this case, we want to add s3
as an additional disk to store our backups:
config/backup.php
// ... /* * The disk names on which the backups will be stored. */'disks' => [ 'local', 's3',], // ...
Note: Don't forget to configure your S3 credentials in the .env
file.
Other settings are personal preference, but we could recommend setting up the following:
- Email notifications - to get notified if something goes wrong or the backup succeeded.
- Slack notifications - to get notified in a Slack channel
- Review Cleanup strategy - to keep only the necessary backups
These will allow you to keep track of your backups and ensure they work as expected.
Running Backups
Now, it's time to make our backups run automatically. Since Laravel 11, we can add our Schedule commands to the routes/console.php
file:
routes/console.php
use Illuminate\Foundation\Inspiring;use Illuminate\Support\Facades\Artisan; Artisan::command('inspire', function () { $this->comment(Inspiring::quote());})->purpose('Display an inspiring quote')->hourly(); Schedule::command('backup:clean')->daily()->at('01:00');Schedule::command('backup:run')->daily()->at('01:30');
This will clean the old backups and create a new one every day at 1:00 and 1:30, respectively.
Old backups are configured in the config/backup.php
file:
// ... 'cleanup' => [ /* * The strategy that will be used to cleanup old backups. The default strategy * will keep all backups for a certain amount of days. After that period only * a daily backup will be kept. After that period only weekly backups will * be kept and so on. * * No matter how you configure it the default strategy will never * delete the newest backup. */ 'strategy' => \Spatie\Backup\Tasks\Cleanup\Strategies\DefaultStrategy::class, 'default_strategy' => [ /* * The number of days for which backups must be kept. */ 'keep_all_backups_for_days' => 7, /* * After the "keep_all_backups_for_days" period is over, the most recent backup * of that day will be kept. Older backups within the same day will be removed. * If you create backups only once a day, no backups will be removed yet. */ 'keep_daily_backups_for_days' => 16, /* * After the "keep_daily_backups_for_days" period is over, the most recent backup * of that week will be kept. Older backups within the same week will be removed. * If you create backups only once a week, no backups will be removed yet. */ 'keep_weekly_backups_for_weeks' => 8, /* * After the "keep_weekly_backups_for_weeks" period is over, the most recent backup * of that month will be kept. Older backups within the same month will be removed. */ 'keep_monthly_backups_for_months' => 4, /* * After the "keep_monthly_backups_for_months" period is over, the most recent backup * of that year will be kept. Older backups within the same year will be removed. */ 'keep_yearly_backups_for_years' => 2, /* * After cleaning up the backups remove the oldest backup until * this amount of megabytes has been reached. * Set null for unlimited size. */ 'delete_oldest_backups_when_using_more_megabytes_than' => 5000, ],], // ...
This controls how many backups are kept and for how long. For example, 7 days of daily backups allows you to roll back a week if something goes wrong.
Note: These settings depend on your project requirements. Pick a time that is least likely to interfere with your users and configure the frequency of the backups according to your traffic/data usage.
Configuring our Server
Okay, now that we pushed our code to the server, we also need to configure the backups there. So, let's open Forge and change a few things:
1 - Add the Credentials to the .env
file
This can be done via UI or by sshing the server and editing the .env
file directly.
2 - Add the Schedule
To do this, we must open our server in Forge and go to the Scheduler
navigation item. There, we can add the commands to run the backups:
Once created, we should see it on the list:
Note: Keep in mind that your path might be different. Instead of home/forge/default
, it will be home/forge/DOMAIN
.
That's it! Our backups should now automatically run on the server.
Testing Backups
Now, here comes the most critical part - testing our backups. We can do this by running the backup command manually:
php artisan backup:run
This will run the backups immediately, and we can see if everything is working as expected:
Now, we can find our backup file:
Extract that Zip file and check if the database dump is there:
And now the crucial part - restore the database from backup!
If you don't test your backups, you might as well not have them. So make sure to test them regularly and ensure that they are working as expected. We recommend doing this at least once a month, but the more often, the better.
The backup restoration process should be simple:
- Create a new database
- Import the database dump from the backup
- Update
.env
with the new database credentials - Confirm that the data is there and is identical
That's it! This simple process lets you sleep better at night, knowing your data is safe.
Our system is now safe from data loss, so let's move on to the live server! In the next lesson, we will deploy a production instance of our application and prepare a checklist for the deployment process. This will ensure that we remember everything and that the deployment goes smoothly.
Hi, If I use spatie media library with Amazon Web Service S3; Would be the images backed-up as well?. It would be better to use another provider to make the backups, wouldn't it? so the backup provider have to be different to any other provider. ok?
By default S3 images will not be backed up. You need to configure that separately. And I'm not sure if this is supported in this package, since it would be huge file moving between two services. For that, there should be better proccessses than using PHP to run them :)
Thanks Modestas,
I understand that the backup of the media have to be other process. Could you point us in any direction to look up taking into account your experience? Another S3-Bucked with "internal S3 script" of backup between buckets? another third party backup service? Is an incremental process (type "Apple Time Machine") possible (exists?)?
I'm not that experienced in this, so sorry, can't 100% recommend anything.