Store Public and Private Files/Images in Laravel: Ultimate Guide

When working with files and images in Laravel, you can store them for public and private use. In this tutorial, we will explore how to do both, also looking at local server and Amazon S3 setup.


1. Local Server: Storing Public Files

1.1. /public Directory

The public/ directory is the default location where Laravel serves static assets like images. By default, any file in the public directory can be accessed by users using a URL that points to the file.

cat browser

If we put the cat.jpg image under the public/images/ folder, it can be accessed via the /images/cat.jpg URL.

Note: your web server's document root should be set to the public/ folder of your Laravel project. All files in this directory will be publicly accessible.

It is important to address that if you add any files to the public/ directory during some method calls, and have already configured the repository, your repository will become dirty. So this is suitable only for static files which you do not intend to manage on a software level.

1.2. /storage/app/public Directory

Now let's put the cat.jpg file in the storage/app/public directory and try to access it.

no cat

The storage/app/public/ directory in Laravel is a directory where you can store files that are publicly accessible to users. This directory provides a convenient location to store user-uploaded files, such as images, videos, and documents.

When files are stored in the storage/app/public/ directory, they are not directly accessible by users via URL. Instead, Laravel provides a symbolic link to the files in the public/ directory, making them accessible to users through a URL.

To create a symbolic link to the storage/app/public directory, you can use the php artisan storage:link command. This will create a symbolic link from the public/storage path to the storage/app/public/ directory.

Once the symbolic link has been created, files stored in the storage/app/public/ directory can be accessed via a URL that starts with storage/.

storage cat

If you are using Laravel Valet, the /storage/ URL of your application is always public in your local environment even if you did not run the php artisan storage:link command.


2. Local Server: Storing/Downloading Private Files

There are cases where files should not be directly accessible by users via URL. These files typically include sensitive data such as financial data, personal data, or invoices.

Let's put the invoice.pdf file into the storage/app/private folder. This directory is not accessible via a URL, and files stored there are only accessible to the application.

no storage

no private

To access private files we can use the Storage facade, which provides a download() method like this:

routes/web.php

use Illuminate\Support\Facades\Storage;
 
Route::middleware('auth')->get('/download', function () {
// There can be more logic to check if the user is eligible for a download
$condition = true;
 
if ($condition) {
return Storage::download('invoice.pdf');
}
 
return response()->noContent();
});

In our case user should be logged in because of auth middleware and satisfy additional conditions if needed, to validate eligibility to download the invoice.


3. Remote Server: Setting Up Amazon S3

If you want to store the files separately on Amazon S3 servers, instead of your own local server, there's a bit of work to set it all up.

Before using the S3 driver, you will need to install the Flysystem S3 package via the Composer package manager:

composer require league/flysystem-aws-s3-v3 "^3.0"

The S3 driver configuration information is located in your config/filesystems.php configuration file. This file contains an example configuration array for an S3 driver and by default inherits values from the .env file. Usually, you do not need to change anything there.

3.1. Create IAM User

  1. To get credentials for our Laravel app first we need to create IAM (Identity and Access Management) User. Navigate to the Security Credentials page. The menu can be found in the top right corner by pressing your account name.

Security Credentails

  1. Then press Add Users button.

Add User

  1. Enter the user name and press Next

Specify User Details

  1. We do not need to set any permissions now. Access permissions will be configured in the bucket settings later. Keep defaults and press Next

Set Permissions

  1. Click Create user to finish the process.

Review and Create

  1. Now in the users list press on the user name you just created.

username

  1. In the user's summary window copy ARN (Amazon Resource Number) for this user. It has the following form arn:aws:iam::*****:user/**** and will be required when setting permissions for the bucket.

Copy User ARN

  1. Now below locate Security credentials tab and press on it.

Security Credentials Tab

  1. In the Access Keys section press Create access key. This will create credentials for our Laravel app to access the bucket.

Create Access Key

  1. Select Application running outside AWS option and press Next

Access Key Case

  1. Optionally you can set a description tag for this key. Then press the Create access key button.

Set Description Tag

  1. Now save both Access key and Secret access key and press Done. Optionally you can download credentials by pressing the Download .csv file button.

Secret access key is displayed only once in this view, so if you fail to save that key you need to create new keys.

Retrieve Access Keys

3.2. Create S3 Bucket

  1. Navigate to your S3 management and create a new bucket by clicking Create bucket

Create bucket

  1. Enter the bucket name for your application

Bucket names must be globally unique and must not contain spaces or uppercase letters. This means the bucket name should be unique in the region, not only for your account.

When choosing a region note region name, for example eu-central-1, you will need to define that too in your Laravel app.

Create Bucket View

The rest can be left with the default settings, scroll to the bottom of the page and press Create bucket

  1. After creating the bucket, you will be redirected to the bucket list. Press on your bucket name to inspect a bucket and set additional configuration.

Bucket Created

  1. Select Permissions tab and scroll to Bucket policy section

Bucket Permissions

  1. In the Bucket policy section press Edit button.

Bucket Policy

  1. And enter the following config:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Statement1",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::****:user/****"
},
"Action": [
"s3:DeleteObject",
"s3:GetObject",
"s3:PutObject",
"s3:ReplicateObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<YOUR-BUCKET_NAME>",
"arn:aws:s3:::<YOUR-BUCKET_NAME>/*"
]
}
]
}

This configuration will allow to list/add/retrieve/delete objects in the bucket for the IAM User using access keys.

Update the Principal.AWS value with your IAM User ARN. It should look like this arn:aws:iam::*****:user/****.

Update the Resource section with your bucket name. We apply these permissions for two resources. The first one is the bucket itself and the second one is for objects.

3.3 Setup Laravel .env variables for S3

Update your .env file in your Laravel project by providing AWS credentials, region, and bucket name.

.env

AWS_ACCESS_KEY_ID=********
AWS_SECRET_ACCESS_KEY=********
AWS_DEFAULT_REGION=<REGION-NAME>
AWS_BUCKET=<BUCKET-NAME>
AWS_USE_PATH_STYLE_ENDPOINT=false

Now we can perform all needed operations using S3 API on our new bucket.


4. Working with S3 Private files

To upload the locally stored files into S3 typically your code will look as follows:

use Illuminate\Support\Facades\Storage;
 
$key = 'invoices/invoice.pdf';
$contents = file_get_contents('storage/app/private/invoice.pdf');
 
Storage::disk('s3')->put($key, $contents);

Or if it is a controller with a form:

upload.blade.php

<form action="{{ route('invoice.store') }}" method="POST" enctype="multipart/form-data">
@csrf
<input type="file" name="invoice">
<button type="submit">
Submit
</button>
</form>

The file can be stored on S3 straight from the request and omitting the Storage facade like in this example:

app/Http/Controllers/InvoiceController.php

public function store(Request $request)
{
if ($request->hasFile('invoice')) {
$file = $request->file('invoice');
$file->storeAs('invoices', $file->getClientOriginalName(), 's3');
}
 
// ...
}

In Amazon S3 directories don't have a physical existence. Instead, each file is referred to as an "object" and is identified by its unique "object key" consisting of the file path and name.

The following example illustrates how to allow users to download files stored in a bucket from the invoices/ folder by visiting the /invoices/invoice.pdf URL. Laravel now works as a proxy server between your S3 bucket and the user.

routes/web.php

Route::middleware('auth')
->get('/invoices/{name}', [InvoiceController::class, 'show'])
->name('invoice.show');

app/Http/Controllers/InvoiceController.php

public function show(string $name)
{
// ...
 
if (! $canDownload) {
abort(403);
}
 
$disk = Storage::disk('s3');
$key = 'invoices/' . $name;
 
if (! $disk->fileExists($key)) {
abort(404);
}
 
return $disk->download($key);
}

This way all your files on Amazon S3 are private and are only accessible by logged-in users through your application.

5. Allow to access S3 Public files

  1. To use S3 as storage for public files like images, navigate to your bucket settings, go to Permissions tab, and in the Block public access section press the Edit button.

Bucket Permissions Public Access

  1. Untick all checkboxes and press Save changes. This will allow you to have public objects in the bucket, but the effect is not immediate, you still need to set a policy for public files.

Allow public access

Objects can be public. This bucket isn't public, but anyone with the appropriate permissions can grant public access to its objects.

  1. Press Edit button in the Bucket policy section.

Bucket policy public

And add the second statement:

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Statement1",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::****:user/****"
},
"Action": [
"s3:DeleteObject",
"s3:GetObject",
"s3:PutObject",
"s3:ReplicateObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<YOUR-BUCKET_NAME>",
"arn:aws:s3:::<YOUR-BUCKET_NAME>/*"
]
},
{
"Sid": "Statement2",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<YOUR-BUCKET_NAME>/images/*"
}
]
}

This time we have only one permission s3:GetObject in the Actions. Notice the Principal value is * which means Everyone. The resource we are granting permission to get objects for everyone is all objects in the images "folder".

We can upload the same cat.jpg picture we used earlier in this tutorial with this snippet:

use Illuminate\Support\Facades\Storage;
 
$key = 'images/cat.jpg';
$contents = file_get_contents('public/images/invoice.pdf');
 
Storage::disk('s3')->put($key, $contents);

Image URL can be get using the url() method:

$url = Storage::disk('s3')->url('images/cat.jpg');

URL will look as follows:

"https://<BUCKET-NAME>.s3.<REGION-NAME>.amazonaws.com/images/cat.jpg"

And when we visit that URL image appears. Now you can use it to display your assets or user content.

Cat on Amazon

To verify that only image files are public and not the invoices we can list all the files in the bucket with the URLs.

$urls = collect(Storage::disk('s3')->allFiles())->map(function ($key) {
return [
'key' => $key,
'url' => Storage::disk('s3')->url($key),
];
});

Which will give the following collection:

[
0 => [
"key" => "images/cat.jpg"
"url" => "https://<BUCKET-NAME>.s3.<REGION-NAME>.amazonaws.com/images/cat.jpg"
],
1 => [
"key" => "invoices/invoice.pdf"
"url" => "https://<BUCKET-NAME>.s3.<REGION-NAME>.amazonaws.com/invoices/invoice.pdf"
]
]

All objects in the bucket have the URL, but that doesn't mean they are accessible via public URL. If we try to access the invoice the following screen will show up:

Invoice access denied

avatar

I'd love to have a part of this tutorial that teach us how to store the photos on a static server. Eg. I could have an Ubuntu server with just ssh and nginx and serve the files like this: https://static.example.com/images/001/cat.jpg

avatar

Personally I've never used it this way, I've always saved files either on the same server or on Amazon S3. Why bother creating (and maintaining!) a separate server just for photos?

avatar

I'd say the reason is the same as why bother to use S3. In my case to have super fast, sessionless and cheap storage for millions of photos. Maintaining such server is so easy - much easier than S3. And cheaper.

avatar

Using something like https://github.com/minio/minio might work. Haven't tried. It is also mentioned in the Laravel docs

avatar

This MiniIO looks VERY interesting. Thank you Nerijus!

avatar
You can use Markdown
avatar

My standard note for anybody coming here: Web devs should strongly consider leveraging image servers such as cloudinary or cloudflare (several others) when storing & serving image files!

👀 1
avatar

Why?

avatar
  • all the variant/conversion work handled, so you can efficiently pull appropriately sized & formatted images into the site without you having to juggle files.
  • all the mime-type conversions to browser-friendly data types (e.g. WebP / AVIF, etc.) on a per-client basis handled without you having to juggle files.
  • srcset / responsive images handled without you having to juggle files.
  • images served from "cloud" (like S3 above).
  • secure direct uploads, so using something like FilePond client your server never acts as intermediary during upload (as in example above).
  • many other cool features (vary by service): ** Ability to serve private images (copied URL will not work outside your site even though not served from your site) ** streaming capability ** analytics ** global CDN network ** etc.

There is a learning curve and expense. Even though it is pretty basic and documentation is sparse, I've settled on cloudflare w/ monthly fee of $5 for first 100K images stored (no fee for variants/conversions of your uploaded image) + $1 per 100K images served. Cheap for the money!

avatar

all the variant/conversion work handled, ....... srcset / responsive images handled without you having to juggle files.

This is easy handled by eg. spatie/laravel-medialibrary

images served from "cloud" (like S3 above).

So you pay through the nose... Doesn't looks like an advantage to me.

secure direct uploads, so using something like FilePond client your server never acts as intermediary during upload (as in example above).

I have it easily done by myself. No need of cloud abra cadabra

many other cool features (vary by service): Ability to serve private images (copied URL will not work outside your site even though not served from your site) streaming capability analytics global CDN network etc.

Private images is no problem whatsoever. CDN yes it this case I agree.

There is a learning curve and expense. ...Cheap for the money!

The cloud stuff needs extra learning so it's not true. And the Laravel/Spatie docs are quite OK so not true either. My server costs me nothing more as I already have it so not ture as well. Also this 5 bucsks quickly become 50 when you want fast and reliable storage not the most basic one which is usually quite useless.

So I'm not buying it. I don't mean that the cloud is worthless. It surely is useful for many cases. But it is not a good solution for everything and everyone. And I definitely do not agree with your statement that every web dev should strongly consider leveraging image servers such as cloudinary or cloudflare. It all depends on the case and usually it is not the case for people who are in small projects where you have 100ish photos stored.

avatar

Well it's not to have a big debate and it sounds like you have this all in hand ... just a note that the option should be considered. And yes the size of the project does matter.

Just FYI one example of a "small" site I have: about 2K user-uploaded images that are served about 250K times per month in many variants. The UI/UX person can do whatever they want without having me spin up new variants, etc. About $8/month. It just works & I hardly think about it! (I view cloudflare about as reliable AWS, which means very much so ... probably more so than my own site!) I have used media library / S3, etc., in the past and they really are very good options, but no comparison at all for me and I will never go back.

But that's just info for folks every dev & every project different! You do have some cool stuff going on that would add to this guide: How do you actually do the direct to server (e.g. S3) uploads securely? And how do you serve tokenized (URL's expire) images from remote storage without hitting your server?

avatar

Sure thing. Knowledge does no harm. Just opposite. And it's good to cosider other options. Maybe S3/CloudFlare etc. is great for this project but your own S3 clone is better for another and jus disk sotorage is better for another one. This should not be a religion.

avatar
You can use Markdown
avatar

For people doing this tutorial now, the layout of amazon AWS has changed, but most things you will find. This step: 10. Select Application running outside AWS option and press Next

You should select "Other" and then it all works as you would expect. I have to say I tried first by getting a digital ocean Spaces to work but I can't figure that out yet.

👍 2
avatar
You can use Markdown
avatar
You can use Markdown

Recent New Courses