Google Storage and WordPress. How to configure and store your uploads?

With a standard configuration, media files are stored in the same place (same server) as the rest of WordPress files. It’s fine in most cases but with GCE it’s much cheaper to store it externally. You could of course use Amazon S3 – at the moment it’s more popular than Google Storage – but we want to stick to GCP and will use Google Storage.

Why to use Google Storage?

  • First of all it’s much cheaper than storing files on GCE instance. See the pricing.
  • Imagine you use many servers, each running the same instance of your blog/app. You could sync all the uploads between servers but that wouldn’t make sense. Instead you can simply store all the files in 1 place and refer to them from anywhere (any server). There’s no need to sync anything.


Unfortunately WordPress doesn’t provide any native solution. After some googling I found 2 ways I thought would be sufficient:

  1. Use a plugin
  2. Configure and use the Cloud Storage FUSE

The plugin

I found couple of plugins which worked but none of them has met my needs. I found that WP-Stateless – Google Cloud Media Storage was working fine and it was easy to set up. However I wasn’t able to use a custom URL for my files. This means it always uses your original bucket URL to display/link files. I wanted to keep all the links within my domain.

I hope someday someone will update that plugin (or create a new one). Then I’ll definitely switch to this solution as it’s much simpler than FUSE.

Cloud Storage FUSE

As per documentation:

Cloud Storage FUSE is an open source FUSE adapter that allows you to mount Google Cloud Storage buckets as file systems on Linux or OS X systems. It also provides a way for applications to upload and download Google Cloud Storage objects using standard file system semantics.

What it means is we create a mount point on our local file system to a shared Google Storage bucket. It automatically synchronises all the files when you upload/remove/edit. There’s no need to use any 3rd-parties.
Our media file URLs will still look like:<file> but will refer to GCS.

The downside compared to the plugin is that there’s some server-side work involved. But don’t worry, it’s not as hard as it sounds.


So let’s get started. There are 3 step to make this work:

  1. Create and configure a new bucket (Google Storage)
  2. Allow read/write Storage access for the VM
  3. Install and configure Cloud Storage Fuse
  4. Mount the uploads directory

The bucket

First go to Storage site in the console.
Google Storage menu

You should see a bucket there, we already created that while migrating our WordPress database to CloudSQL. Just remove the .sql file if you haven’t done that before. Let’s now allow all visitors to read its content. To do that click on 3 vertical dots next to your bucket, select “Edit bucket permissions” -> “Add item“:

  • Entity: User
  • Name: allUsers
  • Access: Reader

Bucket settings dropdown Bucket permissions form

VM Storage read/write access

By default your VM has a limited Storage access and can only read. It’s a bit tricky and took me a while to figure out why my files weren’t uploading to the bucket. To fix it, go to you VM details, stop it and then click the “Edit“. Scroll down to “Cloud API access scopes” and for Storage, select “Read/Write” and save. Then start your VM again.

GCE instance Storage preferences

This will enable your VM to upload and write from the bucket.

Cloud Storage Fuse Preparation

Now it’s time to install the Cloud Storage Fuse. Installation is pretty straightforward and described here. Let’s do it now. First you need to SSH to you VM. See here how to do it.

Now you need to run few commands:

export GCSFUSE_REPO=gcsfuse-`lsb_release -c -s`
echo "deb $GCSFUSE_REPO main" | sudo tee /etc/apt/sources.list.d/gcsfuse.list
curl | sudo apt-key add -

sudo apt-get update
sudo apt-get install gcsfuse

At this point gcsfuse is installed and ready to be used.

Mount the uploads directory

Now it’s time to mount our uploads directory which will synchronise with GCS bucket.

First you need to allow the allow_other option for fuse. Run sudo nano /etc/fuse.conf and uncomment the line containing user_allow_other and save. The file should look like this:

# /etc/fuse.conf - Configuration file for Filesystem in Userspace (FUSE)
# Set the maximum number of FUSE mounts allowed to non-root users.
# The default is 1000.
#mount_max = 1000
# Allow non-root users to specify the allow_other or allow_root mount options.

Now run:

# Go to the local WordPress directory which contains uploads directory
cd /var/www/html/wp-content/
sudo chmod a+w uploads
gcsfuse --dir-mode "777" -o allow_other gcp-blog uploads

The last line mounts the directory. “gcp-blog” is the name of your bucket and “uploads” is the directory path which will be used to synch files and “fake” the local path. The rest are necessary things to properly mount it.

You should see this in the console:

Using mount point: /var/www/html/wp-content/uploads
Opening GCS connection...
Opening bucket...
Mounting file system...
File system has been successfully mounted.

Try to upload a new media file in you dashboard. It should work and all the files should be sent to your bucket.

There’s 1 more thing to be done. If you restarted your instance the mount would be gone. This is because gcsfuse doesn’t mount automatically.

Run sudo nano /etc/fstab and put this in the end of the file:

gcp-blog /var/www/html/wp-content/uploads gcsfuse rw,allow_other,dir_mode=777

“gcp-blog” as before, is your bucket name. This will basically do the same as we did before (runs “gcsfuse” command) but at the system startup.

That’s it! Now all your files will be uploaded straight to the bucket and read from there.

6 thoughts on “Google Storage and WordPress. How to configure and store your uploads?

  1. Could you explain how to set the shh keys to access the GCE instance from a program like Panic’s Coda integrated SSH client or any Linux SSH client that supports authentication and encryption with key pairs. Because I’m still using Google Console, I still don’t get it, if my username in the VM is my name, name@localhost, name@instance_name or the fully qualified email with which I sign into Google Cloud Platform. By the way I am using Debian.

    Thanks a lot.

  2. I followed the exact instructions and when i open the website it shows for all images “file not found” 404 error. is the instructions ok for https enabled image urls?

    1. To be honest, there was no need to do the https thing for me. I have only done it for http. So unfortunately not much I can help with https, but if you find the solution please let me know and I’ll update.

  3. This worked right out of the bat.
    I ran the script as root in the / folder of my application.

    Adding the allUsers permissions though is not needed and maybe even not recommended.
    Having the allUsers set to Reader (now called Storage Legacy Object Reader) will make all files available via the URL, ignoring any .htaccess directives that may be present. In fact, it allows downloading the .htaccess as well. So if someone knows the URL of a file, profile picture, uploaded Gravity Forms Attachment, this could open up some doors. Skippling that step means that a visitor will only be able to access the file by going through your website URL, which will involve our webserver, security settings etc, and return a 403 if they try the sneaky way around.

  4. seems I was able to do it only if I allow public write access, am I missing something?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.