You may already be familiar with Amazon S3, the most popular solution for cost effective storage services nowadays. You will need it when you are looking for:

  • Low cost storage: it happens to be my case, when I implemented a Drupal based web app for a local governmental authority. The app is used by branches from all provinces of the country, and they usually upload a large amount of data (documents, photos, scans etc ...) regularly. Using the app server's storage is too expensive. So I converted the Drupal file system to Amazon S3, leaving only the core and modules on the app server.
  • Fast loading: many bloggers have used S3 to store their photos, videos, audios and files, for better serving their readers. As customers are from all over the world, saving the multimedia content to S3 will let them access them much faster.
  • And many more benefits

In this tutorial, we will show you how to convert the Drupal 7 & 8 file system to Amazon S3 and sync all existing files to S3 Storage.

1. Preparation

You will need to run several client programs like drush and awscli. So if your site is on a shared hosting, you are not able to install and execute them. Pls download it to your local host and configure it there. After that you can upload to your shared hosting.

The techniques that I use in this tutorial are:

  • Amazon S3 ofcourse.
  • Drush:  command line shell and Unix scripting interface for Drupal, very convenient and useful.
  • AWSCLI: the Amazon Web service client tool, I use it to sync data to S3.
  • S3FS: the Drupal module for connecting to Amazon S3

2. Install Drush:

Pls follow this guide to install Drush: http://docs.drush.org/en/master/install/

3. Install S3FS:

For Drupal 7, on the shell script, go to the web project folder and execute the following commands:

drush dl s3fs
drush en s3fs
drush make --no-core sites/all/modules/s3fs/s3fs.make

The first command is to download the S3FS module. The second one enables it. And the third one is to automatically download the required library AWSSDK to /sites/all/libraries/awssdk2. This command is very useful because there are many versions of AWSSDK and only the chosen version (in my case, 2.7.5) is able to work with this module.

If you have no drush, you can install the S3FS module manually and go to GitHub to download the AWSSDK for PHP library and place it under /sites/all/libraries/awssdk2 . Remember to choose the right version, which is written on the /sites/all/modules/s3fs/s3fs.make file.

On Drupal 8, please use composer to install this module. Composer will automatically install required libraries. So on the terminal and under your root project folder, run:

composer require drupal/s3fs

4. Get a key pair to access Amazon S3:

Pls go to Amazon IAM to create an user with Access Key. Pls follow this guide for more details: http://docs.aws.amazon.com/gettingstarted/latest/wah/getting-started-pre...

Then create a bucket on Amazon S3.

5. Configure S3FS access to Amazon S3

Browse Configuration - Media - S3 File System and enter your key pair and bucket name.

Hit Save. If there is no error, then the S3 connection is perfect. Otherwise, pls check your keys and bucket name, according to the error messages.

If you want the credentials and bucket name are not removed by mistakes, pls write them to the /sites/default/settings.php file

$conf['awssdk2_access_key'] = 'YOUR ACCESS KEY';
$conf['awssdk2_secret_key'] = 'YOUR SECRET KEY';
$conf['s3fs_bucket'] = 'YOUR BUCKET NAME';
$conf['s3fs_region'] = 'YOUR REGION'';

Update for Drupal 8: adding keys via the admin interface will be soon deprecated. Instead, add the configurations to the settings.php file like:

# S3FS setting
$settings['s3fs.access_key'] = 'yourkey';
$settings['s3fs.secret_key'] = 'yourkey';
$config['s3fs.settings']['bucket'] = 'yourbucketname';
$settings['s3fs.use_s3_for_public'] = TRUE;

# if you want to use S3 for private files, uncomment the line below:
#$settings['s3fs.use_s3_for_private'] = TRUE;

6. Configure S3FS to take over file system:

Now you may want this S3FS module to take over the public file system. Just open the Avanced Configuration Options section and select Use S3 for public:// files . Just tick and save and the module will take care of the rest.

Return to Amazon S3 and browse your S3 bucket, you should see a new folder named s3fs-public there (or s3fs-private if you choose Private files).

6. Clear the cache and try the first file upload

Now turn into the Actions tab and hit Refresh file metadata cache.

Then upload your first file via the node creation form, such as upload an image of an article. Save it and check the image path, it should look like http://your-bucket-name.s3-us-west-2.amazonaws.com/s3fs-public/... Your file are succesfully uploaded to Amazon S3.

If you don't see the uploaded file, it is because your bucket is not made public. Pls browse your bucket, select the s3fs-public folder and choose Actions - Make Public.

7. Sync existing files to S3

Also on the Actions tab, there is a button to Copy local public files to S3 . Just hit it and the uploading process will run in batch. If you have not many existing files, just wait for a few hours and it will be ok.

In case you have a rather large bunch of existing files, like I had 10GB of public files, the above method is just too slow. With the help of AWSCLI, it will be just a piece of cake.

To install AWSCLI, follow the guide here: http://aws.amazon.com/cli/

On your shell terminal, enter:

sudo apt-get install python-pip
pip install awscli

Then configure AWSCLI:

aws configure

AWS Access Key ID [None]: ENTER
AWS Secret Access Key [None]: ENTER
Default region name [None]: us-west-2
Default output format [None]: json

After that execute this command to sync the sites/default/files folder to your S3 bucket.

aws s3 sync sites/default/files s3://mybucket/s3fs-public --exclude *.tmp

Allow the sync process to complete and check the S3 bucket, you should see your files there.

Wrap up

Before using the S3FS module, I had a failed experiment with Amazon S3 module. The process was more complicated and I hit a deadend.

The S3FS module, in my experience, is much easier to configure and deal with. And I hope you will get the same results as mine. Good luck.

Subscribe to our mailing list

* indicates required