Uploading Files to Google Cloud Storage in AdonisJS
Uploading Files to Google Cloud Storage in AdonisJS Photo by Amith Nair on Unsplash

Uploading Files to Google Cloud Storage in AdonisJS

To upload many files from the profile photo to pdf, almost every web application needs a file upload method. If you want it to be good enough, you'll probably need to do a lot on both the backend and the frontend. We will leave frontend as the subject of another article and discuss how to upload files in an AdonisJs project efficiently using Google Cloud Storage.

It is much easier to deal with file uploads in AdonisJs than many other frameworks. However, I should say that, extending file upload and file storage functionality is not as easy as it should be. Things can get out of hand especially if you want to upload the files directly into a cloud storage in the same time while it has been streaming.

Stria and all the things what makes Stria are served on Google Cloud. For sure, Google Cloud Storage is used as file storage as well. For now, AdonisJs does not offer a native storage driver for Google Cloud Storage. In this article, we will explain the detail of how to upload files on to the Google Cloud Storage (aka Buckets) in AdonisJs framework.

Normally, in order to upload a file to an external storage, the file needs to be saved in a temporary directory first. Then it is moved from the temp directory to the destination (AWS, Space, Cloud). In this case more than one I/O read/write occurs. Although it is not easy to handle file storage on AdonisJs, there is no need for an external library to stream the file to a cloud storage, expect the provider's storage SDK.
Let’s get dive!

Disabling Auto Processing

First, you need to disable "bodyparser" and process the file (stream) manually. This is where AdonisJs poor at when it comes to file uploads. However, AdonisJs allows us to disable auto processing only for specified routes. Thus, we can process the file stream manually and direct the stream to the external file storage - which is Google Cloud Storage for us.

Auto process can be disabled from bodyparser config file (config/bodyparser.js). You should put the routes which you want to process manually to the "processManually" array as an element.

processManually: ['/upload', '/save'];

Once you disabled auto process, as a result, you have to handle the requests made to the '/upload' and to the '/save' routes manually. In fact, you cannot reach even the files or the fields in the request body through  request.post(). 

Processing the Payload Manually

Because auto processing is disabled, you need to handle requests through  request.multipart() interface.

Route.post('/upload', async ({ request }) => {
  // Set the callback to process the 'profile_pic' file manually
  request.multipart.file('profile_pic', {}, async (file) => {
  console.log(file.clientName, file.stream); 
  // Set the callback to process fields manually
  request.multipart.field((name, value) => {
  // Start the process
 await request.multipart.process();

If you want to validate the file against the rules, you need to pass it as second parameter in JSON format:

  request.multipart.file('profile_pic', {
  types: ["jpeg", "jpg", "png", "gif"],
  size: "3mb"
  }, async (file) => {
	file.size = file.stream.byteCount;
        // Validate file
        await file._validateFile();
  console.log(file.clientName, file.stream); 

Uploading the File to the Google Cloud Storage

Now we are ready to upload the file to the Google Cloud Storage. But first, we need to play on Google Cloud Console a little.

  1. Select or create a GCP project

    If you already have a Cloud Platform Project, you can choose it. If not you can create one on https://console.cloud.google.com/project
    Google Cloud Platform Manage Resources
    Google Cloud Platform Manage Resources
  2. Make sure that billing is enabled for your project
    1. Go to https://console.cloud.google.com/home/dashboard
    2. Select Billing from the left menu
    3. If you don’t have any linked billing account, Click Link a billing account.
    Google Cloud Platform offers new members a $300 credits for 12 months.
    Click here for more information about billing and $300 free credits.
    Google Cloud Platform also offers a $3K+ credits for startups. If you are interested, read more here.

  3. Create a bucket on Google Cloud Storage
    You can create a bucket on https://console.cloud.google.com/storage

  4. Create a service account key
    1. On the GCP Console, go to the Create service account key page. https://console.cloud.google.com/apis/credentials/serviceaccountkey
    2. From the Service account drop-down list, select New service account.
    3. Enter a name for the "key" into the Service account name field.
    4. From the Role drop-down list, select Project > Owner or specify the permission you need.
    5. Click Create. The key will be downloaded to your computer as a JSON file. Keep it secret!

Setting up the GCS SDK for Node.js

Now let's install the Google Cloud Storage SDK for Node.js via the npm or the yarn.

Via npm:

npm install --save @google-cloud/storage

Via yarn:

yarn add @google-cloud/storage

Uploading the File to the GCS Bucket

First, import the GCS SDK and then create an instance of it with the key you created at the step before.

const {Storage} = require('@google-cloud/storage');
const storage = new Storage({
  projectId: Env.get('GCS_BUCKET'),
  keyFilename: Env.get('GCS_KEY_FILE_NAME')

You can get the values from environment file as we done here. You can read more about environment files in AdonisJs from the post we published before.

Create an instance for the bucket you want to upload the file:

const gcsFile = storage.bucket('bucket-name').file('file-name'); 

Here is the most critical point:
We now 'pipe' the stream to the GCS with the help of readible.pipe() method of stream module in Node.Js .
Thus, instead of saving the file to the temp folder first and then uploading to the GCP, the file is uploaded to the GCS while being streamed to the server and the possible delay is avoided.

  metadata: {
    contentType: file.stream.headers['content-type']

Finally, you can run the server and upload a file to the server from the page you've created.

File Uploaded to Bucket Via GCS Node.Js SDK
File Uploaded to Bucket Via GCS Node.Js SDK

That's all!  🎊🎉🥳

Bringing All Together

One last thing to do: the .file() , the .field() and the .process(methods are chainable, so, if we bring all together:

Route.post('/upload', async ({request}) => {
  await request.multipart.file('profile_pic', {}, async (file) => {
    const {Storage} = require('@google-cloud/storage');
    const storage = new Storage({
      projectId: 'your - project - id',
      keyFilename: 'path / to / file / step / 4.5'
    const gcsFile = storage.bucket('bucket-name').file('file-name');
      metadata: {
        contentType: file.stream.headers['content-type']
    .field((name, value) => {
      console.log(name, value);

In Conclusion

In AdonisJs, with the help of Node.Js methods and GCS SDK, it is possible to direct (pipe) the stream to a bucket on the Google Cloud Storage. 

Spread the word if you want others to read it! ✌🏻

This article was updated on January 13, 2019