Blog

Uploading Files to S3 Using Presigned URLs

December 30, 2020
by
Cloud Peritus
Uploading Files to S3 Using Presigned URLs

Introduction

If your end users need the ability to upload a large volume of files or extremely large files then Salesforce may not be the ideal file storage solution for your business due to file size limitations and cost. But if your end users are primarily interacting with your Salesforce org then the ideal user experience would be for them to upload the files through Salesforce even if the files are not being stored there. I would recommend you to first do some analysis to see if there’s any viable solution available on AppExchange but if there’s none that meets all your requirements or if they are too costly then implementing your own custom solution may be the path forward. This article will go over just one of many possible custom solutions utilizing Lightning Web Components, presigned URLs, and AWS.

What is a presigned URL?

A presigned URL allows users to download from or upload to S3 even if they personally don’t have the necessary permissions to access that S3 bucket as long as the creator of the presigned URL does. A presigned URL only allows the client to upload/download a specific file and will expire after a set amount of time.

An example use case would be if you have a client application that your customers have access to and can upload files from but they do not (and should not) have direct access to your S3 bucket. You would have a service application that would generate a presigned URL which then the client application could use to upload (or download) to your S3 bucket. It is probably unlikely that you would generate both the presigned URL and upload with it in Salesforce but for the sake of this article, I’ll showcase how to do both here. If your requirements are simpler, you can easily remove the presigned URL part and just upload directly to S3.

Setting up AWS

The first step is to create an AWS account. You can create a Trial account for development and testing purposes but make sure you’re not exceeding the limits to avoid incurring any charges.

  1. Create an S3 bucket. This bucket should not allow public access.
  2. Create an IAM User. This will be the credentials that Salesforce would use to generate the presigned URL. Make sure the policy allows the user to upload files into the S3 bucket that you created in the step above.
  3. To add an additional security, you can update the CORS configuration on your S3 bucket to ensure that operations can only be initiated from your Salesforce org.

Setting up Salesforce

For this article, we’ll go with a simple LWC that’ll consist of a file upload component that you can put on any record page.

1. Create a custom LWC called s3FileUploadComponent. Sample HTML below.

<template>
<lightning-card variant="Narrow"">
<div class="slds-p-around_medium">
<lightning-input type="file" onchange={handleFile}></lightning-input>
     {fileName}
</div>
<div class="slds-p-around_medium">
<lightning-button class="slds-m-top--medium" label="Upload File" onclick={uploadToS3} variant="brand">
</lightning-button>
</div>
</lightning-card>
</template>

2. Download the S3 SDK for Javascript from here and upload it into your Salesforce org as a Static Resource.

3. Import the SDK in your Javascript controller class. Initialize the SDK and S3 in the renderedCallback().

4. When a file is dragged and dropped or selected, we store the filename which will be used for the presigned URL and will be displayed on the UI..

handleFile(event) {
  if (event.target.files.length > 0) {
      this.fileToUpload = event.target.files[0];
      this.fileName = event.target.files[0].name;
  }
}

5. When the end user clicks on the Upload button, we will generate the presigned URL.

if (this.selectedFilesToUpload) {
 // Generate presigned URL for upload
 const presignedUrl = this.s3.getSignedUrl('putObject', {
   Key: this.fileToUpload.name,
   ContentType: this.fileToUpload.type,
   Metadata: {} // This is optional
 });
}

6. Once the presigned URL is generated, we’ll call the PUT method to upload to S3.

// Upload presigned URL
fetch(url, {
 method: "PUT",
 body: this.fileToUpload,
}).then((response) => {
 if (response.status == 200) {
   // Success logic here
 }
});

7. When we get a response from S3 that the file is uploaded successfully, throw a toast message to the end user.

this.dispatchEvent(
 new ShowToastEvent({
   title: 'Success',
   message: 'File uploaded to S3',
   variant: 'success',
 })
);

This is a very simplified version of the code just to give you an idea of what is possible. You should add exception handling logic and a progress bar at the minimum especially if you are dealing with large files. The end users should have some indicator of whether the files are actually uploading.

Tip: In order to get the progress of the upload you would need to use XMLHTTPRequest instead of fetch().

Additional Considerations:

  • You may want to take additional steps to increase security when setting AWS credentials in the browser. See here: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/setting-credentials-browser.html
  • You could also consider generating the presigned URLs outside of Salesforce, such as doing it with a Lambda function or a web service hosted in Heroku, which will then send the presigned URL back to Salesforce.
  • There’s a limit of 5 GB per file upload with the AWS SDK. If you have requirements to upload files larger than that or just to improve performance overall, consider doing a multipart upload instead, which is essentially splitting your file into smaller parts and then combine them once all the parts have been uploaded.
  • If you want to use multipart upload with presigned URLs, you will have to create a presigned URL for each part.
  • There are a lot of enhancements you can consider to make this solution even better. Such as logging each file upload by creating a record in a custom object where you can track the upload status. You could go a step further and use that custom record to help generate a presigned URL for the end user to download a specific file. If you’re using multipart upload then you can customize it further to allow resumption of failed uploads or to delete uploaded files from S3.

Please reach out to Cloud Peritus if you’re interested in learning more.

Keywords: Salesforce, Salesforce Development, LWC, Lightning Web Component, AWS, S3

Authors

Kevin Chan

Kevin Chan

System & Application Architect at Cloud Peritus with 22x certifications and expertise in designing & implementing solutions on Salesforce platform