The AWS SDK for Salesforce makes it easy for developers to access Amazon Web Services in their Apex code, and build robust applications and software using services like Amazon S3, Amazon EC2, etc.

Docs - github.com/mattandneil/aws-sdk
Install - /packaging/installPackage.apexp?p0=04t6g000008SbOb

Sign up then go to your AWS Console > Security Credentials > Access Keys:

Amazon Simple Storage Service (S3) SDK

The Apex client manipulates both buckets and contents. You can create and destroy objects, given the bucket name and the object key.

Create a bucket:

AWS.S3.CreateBucketRequest request = new AWS.S3.CreateBucketRequest();
request.url = 'https://s3.us-east-1.amazonaws.com/testbucket1';
AWS.S3.CreateBucketResponse response = new AWS.S3.CreateBucket().call(request);

Add an object to a bucket:

AWS.S3.PutObjectRequest request = new AWS.S3.PutObjectRequest();
request.url = 'https://s3.us-east-1.amazonaws.com/testbucket1/key.txt';
request.body = Blob.valueOf('test_body');
AWS.S3.PutObjectResponse response = new AWS.S3.PutObject().call(request);

View an object:

AWS.S3.GetObjectRequest request = new AWS.S3.GetObjectRequest();
request.url = 'https://s3.us-east-1.amazonaws.com/testbucket1/key.txt';
AWS.S3.GetObjectResponse response = new AWS.S3.GetObject().call(request);

List bucket contents:

AWS.S3.ListObjectsRequest request = new AWS.S3.ListObjectsRequest();
request.url = 'https://s3.us-east-1.amazonaws.com/testbucket1';
AWS.S3.ListObjectsResponse response = new AWS.S3.ListObjects().call(request);

Delete an object:

AWS.S3.DeleteBucketRequest request = new AWS.S3.DeleteBucketRequest();
request.url = 'https://s3.amazonaws.com/testbucket1/key.txt';
AWS.S3.DeleteBucketResponse response = new AWS.S3.DeleteBucket().call(request);

Amazon Elastic Cloud Compute (EC2) SDK

EC2 provides scalable computing capacity in the cloud. The Apex EC2 client calls services to launch instances, terminate instances, etc. The API responds synchronously, but bear in mind that the instance state transitions take time.

Describe running instances:

AWS.EC2.DescribeInstancesRequest request = new AWS.EC2.DescribeInstancesRequest();
request.url = 'https://ec2.amazonaws.com/';
AWS.EC2.DescribeInstancesResponse response = new AWS.EC2.DescribeInstances().call(request);

Launch a new instance:

AWS.EC2.RunInstancesRequest request = new AWS.EC2.RunInstancesRequest();
request.url = 'https://ec2.amazonaws.com/';
request.imageId = 'ami-08111162';
AWS.EC2.RunInstancesResponse response = new AWS.EC2.RunInstances().call(request);

Terminate an existing instance:

AWS.EC2.TerminateInstancesRequest request = new AWS.EC2.TerminateInstancesRequest(endpoint);
request.url = 'https://ec2.amazonaws.com/';
request.instanceId = new List<String>{'i-01234567890abcdef'};
request.dryRun = true;
AWS.EC2.TerminateInstancesResponse response = new AWS.EC2.TerminateInstances().call(request);






Hi, I am trying to integrate SF with AWS and have been facing Signature mismatch issue. Can you please help me to locate at which point I am going wrong. The String To Sign generated in apex doesn't match with the one generated by AWS and I am not sure as to where I have gone wrong. Any pointers will help me understand the issue. Thanks in Advance for the help!

Fixed a bug in the signature code where non-ASCII encoded characters caused the hash to fail. Latest version can be deployed here, hope this helps: https://githubsfdeploy.herokuapp.com/?owner=mattandneil&repo=aws-sdk&ref=master

Thanks very much for sharing your S3 sdk. It has helped me integrate successfully very fast. Really appreciate your work.

How to delete a file from S3?

Call DeleteObject() like this:

// destroy object
AWS.S3.DeleteObjectRequest request = new AWS.S3.DeleteObjectRequest();
request.url = 'https://s3.us-east-1.amazonaws.com/bucket/file.ext';
AWS.S3.DeleteObjectResponse response = new AWS.S3.DeleteObject().call(request);

would you be open to adding additional examples in the documentation for the provided methods?
For example - I am struggling to implement ListContentsRequest method provided. An example
of how this works in Apex would be fantastic.
Thanks again for your fantastic work.

Here is an example using Apex code to list the contents of an S3 bucket, best regards

// get items in bucket
AWS.S3.ListObjectsRequest request = new AWS.S3.ListObjectsRequest();
request.url = 'https://s3.us-east-1.amazonaws.com/bucket';
AWS.S3.ListObjectsRequest response = new AWS.S3.ListObjects().call(request);

Hello,

I'm trying to access to aws rekognition services using EC2 SDK. I'm able to log and describe my configured instance, but can't find any reference on how to execute from this connector amazon services. Is it even possible?

Thanks

Hi, I am having trouble writing a test class for this SDK. I keep getting "Test methods cannot make Web service callouts". But I am specifying the HttpCalloutMock class so I'm not sure what I'm doing wrong. Any ideas?

Salesforce namespace isolation affects HTTP callout mocks.
More info here: https://salesforce.stackexchange.com/a/18217
Fix is in latest version, or you can wrap it in Test.isRunningTest()

Hello, thank you for providing the SDK, much appreciated! When attempting to delete files which have special characters or spaces in the key I'm getting a Signature Does Not Match error. I've tried URI Encoding the key as well but that doesn't seem to work. Any ideas or direction? Thank you!!

Thank you. To fix escaping, combine the bucket and key in a single URL:

AWS.S3.DeleteObjectRequest request = new AWS.S3.DeleteObjectRequest();
request.url = 'https://s3.region.amazonaws.com/bucket/path/to/key.ext';
AWS.S3.DeleteObjectResponse response = new AWS.S3.DeleteObject().call(request);

Hi - thank you for sharing the SDK! I was running into an error that makes me thing I am doing something simple incorrectly.

String region = 's3-us-west-2';
String bucketname = 'foo';

AwsSdk.S3.Bucket bucket = connector.s3(region).bucket(bucketname);
Map headers = new Map{'Content-Type' => 'text/plain'};
bucket.createContent('My-Test-PNG-upload.png', headers, png);

I get an error back from the SDK that includes

s3

I know that the 'foo' bucket exists, and the error doesn't refer to it in any event. If you see this message and can point out how the bucket is viewed as 's3' instead of 'foo', I'd appreciate it!

https://aws.amazon.com/blogs/aws/amazon-s3-path-deprecation-plan-the-rest-of-the-story/

These all refer to the same object:
- bucket.s3.amazonaws.com/key.ext
- s3.amazonaws.com/bucket/key.ext
- s3.us-east-1.amazonaws.com/bucket/key.ext
- bucket.s3.us-east-1.amazonaws.com/key.ext

Merge the bucket and key into a single URL string:

S3.PutObjectRequest request = new S3.PutObjectRequest();
request.url = 'https://s3.us-west-2.s3.amazonaws.com/foo/My-Test-PNG-upload.png';
request.contentType = 'text/plain';
request.body = Blob.valueOf('png');
S3.PutObjectResponse response = new S3.PutObject().call(request);

This SDK seems to be having the 6 MB governor heap limit and cannot upload files more than approx 4MB. Would be you adding future support to enable larger file upload to AWS S3 using this SDK?

Call the SDK from @Future or Queueable to double the heap size. Some more ideas:

Upload binary files to S3 using Lambda
- https://medium.com/swlh/upload-binary-files-to-s3-using-aws-api-gateway-with-aws-lambda-2b4ba8c70b8e

Apex Limits - Enhanced Futures (Pilot)
- https://releasenotes.docs.salesforce.com/en-us/summer14/release-notes/rn_apex_future_methods_enhanced.htm

Apex Limits - Enhanced Futures (Developer Blog)
- https://developer.salesforce.com/blogs/engineering/2014/06/bigger-apex-limits-enhanced-futures.html

Thank you very much for the code, with this code is there any file limitations to upload a file into S3?

File size is limited by Apex heap size - 6MB in synchronous contexts, 12MB in asynchronous batch, queueable.

Hi, I'm getting below error if tryin to use region as eu-west-1.
I have updated region in named credential end point and URL request.

IllegalLocationConstraintException: The unspecified location constraint is incompatible for the region specific endpoint this request was sent to.

Thanks a lot, for sharing the SDK.

Thank you very much for this SDK!!! I have a problem trying to send a CSV. The content type is getting null. Maybe is problem of my generation of the CSV. Any file type can be send?
Thank you :D

This code you shared looks good and the report URL tests fine manually too. Regarding the Amazon part, try setting contentType = 'text/plain' on your request.

Thanks a lot.
Does it support DynamoDB?

Does it support DynamoDB?