Saturday, December 21, 2024

Migrating to AWS JavaScript SDK v3: Lessons Learned

Share


There’s work coming your way! Node.js 16 reached end-of-life on September 11th, 2023. Also, the AWS Lambda runtime environment for Node.js 18 upgraded to v3 of the AWS SDK for JavaScript. So to upgrade Lambda functions from Node.js 16 to 18, you have to migrate to AWS JavaScript SDK to v3 as well. Unfortunately, v3 is not backward compatible with v2. In the following, I will share what I stumbled upon while upgrading many Lambda functions to v3.

Migrating to AWS JavaScript SDK v3: Lessons Learned

When upgrading the AWS JavaScript SDK from v2 to v3, you should bookmark the following pages:

Import and Client

The first step is to import the SDK and initialize a client.

Old (v2)

v2 provided CommonJS modules only. This was how to import the SDK, the SQS client in this example.

const AWS = require('aws-sdk');
const sqs = new AWS.SQS({apiVersion: '2012-11-05'});

New (v3)

With v3, there are two options to import the SDK. Here is how to import the SQS client using ES modules.

Native JavaScript modules, or ES modules, are the modern approach to split JavaScript programs into separate modules. Learn more!

import { SQSClient } from '@aws-sdk/client-sqs';
const sqs = new SQSClient({apiVersion: '2012-11-05'});

By default, Lambda functions use CommonJS modules. To use ES modules, use the file suffix .mjs instead of .js or set type to module in the package.json. Learn more!

In case you want to stick with CommonJS modules to avoid having to rewrite larger parts of your code, this is how to import the SQS client, for example.

const { SQSClient } = require('@aws-sdk/client-sqs');
const sqs = new SQSClient({apiVersion: '2012-11-05'});

Commands instead of methods

AWS decided to use a command-style approach for v3 of the AWS JS SDK. So, it’s sending commands instead of calling methods. Unfortunately, this requires to rewrite a lot of code.

Old (v2)

Instead of calling listContainerInstances(...) …

const AWS = require('aws-sdk');
const ecs = new AWS.ECS({apiVersion: '2014-11-13'});

ecs.listContainerInstances({
cluster: 'demo',
status: 'ACTIVE'
});

New (v3)

… send a ListContainerInstancesCommand command. Luckily, the parameters stay the same.

const { ECSClient, ListContainerInstancesCommand } = require('@aws-sdk/client-ecs');
const ecs = new ECSClient({apiVersion: '2014-11-13'});

ecs.send(new ListContainerInstancesCommand({
cluster: 'demo',
status: 'ACTIVE'
}));

Promise

How to wait for results from AWS? I prefer using promises with the help of the async/await syntax.

Old (v2)

v2 uses callbacks by default. Therefore, it was necessary to append promise() to every method call.

const AWS = require('aws-sdk');
const s3 = new AWS.S3({apiVersion: '2006-03-01'});

function async handler() {
await s3.getObject({
Bucket: 'demo',
Key: 'hello.txt'
}).promise();
}

New (v3)

v3 uses promises by default.

const { S3Client, GetObjectCommand } = require('@aws-sdk/client-s3');
const s3 = new S3Client({apiVersion: '2006-03-01'});

function async handler() {
await s3.send(new GetObjectCommand({
Bucket: 'demo',
Key: 'hello.txt'
}));
}

Callback

Do you prefer callbacks? Or do you want to avoid rewriting code?

Old (v2)

As mentioned above, v2 defaults to callbacks.

const AWS = require('aws-sdk');
const iam = new AWS.IAM({apiVersion: '2010-05-08'});

iam.deleteAccountPasswordPolicy({}, (res, err) => {
if (err) {
console.log(err);
}
});

New (v3)

But using callbacks is quite simple with v3 as well. The send(...) method accepts a callback function as the 2nd parameter.

const { IAMClient, DeleteAccountPasswordPolicyCommand } = require('@aws-sdk/client-iam');

iam.send(new DeleteAccountPasswordPolicyCommand({}), (res, err) => {
if (err) {
console.log(err);
}
});

Error handling

When things go wrong, handling errors is critical.

Old (v2)

The code property of the error includes the error code.

const AWS = require('aws-sdk');
const s3 = new AWS.S3({apiVersion: '2006-03-01'});

try {
await s3.getObject({
Bucket: 'demo',
Key: 'hello.txt'
}).promise();
} catch (err) {
if (err.code === 'NoSuchKey') {

}
}

New (v3)

With v3 use the name property of the error.

const { S3Client, GetObjectCommand } = require('@aws-sdk/client-s3');
const s3 = new S3Client({apiVersion: '2006-03-01'});

try {
await s3.send(new GetObjectCommand({
Bucket: 'demo',
Key: 'hello.txt'
}));
} catch (err) {
if (err.name === 'NoSuchKey') {

}
}

S3 multi-part upload

Splitting large files into multiple parts when uploading them to S3 is essential to improve performance.

Old (v2)

The S3 client shipped with the high-level method upload(...), which handles multi-part uploads.

const AWS = require('aws-sdk');
const s3 = new AWS.S3({apiVersion: '2006-03-01'});

await s3.upload({
Bucket: 'demo',
Key: 'heavy.tar'
Body: body
}).promise();

New (v3)

AWS moved that functionality from the S3 client to a separate module with v3.

const { S3Client } = require('@aws-sdk/client-s3');
const { Upload } = require('@aws-sdk/lib-storage');
const s3 = new S3Client({apiVersion: '2006-03-01'});

await new Upload({
client: s3,
params: {
Bucket: 'demo',
Key: 'heavy.tar'
Body: body
}
}).done();

The AWS JavaScript SDK v3 does still not support parallel byte-range fetches. Check out widdix/s3-getobject-accelerator to accelerate fetching objects from S3.

Streaming S3 results

When dealing with large files on S3, keeping them in memory is not an option. Make use of streams instead.

The following examples show how to download, transform, and upload an object.

Old (v2)

The createReadStream(...) method allows piping an object stored on S3 into a stream.

const zlib = require('zlib');
const stream = require('stream');
const AWS = require('aws-sdk');
const s3 = new AWS.S3({apiVersion: '2006-03-01'});

const body = stream.pipeline(
s3.getObject({
Bucket: 'demo',
Key: 'hello.txt'
}).createReadStream(),
zlib.createGzip(),
() => {}
);

await s3.upload({
Bucket: 'demo',
Key: 'hello.txt.gz'
Body: body
}).promise();

New (v3)

With v3 the Body property of the GetObjectCommand, PutObjectCommand as well as the Upload functionality (see above) return or accept streams out-of-the-box.

const zlib = require('node:zlib');
const { pipeline, Transform } = require('node:stream');
const { S3Client, GetObjectCommand } = require('@aws-sdk/client-s3');
const { Upload } = require('@aws-sdk/lib-storage');
const s3 = new S3Client({apiVersion: '2006-03-01'});

const getObjectResponse = await s3.send(new GetObjectCommand({
Bucket: 'demo',
Key: 'hello.txt'
}));

const bodyPipeline = pipeline(
getObjectResponse.Body,
zlib.createGzip(),
() => {}
);

await new Upload({
client: s3,
params: {
Bucket: 'demo',
Key: 'hello.txt.gz'
Body: bodyPipeline
}
}).done();

Summary

Due to breaking changes between v2 and v3 of the AWS JavaScript SDK, migrating incurs a lot of work. But there is no way out. AWS plans to deprecate v2 soon. Also, the Node.js 18 environment for Lambda does ship with v3 only.



Source link

Read more

Local News