Reducing Lambda size when using googleapis module

So this is going to be a short one. I was going to deploy a lambda function that made use of the googleapis package, but it was way too large.

The problem

As you probably know, your Lambda functions should be below 5 MB to get acceptable cold start times. My function was above 10 MB. This was not gonna fly, I needed to solve this somehow.

I was already bundling my code using esbuild but was still getting function sizes way above that 5 MB threshold. I was doing something like this to initialize my APIs:

import { google } from "googleapis"

...
const peopleApi = google.people({
  version: "v1",
  auth: accessToken
})
...

As you can see, I was using the google object to create a new peopleApi instance, this turned out to be a bad idea.

The google object that is imported from googleapis is an instance of a class that contains every API available from Google, except for their Cloud Platform APIs, which is a lot.

Even though I was only using one API, every single API from the package was included in the bundle.

The resulting function size was ~13 MB.

The solution

To solve this, I simply imported the API without the entire google instance. The new code looked like this:

import { people_v1 } from "googleapis/build/src/apis/people";

...
const peopleApi = new people_v1.People({
  auth: accessToken
})
...

As you can see, new code traverses the googleapis package to directly import the desired API.

The resulting function size was now down to ~3 MB. That is a significant 10 MB reduction.

The problem will soon be solved

The full googleapis package is about 72 MB, which is a lot if you're only going to use one API from it. But, don't worry, the developers are working to solve this. They are planning to split the APIs into their own NPM packages that can be downloaded separately.

You can see the progress of that effort in this GitHub issue.

Conclusion

I hope some of you learned a thing or two from this article. If so, I would appreciate a follow on Twitter.

Thanks for reading.