I know this may miss the point of using Cloud Functions in the first place, but in my specific case, I'm using Cloud Functions because it's the only way I can bridge Next.js with Firebase Hosting. I don't need to make it cost efficient, etc.

With that said, the cold boot times for Cloud Functions are simply unbearable and not production-ready, averaging around 10 to 15 seconds for my boilerplate.

I've watched this video by Google (https://www.youtube.com/watch?v=IOXrwFqR6kY) that talks about how to reduce cold boot time. In a nutshell: 1) Trim dependencies, 2) Trial & error for dependencies' versions for cache on Google's network, 3) Lazy loading.

But 1) there are only so many dependencies I can trim. 2) How would I know which version is more cached? 3) There are only so many dependencies I can lazy load.

Another way is to avoid the cold boot all together. What's a good way or hack that I can essentially keep my (one and only) cloud function warm?


Solution 1:

With all "serverless" compute providers, there is always going to be some form of cold start cost that you can't eliminate. Even if you are able to keep a single instance alive by pinging it, the system may spin up any number of other instances to handle current load. Those new instances will have a cold start cost. Then, when load decreases, the unnecessary instances will be shut down.

There are ways to minimize your cold start costs, as you have discovered, but the costs can't be eliminated.

As of Sept 2021, you can now specify a minimum number of instances to keep active. This can help reduce (but not eliminate) cold starts. Read the Google Cloud blog and the documentation. For Firebase, read its documentation. Note that setting min instances incurs extra billing - keeping computing resources active is not a free service.

If you absolutely demand hot servers to handle requests 24/7, then you need to manage your own servers that run 24/7 (and pay the cost of those servers running 24/7). As you can see, the benefit of serverless is that you don't manage or scale your own servers, and you only pay for what you use, but you have unpredictable cold start costs associated with your project. That's the tradeoff.

Solution 2:

You're not the first to ask ;-)

The answer is to configure a remote service to periodically call your function so that the single|only instance remains alive.

It's unclear from your question but I assume your Function provides an HTTP endpoint. In that case, find a healthcheck or cron service that can be configured to make an HTTP call every x seconds|minutes and point it at your Function.

You may have to juggle the timings to find the Goldilocks period -- not too often that that you're wasting effort, not too infrequently that it dies -- but this is what others have done.

Solution 3:

You can now specify MIN_INSTANCE_LIMIT to keep instances running at all times.

Cloud Functions Doc: https://cloud.google.com/functions/docs/configuring/min-instances

Cloud Functions example from the docs:

gcloud beta functions deploy myFunction --min-instances 5

It's also available in Firebase Functions by specifying minInstances:

Firebase Functions Docs: https://firebase.google.com/docs/functions/manage-functions#min-max-instances

Frank announcing it on Twitter: https://twitter.com/puf/status/1433431768963633152

Firebase Function example from the docs:

exports.getAutocompleteResponse = functions
    .runWith({
      // Keep 5 instances warm for this latency-critical function
      minInstances: 5,
    })
    .https.onCall((data, context) => {
      // Autocomplete a user's search term
    });

Solution 4:

You can trigger it via cron job as explained here: https://cloud.google.com/scheduler/docs/creating

Solution 5:

Using Google Scheduler is a wise solution but the actual implementation is not so straightforward. Please check my article for details. Examples of functions:

myHttpFunction: functions.https.onRequest((request, response) => {
  // Check if available warmup parameter.                                   
  // Use  request.query.warmup parameter if warmup request is GET.                                   
  // Use request.body.warmup parameter if warmup request is POST.                                   
  if (request.query.warmup || request.body.warmup) {
    return response.status(200).type('application/json').send({status: "success", message: "OK"});
  }
});
myOnCallFunction: functions.https.onCall((data, context) => {
  // Check if available warmup parameter.
  if (data.warmup) {
    return {"success": true};
  }
});

Examples of gcloud cli comands:

gcloud --project="my-awesome-project" scheduler jobs create http  warmupMyOnCallFuntion --time-zone "America/Los_Angeles" --schedule="*/5 5-23 * * *" --uri="https://us-central1-my-awesome-project.cloudfunctions.net/myOnCallFuntion" --description="my warmup job" --headers="Content-Type=application/json" --http-method="POST" --message-body="{\"data\":{\"warmup\":\"true\"}}"

gcloud --project="my-awesome-project" scheduler jobs create http  warmupMyHttpFuntion --time-zone "America/Los_Angeles" --schedule="*/5 5-23 * * *" --uri="https://us-central1-my-awesome-project.cloudfunctions.net/myHttpFuntion?warmup=true" --description="my warmup job" --headers="Content-Type=application/json" --http-method="GET"