The Power of Events: Building a Serverless Webhook on Google Cloud

Serverless WebHooks

Webhooks are the backbone of modern, event-driven architectures, turning your applications from passive servers into reactive listeners. They allow 3rd party services like GitHub, Stripe, or Slack to push real-time updates directly to your code.

The best part? You don’t need a dedicated server to handle this traffic. This is where Google Cloud’s serverless suite shines, offering scalable, cost effective, and fully managed platforms for your webhook endpoints.

In this guide, we’ll explore the two primary serverless options on Google Cloud Platform for building your next webhook—Cloud Functions and Cloud Run and show you a robust, event driven architecture to handle high volume, mission critical events using Pub/Sub.


1. Choosing Your Serverless Webhook Host

Google Cloud offers two, fully-managed serverless compute services, each with a slightly different sweet spot for hosting your public webhook endpoint.

Option A: Cloud Functions (FaaS)

Cloud Functions is Google’s Function-as-a-Service (FaaS) offering. It’s ideal for creating simple, single-purpose functions in response to an HTTP request (your webhook call).

FeatureDescriptionBest for…
SimplicityFocus only on your function code; no Dockerfile or container knowledge needed.Quick integrations, small payloads, and simple response logic.
ScalingScales automatically to meet demand.Bursty traffic patterns and rapid prototyping.
TriggerDirectly uses an HTTP trigger to expose a public URL.Webhooks from external providers like GitHub or Hubspot.
TimeoutsMax function timeout is 9 minutes.Fast-executing logic.
Option B: Cloud Run (Containers)

Cloud Run is a managed compute platform that lets you run stateless containers via HTTP requests or events. It’s perfect when you need more control, memory, or longer execution times.

FeatureDescriptionBest for…
FlexibilitySupports any language and all your container-based dependencies.Complex logic, custom frameworks, and a lift-and-shift of existing microservices.
TimeoutsMax request timeout is 60 minutes (up to 15 minutes for gen 1).Webhooks that trigger longer-running data processing tasks.
ConcurrencyHandles up to 1000 concurrent requests per instance.High-volume webhooks where throughput is critical.

Developer Tip: For a straightforward, simple webhook, start with Cloud Functions. For anything requiring a custom container, a larger application framework (like Spring Boot or FastAPI), or very high-volume/long-running processing, choose Cloud Run.


2. The Robust Serverless Webhook Architecture

For any mission critical webhook, directly processing the event inside the HTTP handler (Cloud Function or Cloud Run service) is a risky approach.

If your function is busy, hits a timeout, or throws an error while processing, the sending service will often retry, leading to duplicate events or, worse, losing the event altogether.

The best practice is an asynchronous, event-driven architecture using Pub/Sub.

Architecture Overview

  1. Ingestion Layer (Webhook Endpoint): A Cloud Function (or Cloud Run service) receives the incoming HTTP request. Its only job is to quickly validate the request (e.g., check a secret/signature) and immediately publish the raw payload to a Pub/Sub topic. It then returns a successful HTTP 200 status code.
  2. Decoupling Layer: Cloud Pub/Sub provides a highly durable, scalable, and fully-managed messaging queue. By acknowledging the message to the sender right away, you relieve the external service of having to retry. Pub/Sub handles message persistence and asynchronous delivery.
  3. Processing Layer (Subscriber): A second Cloud Function (or Cloud Run service) is configured to automatically subscribe to the Pub/Sub topic. This function contains your core business logic (e.g., updating a database, sending an email, or triggering a CI/CD pipeline).

Why This Architecture Wins

  • Speed: The webhook endpoint returns an immediate 200 OK, satisfying the sending service and preventing frustrating timeouts.
  • Resilience: Pub/Sub’s “at-least-once” delivery and built-in features like Dead Letter Topics ensure no events are lost, even if your processor fails.
  • Scalability: The ingestion and processing layers scale independently. High-volume spikes only scale the initial webhook, while your downstream processing can scale at its own pace.
  • Decoupling: Your webhook logic is completely separated from your business logic, making both easier to test and maintain.

3. Implementing the Pub/Sub Webhook (Node.js Example)

Here is a simplified code example for the Ingestion Layer using a Cloud Function.

Step 1: Create the Webhook Ingestion Function

This Node.js Cloud Function acts as your public webhook endpoint.

// index.js for the Ingestion Cloud Function
const { PubSub } = require('@google-cloud/pubsub');
const pubSubClient = new PubSub();
const topicName = 'my-webhook-events-topic';

/**
 * Handles the incoming webhook request, publishes to Pub/Sub, and returns immediately.
 * @param {object} req Cloud Function request context.
 * @param {object} res Cloud Function response context.
 */
exports.ingestWebhook = async (req, res) => {
  // 1. Basic Validation (Replace with real security checks like signature verification!)
  if (req.method !== 'POST' || !req.body) {
    return res.status(400).send('Expected a POST request with a body.');
  }

  // 2. Prepare the Payload
  // Publish the raw JSON body as a string to the Pub/Sub topic
  const dataBuffer = Buffer.from(JSON.stringify(req.body));

  try {
    // 3. Publish to Pub/Sub (The key step!)
    const messageId = await pubSubClient
      .topic(topicName)
      .publishMessage({ data: dataBuffer });

    console.log(`Message ${messageId} published to topic ${topicName}`);

    // 4. Send an immediate successful response back to the webhook sender
    res.status(202).send({
      status: 'Accepted',
      message: 'Event successfully queued for processing',
      id: messageId
    });

  } catch (error) {
    console.error(`Received error while publishing: ${error.message}`);
    // Still return 202 if the event was received but we failed to queue (rare, but possible)
    res.status(500).send('Internal queueing error.');
  }
};
Step 2: Create the Event Processing Function

This second function is triggered automatically by a Pub/Sub event, containing your business logic.

// index.js for the Processing Cloud Function
/**
 * Processes a message published to a Pub/Sub topic.
 * @param {object} message The Pub/Sub message.
 */
exports.processEvent = (message) => {
  try {
    // Decode the base64 data from the Pub/Sub message
    const rawData = Buffer.from(message.data, 'base64').toString();
    const eventPayload = JSON.parse(rawData);

    // --- YOUR CORE BUSINESS LOGIC GOES HERE ---
    console.log('--- STARTING BUSINESS LOGIC ---');
    console.log(`Received event type: ${eventPayload.type}`);

    // Example: Update a record in Firestore, send an email, etc.
    if (eventPayload.type === 'new_user') {
      console.log(`Creating user record for: ${eventPayload.user.email}`);
      // await firestore.collection('users').add(eventPayload.user);
    }

    console.log('--- BUSINESS LOGIC COMPLETE ---');
    // If the function completes successfully, Pub/Sub considers the message acknowledged and deletes it.

  } catch (err) {
    console.error(`Error processing message: ${err.message}`);
    // Throwing an error will tell Pub/Sub to NOT acknowledge the message,
    // which will cause it to be redelivered later.
    throw new Error('Failed to process message.');
  }
};

Deployment Steps (via gcloud CLI)

  1. Enable APIs: Ensure Cloud Functions API, Cloud Run API (if using it), and Cloud Pub/Sub API are enabled in your project.
  2. Create the Pub/Sub Topic:Bashgcloud pubsub topics create my-webhook-events-topic
  3. Deploy the Ingestion Function: Deploy this as an HTTP trigger. This is your public webhook URL.Bashgcloud functions deploy ingestWebhook \ --runtime nodejs20 \ --trigger-http \ --allow-unauthenticated \ --region [YOUR_REGION] \ --entry-point ingestWebhook
  4. Deploy the Processing Function: Deploy this as a Pub/Sub trigger.Bashgcloud functions deploy processEvent \ --runtime nodejs20 \ --trigger-topic my-webhook-events-topic \ --region [YOUR_REGION] \ --entry-point processEvent

You’ve now built a robust, scalable, and resilient serverless webhook using Google Cloud’s powerful event-driven ecosystem!

You can see an example of integrating webhooks with Cloud Run in this video: Integrating Webhooks with Cloud Run.


Get The Blockchain Sector Newsletter, binge the YouTube channel and connect with me on Twitter

The Blockchain Sector newsletter goes out a few times a month when there is breaking news or interesting developments to discuss. All the content I produce is free, if you’d like to help please share this content on social media.

Thank you.

James Bachini

Disclaimer: Not a financial advisor, not financial advice. The content I create is to document my journey and for educational and entertainment purposes only. It is not under any circumstances investment advice. I am not an investment or trading professional and am learning myself while still making plenty of mistakes along the way. Any code published is experimental and not production ready to be used for financial transactions. Do your own research and do not play with funds you do not want to lose.


Posted

in

, ,

by