Skip to main content
This guide shows you how to implement the Layercode Webhook SSE API with Cloudflare Workers using Hono. You’ll learn how to set up a webhook endpoint that receives transcribed messages from the Layercode voice agent and streams the agent’s responses back to the frontend, to be turned into speech and spoken back to the user. You can test your backend using the Layercode dashboard playground or by following the Build a Web Voice Agent guide. Example code: layercodedev/example-backend-hono

Prerequisites

  • Node.js 18+
  • Hono (Cloudflare Workers compatible)
  • A Layercode account and agent (sign up here)
  • An API key for your LLM provider (we use Google Gemini in this example)

Setup

npm create hono@latest my-app
cd my-app
npm install hono @layercode/node-server-sdk
Edit your .env (automatically read by wrangler) environment variables. You’ll need to add:
  • GOOGLE_GENERATIVE_AI_API_KEY - Your Google Gemini API key
  • LAYERCODE_API_KEY - Your Layercode API key, found in the Layercode dashboard
  • LAYERCODE_WEBHOOK_SECRET - Your Layercode agent’s webhook secret, found in the Layercode dashboard (go to your agent, click Connect your backend and copy the webhook secret shown)

Create Your Hono Handler

Here’s a simplified example of the core functionality needed to implement the Layercode webhook endpoint:
import { Context } from "hono";
import { verifySignature, streamResponse } from '@layercode/node-server-sdk';
import { createGoogleGenerativeAI } from '@ai-sdk/google';
import { streamText, ModelMessage } from 'ai';
import { env } from 'cloudflare:workers';

const google = createGoogleGenerativeAI({ apiKey: env.GOOGLE_GENERATIVE_AI_API_KEY });
const sessionMessages = {} as Record<string, ModelMessage[]>;

const SYSTEM_PROMPT = "You are a helpful assistant.";
const WELCOME_MESSAGE = "Welcome to Layercode. How can I help you today?";

export const onRequestPost = async (c: Context) => {
  const secret = env.LAYERCODE_WEBHOOK_SECRET;
  if (!secret) {
    return c.json({ error: 'LAYERCODE_WEBHOOK_SECRET is not set' }, 500);
  }

  const rawBody = await c.req.text();
  const signature = c.req.header('layercode-signature') || '';
  const isValid = verifySignature({ payload: rawBody, signature, secret });
  if (!isValid) {
    return c.json({ error: 'Invalid signature' }, 401);
  }

  const json = JSON.parse(rawBody);
  const { text, type, session_id } = json;

  const messages = sessionMessages[session_id] || [];

  if (type === "welcome") {
    return streamResponse(json, async ({ stream }) => {
      stream.tts(WELCOME_MESSAGE);
      messages.push({
        role: "assistant",
        content: [{ type: "text", text: WELCOME_MESSAGE }],
      });
      sessionMessages[session_id] = messages;
      stream.end();
    });
  }

  messages.push({ role: "user", content: [{ type: "text", text }] });

  return streamResponse(json, async ({ stream }) => {
    try {
      const result = await streamText({
        model: google('gemini-2.5-flash-lite'),
        messages: [
          { role: 'system', content: [{ type: 'text', text: SYSTEM_PROMPT }] },
          ...messages,
        ],
      });

      let responseText = '';
      for await (const delta of result.textStream) {
        stream.tts(delta);
        responseText += delta;
      }

      messages.push({
        role: 'assistant',
        content: [{ type: 'text', text: responseText }],
      });
      sessionMessages[session_id] = messages;
    } catch (err) {
      console.error('Error:', err);
    } finally {
      stream.end();
    }
  });
};

3. How It Works

  • /agent endpoint: Receives POST requests from Layercode with the user’s transcribed message, session, and turn info. The webhook request is verified as coming from Layercode.
  • Session management: Keeps track of conversation history per session (in-memory for demo; use a store for production).
  • LLM call: Calls Gemini 2.5 Flash Lite with the system prompt, message history, and user’s new transcribed message.
  • SSE streaming: As soon as the LLM starts generating a response, the backend streams the output back as SSE messages to Layercode, which converts it to speech and delivers it to the frontend for playback in realtime.
  • /authorize endpoint: Your Layercode API key should never be exposed to the frontend. Instead, your backend acts as a secure proxy: it receives the frontend’s request then, calls the Layercode authorization API using your secret API key, and finally returns the client_session_key (and optionally a conversation_id) to the frontend. This key is required for the frontend to establish a secure WebSocket connection to Layercode.

4. Running Your Backend

Start your Hono app (Cloudflare Workers):
npx wrangler dev

Configure the Layercode Webhook endpoint

In the Layercode dashboard, go to your agent settings. Under Your Backend, click edit, and here you can set the URL of the webhook endpoint. If running this example locally, setup a tunnel (we recommend cloudflared which is free for dev) to your localhost so the Layercode webhook can reach your backend. Follow our tunnelling guide.