1. Subscribe to the right webhook events
Enable thesession.end and session.update events on your agent webhook configuration. Layercode sends session.end immediately after the call finishes with usage metrics and the full transcript, and follows up later with session.update when the recording file is ready (if session recording is enabled for your org).
Your webhook handler should capture both payloads. A minimal example in Next.js:
2. Fetch full session details on demand
While thesession.end payload already includes the transcript, you can always fetch the authoritative record later through the REST API:
completed, the payload includes a recording_url that points to the downloadable WAV file.【F:docs/api-reference/rest-api.mdx†L321-L366】
3. Download the call recording when it finishes
When you receive asession.update webhook indicating recording_status: "completed", stream the audio file directly from the recording endpoint:
recording_status: "in_progress" if processing is still happening.【F:docs/api-reference/rest-api.mdx†L373-L401】
4. Kick off your analytics pipeline
With transcripts saved and recordings queued, you can start whatever analysis you need—summaries, compliance checks, quality scoring, or AI-powered tagging. A common pattern is:- Store the transcript rows in your database when
session.endarrives. - Trigger asynchronous jobs from
session.updatethat download the recording and push it to transcription review, summarization, or storage. - Merge results (e.g., LLM summaries, compliance flags, sentiment) back into your customer dashboard once processing completes.
Example: summarize transcripts with the Vercel AI SDK
Once the transcript is stored, you can enrich it with an LLM call that produces business-ready insights. The snippet below shows how to send the transcript text to an OpenAI model using the Vercel AI SDK and extract a structured summary, caller name, and sentiment flag:analyzeSessionTranscript after you persist the transcript rows so downstream dashboards and QA tools can display the summary alongside the original conversation.【F:docs/api-reference/rest-api.mdx†L321-L366】