Documentation Index
Fetch the complete documentation index at: https://docs.trulayer.ai/llms.txt
Use this file to discover all available pages before exploring further.
Install
Instrument
instrumentVercelAI takes the ai module’s functions and returns instrumented replacements. Import it once at the top of any module that uses the Vercel AI SDK.
Minimal example — streamText in a Next.js Route Handler
app/api/chat/route.ts
generateObject with a Zod schema
object as output and any validation error as span.status = error.
What gets captured
llmspans around every wrapped call with:input— the last message (or prompt string)output— completed text, or the structured object forgenerateObjectmodel— the model id (for examplegpt-4o-mini)prompt_tokens/completion_tokenslatency_ms
- Stream results hold the span open until the final token arrives;
usageis recorded from the resolvedresult.usagepromise.
Known gotchas
- Next.js Edge runtime is unsupported. Add
export const runtime = "nodejs"to any Route Handler or Server Action that calls an instrumented function. The SDK relies onnode:async_hooksfor trace context propagation, which is not available on Edge. - Only wrapped functions are traced. If you import
generateTextdirectly fromaielsewhere in your code, those calls bypass TruLayer. Pick one convention per module. - Tool calls are recorded on the parent
llmspan. For a dedicatedtoolspan per tool execution, wrap the tool body withtl.span("tool:name", { spanType: "tool" }).