Documentation — AI Observability Guides
Understand what your AI agents do, why they decide it, and what it costs — debug decisions, track costs per user, and optimize agent behavior.
Spanora is an AI observability platform that gives you full monitoring and tracing for every LLM call, tool invocation, and token spent across your AI agents. Built on OpenTelemetry — no vendor lock-in.
Quick Start
Automatic Setup
The fastest way to get started. Install the Spanora skill and let your AI coding agent handle the rest:
npx skills add spanora/skillsThen prompt your AI coding agent:
> Integrate Spanora into this projectIt detects your AI SDK and package manager, installs dependencies, and wires up Spanora automatically.
Manual Setup
Create an API key from the dashboard, then pick your integration:
pnpm add @spanora-ai/sdk ai @ai-sdk/openaiimport { init } from "@spanora-ai/sdk";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
init({ apiKey: process.env.SPANORA_API_KEY });
const result = await generateText({
model: openai("gpt-4o"),
prompt: "What is the capital of France?",
experimental_telemetry: { isEnabled: true },
});Verify
Trigger an AI execution in your application. You should see it appear in the traces view within seconds.
Where to Go Next
- Integrations — Detailed setup guides for each framework
- Any OTEL Provider — Use Spanora with any OTEL-compatible service, no SDK required
- SDK Reference — Full API reference for all SDK functions
- OTEL Attributes — Semantic attribute conventions recognized by the backend