Spanora

Documentation — AI Observability Guides

Understand what your AI agents do, why they decide it, and what it costs — debug decisions, track costs per user, and optimize agent behavior.

Spanora is an AI observability platform that gives you full monitoring and tracing for every LLM call, tool invocation, and token spent across your AI agents. Built on OpenTelemetry — no vendor lock-in.

Quick Start

Automatic Setup

The fastest way to get started. Install the Spanora skill and let your AI coding agent handle the rest:

npx skills add spanora/skills

Then prompt your AI coding agent:

> Integrate Spanora into this project

It detects your AI SDK and package manager, installs dependencies, and wires up Spanora automatically.

Manual Setup

Create an API key from the dashboard, then pick your integration:

pnpm add @spanora-ai/sdk ai @ai-sdk/openai
agent.ts
import { init } from "@spanora-ai/sdk";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";

init({ apiKey: process.env.SPANORA_API_KEY });

const result = await generateText({
  model: openai("gpt-4o"),
  prompt: "What is the capital of France?",
  experimental_telemetry: { isEnabled: true },
});

Verify

Trigger an AI execution in your application. You should see it appear in the traces view within seconds.

Where to Go Next

On this page