Spanora

Integrations — LangChain, Vercel AI SDK, OpenAI & More

Connect Spanora with your existing AI tools and frameworks for full observability. Guides for LangChain, Vercel AI SDK, OpenAI, Anthropic, and any OpenTelemetry provider.

The platform accepts standard OTLP HTTP data in both JSON and Protobuf formats, so it works with any OpenTelemetry-compatible framework or library. Choose the integration that matches your setup:

Any OTEL Provider

Spanora accepts standard OTLP HTTP traces from any OpenTelemetry-compatible source. Point your exporter to https://spanora.ai/api/v1/traces with a Bearer token, and it works — no SDK required.

Get started with any OTEL provider →

LangChain (Python)

Auto-instrument LangChain and LangGraph Python applications without modifying your existing LangChain code. Uses opentelemetry-instrumentation-langchain to capture LLM calls, chains, and tool executions. No Spanora SDK required.

Get started with LangChain →

Vercel AI SDK

The easiest way to get started with TypeScript. Enable experimental_telemetry and all LLM calls, tool invocations, and streaming are captured automatically.

Get started with Vercel AI SDK →

OpenAI SDK

Auto-extraction wrappers for OpenAI ChatCompletion responses. Model, tokens, and output are captured without manual callbacks.

Get started with OpenAI →

Anthropic SDK

Auto-extraction wrappers for Anthropic message responses. Model, tokens, and output are captured without manual callbacks.

Get started with Anthropic →

Compatibility

FrameworkStatusNotes
Any OTEL ProviderSupportedDirect OTLP HTTP (JSON + Protobuf) — any language, any framework
LangChainSupportedAuto-instrumentation via OpenLLMetry, no Spanora SDK needed
Vercel AI SDKSupportedNative OTEL attributes recognized
OpenAI SDKSupportedAuto-extraction via @spanora-ai/sdk/openai
Anthropic SDKSupportedAuto-extraction via @spanora-ai/sdk/anthropic

On this page