Connect your LLM apps to Lunary with the OpenTelemetry standard.
/v1/otel
endpoint. This means you can export traces, metrics, and events from LLM stacks or frameworks—no matter the language or platform—directly to Lunary’s observability dashboard.
Why OpenTelemetry?
- Unified tracing across polyglot apps (Python, JS, Java, Go, etc.)
- Bring-your-own instrumentation: works with OpenLIT, Arize, OpenLLMetry, MLflow, and more.
- Rich, future-proof GenAI semantic conventions.
https://api.lunary.ai/v1/otel
opentelemetry-sdk
@opentelemetry/api
Model SDK | Python | Typescript |
---|---|---|
Azure OpenAI | ✅ | ✅ |
Aleph Alpha | ✅ | ❌ |
Anthropic | ✅ | ✅ |
Amazon Bedrock | ✅ | ✅ |
Amazon SageMaker | ✅ | ❌ |
Cohere | ✅ | ✅ |
IBM watsonx | ✅ | ⏳ |
Google Gemini | ✅ | ✅ |
Google VertexAI | ✅ | ✅ |
Groq | ✅ | ⏳ |
Mistral AI | ✅ | ⏳ |
Ollama | ✅ | ⏳ |
OpenAI | ✅ | ✅ |
Replicate | ✅ | ⏳ |
together.ai | ✅ | ⏳ |
HuggingFace Transformers | ✅ | ⏳ |