Skip to main content
Monitor Anthropic’s official Python SDK by wrapping the client once with lunary.monitor(client).
1

Install both packages

pip install lunary anthropic

Python

Learn how to set up the Python SDK.
2

Monitor Anthropic

Wrap Anthropic or AsyncAnthropic once, then keep using the Anthropic client as usual.
import os

from anthropic import Anthropic
import lunary

client = Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])

lunary.monitor(client)
3

Supported surface

The monitored client supports the current Anthropic Messages API surface:
  • client.messages.create(...)
  • Raw streaming via client.messages.create(..., stream=True)
  • client.messages.parse(...)
  • client.messages.stream(...)
  • client.beta.messages.create(...)
  • client.beta.messages.parse(...)
  • client.beta.messages.stream(...)
  • client.beta.messages.tool_runner(...)
  • The same monitored surface on AsyncAnthropic
Tool-runner loops appear in Lunary as one LLM run per underlying Anthropic request. Raw streams and helper streams keep the normal Anthropic SDK lifecycle, and Lunary preserves Anthropic content blocks in the run details, including thinking, redacted_thinking, tool_use, tool_result, server_tool_use, and web_search_tool_result, along with token usage and cached input tokens.
4

Typical usage

Structured outputs with Lunary context:
import pydantic


class Answer(pydantic.BaseModel):
    answer: int
    confidence: str


parsed = client.messages.parse(
    model="claude-sonnet-4-5-20250929",
    max_tokens=256,
    messages=[
        {
            "role": "user",
            "content": "Return JSON with `answer` and `confidence` for 2 + 2.",
        }
    ],
    output_format=Answer,
    tags=["support", "structured-output"],
    user_id="user_123",
    user_props={"plan": "pro"},
    metadata={"user_id": "user_123"},
)
Raw streaming and helper streams:
raw_stream = client.messages.create(
    model="claude-sonnet-4-5-20250929",
    max_tokens=256,
    stream=True,
    messages=[{"role": "user", "content": "Write one sentence about Lunary tracing."}],
)

for event in raw_stream:
    pass

with client.messages.stream(
    model="claude-sonnet-4-5-20250929",
    max_tokens=256,
    messages=[{"role": "user", "content": "Write one more sentence about Lunary tracing."}],
) as stream:
    for event in stream:
        pass

    final_message = stream.get_final_message()
Beta tool-runner loops:
from anthropic import beta_tool


@beta_tool
def get_weather(city: str) -> str:
    """Returns a canned weather response for the requested city."""

    return f"The weather in {city} is sunny and 20C."


runner = client.beta.messages.tool_runner(
    model="claude-sonnet-4-5-20250929",
    max_tokens=256,
    max_iterations=3,
    messages=[
        {
            "role": "user",
            "content": "Call the weather tool and summarize the result.",
        }
    ],
    tools=[get_weather],
)

for message in runner:
    print(message)
Anthropic validates the provider-side metadata object. Use Anthropic-supported fields such as user_id there, and use Lunary’s tags, user_id, and user_props for observability context. For Anthropic beta features that require betas=[...], pass the beta headers exactly as Anthropic documents.