Record and replay chat conversations in your chatbot app. Helps you understand where your chatbot falls short and how to improve it.

Chats integrate seamlessly with traces by reconciliating messages with LLM calls and agents.

You can record chats in the backend or directly on the frontend if it’s easier for you.

Setup the SDK

Open a thread

Start by opening a thread.

const thread = lunary.openThread()

You can resume an existing thread by passing an ID from an existing thread.

// Save `thread.id` somewhere
const thread = lunary.openThread({
  id: 'your-thread-id'; // Replace with your actual thread ID
})

You can also add tags to a thread by passing a object with a tags param:

const thread = lunary.openThread({
  tags: ['support']
})

Track messages

Now you can track messages. The supported roles are assistant, user, system, & tool.

thread.trackMessage({
  role: 'user',
  content: 'Hello, please help me'
})

thread.trackMessage({
role: 'assistant',
content: 'Hello, how can I help you?'
})

Track custom events

You can track custom events that happen within your chatbot. This can include things like:

  • opening a document
  • clicking a button
  • submitting a form
  • user activity or inactivity
  • other events that you want to track
thread.trackEvent("event-name")

// you can also track additional metadata
thread.trackEvent("open-document", {
documentName: "my-document.pdf",
})

Capture user feedback

Finally, you can track user feedback on bot replies:

The ID is the same as the one returned by trackMessage.

const msgId = thread.trackMessage({
  role: "assistant",
  content: "Hope you like my answers :)"
})

lunary.trackFeedback(msgId, { thumb: "up" })

To remove feedback, pass null as the feedback data.

lunary.trackFeedback(msgId, { thumb: null })

Reconciliate with LLM calls & agents

To take full advantage of Lunary’s tracing capabilities, you can reconcile your LLM and agents runs with the messages.

We will automatically reconciliate messages with runs.

const msgId = thread.trackMessage({ role: "user", content: "Hello!" });

const res = await openai.chat.completions
  .create({
    model: "gpt-4o",
    temperature: 1,
    messages: [message],
  })
  .setParent(msgId);

thread.trackMessage({
  role: "assistant",
  content: res.choices[0].message.content,
});

If you’re using LangChain or agents behind your chatbot, you can inject the current message id into context as a parent:

const msgId = thread.trackMessage({ role: "user", content: "Hello!" });

// In your backend, inject the message id into the context

const agent = lunary.wrapAgent(function ChatbotAgent(query) {
  // your custom code...
});

await agent("Hello!").setParent(msgId);

Note that it’s safe to pass the message ID from your frontend to your backend, if you’re tracking chats directly on the frontend for example.