Use this file to discover all available pages before exploring further.
Ollama allow you to self-host quickly large language models.
Our SDKs include automatic integrations with Ollama.
1
Setup the SDK
Python
Learn how to set up the Python SDK.
Javascript
Learn how to set up the JavaScript SDK.
2
Monitor Ollama
With our SDKs, tracking Ollama calls is super simple.
Python
Javascript
from openai import OpenAIimport lunaryclient = OpenAI(base_url='http://localhost:11434/v1/', # replace by your Ollama base urlapi_key='ollama', #required but ignored)lunary.monitor(client)chat_completion = client.chat.completions.create(messages=[{'role': 'user','content': 'Say this is a test',}], model='llama3.2',)
import OpenAI from 'openai'import { monitorOpenAI } from "lunary/openai"const openai = monitorOpenAI(new OpenAI({ baseURL: 'http://localhost:11434/v1/', // replace by your Ollama base url apiKey: 'ollama', // required but ignored}))const chatCompletion = await openai.chat.completions.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'llama3.2',})