Anastasia Spangler • 8 months ago
Agents w/ Ollama
I can not get Agents to work with ollama. I've had no problem running chat completions. I just get an empty string. It's just the Agents that's getting me.
From the cookbook: https://cookbook.openai.com/articles/gpt-oss/run-locally-ollama
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:11434/v1", # Local Ollama API
api_key="ollama" # Dummy key
)
response = client.chat.completions.create(
model="gpt-oss:20b",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain what MXFP4 quantization is."}
]
)
print(response.choices[0].message.content)
Comments are closed.

1 comment
Edwin Arbus • 8 months ago
Can you share more info? What are you trying to do and what actually happened? Do you have a particular error to show?