Ollama (LLM)
Ollama (LLM)
Use Ollama as the LLM (dialogue + reasoning) provider.
Setup
Depending on your deployment, you may need an API key.
- Env var:
OLLAMA_API_KEY
Example
from siphon.plugins import ollama
llm = ollama.LLM(
model="llama3.1",
base_url="http://localhost:11434/v1",
)
Common options
model(default:llama3.1)base_url(default:http://localhost:11434/v1)api_key(default:None)temperature(default:0.3)parallel_tool_calls(default:True)