Mistral (LLM)
Mistral (LLM)
Use Mistral as the LLM (dialogue + reasoning) provider.
Setup
- Set
MISTRAL_API_KEY
# .env
MISTRAL_API_KEY=...
Example
from siphon.plugins import mistralai
llm = mistralai.LLM(
model="mistral-large-latest",
)
Common options
model(default:mistral-medium-latest)api_key(default:None)temperature(default:0.3)max_completion_tokens(default:150)timeout(default:15)