Gemini (LLM)
Gemini (LLM)
Use Gemini as the LLM (dialogue + reasoning) provider.
Setup
- Set
GEMINI_API_KEY
# .env
GEMINI_API_KEY=...
Example
from siphon.plugins import gemini
llm = gemini.LLM(
model="gemini-2.0-flash",
)
Common options
model(default:gemini-2.5-flash-lite)api_key(default:None)max_output_tokens(default:150)temperature(default:0.3)