LLM Providers
LLM Providers
LLM plugins power the agent’s dialogue and reasoning.
Set your API key in .env
Create a .env file (same directory where you run the worker) and add your provider key:
# .env
OPENAI_API_KEY=...
How to use an LLM plugin
Worker defaults
from dotenv import load_dotenv
from siphon.agent import Agent
from siphon.plugins import openai
load_dotenv()
agent = Agent(
agent_name="Receptionist",
llm=openai.LLM(),
system_instructions="You are a helpful receptionist.",
)
if __name__ == "__main__":
agent.dev()
Per-call overrides
You can also pass an LLM object into Dispatch(...) or Call(...) for per-call overrides.