Perplexity (LLM)

Perplexity (LLM)

Use Perplexity as the LLM (dialogue + reasoning) provider.

Setup

  • Set PERPLEXITY_API_KEY
# .env
PERPLEXITY_API_KEY=...

Example

from siphon.plugins import perplexity

llm = perplexity.LLM(
    model="sonar",
)

Common options

  • model (default: llama-3.1-sonar-small-128k-chat)
  • base_url (default: https://api.perplexity.ai)
  • api_key (default: None)
  • temperature (default: 0.3)
  • parallel_tool_calls (default: True)