Skip to main content
Configuration is how you define your agent’s behavior—which LLM it uses and how it responds. Get this right and your agent feels natural. Get it wrong and users will notice.

What is Configuration?

Every agent needs two things configured:
  1. LLM Settings — Which model to use, temperature, streaming behavior
  2. Prompts — System instructions that define personality and constraints
The SDK provides sensible defaults, so you can start simple and tune later.

Key Settings

SettingWhat It Controls
ModelGPT-4o, Claude, Llama, or your own
Temperature0.0 = deterministic, 1.0 = creative
StreamingEssential for real-time responses (always use stream=True)
System PromptThe personality, rules, and context

What’s Next