
What is a System Prompt?
A system prompt is a set of instructions given to an LLM at the beginning of a conversation that defines its behavior, role, constraints, and output format. It acts as the model's "operating manual" β shaping every response without being visible to the end user in many applications.
Why It Matters
The system prompt is the primary mechanism for customizing LLM behavior in production applications. It determines whether the model acts as a helpful assistant, a JSON extraction tool, a customer support agent, or a creative writing partner. Effective system prompts are the difference between a generic chatbot and a reliable, branded AI product.
How It Works
How it's provided:
- Via the API's
systemrole (OpenAI, Anthropic, Google) - Placed before the conversation history in the message array
- Some APIs also support
developerrole for additional instructions
Common system prompt components:
- Role/persona β "You are a financial advisor specializing in personal investment."
- Behavior rules β "Always be concise. Never give medical advice. Respond in Dutch."
- Output format β "Return responses as JSON with fields: answer, confidence, sources."