Build a complete system prompt in seconds. Pick role, task, and tone — then customise. Works with ChatGPT, Claude, Gemini, Mistral, and any LLM that accepts a system field.
A system prompt is an instruction given to an AI model before the conversation starts. It defines the model's persona, constraints, tone, and task. Everything in the system prompt shapes how the model responds throughout the session.
In ChatGPT: use the "Custom instructions" feature (Settings → Personalization → Custom instructions) or in the system message field of the API. In Claude: use the system prompt field in the API, or the "System Prompt" box in Claude.ai Projects. In any API call, it goes in the system parameter.
Yes — the output box is fully editable. The dropdowns give you a solid starting point, but the final prompt is yours to customise. Click anywhere in the output and modify it before copying.
Tokens are the units AI models use for pricing and context limits. Roughly 1 token = 0.75 words. System prompts count toward your context window. Most models have a context limit of 4K–200K tokens, so a 100-token system prompt is negligible.
All major LLMs support system prompts via their APIs: OpenAI (GPT-4, GPT-4o), Anthropic (Claude), Google (Gemini), Mistral, Cohere, and all open-source models like LLaMA and Mixtral. They are the standard way to configure AI behaviour in production applications.