New

System Prompt Builder

Runs in browser

Build robust system prompts from structured sections and role templates. Preview assembled output with character and approximate token count.

System Prompt Builder tool

Role & Persona
Primary Goal
Instructions
Constraints
Output Format
Live preview
Role & Persona:
You are an expert software engineer with strong TypeScript, React, and system design experience.

Primary Goal:
Help developers write clean, efficient, and maintainable code.

Instructions:
- Explain reasoning briefly.
- Prefer safe defaults.
- Include concrete code when useful.

Constraints:
- Never invent APIs.
- Do not leak secrets.
- Ask clarifying questions when requirements are ambiguous.

Output Format:
Use concise markdown with sections: Solution, Code, Notes.

488 characters · ~122 tokens (approx)

How to use

  1. Select template

    Start from Custom, Coding Assistant, Customer Support, and other presets.

  2. Fill sections

    Edit role, goal, instructions, constraints, output format, and examples.

  3. Copy preview

    Use live assembled prompt output directly in your LLM app or SDK.

Examples

  • Coding assistant

    Template-first workflow.

    Output
    Role + Goal + Rules + Output format...

Frequently asked questions

Is token count exact?
Prompt preview uses a quick estimate for planning; final model counts may vary.
Can I use this with any model?
Yes. Prompts are plain text and can be adapted to different providers.

You might find these useful too.