Free 🧰 Practical LLM Lessons
LLM Lab 5: Instructions and Guardrails
See how system messages, user requests, and safety policies stack to shape the final reply.
Published January 8, 2026
Who is really in charge?
LLMs weigh multiple instruction layers: system, developer, and user. Later messages can be ignored if they conflict with higher-priority rules.
Can You Guess?
A chat has: System: 'Be a concise assistant.' User: 'Write a 1,000-word essay.' Which wins?
Instruction priority stack
Follow the steps and watch each checkpoint light up as you progress.
Step 1: System
Highest priority: sets behavior and safety guardrails.
Step 2: Developer
Optional middle layer: templates or app-specific rules.
Step 3: User
Lowest priority: the prompt you type.
Step 4: Policies
Safety checks can refuse or redact content even if prompted otherwise.
Try this now
Flip the instructions deliberately
Key Takeaways
- ✓ Instruction order and type set priorities.
- ✓ Conflicts create unpredictable responses.
- ✓ Clear, non-conflicting instructions yield stable outputs.
Quick poll
What brought you to this free lesson?
Lesson: LLM Lab 5: Instructions and Guardrails