System Prompts Overview
System prompts are the foundational instructions that guide GPT-5.1's behavior when generating gift recommendations. They define tone, format, language rules, and quality controls.
Purpose
System prompts ensure:
- Consistent tone across all responses
- Proper formatting of product recommendations
- Language accuracy (Estonian and English)
- Quality controls to prevent hallucinations
- Conversion optimization for better recommendations
Architecture
Language Selection Rules
Location: app/api/chat/system-prompt.ts:246-267
Rule 1: Estonian Query → Estonian Prompt
if (language === 'et') {
return buildEstonianSystemPrompt();
}
Rule 2: English Query → English Prompt
if (language === 'en') {
return buildEnglishSystemPrompt();
}
Rule 3: Mixed/Undefined → Estonian (Default)
if (!language || language === 'mixed') {
return buildEstonianSystemPrompt(); // Default to primary market
}
Prompt Components
Each system prompt contains:
-
Identity & Role
- "I am RahvaRaamat AI shopping assistant"
- Clear purpose statement
-
Response Style
- Tone configuration (enthusiastic/friendly/professional)
- Sentence count per product (3-4)
- Conversion language flags
-
Formatting Rules
- Numbered lists (1., 2., 3.)
- Bold product titles
- Structured descriptions
-
Critical Prohibitions
- No hallucinating content
- No website URLs
- No invented details
-
Context Enrichment (optional)
- When to ask questions
- How many to ask
- What to ask about
-
Context Reminders (multi-turn)
- Recipient context
- Occasion context
- Coherence across turns
Configuration-Driven
Location: app/chat/config.ts
export const chatConfig = {
productDescriptions: {
tone: 'enthusiastic' | 'friendly' | 'professional',
sentencesPerProduct: 3,
useConversionLanguage: boolean,
emphasizeBenefits: boolean,
personalizeToContext: boolean
},
contextEnrichment: {
enabled: boolean,
maxQuestions: number,
requireProducts: boolean,
minConfidence: number
}
};
Changes to config automatically update prompts - no code changes needed.
Dynamic Generation
function generateDynamicSystemPrompt(
language?: 'et' | 'en' | 'mixed',
context?: { recipient?: string; occasion?: string }
): string {
// 1. Select base prompt by language
let basePrompt = getBaseSystemPrompt(language);
// 2. Add context reminders if available
if (context?.recipient) {
basePrompt += contextReminder(context.recipient);
}
// 3. Add enrichment section if enabled
if (chatConfig.contextEnrichment.enabled) {
basePrompt += enrichmentInstructions();
}
return basePrompt;
}
Validation Pipeline
After GPT-5.1 generates a response:
Key Features
Multi-Language Support
- Native Estonian and English prompts
- Cultural context awareness
- Proper grammar and idioms
Hallucination Prevention
- Strict rules against content invention
- Product-only descriptions
- Validation checks post-generation
Conversion Optimization
- Enthusiastic tone options
- Benefit-focused language
- Call-to-action phrasing
Context Coherence
- Multi-turn conversation memory
- Recipient/occasion reminders
- Consistent personalization
File Structure
app/api/chat/
├── system-prompt.ts # Main prompt logic
├── config.ts # Configuration
└── utils/
└── context-enrichment-questions.ts # Question generation
Documentation Pages
- Estonian Prompt - Estonian language rules and formatting
- English Prompt - English language rules and formatting
- Dynamic Generation - Language selection and context adaptation
- Context Enrichment - Question generation for missing context
- Response Validation - Hallucination detection and prevention
- Configuration - Tone, conversion language, and feature flags
- Followup Router Prompt - Semantic reasoning for followup classification
Performance Impact
{
promptBuildTime: ~5ms, // Negligible
promptTokens: ~400, // Moderate context
cachingEnabled: true, // System prompt cached
cacheHitRate: ~95% // Very high reuse
}
Related Documentation
- Phase 5: Response Generation - How prompts are used
- Pipeline Models - GPT-5.1 configuration