Skip to main content
Back to Library
Prompt Engineering Guide

Mastering Medical report summary
on Mixtral 8x22B

Stop guessing. See how professional prompt engineering transforms Mixtral 8x22B's output for specific technical tasks.

The "Vibe" Prompt

"Summarize this medical report for me. Make it easy to understand. Keep it short. Here's the report: [Medical Report Text]"
Low specificity, inconsistent output

Optimized Version

STABLE
You are a highly experienced medical transcriber and summarizer. Your task is to generate a concise, accurate, and easy-to-understand summary of the provided medical report. The summary should be suitable for a general audience and highlight key patient information, diagnoses, treatments, and future recommendations. Follow this chain of thought: 1. **Identify Patient Demographics:** Extract patient's age and gender if present. 2. **Chief Complaint/Reason for Visit:** Determine the primary reason the patient sought medical attention. 3. **Key Medical History:** Note any significant past medical conditions, surgeries, or relevant family history. 4. **Major Findings/Diagnoses:** List the most important diagnostic findings and confirmed diagnoses. 5. **Treatments & Medications:** Detail current or proposed treatments, including medications, procedures, or therapies. 6. **Prognosis & Recommendations:** Summarize the anticipated outcome and any future instructions, follow-up appointments, or lifestyle changes. 7. **Synthesize:** Combine the extracted information into a coherent, flowing summary, prioritizing clarity and brevity. Begin by extracting the requested information from the report section by section, then provide the final summary. If a section is not applicable or information is missing, state 'N/A'. **Medical Report:** [Medical Report Text] **Extracted Information:** 1. Patient Demographics: 2. Chief Complaint/Reason for Visit: 3. Key Medical History: 4. Major Findings/Diagnoses: 5. Treatments & Medications: 6. Prognosis & Recommendations: **Summary:**
Structured, task-focused, reduced hallucinations

Engineering Rationale

The optimized prompt leverages Chain-of-Thought (CoT) prompting, breaking down the complex task into manageable, sequential steps. This forces Mixtral 8x22B to process the information systematically, reducing the likelihood of omissions or inaccuracies. By explicitly defining the roles ('medical transcriber and summarizer') and target audience ('general audience'), it sets clear expectations for tone and complexity. The detailed instructions for each extraction point ensure comprehensive coverage of critical information. The 'Synthesize' step encourages coherent narrative generation, and the structured output format for 'Extracted Information' acts as an intermediate scratchpad, making the model's reasoning transparent and enabling self-correction before generating the final summary. This structure guides the model more effectively than a vague, short instruction.

0%
Token Efficiency Gain
The optimized prompt explicitly asks the model to act as 'a highly experienced medical transcriber and summarizer', establishing a clear persona.
The optimized prompt includes a detailed 'chain of thought' with 7 distinct steps for processing the information.
The optimized prompt specifies the target audience as 'a general audience', guiding the language and complexity of the summary.

Ready to stop burning tokens?

Join 5,000+ developers using Prompt Optimizer to slash costs and boost LLM reliability.

Optimize My Prompts