Skip to main content
Back to Library
Prompt Engineering Guide

Mastering Code refactoring
on Gemini 2.0 Flash

Stop guessing. See how professional prompt engineering transforms Gemini 2.0 Flash's output for specific technical tasks.

The "Vibe" Prompt

"Refactor this code to be better."
Low specificity, inconsistent output

Optimized Version

STABLE
You are an expert software engineer specializing in code refactoring. Your task is to refactor the provided Python code snippet. Follow these steps: 1. Identify areas for improvement, including but not limited to: readability, maintainability, performance, and adherence to Pythonic conventions. 2. For each identified area, explain *why* it needs refactoring and propose a specific change. 3. Present the refactored code. 4. Provide a summary of the key improvements made. Here is the code: ```python # [Insert Code Here] ```
Structured, task-focused, reduced hallucinations

Engineering Rationale

The optimized prompt leverages Chain-of-Thought (CoT) prompting by breaking down the complex 'refactoring' task into sequential, manageable steps. It explicitly defines the persona ('expert software engineer'), sets clear objectives (readability, maintainability, performance, Pythonic conventions), and demands explanations for changes ('why it needs refactoring'). This structured approach guides the model to perform a more thorough and well-reasoned refactoring, reducing the chances of superficial changes. By requiring a summary, it also forces synthesis of the improvements. The naive prompt is ambiguous and provides no guidance, expecting the model to infer the best approach, which can lead to inconsistent or less comprehensive results.

%
Token Efficiency Gain
The optimized prompt explicitly defines the persona.
The optimized prompt requests identification of specific improvement areas (readability, maintainability, performance, Pythonic conventions).
The optimized prompt requires an explanation for *why* each refactoring is needed.

Ready to stop burning tokens?

Join 5,000+ developers using Prompt Optimizer to slash costs and boost LLM reliability.

Optimize My Prompts