Peer Review & Iteration
Give and receive feedback on AI strategies like a professional.
Peer Review & Iteration
The best AI strategies are pressure-tested by smart people asking hard questions. This lesson teaches you to give and receive constructive feedback on AI plans.
The Peer Review Framework
What to Evaluate
When reviewing someone else's AI strategy (or asking others to review yours), assess these dimensions:
#### 1. Problem Clarity (Is the problem well-defined?)
- •Is the problem specific and measurable?
- •Is it clear why this problem matters?
- •Are the current costs/impacts quantified?
- •Could a skeptic agree this is a real problem?
#### 2. Solution Fit (Does AI actually solve this problem?)
- •Is AI the right tool? (Could a spreadsheet or simple software solve this?)
- •Are the chosen tools appropriate for the scale and complexity?
- •Is the technical approach sound?
- •Are there simpler alternatives that weren't considered?
#### 3. Feasibility (Can this actually be built?)
- •Is the timeline realistic?
- •Does the team have the skills (or a plan to acquire them)?
- •Are the cost estimates realistic? (Not too optimistic, not padded)
- •Are dependencies identified and manageable?
#### 4. Risk Awareness (What could go wrong?)
- •Are the major risks identified?
- •Are mitigation plans specific and actionable?
- •Is there a kill criteria?
- •Is there a rollback plan if things go wrong?
#### 5. ROI Honesty (Do the numbers add up?)
- •Are the savings calculations realistic?
- •Are costs comprehensive (including hidden costs)?
- •Has sensitivity analysis been done?
- •Would you invest your own money based on this analysis?
How to Give Good Feedback
The SBI Method (Situation, Behavior, Impact)
Instead of: "Your ROI analysis is bad"
Say: "In the ROI section [situation], you calculated time savings at $75/hour [behavior], but the role you're replacing is typically $35-40/hour fully loaded [impact: the ROI may be overstated by 2x]."
The Three-Column Method
| What's Strong | What Could Be Better | Questions to Address |
|---|---|---|
| Problem definition is specific and compelling | Timeline feels aggressive — Phase 1 in 2 weeks may not allow for proper testing | What happens to the workflow if the AI model is updated and responses change? |
| Tool evaluation scorecard is thorough | Cost estimate doesn't include maintenance/monitoring time | Have you validated the 80% accuracy assumption with a real test? |
| Risk section is honest and well-thought-out | System prompt could be more specific about edge cases | What's the fallback if your automation platform has an outage? |
Rules for Constructive Feedback
- 1.Be specific — "The ROI section needs work" is useless. "The time savings estimate assumes 100% adoption in week 1, which is unrealistic" is actionable.
- 2.Lead with strengths — People are more receptive to criticism when they feel their work is valued.
- 3.Suggest, don't dictate — "You might consider..." is better than "You should..."
- 4.Ask questions — "Have you thought about X?" invites discussion. "You forgot X" shuts it down.
- 5.Focus on the plan, not the person — "This section could be stronger" vs "You did this wrong."
How to Receive Feedback
The 3-Step Response
- 1.Listen completely — Don't interrupt. Don't defend. Just absorb.
- 2.Ask clarifying questions — "Can you tell me more about why you think the timeline is aggressive?"
- 3.Categorize the feedback:
- Accept: The feedback is clearly right. Make the change.
- Consider: The feedback has merit but you disagree partially. Think on it for 24 hours.
- Acknowledge: The feedback reflects a difference of opinion. Thank them and explain your reasoning.
What NOT to Do
- •Don't explain every decision immediately — that's defending, not listening
- •Don't dismiss feedback because the reviewer "doesn't understand"
- •Don't take it personally — they're reviewing the plan, not judging you
- •Don't change everything based on one person's opinion — look for patterns across reviewers
Self-Review Checklist
Before asking for peer review, evaluate your own work:
- •[ ] Could someone unfamiliar with this business understand my plan?
- •[ ] Are all numbers backed by data or clearly marked as estimates?
- •[ ] Have I considered at least 3 risks and mitigations?
- •[ ] Is my timeline realistic if everything takes 50% longer than expected?
- •[ ] Am I recommending AI because it's the best solution, or because this is an AI course?
- •[ ] Would I bet my own money on this ROI estimate?
- •[ ] Have I included what I'd do if this doesn't work?
- •[ ] Is my presentation clear enough for a non-technical stakeholder?
The Final Iteration
After receiving feedback, make one final pass through your capstone:
- 1.Address all "Accept" feedback
- 2.Reconsider all "Consider" feedback
- 3.Strengthen your weakest section
- 4.Proofread for consistency (do your numbers add up across sections?)
- 5.Add a "Limitations and Assumptions" section for full transparency
Exercises
0/4Use the self-review checklist on your capstone project. For each item, honestly assess whether your project meets the criteria. Identify your 3 weakest areas and improve them. Share the before and after for at least one section.
Hint: The hardest items on the checklist are: "Am I recommending AI because it's the best solution?" and "Would I bet my own money on this ROI?" Be brutally honest.
Ask an AI to play the role of a skeptical CFO reviewing your capstone project. Give it your executive summary and ROI analysis, then ask it to poke holes. Address each critique with a revision or a reasoned defense.
Hint: Prompt: "You are a skeptical CFO who has seen many failed technology initiatives. Review this AI proposal and challenge every assumption, number, and timeline. Be tough but fair." Then iterate based on the challenges.
Reflect on your entire BlueWave learning journey. What was the most valuable skill you developed? What surprised you about AI? What would you do differently if you started over? How will you apply what you've learned in the next 30 days?
Hint: Be specific. "I'll use AI more" is vague. "I'll build the email classification workflow I designed in Wave 7 and pilot it with my support team by the end of the month" is actionable.
When receiving critical feedback on your AI strategy, what is the best first response?