Prompt Patterns I: Laying the Foundation

Prompt engineering isn’t just technical—it’s a blend of creativity and strategy, much like guiding a team through an agile sprint. In this first module, I explored foundational patterns that show how a few well-crafted words can lead AI to deliver brilliant results. Let’s dive into these concepts through relatable stories and examples.

1. Few-Shot Examples: Guiding with Just Enough

Imagine you’re training a new team member. Instead of overwhelming them with every single process, you share just a few crystal-clear examples of how tasks are done. This is the essence of few-shot prompting: giving AI just enough context to get it right.

  • Example: Let’s say I’m writing user stories. I prompt the AI with:
    *“Here are examples of well-written user stories:

    1. As a teacher, I want to grade assignments digitally so that I save time.
    2. As a student, I want to see my grades in real-time so that I stay informed.*
      Write a user story for an admin dashboard feature.”
  • Result: The AI responds:
    “As an administrator, I want to view system usage metrics in a dashboard so that I can monitor performance.”

Aha! With just a few examples, it understands the format and delivers a great result.

2. Few-Shot Examples for Actions: Showing the Way

Have you ever coached a team through a retrospective and realized they weren’t sure how to analyze action items? Instead of explaining abstractly, you show a few examples, breaking it down step by step. Few-shot prompting for actions works the same way.

  • Story: During a sprint, I wanted the AI to prioritize tasks. I prompted it:
    *“Here’s how tasks are prioritized:

    • Task: Fix critical bug (Urgency: 5, Importance: 5). Priority: High.
    • Task: Update documentation (Urgency: 2, Importance: 3). Priority: Low.
      Now prioritize this task: Create user onboarding flow (Urgency: 4, Importance: 4).”*
  • Result: The AI concludes:
    “Priority: Medium-High. Onboarding flow is important but not as urgent as fixing bugs.”

A clear and actionable result—perfect for agile prioritization.

3. Few-Shot Examples with Intermediate Steps: Building the Bridge

When faced with a complex feature request, breaking it into smaller pieces is key. The AI can mimic this agile principle when you guide it with intermediate steps.

  • Example:
    “Feature: Build a user profile system.
    Step 1: Design the database schema for storing user profiles.
    Step 2: Create API endpoints for CRUD operations.
    Step 3: Develop the front-end interface for users to edit their profiles.”
  • Aha! Moment: The AI doesn’t just produce the final output—it builds the bridge by generating actionable steps, just like we do when refining epics into stories and tasks.

4. Writing Effective Few-Shot Examples: Clarity is King

Great examples don’t just describe—they inspire. When crafting few-shot prompts, variety and precision matter. I learned this the hard way when I gave the AI vague examples during my first attempt. It churned out equally vague results. Lesson learned!

  • Revised Example:
    *“Write an engaging email subject line. Examples:

    1. ‘Unlock Your Team’s Potential with These Agile Tools!’
    2. ‘Your Next Sprint Made Simple: Learn More.’
      Write a subject line for a webinar about prompt engineering.”*
  • Result:
    “Transform Your Prompts: Master AI with Ease!”
    Precision inspires precision.

5. Chain-of-Thought Prompting: Thinking Out Loud

Chain-of-thought prompting mirrors how we analyze complex problems. Imagine walking a team through a root-cause analysis during a retrospective.

  • Example Prompt:
    *“Why didn’t the new feature perform well? Let’s break it down:

    1. Were users aware of the feature? No.
    2. Was the onboarding process clear? Not entirely.
    3. Were bugs reported? Yes, three major ones.”*
  • Result: The AI generates actionable insights:
    “Focus on improving communication and resolving critical bugs before the next launch.”

It’s as if the AI is “thinking” with you.

6. ReAct Prompting: Reason and Act

ReAct combines reasoning and action. Picture yourself in a planning meeting where the team has to make quick decisions about next steps.

  • Example Prompt:
    *“You are a product owner deciding on the next feature. Reason through the options:

    • Feature A: High user demand, low development cost.
    • Feature B: Low user demand, high development cost.
      Act by selecting the best option.”*
  • Result:
    “Feature A is the optimal choice. High demand and low cost maximize value.”
    The AI not only reasons but takes action, making it an agile collaborator.

7. Using Large Language Models to Grade Each Other: Peer Reviews for AI

Just as retrospectives offer valuable feedback loops in agility, using AI to critique its own output builds a cycle of continuous improvement.

  • Example Prompt:
    “Grade this user story: ‘As a customer, I want to track my orders so that I stay updated.’ Strengths: Concise and user-focused. Weaknesses: Lacks technical detail. Suggested improvement: Add a reference to tracking methods.”

Aha! Moment: The AI learns to critique and improve, aligning perfectly with agile’s emphasis on iteration and feedback.