Improving Classroom AI Prompts

Explore three key tips for improving your generative AI prompts.

Grades K-12 9 min Resource by:
Listen to this article

To improve the quality of outputs in generative AI chatbots like OpenAI’s ChatGPT, Google Gemini, Microsoft Copilot, or Anthropic’s Claude, follow these three simple tips.

Have the Proper Mindset

When interacting with an AI chatbot, it’s helpful to engage with it as if it were an intelligent, trusted colleague.

Chatbots are designed to be good communicators. They’re trained in language, so we can interact with them conversationally. They’re also very smart. They’ve been trained on vast amounts of knowledge, so we can ask questions and expect that they can nearly always return an answer.

Generative AI is also really good at making connections, so we can give these chatbots information or lists of ideas and ask them to make sense out of it. Perhaps we want to see trends in a series of data sets or find common strategies from a collection of educational articles. These AI chatbots are able to efficiently connect the dots.

We do need to remember that, just like our colleagues, chatbots can also be wrong, and they can pitch us ideas that we just don’t like. And that’s okay—it’s just like any brainstorming experience, and not every idea will be a keeper.

While the AI chatbots are great collaborative partners, rich sources of information, and wells of inspiration and ideas, in the end, it’s up to you to process what you’re given and make the final judgments. That oftentimes means taking what you like while making the necessary revisions and additions to make the outputs usable.

Write Effective Prompts

This is another place where it’s helpful to think of our interactions with AI as interactions with a colleague. If we’re not clear in what we ask or say to another human, the person on the listening end often won’t give us great feedback in return.

For example, if you ask a friend to bake you a cake, you’ll probably get a cake, but you’re leaving a lot to chance. When do you need it? What’s the occasion? How big should it be? What flavor would you like? What’s the budget? Does it have to be refrigerated? Without these details, there’s a good chance that the cake won’t meet your expectations or needs.

The same goes for prompting. If you simply ask a chatbot to create a lesson about ecosystems, it won’t know what the grade level is, the specific standards you’re targeting, the length of time you have to teach it, or the technology you have available. As with the cake request, there are too many unanswered questions.

A much better prompt might be: “Design a 3-day, inquiry-based science lesson on ecosystems for fifth graders that integrates free digital resource tools and ends with a student-created multimedia project. Include learning objectives, vocabulary, and a formative check each day. Generate this as a downloadable Word document.” This prompt, much more rich with details, will almost assuredly leave you more satisfied with the lesson than the first prompt.

There are lots of prompting models available to help guide you. Here are five components that appear in nearly all of them.

1. Assign a role.

Tell the chatbot what role it should assume. Most commonly, this means saying something like, “Act as an expert K–12 instructional coach,” or “You are an experienced ninth-grade life science teacher.”

Assigning a role doesn’t magically make the AI into that person, but it does help the system know what data resources to pull from. It can then focus its energy on the best resources and give you higher-quality results. Instead of searching the entirety of its knowledge base, it will focus on content from better quality resources that are relevant to your specified roles and needs.

2. Describe a clear task.

This one seldom gets forgotten, but you do need to tell the chatbot what you want it to do. Do you want it to design a lesson, create a rubric, or generate an exit ticket? Whatever task you request, it’s helpful to use action words, like design, explain, generate, analyze, and produce.

3. Identify the audience.

This is an important question, especially if you’re using AI to help design lessons. While you should never input specific student names or identifiable information attached to students into the chatbot, you should describe, in general, who the lesson is for. Include details like grade, class name, overall student achievement level, and perhaps general student interest areas. Include any other specific details that are key to the success of your lesson.

If the output is not for students, describe that intended audience instead.

4. Specify the final product.

Tell the chatbot what you want the output to look like. What kind of product is it? What’s the format? Is it a bulleted list of ideas? Is it a rubric in table format? Do you want a process described in a step-by-step list? Maybe you want a script, an email, or a story.

The clearer you can be about what you want produced, the better chance that you will receive it back to your specifications.

5. Add relevant limiters and context.

This is where you use your judgment and ask yourself, “What else should my virtual collaborator know about the task that I’m asking it to complete?”

This is the part of the equation that can really advance you to the next level and give you much better results.

Ask yourself what additional information might improve the final product that you want to receive. This might include asking the chatbot to produce content at a certain reading level or perhaps in a certain style. It might mean identifying the purpose. Is the intent to entertain, or should the tone be more serious? Do you want readers to be able to skim it, or should it be something that requires more in-depth study? Are there length limits or specific examples that you want included?

Essentially, what are the relevant details that you’d want your collaborative colleague to know if they were helping you complete this task at a high level?

Follow Up and Refine

Even with your best efforts, it’s hard to get perfect outputs from your first prompt attempt. AI will often produce outputs that you hadn’t expected, and you may discover that you forgot to mention some key details.

When reviewing the results, it’s really important to follow up and refine. This means asking follow-up questions or giving additional directions. In a sense, you are continuing the conversation after hearing the first response, just like you would with a human colleague.

If the output was too wordy or complicated, you can ask for the response to be simplified. The classic example is: “Explain it to me like I’m a fifth grader.”

You might ask for the format to be refined or expanded. If you like the fourth idea of a brainstorm list the best, you could say, “Number four is the best idea from this list. Generate more ideas like that.”

The AI will sometimes ask you if you want certain refinements. That’s helpful and might steer you in directions that you hadn’t considered. The chatbot might ask you if you want something formatted in a different way or want something to be expanded or clarified. Again, it’s like a colleague asking helpful questions.

In general, keep the back-and-forth conversation going until you’re satisfied. It’s very likely that you will spend more time refining than prompting, and that’s okay.

Bonus Tip

While these three tips will help get you better results, prompting still takes practice. The best way to get better is to do it often, review your results, and learn from the experience.

To improve more quickly, consider selecting a specific teaching task that you want help with. Maybe it’s generating journal topics for a writing class, developing engaging ways to build community in your classroom, or finding new exit ticket ideas. Pick something meaningful to you, and then spend a week or two using generative AI as a thought partner to brainstorm relevant ideas and solutions.

It’s important to do this on a regular, ongoing basis. The repeated attempts will allow you to see trends, apply your learning, and iterate.

AVID Connections

This resource connects with the following components of the AVID College and Career Readiness Framework:

  • Rigorous Academic Preparedness
  • Opportunity Knowledge
  • Student Agency
  • Insist on Rigor
  • Break Down Barriers

Extend Your Learning