Skip to Main Content

Artificial Intelligence (AI) in the Academic Health Sciences : Prompt Writing

What is Prompt Writing?

Prompt writing (also known as "prompt engineering") is the process of creating instructions or queries to elicit a desired response from a generative AI model. A prompt is a bit like a roadmap for AI models; the better you design your map (i.e., prompt), the more likely the AI model will arrive at your desired destination (i.e., output).

This page gives you an overview of the components of an effective prompt, and some additional techniques you can use to elicit desired responses from AI models.

Components of an Effective Prompt

An effective prompt typically includes some (or all) of the following components:


  • Instruction:
    • A command for what you want the model to do.
    • Example: "List the works of Victor Hugo."
  • Supporting Content:
    • Additional, contextual information that can be used to further inform the model's output. When used, this is typically included alongside the instruction.
    • Example: "Write an email wishing your coworkers a happy new year. Incorporate themes of wellness, happiness, and fulfillment." -In this case, the themes would be the supporting content.
  • Primary Content:
    • The text you would like the model to process. Primary content can be in a variety of formats, such as sentences, lists, and tables.
    • Example: "Summarize the following paragraph: [insert paragraph]" -In this case, the inserted paragraph would be the primary content.
  • Example(s):
    • Example(s) of how you would like the model to behave.
      • Zero shot learning refers to when no example is given in the prompt.
      • One shot learning refers to when one example is given in the prompt.
      • Few shot learning refers to when more than one example is given in a prompt.
    • Example: "Title: Oliver Twist, Author: Charles Dickens. Title: War and Peace, Author: Leo Tolstoy. Title: Moby Dick, Author: " -In this case, Oliver Twist and War and Peace would be the examples.
  • Cue:
    • A "hint" or "jump start" for how you would like the model to structure its output.
    • Example: "Summarize the following paragraph [insert paragraph]. Key points: (1)" -In this case "Key points: (1)" would be the cue, as it tells the model to structure the response as Key Points: (1) [insert first point] (2) [insert second point] etc.

Note: This information was based on Microsoft's Azure OpenAI Service Documentation. Visit their page for additional tips on prompt writing.

Prompt Writing Techniques

There are additional techniques you can use to produce an effective prompt. A few of these techniques are listed below.


  • Include your instruction at the beginning and end of the prompt:
    • Models typically prioritize text at the beginning and end of the prompt. Repeating an instruction can help to generate a more effective response from the model.
  • Be clear:
    • Be concise and explicit with what you want from in the prompt. This includes stipulating the length of the response, and refraining from using ambiguous language.
    • Example: Instead of asking, "Why should I read Don Quixote?" tell the model, "List 5 key themes in Don Quixote, and how each theme is relevant to modern society."
  • Include a system message:
    • This gives additional context to the model, and tells it how to frame its response.
    • Example: "You are a bookstore owner that gives customers advice on what books to read."
  • Use separators for different components of your prompt:
    • This makes your prompt easier to read for the model. You can use labels for each part of the prompt, use punctuation to separate out each part, or a combination of the two.
  • Break down the prompt into manageable chunks:
    • Models can sometimes have trouble with long, complex prompts. To help with this, you can break down your prompt into "chunks."
    • Example: "Extract factual claims from this paragraph. [insert paragraph]. Next, fact check the queries using a search engine."
  • Use chain of thought prompting:
    • With this technique you instruct the model to outline each step of the process it used to create its response.
    • Example: "Take a step-by-step approach in your response."
  • Specify your output structure:
    • This is telling the model the specific format in which you would like the response to be presented.
    • Example: "List the books you would recommend as a bulleted list using the format: author, title, year."
  • Provide grounding context:
    • With this technique you are giving the model data from which to draw its responses. This increases the likelihood the model will provide accurate responses (reducing the risk of fabrication).
    • Example: "Extract what Jean Valjean stole from the following paragraph: [insert paragraph]."
  • Give the model an "out":
    • With this technique you are telling the model to output something if it cannot complete the assigned task. This helps to reduce fabrication.
    • Example: "Respond with 'Unknown' if you can't find the answer."
  • Revise your prompt:
    • If the model isn't giving you the desired response, try rephrasing your prompt. Creating a prompt that works is an iterative process!
    • Example: If you inserted the prompt: "State whether X journal has characteristics of a predatory journal" and the model responds that it doesn't have real-time access to the internet, you can revise to "Based on the data you have, state whether X journal has characteristics of a predatory journal."

Note: These techniques were based on those from Microsoft's Azure OpenAI Service Documentation. Visit their page for additional tips on prompt writing.

Additional Resources for Creating Prompts