Critical Thinking with AI: Two Approaches 

Share this Teaching Tip
Reading Time: 6 minutes

Critical thinking, a crucial skill for our students’ career success, involves analyzing information, evaluating its reliability, and making judgments based on it. However, mastering critical thinking is not easy work. It requires a deep understanding of the qualities of what we are evaluating and the criteria appropriate to the evaluation of those qualities (Paul, 1993). Without being well-informed, we risk falling into the trap of bias and faulty thinking, which can affect our beliefs and actions (Nelson, 2024).

As an expert in both content and critical thinking in your field or discipline, you may be able to identify the shortcomings of AI outputs with ease, but your students (as novices) may struggle to notice patterns, gaps, assumptions, and errors in thinking (Nelson, 2024). In teaching critical thinking in the AI age, students may fall back on using AI as an “answer” tool, often shortcutting the more time-, effort-, and experience-intensive process of concluding using a combination of reasoning and evidence.

How can we use AI to foster rather than replace critical thinking? The answer may lie in providing worked examples demonstrating the steps taken in the critical thinking process, using AI not as an answer tool but as a process tool. Making the critical thinking process visible using AI can also show students its capabilities and weaknesses as a tool that predicts but cannot “think” critically (MIT Horizon, 2024).

This post provides two analogies and examples to promote critical thinking with AI in your classroom:  

  1. The Navigator’s Map: This approach involves analyzing an AI output by observing its details, then comparing it with a reliable outside source. In this analogy, AI outputs are the map, the AI user is the journeyer, and the verified source is the actual terrain. 
  1. The Sculptor’s Stone: This approach involves establishing quality criteria, then iteratively improving an AI output by refining it to meet certain criteria, specifications, requirements, or standards. In this analogy, the AI user is the sculptor, the AI output is the stone, and AI prompts are the tools to “chisel” the stone. 

The Navigator’s Map: Observe, Question, Compare (OQC) 

This in-class approach involves analyzing an AI output by observing its details and comparing it with an authoritative source beyond what AI provides. For instance, students could use AI to generate a summary of a historical or current event and then compare this with information from a trusted textbook or website. This helps students identify meaningful information and ask questions, given that AI cannot reliably “check” its own work.

Here are the three steps to show in a worked example in class:

  1. Observe  – Identify the features of the AI output, even if it feels self-evident. For example, how many points are there? What are the points? How long are the points? 
  2. Question – Brainstorm questions to ask and possible answers based on initial impressions – no question is a bad one! For example, Is this true and accurate? Is it specific and relevant? Is it fair? Is it usable? What might be missing? 
  3. Compare/Check – Find an authoritative source to review or compare with the AI output. This could be a trusted family member; laws, legislation, or policy; market or research data; standards and best practices; webpage or professional association information; textbook or journal article. Or, you may create your own resource.

Example in Practice

It’s important to remind students that AI is a tool, and its outputs need to be interpreted and verified by human users. Note that you may adapt this process to suit your own course subject and learning activities. Here is an example using the topic of creating a healthcare plan.

Step Instructions AI Script/Prompt Notes 
Observe Identify the features of the AI output. Let’s use AI to “Generate a brief patient care plan for a 65-year-old patient with diabetes.” Let’s make a list of what we notice about what is in the plan.Encourage students to note the number of points, their content, and length in the AI-generated care plan. 
Question Brainstorm questions based on initial impressions. Let’s ask ourselves: “What questions arise from the AI-generated care plan? What do we like? What do we not like? Write them down.” Remind students that no question is a bad one. Questions could be about the accuracy, relevance, fairness, usability, or any missing information. 
Check Find a source to review or compare with the output. Now, let’s compare the AI-generated care plan with established care guidelines for diabetes. Note the differences and similarities. What do the guidelines tell us that are not incorporated into the AI-generated plan?” Students can use established care guidelines as the authoritative source. Discuss the importance of cross-verifying AI outputs with reliable sources. 
PSW/Healthcare example of the Observe, Question, and Compare Approach

The Sculptor’s Stone (Strategy 2): Review, Evaluate, Re-prompt (RER) 

This second approach focuses on defining what quality for a particular task, then evaluating an AI output against that criteria. It also invites students to use the established criteria to improve what the chatbot originally created. For example, in a career development class, students could use AI to generate a cover letter, then refine the prompt based on specific criteria such as values alignment, research about the organization, and personal details to get a nuanced letter.

Here are the three steps to show in a worked example in class:

  1. Review Criteria – What is a quality, desirable, standard, or “good” output? This could include specificity, context, organization, clarity, usability, format, authenticity, or emotion.
  2. Evaluate – What are the gaps between the criteria and the output? What could AI add, delete, or modify? What advice does AI give for improving, based on the feedback it provides? What might AI not be able to improve, given its inherent limitations? 
  3. Refine the Prompt – Give further instructions to AI using identified criteria. Regenerate the output with a more detailed or specific prompt, or decide to stop prompting and make human changes. 

AI can only improve its own work based on the criteria it is asked to apply. This approach fosters not only critical thinking skills and knowledge of course content but also skills for using generative AI.

Example in Practice

It’s important to remind students that AI is only a predictive tool, and its outputs need to be interpreted and refined by human users based on established criteria. At some point in the regeneration process, AI users may not be able to obtain a better output; as a class, you can decide when to stop the re-prompting process.

Step Instructions AI Script/Prompt Notes 
Review Identify the quality criteria for a marketing campaign. Let’s list what makes a good marketing campaign, for example, for a new fitness product. What do we need to know to determine whether it’s a good campaign?  Discuss with students what makes a good marketing campaign. This could include specificity, details related to context or organization, clarity, usability, format, authenticity, and human emotional responses. AI may also be able to generate a starter list of criteria. 
Evaluate Identify the gaps between the criteria and the output. Based on the criteria we established in step 1, let’s refine the marketing campaign idea to target millennials in Ontario who are interested in fitness and wellness to build a healthy lifestyle and save money over a year. Please ask AI to regenerate the original draft, and add more criteria to make the output better. Now, let’s ask, “What’s changed? What’s still missing? What should we try again? When should we stop?” Encourage students to think about what AI could add, delete, or modify. What should be selected, and what should be discarded? What advice does AI give for how to improve? What might AI not be able to improve? 
Re-prompt Give further instructions to AI using identified criteria. Based on the criteria we established in step 1, let’s refine the marketing campaign idea to target millennials in Ontario who are interested in fitness and wellness to build a healthy lifestyle and save money over a year. Please ask AI to regenerate the original draft, and add more criteria to improve the output. Now, let’s ask, “What’s changed? What’s still missing? What should we try again? When should we stop?” Discuss with students the importance of specific prompts in generating quality AI outputs. Include a discussion of what has improved and what has not improved. Discuss the limits of AI as a “thinking” tool, and the benefits of human thinking in improving outputs.

Final thoughts

Here are some ideas to consider if you decide to bring one of these activities to your in-class lessons (Walter, 2024): 

  • Choose topics of interest and relevance to students (so they will be curious and motivated to share what they know).
  • Model the steps of the critical thinking process with AI (but be ready to pivot, as the output may be different than what you had created at another sitting). 
  • Encourage students to share what works as they experiment with prompts, observing how nuances in language can change the outputs.
  • Remind students of their unique human capabilities (and the limitations of AI) for solving complex, real-life problems.

While AI presents new challenges in teaching critical thinking, it also offers unique opportunities to foster these skills. By using AI not just as an answer tool but as a process tool, we can help students shift from being passive recipients of information to actively engaging with it, questioning it, and making judgments about it. This process can be especially meaningful and relevant to students if you describe and show why critical thinking matters in your field as you show how critical thinking works.

References 

MIT Horizon, (2024). Critical thinking in the age of AI. 

Nelson, N. (2024). Inference ladder to develop students’ critical thinking. LinkedIn. 

Paul, R. (1993). Critical thinking: What every student needs to survive in a rapidly changing world. Dillon Beach: CA: Foundation for Critical Thinking. 

Walter, Y. (2024). Embracing the future of Artificial Intelligence in the classroom: The relevance of AI literacy, prompt engineering, and critical thinking in modern education. International Journal of Educational Technology in Higher Education, 21(15), p.1-29. 

Elan Paulson

Elan Paulson, PhD, has been an educator in Ontario's higher education system since 2004. Before joining Conestoga as a Teaching and Learning Consultant, Elan was on the executive team at eCampusOntario. She previously served as Program Director and as an instructor in professional education programs at Western University's Faculty of Education. With a Master's in Educational Technology, Elan specializes in technology-enabled and collaborative learning to support diverse learners. She has also conducted research on faculty participation in communities of practice for professional learning and self-care.

Feedback
Feedback
Did you find what you are looking for? How easy was it to find what you are looking for?
Next
Enter your email if you'd like us to contact you regarding with your feedback.
Back
Submit
Thank you for submitting your feedback!