How To Design Courses That Develop Critical Thinking Skills

By StefanApril 15, 2025
Back to all posts

Getting students to think deeply is harder than it sounds. A lot of classes end up training recall: memorize the definition, repeat it on a quiz, move on. But critical thinking is different. It’s messy. It’s about judging evidence, spotting weak reasoning, and explaining how you got to your conclusion.

In my experience, students don’t “catch” critical thinking just by being told to. They need structured chances to wrestle with real decisions—then feedback on how they reason. That’s what this post is focused on: how to design courses where critical thinking is built into the objectives, the activities, and the assessments.

Below, I’ll walk you through what to do (and I’ll include a couple plug-and-play templates you can steal).

Key Takeaways

  • Center the course on real problems. Scenario example: “A local clinic wants to cut costs by reducing appointment lengths.” Learning objective wording: “Students will evaluate trade-offs between cost, patient outcomes, and equity using specific evidence.” Rubric criterion: Evidence quality (0–4)—does the student cite relevant data or credible sources? Example prompt: “Which option would you recommend, and what evidence changes your mind?”
  • Write objectives that name the thinking. Example objective: “Students will distinguish between correlation and causation in a provided dataset and justify their classification.” Rubric criterion: Reasoning clarity—does the explanation connect claim → evidence → reasoning? Example prompt: “Label each statement as correlational or causal and explain why.”
  • Use interactive methods that force justification. Debate scenario: “Should schools adopt AI proctoring?” Rubric criterion: Counterargument handling (0–4)—does the student fairly address the opposing view and respond with evidence? Example prompt: “Respond to one peer’s claim by identifying a missing assumption.”
  • Build “what-if” and scenario analysis into every module. What-if scenario: “Gas prices triple overnight—predict effects on commuting, housing demand, and political debate.” Rubric criterion: Multiple perspectives—does the student consider at least two stakeholder groups? Example prompt: “What changes first, and what second-order effects follow?”
  • Assess critical thinking with tasks that require explanation. Assessment example: open-ended case write-up + reflection. Rubric criterion: Use of evidence and metacognition. Example prompt: “Describe your initial reasoning, then explain what evidence forced you to adjust it.”
  • For online courses, design for discussion quality—not just participation. Online activity: structured debate thread with “claim/evidence/reasoning + question back.” Rubric criterion: Peer engagement—does the student ask a follow-up that challenges reasoning? Example prompt: “What would you need to see to believe your conclusion?”

Ready to Create Your Course?

Try our AI-powered course creator to speed up drafting your course structure.

Start Your Course Today

Design Courses to Build Critical Thinking (Not Just Memorization)

If you want students to develop real-world thinking, you can’t rely on “content delivery” alone. You need to design the course so students repeatedly practice the same core moves: interpreting information, evaluating evidence, and explaining their reasoning in a way that others can challenge.

One practical way to do this is to start with problems that look like the world. Instead of “here are the theories of accounting,” try: “A small café’s profit is falling—how would you diagnose what’s driving it, using the numbers you’re given?” Students don’t just learn terms. They learn how to use information under uncertainty.

Here’s another habit that makes a huge difference: build in reflection that connects to decisions. Not “write what you learned.” I mean prompts like: “What assumption did you start with? What evidence did you use to challenge it?” Reflection is where students slow down and notice their thinking.

And don’t skip peer work. Critical thinking improves when students have to defend and respond. In a group critique, for example, students aren’t just submitting an answer—they’re reacting to someone else’s reasoning. That’s where the thinking gets sharper.

If you want to keep the whole course from turning into random activities, I like to anchor it to a clear plan. If you need help staying organized, review how to create a course outline so your modules ladder up to the critical thinking outcomes instead of drifting.

Set Objectives That Actually Describe Critical Thinking

“Students will improve critical thinking” is nice… but it’s not measurable. If you can’t tell what “good” looks like, students can’t aim at it, and you can’t assess it.

What I recommend is writing objectives that name the action and the thinking skill. For example:

Strong objective: “Students will evaluate two conflicting news sources on a current issue and justify which source provides more reliable evidence by identifying credibility indicators and reasoning flaws.”

Weak objective: “Students will understand media bias.”

When you break objectives into weekly steps, you also make course design easier. One week might be: “identify biases in opinion articles.” Another week: “distinguish credible data from unreliable data.” Notice how each one points to something you can observe in student work.

Then—this part matters—put the objectives in student-facing language. In my classes, students engage more when they can see the “why” behind the tasks. If you’re building a syllabus from scratch, it helps to use a structure you trust. This guide on making a course syllabus can save you time.

Mini-template: 60–90 minute lesson plan (Socratic questioning + evidence justification)

Lesson goal: Students learn to test a claim using evidence and reasoning.

Target level: Upper secondary / early college (works with adjustments).

Lesson topic (example): “Are algorithmic recommendations making people better informed?”

  • 0–10 min | Hook (short prompt): Show a 1-paragraph claim: “Recommendations reduce misinformation.” Ask students: “What would you need to know to believe this?” (No answers yet—just collect questions.)
  • 10–30 min | Socratic questioning cycle: Provide a short excerpt with evidence (or a chart). Students annotate it with a simple structure: Claim, Evidence, Assumptions, What’s missing. Then run a whole-class question sequence:
    • “What is the claim exactly?”
    • “How do we know it’s evidence, not just opinion?”
    • “What assumption is doing the heavy lifting?”
    • “What alternative explanation could fit the same evidence?”
  • 30–55 min | Small-group evidence test: In groups, students choose one assumption and challenge it with a “counter-evidence question.” Each group must produce:
    • 1 revised claim (if needed)
    • 1 piece of evidence they would look for
    • 1 question they’d ask to verify
  • 55–75 min | Whole-class share (structured): Each group shares using a 30-second script: “Our original assumption was… If that’s wrong, then… To check it, we would… ”
  • 75–90 min | Exit ticket (graded with a rubric): “Write one paragraph: defend or reject the original claim using at least 2 evidence-related reasons (not just ‘I agree’).”

Why this works: the questions force students to connect claim → evidence → reasoning, not just state opinions.

Use Teaching Methods That Force Reasoning

Lectures can still have a place. But if the class is mostly one-way, students don’t practice the thinking. Critical thinking needs active processing.

Here are methods that consistently work because they require justification:

  • Debates (with rules): Debates are powerful when students must cite evidence and respond to counterarguments. If you just say “argue your side,” you’ll get hot takes. Instead, require “claim → evidence → reasoning → response.”
  • Socratic questioning (with a target): “Why?” is great, but it can also turn into vague questioning. I like to pair Socratic questions with a specific reasoning target: causation vs. correlation, credibility, missing assumptions, or alternative explanations.
  • Group problem-solving (with roles): Put structure on group work. Give roles like “evidence checker,” “assumption spotter,” or “devil’s advocate.” It reduces the common problem where one student does all the thinking.

One resource I’ve used to tighten my approach to teaching is these proven teaching strategies. It’s not that the ideas are magical—it’s that it helps you pick strategies that match the outcomes you actually want.

Mixing methods matters too. But “mixing” shouldn’t mean random. I rotate between explanation, practice, and feedback. Students need time to learn the skill, then practice it, then improve it based on what you point out.

Ready to Create Your Course?

If you want help drafting the structure, try our course creator.

Start Your Course Today

Add Activities That Promote Critical Thinking (With Real Prompts)

Here’s the thing: students won’t “become” critical thinkers just because you assigned a reading. They become critical thinkers when they do tasks that require analysis and justification.

Scenario-based activities are a great start because they mirror the real world: incomplete information, trade-offs, and decisions with consequences. For example, in a business leadership module, students can get a scenario like:

Scenario: “Your company’s quarterly earnings are down 12%. Leadership proposes cutting training budgets and reducing onboarding time by 40%.”

Task: “Recommend a plan and justify it using evidence from the provided materials. Address at least one risk (quality, retention, customer impact) and explain how you weighed it.”

Then add “what-if” questions to force deeper reasoning. A good what-if is specific and forces second-order thinking:

What-if prompt (environmental studies example): “What if gasoline prices tripled overnight? Predict changes in commuting patterns, local politics, and household budgets. What evidence would you use to support each prediction?”

Role-play works well too, especially when each role has different incentives. In a healthcare policy simulation, you can assign students roles like a patient advocate, hospital administrator, and clinician. The twist? Each role must produce a short “reasoning memo” explaining how they decide—then they must respond to one opposing role’s memo.

And yes, these activities also build communication and teamwork. But I wouldn’t design them for that reason alone. The main win is that they force students to practice reasoning under pressure.

Assess Critical Thinking With Rubrics (So Students Know What “Good” Means)

If your assessment is just recall, you’ll only get recall. That’s why critical thinking assessments need to require explanation: interpreting, evaluating, and justifying.

Open-ended questions are a common and effective option. Instead of asking for a definition, ask for a decision with evidence. Example:

  • Instead of: “Define mutation.”
  • Try: “Given this mutation’s effect on protein function, predict how it might change an organism’s traits. What evidence supports your prediction?”

Project-based assessments are also great because they let students work through complexity over time. For instance, an urban planning project could ask students to propose solutions to traffic congestion, but require them to justify choices with data and assumptions. The goal isn’t just creativity—it’s reasoning you can follow.

Don’t overlook structured reflection. After a project, ask students to describe their thinking process. A prompt that works well:

Reflection prompt: “Which strategy did you start with, and why? What evidence (or feedback) changed your mind, and what new assumption did you adopt?”

To make grading consistent, you need criteria with scoring descriptors. Here’s a rubric you can use right away.

Mini-template: Critical Thinking rubric (0–4 scoring descriptors)

Use for: exit tickets, essays, discussion posts, case write-ups, or project proposals.

  • Criterion 1: Claim quality (0–4)
    • 4 = Clear, specific claim aligned to the prompt; no major ambiguity.
    • 3 = Mostly clear claim; minor vagueness.
    • 2 = Claim present but broad or partially off-target.
    • 1 = Claim unclear or inferred only.
    • 0 = No claim or irrelevant claim.
  • Criterion 2: Evidence use (0–4)
    • 4 = Uses relevant evidence (data, sources, examples) and explains how it supports the claim.
    • 3 = Evidence is mostly relevant; explanation is present but could be tighter.
    • 2 = Evidence is limited or not clearly connected.
    • 1 = Evidence is minimal, missing, or mostly anecdotal.
    • 0 = No evidence or incorrect evidence.
  • Criterion 3: Reasoning & logic (0–4)
    • 4 = Logic is coherent; addresses trade-offs and avoids obvious reasoning errors.
    • 3 = Logic mostly coherent with minor gaps.
    • 2 = Reasoning is partly present but jumps or oversimplifies.
    • 1 = Reasoning unclear or heavily flawed.
    • 0 = No reasoning beyond opinion.
  • Criterion 4: Counterarguments & assumptions (0–4)
    • 4 = Identifies at least one assumption and addresses a plausible counterargument.
    • 3 = Mentions an assumption or counterpoint, but response is incomplete.
    • 2 = Includes assumptions/counterpoint but not well developed.
    • 1 = Minimal acknowledgment of alternatives.
    • 0 = Ignores alternatives entirely.
  • Criterion 5: Metacognition (reflection) (0–4) (only for reflection tasks)
    • 4 = Explains initial reasoning and what changed (and why).
    • 3 = Explains change with some reasoning.
    • 2 = Mentions change but lacks explanation.
    • 1 = Very limited reflection.
    • 0 = No reflection.

Example student responses (what “4” might look like):

  • Evidence use (4): “The article cites a dataset from 2021; because the sample includes both urban and rural schools, it’s more representative than the blog post that uses only one district.”
  • Counterarguments (4): “Even if test scores rise, there’s a risk of narrowing curriculum. If we can’t show long-term retention, the policy might not be worth it.”

Want to keep things manageable? Grade fewer pieces deeply. For example, use the rubric fully on the major assignment, then use 2–3 criteria for low-stakes checks.

Design Online Courses That Foster Critical Thinking

Online teaching can actually make critical thinking easier—if you design it that way. The tools are there: discussion prompts, multimedia, quizzes, and structured feedback. The danger is that online courses can turn into “watch and move on.” Don’t do that.

Start with discussion threads that require justification. A strong discussion prompt includes a position, evidence, and a constraint. Example:

Prompt: “Should governments regulate AI development? Post a claim, cite one piece of evidence from the provided materials, and explain your reasoning. Then reply to one peer by asking a question that tests their assumption.”

Notice the difference: students can’t just say “yes” or “no.” They have to show thinking.

Short multimedia can help too. I like to use videos around 4–6 minutes that present conflicting viewpoints. After the video, give a prompt that forces evaluation, like:

Prompt: “Which argument is stronger and why? Identify one missing piece of evidence in the weaker argument.”

Simulations are another option. Even simple branching scenarios (“You’re advising a city council; you only have partial data”) can push students to make judgments with incomplete information. That’s basically real life.

If you’re choosing a platform, compare tools for interactive elements like discussion moderation, multimedia embedding, and quiz features. The best platform won’t teach the students for you—but it can support the kind of practice you’re designing.

If you want a research-backed framework for what to aim for, critical thinking is often broken down into skills like analysis, evaluation, inference, and explanation. If you want to ground your course design in established definitions, you can look at resources from the Foundation for Critical Thinking and the OECD framework work on learning and skills. (You don’t need to cite everything in your syllabus, but it helps to know what you’re targeting.)

FAQs


  • Debate (with evidence rule): “Should governments regulate AI development?” Students must include one credible source from the provided pack and respond with “assumption tested” (ask a question that challenges the logic, not just the opinion).
  • Case study: Give a short dataset + narrative. Ask: “What’s driving the outcome? Identify two plausible causes and explain which is more supported.”
  • Role-play: Stakeholders argue a policy change, but each role must write a 150–200 word reasoning memo before discussion.
  • Problem-solving: “You have incomplete data; what decision can you make now, and what would change your mind later?”

In all of these, the key is the same: students have to justify. If there’s no justification step, critical thinking practice is basically missing.


Use a rubric with a few criteria and grade consistently. If you’re short on time, do this:

  • Low-stakes check: Grade 2 criteria (Evidence use + Reasoning & logic) on exit tickets.
  • High-stakes assignment: Grade all criteria (including Counterarguments/Assumptions and Metacognition if it’s a reflection task).
  • Make examples part of the rubric: Include a “what a 4 looks like” sample response so students understand expectations.

One concrete open-ended question that works for grading quickly: “Which option would you recommend based on the case materials? Explain your reasoning and identify one assumption you made.”


  • Socratic questioning (targeted to evidence, assumptions, and alternative explanations)
  • Inquiry-based learning (students investigate a question and justify conclusions)
  • Collaborative learning with roles (evidence checker, skeptic, summarizer)
  • Problem-based learning (real constraints, trade-offs, and decision points)

What I’ve noticed is that the method matters less than the “thinking requirement.” If students can complete the task without explaining their reasoning, you won’t see much critical thinking growth.


  • Clear objectives: objectives should name the thinking (evaluate, justify, infer, compare evidence).
  • Discussion structure: require claim/evidence/reasoning and a “question back” to peers.
  • Multimedia with evaluation prompts: don’t just post videos—attach a question that forces judgment.
  • Timely feedback: even short feedback (“your evidence supports X, but it doesn’t address Y”) improves reasoning.
  • Assessment that demands explanation: essays, reasoning memos, and reflection tasks outperform memory-only quizzes.

Example reflection prompt you can use online: “What did you assume at first? What evidence or peer feedback changed your reasoning, and what’s your updated conclusion?”

Related Articles