Lately, I’ve been thinking about the gap between how we assess learning, and how we actually think in the real world. And here’s something to consider: Post-its > Blue books. Because if we want students to become critical thinkers in an AI world, we need to make their thinking visible, not just test their recall. I’ve seen how design sprints help make that possible. At some point, I started to connect the dots: The strategies we use in design sprints - rapid idea generation, sorting, moving, deciding aren’t just for innovation teams. They’re exactly what students need to practice real thinking, together. They offer structure for creativity and teach the skills we say we value: ✅ Collaboration ✅ Decision-making ✅ Strategic thinking ✅ Communication When you apply design sprint pratices to lessons, students don’t just sit and “get it.” They sort. They vote. They group. They move. They decide. Above all they are learning real skills they can take with them when they graduate. One of my student teachers joined a workshop I was leading on this at her district. She took the framework back to her classroom and saw an immediate shift. Same content, same kids. But this time, they were engaged, focused, and fully owning their thinking. We don’t need to throw everything out. We need to adapt with intention. And sometimes, that starts with a stack of Post-its and the freedom to move.
Classroom Activities for Critical Thinking
Explore top LinkedIn content from expert professionals.
-
-
Today, I taught a 45-minute session on AI to High School students. (These students were recently admitted to Berkeley College.) This is what we did. 1️⃣ I handed everyone a sheet of paper and a pen. 2️⃣ I gave them a scenario, and asked them to take notes. Here's the scenario: *** You’re taking a class. You’re about to complete a group assignment, with 3-4 other people. You get assigned Carl. Now, here’s some information about Carl. Carl is a genius. But he’s a specific kind of genius. He spends almost all of his time on the internet, looking for information. Now, he doesn’t read much. But he does remember a lot of it, because he has a photographic memory. When you ask him questions, there’s a 50% chance he’ll regurgitate something he saw online. Sometimes, it’s Wikipedia. Sometimes, it’s another source. Sometimes, he’s not even sure. It just comes to him. Now, about 25% of the time, he says something absolutely brilliant. He combines different sets of information or he says something that really resonates with you. The other 25% of the time? He makes it up. He puts in random pieces of information, or he says something that’s inaccurate. Or he says something that shows his bias. One good thing about Carl: whatever you'll ask him to do, he'll do. Oh, and remember: Carl is kind of a weird dude. Every time you ask him a question, he has a strange tick. In order to think, Carl needs to flick a light switch and dump out a bottle of water. The ritual helps him concentrate. *** 3️⃣ I asked students to flip the sheet of paper over. I had them spend 3 minutes making a plan: what are they going to do with Carl? 4️⃣ I had students pair up and share their ideas with each other. They worked together to create a plan for dealing with Carl. At the end, we had quite a few ideas. Some students wanted to assign a fact-checker. Some students wanted to give Carl a careful set of instructions. Some students wanted to give Carl a "handler." They watch over him. Some students wanted to ignore Carl, and not let him touch anything. Some students wanted to have Carl complete the entire group project. We talked about the approaches, and their pros and cons. This naturally led us to things like: ➢ Voice ➢ Collaboration ➢ How to set clear instructions ➢ Guarding against false information Then, we looked at ChatGPT. We talked about how this technology relates to our plan for Carl. We talked about what sort of mindset help us out when we use AI.
-
In grad school, we had a professor pose to us a problem. He created a scenario where students in a 6th grade class suddenly performed poorly on tests. He put us in groups and asked us to think about how to solve this issue. It felt like we pitched to him every possible solution. From reducing test taking anxiety to developing better instructional and teaching methods, we mentioned just about everything we could think of. All the while, he kept shaking his head no in disappointment. He finally couldn’t take it anymore and gave us a hint. What would cause this to “suddenly” happen? One of my peers, a teacher, asked if it involved budget cuts. His demeanor began to change, and we knew we were finally on the right track. After a few more educate guesses, we were able to come to the conclusion that the school had cut the breakfast program. The students weren’t performing well because they were hungry. It was an obvious answer that we should’ve asked about first. If the basic needs aren’t met, everything is pointless. I often think about this as an instructional designer. I can create the best learning experiences possible on paper, but in the real world, I can’t account for everything. Learning environments deeply matter. It serves as a reminder that we need to think about every factor for the learning perspective from the obvious basic needs all the way to the finer intricacies of learning.
-
Learning flourishes when students are exposed to a rich tapestry of strategies that activate different parts of the brain and heart. Beyond memorization and review, innovative approaches like peer teaching, role-playing, project-based learning, and multisensory exploration allow learners to engage deeply and authentically. For example, when students teach a concept to classmates, they strengthen their communication, metacognition, and confidence. Role-playing historical events or scientific processes builds empathy, critical thinking, and problem-solving. Project-based learning such as designing a community garden or creating a presentation fosters collaboration, creativity, and real-world application. Multisensory strategies like using manipulatives, visuals, movement, and sound especially benefit neurodiverse learners, enhancing retention, focus, and emotional connection to content. These methods don’t just improve academic outcomes they cultivate lifelong skills like adaptability, initiative, and resilience. When teachers intentionally layer strategies that match students’ strengths and needs, they create classrooms that are inclusive, dynamic, and deeply empowering. #LearningInEveryWay
-
Teaching doesn’t have to stick to textbooks. Sometimes, it’s the unconventional ideas that leave the deepest impact. Imagine walking into a classroom to see an inflatable pool filled with water—and plastic bottles, wrappers, straws, and other trash floating on the surface. It’s not just a setup; it’s a vivid reminder of what our oceans face every single day. This creative approach to teaching ocean pollution allows students to see and feel the problem, making it real and urgent. Here’s why it works: 👉 Hands-On Learning: Students can interact with the setup—attempting to “clean up” the pool and realizing how challenging it is to remove every tiny piece of waste. 👉 Awareness Through Action: Seeing the pollution firsthand helps students connect emotionally, sparking curiosity and empathy for marine life. 👉 Critical Thinking: Discussions around the pool lead to questions like: “How does this happen?” and “What can we do to prevent it?” It encourages students to think of sustainable solutions. 👉 Empowering Change: By the end of the lesson, the goal isn’t just awareness—it’s action. Students leave inspired to reduce single-use plastics and advocate for cleaner oceans. Teaching creatively isn’t just about making lessons fun—it’s about making them unforgettable. And when we combine creativity with a purpose, the impact goes beyond the classroom. 💡 How can you add creativity to your teaching or work today? Let’s start inspiring change together. P.S. Small efforts can lead to big waves. What creative methods have you seen or used to teach important lessons? Follow for more insights from Ian Tenenbaum Press 🔔 for regular updates Video Credit: All rights belong to the respective owner. Please DM for credit or removal. #IanTenenbaum #founders #entrepreneur #ADHDcoach
-
We go through tons of math and physics problems in STEM education. And yet, we fail to leverage the most important learning: modeling. A typical engineering problem might look like this: "Consider an elephant of mass X at the top of an inclined plane of height Z and angle Y. The elephant slips and rolls down. Determine how long it will take to reach the bottom." Then comes the killer note: "Assume the elephant is a perfect sphere with uniform density and no friction." At this point, the problem is no longer engineering, it’s just math. The student plugs in a formula, solves for time, and moves on. But real engineering isn’t about solving equations. It’s about making decisions and solving problems. A better approach? Remove the note. Now, the student must: ✅ Define assumptions. Is friction negligible? Is a perfect sphere reasonable? Probably not, but making that assumption gives a lower bound for time. ✅ Question the real-world implications. How does shape affect motion? How does friction change the problem? ✅ Recognize uncertainty. Maybe the elephant gets stuck. Maybe it never reaches the bottom. Now, the student is forced to reason about bounding the problem rather than just computing a single number. This is engineering. Not just applying formulas, but thinking critically, defining problems, and managing uncertainty. Too often, we strip engineering problems of the complexity that makes them worth solving. But real-world engineers don’t get neat assumptions, they get messy, ambiguous, imperfect systems. We should teach students to think like engineers, not calculators. How can we improve problem modeling in engineering education?
-
Stop having students do assignments (have AI do the work instead) This is one of my favorite right-now-in-the-AI-era activities Here’s how this one works: Instead of writing the essay or solving the case, students prepare everything an AI would need to succeed The challenge isn’t in doing the work as much as it is designing the system that makes the work possible This is not only better learning, it is also a critical work skill where codifying workflows and expertise unlock system efficiencies and make room for subject matter outputs Activity Breakdown 1️⃣ Understand the Task - Instructor drops in a sample prompt (e.g., “Write a persuasive essay on the ethics of AI in education”). - Students analyze what’s really being asked. 2️⃣ Build the Training Package - Essential facts, definitions, and concepts. - Key sources (and one or two to *ignore* — with reasons why). - Examples of good responses, common mistakes to avoid. - A metaphor, diagram, or hook that captures the core idea. - A step-by-step process the AI should follow. - A rubric or checklist for evaluating the result. 3️⃣ Optional Challenges - Test the package by prompting AI and critiquing the output. - Swap with a partner to see if their package gets a strong result. - Revise for clarity and efficiency. 4️⃣ Reflection - Which was harder: doing the task or training the bot? - What gaps in your own understanding did this reveal? This activity shifts the focus from doing the task to designing the system that produces expert work That’s just good metacognition right there - Breaking down complex assignments into clear, repeatable steps - Codifying what quality looks like (and what to avoid) - Building the skills to guide and evaluate AI outputs That’s not only better learning in the classroom — it’s also preparing them for a workplace where the most valuable people aren’t the fastest doers, but the clearest system designers ✨ How would you modify this activity to make it even better?
-
Most A/B tests look simple on the surface, two versions, one outcome, run a t-test, done. But what if your entire analysis is built on a faulty assumption? To help my students spot these hidden traps, I created a synthetic dataset (https://lnkd.in/epyqmxTC) that mirrors a common real-world scenario. In this example, we simulate 75 students spread across 10 classes. Each class, not each student, is randomly assigned to either Design A (control) or Design B (treatment) of an educational platform. Students then use their assigned version, and we measure how long they spend on the page as a proxy for engagement. At first glance, Design B appears to outperform Design A. A few of the B-assigned classes show noticeably higher average time on page. This is exactly where things can go wrong. Without proper statistical training, someone might look at this and immediately run Welch’s t-test to compare students in Design A versus Design B. The logic sounds straightforward. There are two conditions, one continuous variable, and Welch’s test even adjusts for unequal variances, so it seems like a safe choice. But it is not the right tool in this case. The issue is that treatment was assigned at the group level. Classes, not individual students, were randomized. That means the data points are not truly independent. Students within a class tend to behave similarly because of shared dynamics such as the same teacher, classroom environment, or peer effects. Welch’s t-test, just like any traditional t-test, assumes each observation is unrelated to the others. When that assumption is violated, the p-values it produces can give a false sense of certainty. In this dataset, Welch’s t-test produced a very small p-value (p = .00007), suggesting a strong and statistically significant effect of Design B. But when we analyzed the same data using a linear mixed-effects model that properly accounted for the fact that students were nested within classes, the result changed. The effect of Design B was no longer statistically significant (p = .109). What seemed like a convincing treatment effect was actually driven by just two of the ten classes. The other classes showed no clear difference. This example has direct consequences for how teams make decisions, and can result in wasted development time, unnecessary marketing costs, and operational effort spent on a change that delivers no real value. While the scenario may seem straightforward, it highlights a deeper issue: without a strong grasp of experimental design and statistical modeling, it’s easy to apply the wrong test, misread the outcome, and move forward with misplaced confidence. Even one misstep can turn into a failed product launch or a missed opportunity. Statistical reasoning isn’t just a technical skill, it’s a critical part of producing research that supports sound, evidence-based decisions. Please learn stats and methods before doing UX research!
-
Over time, my approach to teaching graduate classes has shifted towards creating an environment where students act more like a group of consultants tackling real-world, data-driven problems. Instead of simply following theoretical frameworks, students now dive into real-life datasets, analyze trends, and craft creative solutions. This hands-on method encourages them to think critically and out of the box—steering away from the temptation of copy-pasting from AI tools like ChatGPT. The focus isn’t just on solving problems; it’s about viewing challenges from different perspectives. By engaging with diverse datasets, students learn to approach problems with fresh eyes, ensuring a deeper retention of knowledge. It also makes the learning process more interactive and fun! This week, we focused on conducting data-driven SWOT analyses. Students worked in teams, using multiple datasets to identify strengths, weaknesses, opportunities, and threats. Along the way, they developed their soft skills, learned the value of collaboration, and strengthened their ability to work effectively in groups. This approach not only prepares students for real-world consulting roles but also equips them with the skills to think critically, collaborate, and adapt to a rapidly evolving business landscape. #DataDrivenLearning #ConsultingSkills #RealWorldProblems #GraduateEducation #CriticalThinking #OutOfTheBox #SWOTAnalysis #SoftSkillsDevelopment #CollaborativeLearning #FunInTheClassroom #BusinessEducation #InnovationInTeaching #HigherEd
-
+1
-
When Students Over-Rely on ChatGPT, Critical Thinking Suffers—Here’s How to Turn the Tide Educators worldwide are seeing an unsettling trend: students increasingly defaulting to ChatGPT for essays, problem-solving, and research. The immediate result? Polished homework with minimal effort. The long-term impact? A real risk to the very skills we strive to instill—critical thinking and problem-solving. But it doesn’t have to be this way. ChatGPT can be a powerful ally if we (1) acknowledge its limitations, (2) teach students how to use it responsibly, and (3) design activities that still require their brainpower. Here are concrete strategies—both in and out of the classroom—to flip ChatGPT from crutch to catalyst: 1️⃣ In-Class Engagement - Think-Pair-Share-ChatGPT: Pose a question, let students first discuss in pairs, then compare their ideas with ChatGPT’s answer. Have them critique the bot’s reasoning, exposing gaps and sharpening their own analyses. - Fact-Checking Face-Off: Challenge small groups to verify the references ChatGPT provides. They’ll quickly see its “credibility” can be smoke and mirrors, reinforcing the need for proper research and source validation. 2️⃣ Homework Hacks - Two-Version Assignments: Encourage students to submit one draft written themselves and one draft generated (or revised) by ChatGPT—then highlight and explain every change. They learn that blindly copying AI output often produces superficial work. - ChatGPT as Peer Reviewer: Ask the bot for feedback or counterarguments—and have students defend which suggestions they accept or reject. This fosters deeper reflection and ownership. 3️⃣ Project-Based Learning (PBL) Approaches - Authentic Audiences: Require real-world deliverables (e.g., presentations to the local council, kids’ books for a younger class). ChatGPT can supply initial facts, but students must tailor and translate knowledge for a specific audience—no bot can do that seamlessly. - Process Show-and-Tell: Have students document how they arrived at each conclusion, including any AI prompts. If ChatGPT did most of the heavy lifting, it’ll be obvious in their final presentation—and in their understanding (or lack thereof). The bottom line? ChatGPT isn’t the end of critical thinking—unless we let it be. By designing assignments that value process over one-click answers, we can harness AI to enhance rather than erode our students’ intellectual growth. Check out “ChatGPT vs. Critical Thinking: Friend, Foe, or Frenemy in the Classroom?” by Ruopeng An for a deep dive into research, anecdotes, and classroom-tested ideas. Let’s equip the next generation to use AI as a thought partner—not a substitute for thinking. Share this post with fellow educators who might be wrestling with the same issues, and let’s ignite a new wave of critical thinkers! #ChatGPT #CriticalThinking #Teaching