I made a big mistake designing a training program this week and I want to help you avoid doing the same thing. Here is the situation: I was working with a long-time client, building new content for their manager training program. We are in the early stages of development, and I spent a good deal of time this week writing out all of the copy and engagements for the courses. All the work put us back on schedule...I thought. The problem was that a lot of the content was wrong. Why was it wrong? This is where the mistake comes in. We had agreed on an initial outline for the courses. But that outline really only covered the basic ideas, it importantly didn't spell out the outcomes or behaviors the client wanted. I had worked with them for a while, so I thought I knew what they wanted. I didn't follow our standard and proven process to make sure we get alignment with all of the stakeholders at each stage of the development process. Instead, I tried to skip ahead. Skipping your process while designing training will never save you time. Step 1: Identify and agree on the impact you are trying to have. Step 2: Identify and agree on the behaviors you are trying to change. Step 3: Identify and agree on the knowledge you need in the course. Step 4: Now you can build out the content and make it engaging. Luckily, we were able to quickly identify the gaps and adjust the content to have the right focus. But it is always better to not make those mistakes in the first place. What reminders do you use to stick to your development process? #InstructionalDesign #EmployeeTraining #FailWhale #Mistakes
Training Needs Assessment Methods
Explore top LinkedIn content from expert professionals.
-
-
8 reasons why skipping needs analysis will kill your training transfer and won't stick for the learners: 1. Misaligned Objectives When you skip needs analysis, you miss the chance to align training objectives with actual business goals. This mismatch leads to wasted efforts and resources. 2. Irrelevant Content Without needs analysis, training content may not address the specific needs of the learners. Irrelevant content fails to engage and motivate participants. 3. Poor Learner Engagement Needs analysis helps identify what motivates learners. Skipping it can result in low engagement and participation. 4. Ineffective Training Methods Different learners have different needs. Needs analysis helps tailor the training methods to suit the audience, making the training more effective. 5. Lack of Context Understanding the context in which learners will apply new skills is crucial. Without this insight, training may not be practical or applicable. 6. No Measurement of Success Needs analysis sets the baseline for measuring training effectiveness. Without it, you can't accurately assess if the training met its goals. 7. Resistance to Change Learners are more likely to resist training that doesn't seem relevant to their roles. Needs analysis helps in designing training that addresses real challenges, reducing resistance. 8. Wasted Resources Investing in training that doesn't meet the needs of the learners is a waste of time and money. Needs analysis ensures that resources are used efficiently to achieve desired outcomes. Do you agree with these points? Is there anything else you've experienced when skipping the needs analysis stage? #instructionaldesign #edtech
-
Recently I listened to Luke Hobson, EdD's podcast on #andragogy, or the science behind #adultlearningtheory. It was incredibly insightful to hear him explain the six major principles in a simple to understand way! I also was very encouraged to notice I've already been using a lot of these in some sort of way (though there's always room for growth and improvement!). Here's a quick recap of my big takeaways: ✴️ "The Why Being the Ask" - Clarifying for the target audience how the learning will help them in their jobs. One great tip I got was to add a statement with each learning objective about its relevance (emphasis on relevance is a HUGE part of my instructional processes!). ✴️ Acknowledging what the audience already knows / past experiences - Each course I start, I always do a survey with my learners about their learning preferences, interests, and goals, as I make every effort to connect and cater the learning experiences to the interests and needs of my learners. While this may seem like a big task, by involving different forms of presenting information and ways to gather responses, I am usually able to meet the expectations and preferences of my learners. This also involves the concept of #universaldesignforlearning or #udl, which promotes the idea of having multiple ways and flexible means of how to accomplish #learninggoals (I've got a great suggestion for a book to help flesh this out; if you're interested, see the comments!). ✴️ Relevance - As mentioned above, going beyond just the theory to make clear the relevance of the learning is paramount! ✴️ Self-concept - Again, implementing UDL practices helps in providing flexibility for how learners want to learn! ✴️ Problem-Centric - Incorporating real-world problems for learners to solve (my favorite way is using #scenariobasedlearning, though case studies are also another great means); incorporating quality interactions and breaks also helps with this. I liked Luke's suggestion of the 70/30 divide - 70% "action," 30% exploring instructional content. ✴️ Last but not least, intrinsic motivation - Inspiring learners to believe in themselves and their abilities and encouraging a #growthmindset. This is always my goal as an instructor. Some great suggestions Luke had for supporting this are regular check-ins and giving learners feedback and praise for responses (again, I strive to do this as often as possible). This podcast also inspired me with some additional ideas that I look forward to implementing in the near future! What are your thoughts? What are your best tips for supporting adult learning? I'd love to hear your thoughts in the comments! #instructionaldesign #learninganddevelopment #learningexperiencedesign #bettereveryday
-
When it comes to Professional Development Units (PDUs) do you know what you are looking for? I ask some of the Certified professionals I met this question and many of them responded; "60 every 3 years"! Their answer is partially correct if all you care about is keeping your Certification active, 60 PDUs will accomplish that. Post 1984, the PMP committee realized that simply amassing 60 PDUs in that 3 year period was inadequate and the concept of the Talent Triangle was implemented to assure that the 60 PDUs generated a minimum in the three Talent Triangle categories (currently Way of Working, Business Acumen, and Power Skills) in hopes of maintaining a balance in the person's project management knowledge base. Earning the necessary PDUs to maintain a person's Certification is not difficult or expensive. In fact all the PDUs you need in each 3 year period can be found easily and free of charge. But I think there is a whole lot more to PDUs than meet the eye. PDUs are meant to keep you current with the constantly evolving Body of Knowledge. For that reason the PMI Standard PMBoK is revised approximately every 4 years. These revisions to the PMBoK reflect the advancement of the inner workings of profession. When you earn your Certifcation your knowledge base should be consistent with the status of the profession currently in use at the time relative to tools, techniques, etc. However, as time passes and new concepts, tools and techniques come along, they potentially make your knowledge base inadequate or inferior. This is the real purpose of PDUs -- to give everyone a chance to practice continuous learning and maintain syncronization with the status of the profession"s knowledge base. Simple example, professionals who were certificed several years ago when AI was not being talked about much (if at all), would need to take learning programs that augmented their current personal knowledge base to stay current. So, at the beginning of each year you must conduct a personal "Knowledge Assessment" to determine where you have voids from the existing profession's status. You must conduct this assessment with a candid examination of your weaknesses and plan those learning programs that will fill your knowledge voids. From a career perspective the 60 PDUs are not nearly as critical as are they the right PDUs needed to keep your knowledge base current. Don't look for easy PDUs, look for the right PDUs. The question might be; "if you earned your PMP in 1990 and you have not practiced targeted learning since, are you actually still a PMP? It's your career, manage it! #leepmp Dave Garrett Sierra Hampton-Simmons Gina Alesse
-
Training without measurement is like running blind—you might be moving, but are you heading in the right direction? Our Learning and Development (L&D)/ Training programs must be backed by data to drive business impact. Tracking key performance indicators ensures that training is not just happening but actually making a difference. What questions can we ask to ensure that we are getting the measurements we need to demonstrate a course's value? ✅ Alignment Always ✅ How is this course aligned with the business? How SHOULD it impact the business outcomes? (i.e., more sales, reduced risk, speed, or efficiency) Do we have access to performance metrics that show this information? ✅ Getting to Good ✅ What is the goal we are trying to achieve? Are we creating more empathetic managers? Creating better communicators? Reducing the time to competency of our front line? ✅ Needed Knowledge ✅ Do we know what they know right now? Should we conduct a pre and post-assessment of knowledge, skills, or abilities? ✅ Data Discovery ✅ Where is the performance data stored? Who has access to it? Can automated reports be sent to the team monthly to determine the impact of the training? We all know the standard metrics - participation, completion, satisfaction - but let's go beyond the basics. Measuring learning isn’t about checking a box—it’s about ensuring training works. What questions do you ask - to get the data you need - to prove your work has an awesome impact?? Let’s discuss! 👇 #LearningMetrics #TrainingEffectiveness #TalentDevelopment #ContinuousLearning #WorkplaceAnalytics #LeadershipDevelopment #BusinessGrowth #LeadershipTraining #TalentDevelopment #LearningAndDevelopment #TalentManagement #Training #OrganizationalDevelopment
-
𝗧𝗵𝗲 𝗜𝗺𝗽𝗼𝗿𝘁𝗮𝗻𝗰𝗲 𝗼𝗳 𝗙𝗲𝗲𝗱𝗯𝗮𝗰𝗸 𝗶𝗻 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 𝗮𝗻𝗱 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 🗣️ Ever feel like your Learning and Development (L&D) programs are missing the mark? You're not alone. One of the biggest pitfalls in L&D is the lack of mechanisms for collecting and acting on employee feedback. Without this crucial component, your initiatives may fail to address the real needs and preferences of your team, leaving them disengaged and underprepared. 📌 And here's the kicker—if you ignore this, your L&D efforts risk becoming irrelevant, wasting valuable resources, and ultimately failing to develop the skills your workforce truly needs. But don't worry—there’s a straightforward fix: integrate feedback loops into your L&D programs. Here’s a clear plan to get started: 📝 Surveys and Questionnaires: Regularly distribute surveys and questionnaires to gather insights on what’s working and what isn’t. Keep them short and focused to maximize response rates and actionable feedback. 📝 Focus Groups: Organize small focus groups to dive deeper into specific issues. This setting allows for more detailed discussions and nuanced understanding of employee needs and preferences. 📝 Real-Time Polling: Use real-time polling tools during training sessions to gauge immediate reactions and make on-the-fly adjustments. This keeps the learning experience dynamic and responsive. 📝 One-on-One Interviews: Conduct one-on-one interviews with a diverse cross-section of employees to get a more personal and detailed perspective. This can uncover insights that broader surveys might miss. 📝 Anonymous Feedback Channels: Ensure there are anonymous ways for employees to provide feedback. This encourages honesty and helps identify issues that employees might be hesitant to discuss openly. 📝 Feedback Integration: Don’t just collect feedback—act on it. Regularly review the feedback and make necessary adjustments to your L&D programs. Communicate these changes to employees to show that their input is valued and acted upon. 📝 Continuous Monitoring: Use analytics tools to continuously monitor engagement and performance metrics. This provides ongoing data to help refine and improve your L&D initiatives. Integrating these feedback mechanisms will not only enhance the effectiveness of your L&D programs but also boost employee engagement and satisfaction. When employees see that their feedback leads to tangible changes, they are more likely to be invested in the learning process. Have any innovative ways to incorporate feedback into L&D? Drop your tips in the comments! ⬇️ #LearningAndDevelopment #EmployeeEngagement #ContinuousImprovement #FeedbackLoop #ProfessionalDevelopment #TrainingInnovation
-
I don't believe learners are dumb but... I do believe people's wants can prevent them from addressing their needs. Too many times when we're analyzing a situation to define a solution, we get hung up on... "What do the people say they want?" They tell us they're "visual learners," or they just want microlearning, or they just need more training. And we believe them. So we create flashy video, or thousands of microlearning modules, or entire year-long curriculum... only to find out none of it works. Or even worse, they don't even engage with it. Because what we've failed to do is look at what they really need through the lens of our expertise as learning and performance professionals. We've failed to see that by "visual learner" they meant they wanted a tutorial but also an opportunity to tinker, to practice, to get feedback, and to try again before they're unleashed into reality. We've failed to see that by "microlearning," they actually just meant they needed the content to be more focused, more relevant, and more timely. We've failed to see that by "more training," they actually needed managers who acknowledged when they were doing the job well and supported confidence-building. How many times have ignored needs in favor of wants because you failed to really gather data and analyze needs? I can help you get past that. It starts with going From Data to Design. #InstructionalDesign #eLearning #LearningAndDevelopment #TransitioningTeacher #Consulting #TalentDevelopment #LXD #LearningDesign
-
Skill Assessment: The Game-Changing 4-Day Blueprint Most teams are playing Career Roulette. Not You. No guessing. No assumptions. Just clarity and action. (Note: If you have not DEFINED the Skills to be Assessed, Start there. - check yesterday’s post for guidance.) Here is the 4-Step playbook. To map Your team's capabilities - Fast! 𝗦𝘁𝗲𝗽 1: 𝗔𝘀𝘀𝗲𝘀𝘀𝗺𝗲𝗻𝘁 𝗠𝗲𝘁𝗵𝗼𝗱 (Day 1) Don’t overcomplicate it. Speed + Simplicity = Results. Tap into these 3 feedback channels: • Self-Assessment: What do they believe they are great at? • 360 / Peer Review: What do peers see that they don’t? • Leadership Evaluation: What do you see from the top? Tip: Use a simple 1-5 rating system. No overthinking. Example scorecard for each role: - Technical Proficiency - Customer Service Care - Problem-Solving Speed - Collaborative Potential 𝗦𝘁𝗲𝗽 2: 𝗖𝗹𝗮𝗿𝗶𝘁𝘆 𝗠𝗮𝗽𝗽𝗶𝗻𝗴 - 𝗣𝗹𝗮𝗻 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 (Day 2) Before you collect feedback, lock in these critical details: - Objective: Why are we doing this? - Metrics: What skills are we actually measuring? - Timeline: When will it start and finish? - Analysis: How will we interpret the results? - Next Steps: What will we do with the data? This step prevents confusion and creates alignment. Skipping this step may end up with data overload and no direction. 𝗦𝘁𝗲𝗽 3: 𝗖𝗼𝗻𝗳𝗶𝗱𝗲𝗻𝘁𝗶𝗮𝗹 𝗖𝗼𝗹𝗹𝗲𝗰𝘁𝗶𝗼𝗻 - 𝗖𝗼𝗻𝗱𝘂𝗰𝘁 𝘁𝗵𝗲 𝗔𝘀𝘀𝗲𝘀𝘀𝗺𝗲𝗻𝘁 (Day 3) Data only works if people are honest. Here’s how you get it: - Anonymize it: People are more honest this way. - Ensure Psychological Safety: No fear of being punished for honesty. - Train Assessors: Consistent evaluation beats biased judgment. With this approach, You will get truth instead of sugar-coated feedback. 𝗦𝘁𝗲𝗽 4: 𝗦𝗸𝗶𝗹𝗹 𝗦𝘁𝗿𝗲𝗻𝗴𝘁𝗵 & 𝗚𝗮𝗽 𝗦𝘁𝗿𝗮𝘁𝗲𝗴𝘆 (Day 4) The data is in. Now, take action. Here’s how you do it fast: - Identify Top 3 Skill Strengths & Gaps - Align Skills to Business Goals: Results start here. - Develop an Improvement Plan (more on this tomorrow) This is where good teams become great. You are not just collecting data You are building a team of peak performers. No Team? This blueprint works for personal development too. Which skill is most critical for your team to assess right now? P.S. I just ran this process with a team and found our top development need is Marketing. What is Yours?
-
Kirkpatrick mistake: — Taking a bottoms-up approach for each separate program. — Your programs should not define new desired behaviors and results. Your desired results and behaviors should define new programs. When you start at the bottom, the focus is how people will react and learn from the training; usually followed by a forced attempt to connect a new set of behaviors and outcomes to the training (when none end up getting measured or connected to those separate programs in a meaningful way). A top-down approach looks like this: - what are the defined goals at the business, process, and performer level? (You’ll need to convince your leadership to define and drive alignment on these if they aren’t in place today) - what do the performers need to DO and how can you ensure all the environmental performance dependencies are in place? - for the processes/performers that include knowledge/skill gaps, create your formal instruction programs to fill them. Only then can your learning/training programs be consistently and reliably tied to level 3 and 4. #salesenablement #salestraining
-
Here is something many of us in CPD think about often: Does ongoing learning really make a difference in patient care? According to this research letter published in JAMA Internal Medicine, the answer is yes. In this study of over 4,000 hospitalists participating in the ABIM Longitudinal Knowledge Assessment (LKA), those scoring in the top quartile had significantly lower 7-day mortality and readmission rates for their patients compared to those in the bottom quartile. In other words, the physicians who engaged regularly with a longitudinal, feedback-informed learning model rather than a once-a-decade high-stakes test had patients with better outcomes. This reinforces what many of us have been saying for years: - Learning should be continuous, not episodic - An ongoing assessment approach should identify individual gaps and inform future learning, and then we can replace models that just test retention - And yes, as Cervero and Gaines emphasized nearly a decade ago, multiple interventions are far more effective than one-and-done activities The LKA isn’t just a replacement for the traditional MOC exam. It’s a shift in philosophy towards focusing on relevance, reflection, and real-time application. It also helps align the goals of CPD and Board Certification, even if the systems remain administratively separate. It also addresses a question that many in health professions education are currently wondering: What is the value of high stakes exams and are they needed? Maybe it's time we stop treating lifelong learning as a checkbox and start treating it as a strategy for better care. Imagine that. Also see: Cervero RM, Gaines JK. Acad Med. 2015;90(12):1778–1783. https://lnkd.in/gKCpjr45