The DOJ consistently says that compliance programs should be effective, data-driven, and focused on whether employees are actually learning. Yet... The standard training "data" is literally just completion data! Imagine if I asked a revenue leader how their sales team was doing and the leader said, "100% of our sales reps came to work today." I'd be furious! How can I assess effectiveness if all I have is an attendance list? Compliance leaders I chat with want to move to a data-driven approach but change management is hard, especially with clunky tech. Plus, it's tricky to know where to start– you often can't go from 0 to 60 in a quarter. In case this serves as inspiration, here are a few things Ethena customers are doing to make their compliance programs data-driven and learning-focused: 1. Employee-driven learning: One customer is asking, at the beginning of their code of conduct training, "Which topic do you want to learn more about?" and then offering a list. Employees get different training based on their selection...and no, "No training pls!" is not an option. The compliance team gets to see what issues are top of mind and then they can focus on those topics throughout the year. 2. Targeted training: Another customer is asking, "How confident are you raising bribery concerns in your team," and then analyzing the data based on department and country. They've identified the top 10 teams they are focusing their ABAC training and communications on, because prioritization is key. You don't need to move from the traditional, completion-focused model to a data-driven program all at once. But take incremental steps to layer on data that surfaces risks and lets you prioritize your efforts. And your vendor should be your thought partner, not the obstacle, in this journey! I've seen Ethena's team work magic in terms of navigating concerns like PII and LMS limitations – it can be done!
Conducting Post-Training Evaluations
Explore top LinkedIn content from expert professionals.
-
-
📈 Unlocking the True Impact of L&D: Beyond Engagement Metrics 🚀 I am honored to once again be asked by the LinkedIn Talent Blog to weigh in on this important question. To truly measure the impact of learning and development (L&D), we need to go beyond traditional engagement metrics and look at tangible business outcomes. 🌟 Internal Mobility: Track how many employees advance to new roles or get promoted after participating in L&D programs. This shows that our initiatives are effectively preparing talent for future leadership. 📚 Upskilling in Action: Evaluate performance reviews, project outcomes, and the speed at which employees integrate their new knowledge into their work. Practical application is a strong indicator of training’s effectiveness. 🔄 Retention Rates: Compare retention between employees who engage in L&D and those who don’t. A higher retention rate among L&D participants suggests our programs are enhancing job satisfaction and loyalty. 💼 Business Performance: Link L&D to specific business performance indicators like sales growth, customer satisfaction, and innovation rates. Demonstrating a connection between employee development and these outcomes shows the direct value L&D brings to the organization. By focusing on these metrics, we can provide a comprehensive view of how L&D drives business success beyond just engagement. 🌟 🔗 Link to the blog along with insights from other incredible L&D thought leaders (list of thought leaders below): https://lnkd.in/efne_USa What other innovative ways have you found effective in measuring the impact of L&D in your organization? Share your thoughts below! 👇 Laura Hilgers Naphtali Bryant, M.A. Lori Niles-Hofmann Terri Horton, EdD, MBA, MA, SHRM-CP, PHR Christopher Lind
-
Most teams don’t get better because they don’t take time to debrief. Last year, I had the honor of doing a bunch of leadership development work alongside my dear friend and amigo, Michael French. He’s a multi-time founder with successful exits, a fantastic family, and a heart of gold. One of the most powerful tools we taught together (really he, Michael O'Brien, and Admiral Mike McCabe taught, and I amplified in my sessions) was the concept of a Topgun-style debrief — and then we practiced it ourselves after every single session as a group. It’s a simple but transformative ritual. After every experience, we’d ask each other: What went well? What could have gone better? And what actions will we take to be even better next time? That’s it. Just three questions. But when asked in a space of trust, it opens the door to continuous improvement, honest reflection, and shared learning. The coolest part? Michael started doing it at home with his son — and now his son comes home from school excited to debrief the day with his dad. That’s when you know the tool is working. The origins of this approach go back to the Navy Fighter Weapons School — better known as Topgun. In the 1960s, Navy pilots were underperforming in air combat. So they changed the way they trained. But more importantly, they changed the way they debriefed. They created a culture of constructive, positive, inclusive performance reviews — grounded in trust, openness, and the pursuit of excellence. Led to a 400% improvement in pilot effectiveness. The philosophy was clear: the debrief is not about blame or fault-finding. It’s not about who “won” the debrief. It’s about learning. It’s about getting better — together. The tone is collaborative, supportive, and often informal. The goal is to build a culture of reflection where people feel safe enough to speak, to listen, and to grow. Most organizations only do debriefs when something goes wrong. But if we wait for failure to reflect, we miss all the micro-moments that help us move from good to great. Excellence isn’t a destination. It’s a mindset. It’s the discipline of always being open to improvement — even when things are going well. Especially when things are going well. So here’s my nudge to you: give this a try. Whether it’s with your team, your family, your partner, or just yourself at the end of the day — ask those three simple questions. What went well? What could have gone better? And what actions can we take to be even better next time? Let me know if you do. I’d love to hear how it goes.
-
I once had a team of insecure overachiever analysts. They were introverts, brilliant at their work, and incredibly nice people. Too nice, as it turned out. They were so nice that they wouldn't tell each other what was really going on. Instead, they'd come to me: "So-and-so is doing this thing that's really annoying. Can you do something about it?" I got sick of everyone putting me in the middle instead of taking ownership of their issues with each other. So I did something about it. I brought in trainers from the Center for Creative Leadership to teach everyone the Situation-Behavior-Impact (SBI) model (link in comments). The process was simple but powerful: 1. Describe the situation so everyone's on the same page. 2. Share the specific behavior you observed (no judgments about intent). 3. Explain the impact on you or the other people in the room. We started with positive feedback to create safety. We practiced saying things like, “When you walked into that meeting with a big smile, the impact was that it put everyone at ease." Everyone started spotlighting the good that was happening, and that encouraged more thoughtful interactions. Then, we practiced constructive feedback—harder, but even more important. The impact was almost immediate. Soon, I heard people asking each other, "Hey, can I give you an SBI?" The framework made it safe. More importantly, we came to give and receive feedback for the gift that it is. That ability to give and receive honest, thoughtful feedback is the foundation of every healthy team culture. But it's a skill we rarely train for. I’m curious: What frameworks have you used in your organizations to create a culture of feedback?
-
𝗠𝗲𝗮𝘀𝘂𝗿𝗶𝗻𝗴 𝘁𝗵𝗲 𝗥𝗢𝗜 𝗼𝗳 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 𝗮𝗻𝗱 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 𝗣𝗿𝗼𝗴𝗿𝗮𝗺𝘀 📊 Many organizations struggle to quantify the impact of their Learning and Development (L&D) initiatives. Without clear metrics, it becomes difficult to justify investments in L&D programs, leading to potential underfunding or deprioritization. Without a clear understanding of the ROI, L&D programs may face budget cuts or be viewed as non-essential. This could result in a less skilled workforce, lower employee engagement, and decreased organizational competitiveness. To address these issues, implement robust measurement tools and Key Performance Indicators (KPIs) to demonstrate the tangible benefits of L&D. Here's a step-by-step plan to get you started: 1️⃣ Define Clear Objectives: Start by establishing what success looks like for your L&D programs. Are you aiming to improve employee performance, increase retention, or drive innovation? Clear objectives provide a baseline for measurement. 2️⃣ Select Relevant KPIs: Choose KPIs that align with your objectives. These could include employee productivity metrics, retention rates, completion rates for training programs, and employee satisfaction scores. Having the right KPIs ensures you’re measuring what matters. 3️⃣ Utilize Pre- and Post-Training Assessments: Conduct assessments before and after training sessions to gauge the improvement in skills and knowledge. This comparison can highlight the immediate impact of your training programs. 4️⃣ Leverage Data Analytics: Use data analytics tools to track and analyze the performance of your L&D initiatives. Platforms like Learning Management Systems (LMS) can provide insights into learner engagement, progress, and outcomes. 5️⃣ Gather Feedback: Collect feedback from participants to understand their experiences and perceived value of the training. Surveys and interviews can provide qualitative data that complements quantitative metrics. 6️⃣ Monitor Long-Term Impact: Assess the long-term benefits of L&D by tracking career progression, employee performance reviews, and business outcomes attributed to training programs. This helps in understanding the sustained impact of your initiatives. 7️⃣ Report and Communicate Findings: Regularly report your findings to stakeholders. Use visual aids like charts and graphs to make the data easily understandable. Clear communication of the ROI helps in securing ongoing support and funding for L&D. Implementing these strategies will not only help you measure the ROI of your L&D programs but also demonstrate their value to the organization. Have you successfully quantified the impact of your L&D initiatives? Share your experiences and insights in the comments below! ⬇️ #innovation #humanresources #onboarding #trainings #projectmanagement #videomarketing
-
Most leadership programs end with feedback forms. If your CEO asked for the 💰 money slide, Would you have anything to show? Here’s the reality: attendance isn’t impact. Smiles and surveys don’t prove ROI. Here’s where ROI starts: ☑️ Start with business strategy, not just learning objectives. ↳ Programs should be designed to accelerate organizational priorities, not just learning hours. ☑️ Embed development into the work itself so growth shows up in real time. ↳ Impact should be measured in project delivery, cost savings, quality of execution, and leaders’ ability to grow and guide their teams. ☑️ Prepare leaders for responsibilities beyond their current role. ↳ Growth is proven when leaders step up successfully into bigger challenges, not when they sit in classrooms. ☑️ Measure outcomes with real metrics, not fluff. ↳ Track improvements in retention, promotion readiness, decision speed, or customer satisfaction. ↳ If you can’t measure it, you can’t prove ROI. ☑️ Reinforce learning through coaching and accountability until new habits stick. ↳ Sustained behavior change is the only way leadership investments translate into long-term ROI. This is when the impact becomes clear. You see sharper judgment, stronger execution, ready successors, and market-ready teams. That’s the money slide boards and executives are looking for. As the article pointed out, too many organizations still approach leadership development with yesterday’s playbook. In business, the “money slide” is the single slide in a presentation that proves value, the ROI that executives are really looking for. Too often, instead of proving value, organizations fall back on the old playbook: 📚 more courses, 🕒 more hours, 📊 more frameworks. But impact doesn’t come from volume. It comes from alignment, design, and outcomes. Here’s my take: the future of leadership development won’t be judged by how much training content is delivered. It will be judged by how much capability is created and how quickly that capability moves the business forward. That’s the shift executives are hungry to see. ♻️ Repost if you’re investing in people, not just tech. Follow Janet Perez for Real Talk on AI + Future of Work
-
The last claim your company faced probably came from someone who "completed" their compliance training. Compliance programs built solely around communicating company policies fail to reduce real-world risk. Checking boxes doesn't change behaviors, and it doesn't protect companies from claims. Effective compliance training goes beyond information sharing. It develops essential workplace skills, reinforces measurable behaviors, and links directly to outcomes that executives care about. Clients partner with us to build respectful workplaces because strong behavioral norms directly translates into measurable business results: • Teams that demonstrate respectful behaviors outperform others by 10–15%. • Organizations with healthy cultures have fewer employee-relations claims. • Effective training reduces investigation expenses and compliance risks. Executives expect clear proof that training programs impact critical business metrics: Instead of reporting, "95% completed harassment training," Report, "Harassment-related claims dropped 20%, reducing investigation costs." Instead of highlighting, "High ratings for DEI training," Highlight, "Teams completing our inclusion training saw 18% lower turnover." Compliance should always be the natural outcome of skill-building and behavior change—never the main goal of your training programs. Completion rates alone don't protect your company. Behavior change does.
-
Did you know that 92% of learning leaders struggle to demonstrate the business impact of their training programs? After a decade of understanding learning analytics solutions at Continu, I've discovered a concerning pattern: Most organizations are investing millions in L&D while measuring almost nothing that matters to executive leadership. The problem isn't a lack of data. Most modern LMSs capture thousands of data points from every learning interaction. The real challenge is transforming that data into meaningful business insights. Completion rates and satisfaction scores might look good in quarterly reports, but they fail to answer the fundamental question: "How did this learning program impact our business outcomes?" Effective measurement requires establishing a clear line of sight between learning activities and business metrics that matter. Start by defining your desired business outcomes before designing your learning program. Is it reducing customer churn? Increasing sales conversion? Decreasing safety incidents? Then build measurement frameworks that track progress against these specific objectives. The most successful organizations we work with have combined traditional learning metrics with business impact metrics. They measure reduced time-to-proficiency in dollar amounts. They quantify the relationship between training completions and error reduction. They correlate leadership development with retention improvements. Modern learning platforms with robust analytics capabilities make this possible at scale. With advanced BI integrations and AI-powered analysis, you can now automatically detect correlations between learning activities and performance outcomes that would have taken months to uncover manually. What business metric would most powerfully demonstrate your learning program's value to your executive team? And what's stopping you from measuring it today? #LearningAnalytics #BusinessImpact #TrainingROI #DataDrivenLearning
-
5,800 course completions in 30 days 🥳 Amazing! But... What does that even mean? Did anyone actually learn anything? As an instructional designer, part of your role SHOULD be measuring impact. Did the learning solution you built matter? Did it help someone do their job better, quicker, with more efficiency, empathy, and enthusiasm? In this L&D world, there's endless talk about measuring success. Some say it's impossible... It's not. Enter the Impact Quadrant. With measureable data + time, you CAN track the success of your initiatives. But you've got to have a process in place to do it. Here are some ideas: 1. Quick Wins (Short-Term + Quantitative) → “Immediate Data Wins” How to track: ➡️ Course completion rates ➡️ Pre/post-test scores ➡️ Training attendance records ➡️ Immediate survey ratings (e.g., “Was this training helpful?”) 📣 Why it matters: Provides fast, measurable proof that the initiative is working. 2. Big Wins (Long-Term + Quantitative) → “Sustained Success” How to track: ➡️ Retention rates of trained employees via follow-up knowledge checks ➡️ Compliance scores over time ➡️ Reduction in errors/incidents ➡️ Job performance metrics (e.g., productivity increase, customer satisfaction) 📣 Why it matters: Demonstrates lasting impact with hard data. 3. Early Signals (Short-Term + Qualitative) → “Small Signs of Change” How to track: ➡️ Learner feedback (open-ended survey responses) ➡️ Documented manager observations ➡️ Engagement levels in discussions or forums ➡️ Behavioral changes noticed soon after training 📣 Why it matters: Captures immediate, anecdotal evidence of success. 4. Cultural Shift (Long-Term + Qualitative) → “Lasting Change” Tracking Methods: ➡️ Long-term learner sentiment surveys ➡️ Leadership feedback on workplace culture shifts ➡️ Self-reported confidence and behavior changes ➡️ Adoption of continuous learning mindset (e.g., employees seeking more training) 📣 Why it matters: Proves deep, lasting change that numbers alone can’t capture. If you’re only tracking one type of impact, you’re leaving insights—and results—on the table. The best instructional design hits all four quadrants: quick wins, sustained success, early signals, and lasting change. Which ones are you measuring? #PerformanceImprovement #InstructionalDesign #Data #Science #DataScience #LearningandDevelopment