Increasing the Response Rate of the Post-Audit Exit Survey Issuing and reporting on post audit exit survey is a great way to receive feedback to improve IA's performance. However, even with significant effort invested in building and issuing the survey, audit customers may sometimes not complete it. When this occurs, the CAE should consider the following factors to diagnose and resolve the issue: 1. Survey Length The survey might be too long. Recommendation: Keep the survey to no more than 6–8 questions. Use closed-ended questions for most items and reserve free-text responses for the final question. Consider making the free-text portion optional. 2. Timing of the Request The request for survey completion might be poorly timed. Some teams wait until after the final report is issued—sometimes weeks or months after fieldwork—to ask for feedback. Recommendation: Introduce and reinforce the survey request early and often: - The VP should mention the post-audit exit survey to C-suite or VP-level executives during pre-planning. - The Director should raise the topic with VP or Director-level audit customers at the end of the initial planning meeting. - The Manager should emphasize the survey’s importance during the fieldwork kickoff. - The audit senior or supervisor should send the survey along with the draft audit report in preparation for the audit exit meeting. 3. Limited or No Follow-Up The audit leadership team may have made the survey request only once. Recommendation: Follow up multiple times. Audit customers may be juggling several projects, so a reminder can significantly boost response rates. 4. Relevance of Survey Content The survey questions might focus solely on Internal Audit, which may not resonate with the audit customer. If the survey only asks about the audit team’s performance (e.g., team knowledge or punctuality in deliverables), it might overlook the audit team’s impact on the customer’s operations. Recommendation: Include questions that evaluate both the internal team’s performance and the relevance of the audit team’s output. This balanced approach makes the survey more engaging and pertinent to the audit customer. 5. The Audit Customer is Annoyed or Dissatisfied with IA If the audit ran too long or if the team’s performance was subpar, it may lead the customer to want to move on from all audit-related matters. Recommendation: Give the executive or team a couple of weeks of space. The CAE (not anyone else) should then follow-up directly to obtain their feedback. And they should be prepared to commit to following-up individually with the audit customers once improvements are implemented to highlight time was wll spent providing feedback and the team took action on it.
How to Increase Survey Response Rates
Explore top LinkedIn content from expert professionals.
Summary
Survey response rates refer to the percentage of participants who complete a survey. Increasing these rates involves designing surveys that are concise, relevant, and engaging while respecting participants' time and effort.
- Ask purposeful questions: Only include questions with a clear purpose, ensuring they contribute directly to your objectives and avoid unnecessary filler.
- Keep it short: Limit your survey to a manageable number of questions, combining closed-ended inquiries with optional open-ended ones to reduce participant fatigue.
- Follow up thoughtfully: Send polite reminders at appropriate intervals, and communicate how their feedback has been used to encourage future participation.
-
-
We’ve collected hundreds of millions of in-product survey responses, and one small change can 3-5x response rates: start with what’s called a “closed” question with pre-set options for the user to choose from. A simple set of pre-set choices is easier for users to engage with, reducing friction and boosting participation. Even if open-ended text responses are more valuable, placing them after a closed question drives higher responses overall. For example: ❌ Wrong: What feature should we build next? (open) → How important is it? (closed) ✅ Right: Which feature should we build? (closed) → Why? (open) Next time you run an in-product survey, start with a closed question, then follow up with an open question. You’ll collect more responses, guaranteed.
-
15% survey response rates? Using behavioral science, it is possible to boost your response rates to 75% with one simple action. How many of you have spent hours crafting what you believe is the perfect post-program survey, send it out with high hopes, and then... crickets. If you're hitting 15-30% survey response rates, you're actually in the normal (yet sadly insufficient) range. The source of our dismal response rates isn't survey fatigue. It's survey design. During a recent conversation with Julie Dirksen on her Behavioral Breakdowns series, we uncovered how the COM-B framework (Capability, Opportunity, Motivation = Behavior) can transform survey response rates: Design for purpose: If you can't explain exactly how you'll use the answer to a question, don't ask it. Make surveys part of the learning: When we move key questions to the START of each workshop session and displayed responses in real-time as a teaching tool, response rates jumped from 15% to 75%. Create a positive data culture: Communicate how data will be used, be transparent about its importance, and always give people something valuable in return. The most powerful insight? When survey data becomes visibly valuable to participants—not just to you—response rates soar. Here are a few other ideas Julie and I discussed that can help boost your survey response rates (and the link to our recorded conversation): https://lnkd.in/eimKe62U. What's your biggest survey challenge? Share in the comments! #learninganddevelopment #surveydesign #datacollection
-
𝐘𝐨𝐮’𝐫𝐞 𝐀/𝐁/𝐂/𝐃/𝐄/𝐅 𝐭𝐞𝐬𝐭𝐢𝐧𝐠 𝐲𝐨𝐮𝐫 𝐬𝐮𝐫𝐯𝐞𝐲𝐬… 𝐫𝐢𝐠𝐡𝐭? 𝐎𝐫 𝐣𝐮𝐬𝐭 𝐠𝐮𝐞𝐬𝐬𝐢𝐧𝐠? If you’re still sending the same version of a survey to everyone and then complaining about low response rates, we need to talk. With OPINATOR, you can test as many survey versions as you want, all through a single link. No need to send multiple URLs or embed different widgets. Just define how often each version should appear: Version A – 40%, Version B – 25%, Version C – 10%… and so on. For instance, for the survey featured in the picture below, we produced 11 different randomized versions (OPIs). The customer? Completely unaware they’re seeing Version J instead of A. And in each version, you can change: 🎨 Visual design ❓ Question order 📝 Response options 🔀 Randomization logic And, of course, you could personalize each version further depending on the touchpoint, channel, journey, and even the individual customer! The result? You learn what actually works: design tweaks, wording, order, even the tone of voice. And you optimize based on facts, not gut feelings. But multiple A/B testing isn’t just about optimization. It’s also about 𝐬𝐭𝐫𝐚𝐭𝐞𝐠𝐲. Let’s say you’ve got a 25-question survey. You want insights on every question, but not at the cost of a 5% completion rate. With OPINATOR, you can split the survey into multiple chunks: For example, 5 different OPIs, each with 5 questions. Each customer sees just one short version, but collectively, you get insights on all 25 questions, with higher response rates and full analytics. 𝐒𝐡𝐨𝐫𝐭𝐞𝐫 𝐬𝐮𝐫𝐯𝐞𝐲𝐬. 𝐇𝐢𝐠𝐡𝐞𝐫 𝐜𝐨𝐦𝐩𝐥𝐞𝐭𝐢𝐨𝐧. 𝐍𝐨 𝐥𝐨𝐬𝐬 𝐨𝐟 𝐢𝐧𝐬𝐢𝐠𝐡𝐭. So if you’re still complaining about low response rates and not applying these strategies then maybe the problem isn’t your customers… #CX #VoC #CustomerExperience
-
𝗦𝘂𝗿𝘃𝗲𝘆 𝗙𝗮𝘁𝗶𝗴𝘂𝗲 𝗶𝘀 𝗥𝗲𝗮𝗹—𝗛𝗼𝘄 𝘁𝗼 𝗗𝗲𝘀𝗶𝗴𝗻 𝗥𝗲𝘀𝗲𝗮𝗿𝗰𝗵 𝗧𝗵𝗮𝘁 𝗖𝘂𝘀𝘁𝗼𝗺𝗲𝗿𝘀 𝗪𝗮𝗻𝘁 𝘁𝗼 𝗔𝗻𝘀𝘄𝗲𝗿 Ask any CX professional about their biggest challenge. Invariably, it will be low response rates, skewed feedback, and poor insights. But here's the truth: people aren't tired of giving feedback—they're tired of responding to bad surveys. So, how do you design research that respects your customers' time and earns their trust? 𝗕𝗲 𝗜𝗻𝘁𝗲𝗻𝘁𝗶𝗼𝗻𝗮𝗹 - Ask only what you'll use. Customers can sense when questions are just filling space. 𝗞𝗲𝗲𝗽 𝗜𝘁 𝗦𝗵𝗼𝗿𝘁 & 𝗦𝗺𝗮𝗿𝘁 - Lengthy, repetitive surveys are a one-way ticket to disengagement. Prioritize the essentials 𝗣𝗲𝗿𝘀𝗼𝗻𝗮𝗹𝗶𝘇𝗲 𝘁𝗵𝗲 𝗘𝘅𝗽𝗲𝗿𝗶𝗲𝗻𝗰𝗲 - Use skip logic, make it feel relevant, and show you know who they are, like addressing them by name and skipping their age and gender questions. 𝗧𝗶𝗺𝗲 𝗜𝘁 𝗥𝗶𝗴𝗵𝘁 - A poorly timed survey can feel intrusive. Consider the context—when are they most likely to be in the mindset to respond? 𝗖𝗹𝗼𝘀𝗲 𝘁𝗵𝗲 𝗟𝗼𝗼𝗽 - Always share what you've done with their feedback. Nothing motivates participation like seeing real impact. 𝙏𝙝𝙚 𝙜𝙤𝙖𝙡 𝙞𝙨𝙣'𝙩 𝙟𝙪𝙨𝙩 𝙢𝙤𝙧𝙚 𝙙𝙖𝙩𝙖. 𝙄𝙩'𝙨 𝙗𝙚𝙩𝙩𝙚𝙧 𝙙𝙖𝙩𝙖. And better data starts with respect for your customers' time, attention, and voice. Because if your research doesn't work for your customer, it won't work for your business either. Have you redesigned your surveys lately? What strategies worked for you? #CX #CustomerExperience #MarketResearch #CustomerInsights #Anand_iTalks
-
The difference between a 2% response rate and a 20% response rate isn't what you ask—it's when you ask it. Let's talk about behavioral triggers. Winning orgs understand that capturing feedback at moments of highest user engagement or emotional investment changes everything. The proof? Trigger-based marketing emails generate 10x more revenue. The secret is deploying surveys only after meaningful touchpoints when feedback is most valuable. Think product milestones, service completions, or problem resolutions—moments when customers are already emotionally invested in the outcome. But here's where it gets strategic: different generations need different approaches. Millennials respond to authentic, value-driven surveys that connect to bigger purposes, while Gen Z prefers instant, mobile-optimized micro-surveys that respect their time and avoid communication overload. Lessons learned? Stop sending surveys into the void and start sending them when people actually care enough to respond. The timing of your feedback request is just as important as the feedback itself. Ready to increase your survey response rates? Start timing your surveys around emotional peaks, not calendar dates. Want more tips for getting the data that your business needs? Try our SurveyVista Knowledge Base to get started: https://lnkd.in/g_-whX4e #Salesforce #CustomerInsights #DataMatters