Stop tracking email open rates. Open rates are the least reliable metric in your funnel. Honestly, they are wildly inaccurate. This is because an email "open" is only considered when a tracking image pixel loads in an email. Let’s break down the 7 ways your open rate is probably wrong. Apple Mail Privacy Protection (MPP): This feature pre-loads images and content in emails, potentially registering an "open" even if the recipient hasn't actually opened the email. Image blocking is enabled by default in many email clients, which prevents your tracking pixel from loading. So even if someone reads your email top to bottom, it may never count as an open. Email Client Settings: Some email clients automatically display images in the preview pane, which can register an "open" even if the recipient hasn't fully opened the email. Tracking Pixel Issues: Tracking pixels, which are tiny images used to track opens, can sometimes fail to load due to various technical issues, including server-side problems or client-side limitations. Spam Filters: Advanced spam filters and sorting algorithms might check the email though bots, potentially skewing open rates Forwarded emails may register multiple opens from different devices or IPs, even if only one person is reading it Inflated Open Rates: Certain programs that automatically open and view emails for spam detection can also contribute to inflated open rates. And yet, marketers obsess over it. Whole campaigns are “optimized” for a number that doesn't reflect real intent, real engagement, or real outcomes. We went one step ahead of open rates at Mailmodo. We bring web-like actionable interactions within emails. Instead of measuring who opened, we focus on what they did. Inside the email. That means: A one-click poll → submitted A meeting → booked right inside the email A feedback form → feedback provided An event registration form → user registered for the event If you want to build email journeys that measure action instead of illusions, let’s talk. I can share what we’ve seen work and why open rate should never be your success metric.
Why email data is never 100% accurate
Explore top LinkedIn content from expert professionals.
Summary
Email data is never 100% accurate because tracking methods, privacy tools, and security filters can distort metrics like open rates and engagement figures. In simple terms, email statistics often measure activity from bots and filters alongside real people, making it hard to know who is actually interacting with your messages.
- Question your numbers: Always check if high open or click rates are coming from genuine readers rather than automated filters or bot activity.
- Prioritize true engagement: Focus on actions like replies, sign-ups, or clicks that show real interest instead of relying on inflated or misleading open rates.
- Adjust your approach: Regularly review your strategy and data sources to ensure your email campaigns are reaching and influencing actual people, not just recording technical activity.
-
-
Stop tracking email open rates. 🛑 Lots of talk about Gmails new banner blocking auto image downloads recently however this is not a new problem. Optimising campaigns for open rates is worse than a vanity exercise - it’s misleading and unreliable. Here’s why… Open tracking relies on an invisible image (a tracking pixel) embedded in your email. When the email is opened, the image loads, logging the open… But it’s not that simple. That pixel can also collect data like IP addresses, locations, and browser details, raising significant privacy concerns. As a result, major players like Microsoft, Google, and Apple have taken action to protect user privacy. Let’s take Gmail as an example: Since the introduction of Google’s privacy measures, reported open rates have been significantly inflated. Many of these “opens” are triggered by Google’s servers rather than genuine recipient engagement. Check out the data 📈 Chart 1: Compares open rates for Gmail against other providers like Microsoft and Proofpoint. Notably, Gmail’s open rates have consistently been higher, with a significant spike after 2018. Chart 2: Focuses on “first-minute opens” likely caused by image preloading or security scanning. Data suggests that 15-20% of emails sent to Gmail/GSuite recipients are false opens, a trend since 2018. The takeaway? Open rates are a relic. They’re inflated, unreliable and can mislead your outbound strategy. Instead, focus on metrics that reflect genuine engagement: ✅ Click-through rates ✅ Responses ✅ Actual engagement In email prospecting, not all that glitters is gold—some of it is just glitter. Full stats breakdown link in the comments #outbound #B2Bsales #prospecting
-
Any service claiming they can "verify" catch-all emails with 100% accuracy is full of 💩 By definition, catch-all domains ACCEPT ALL EMAIL ADDRESSES. That's literally what they do. The only way to TRULY verify if an email exists is to: 𝘚𝘦𝘯𝘥 𝘢𝘯 𝘢𝘤𝘵𝘶𝘢𝘭 𝘦𝘮𝘢𝘪𝘭 Wait for either: • A bounce • A response • Or...silence (which tells you nothing) I know there is at least one or two verification services that literally send an email (Scrubby?), but most either resort to pinging servers or their own users historical data. Verification tools connect to the mail server and issue the 𝚁𝙲𝙿𝚃 𝚃𝙾: command to see if the server rejects invalid addresses. For a normal domain (without catch-all), if the mailbox doesn’t exist, you often get an SMTP error like 𝟻𝟻𝟶 𝟻.𝟷.𝟷 <𝚊𝚍𝚍𝚛𝚎𝚜𝚜@𝚍𝚘𝚖𝚊𝚒𝚗.𝚌𝚘𝚖>: 𝚁𝚎𝚌𝚒𝚙𝚒𝚎𝚗𝚝 𝚊𝚍𝚍𝚛𝚎𝚜𝚜 𝚛𝚎𝚓𝚎𝚌𝚝𝚎𝚍. For a catch-all domain, the server might accept the address with a 𝟸𝟻𝟶 𝙾𝙺 (even if that specific mailbox doesn’t exist). A common trick is to check random addresses at the same domain (like 𝘢𝘣𝘤123@𝘥𝘰𝘮𝘢𝘪𝘯.𝘤𝘰𝘮, 𝘹𝘺𝘻987@𝘥𝘰𝘮𝘢𝘪𝘯.𝘤𝘰𝘮). If every single random address gets the same “𝙾𝙺” response, that strongly indicates the domain is configured as catch-all. Sure, verification services can detect if a domain is catch-all, run fancy SMTP check, or look at historical data, but at the end of the day, they're basically just telling you "yeah this domain accepts everything" 🤷♂️ If you know which domains are catch-all you can: • Flag them as higher risk • Adjust your sending strategy • Set proper expectations Don't get me wrong, I use multiple email verification tools in sequence on every campaign. And depending on the campaign I will send to catch-alls, especially if they're a large portion of the lead list. You can't definitively verify a catch-all email address without some form of response or engagement. Period. Anyone claiming otherwise is: a) Lying b) Doesn't understand how email actually works c) Playing semantic games with what "verify" means (most common amongst these vendors) Can anyone explain how these companies are claiming they are doing this? I've only ever gotten nebulous non-answers when I've asked.
-
Your emails are getting opened… but no one’s buying? Here’s why 👇 A client came to me recently, frustrated. Their email campaign was performing great—or so they thought. ✅ 50+% open rates ✅ 10+% click-through rates ❌ 0 conversions Before diving into their copy and CTA, I asked a simple question: Are these engagement numbers real? Turns out, they weren’t. When we dug deeper, we found that a huge chunk of those "opens" and "clicks" came from email security filters, not actual humans. Spam filters, security bots, and Apple’s Mail Privacy Protection (MPP) were skewing the data, making their emails look like they were getting traction when, in reality, they were just noise in a spam filter’s pre-scan process. So before you rewrite your CTA for the 10th time or obsess over your copy, take a step back. 👉 Check if the domains opening your emails belong to corporate firewalls 👉 See if the click timestamps look unnatural (multiple clicks in the same second = bots) 👉 Compare reported clicks with actual website visits in Google Analytics Email deliverability isn’t just about hitting the inbox—it’s about reaching real people. And if your numbers are lying to you, no amount of copywriting magic will fix the real issue.
-
"I'm seeing 60-70% opens but less than 1% clicks” THIS 👆👆👆 Probably one of the most common questions I get asked is: "Our opens are great, but click-throughs are super low - what does mean/what do we do?" It’s either the way you collect your data, deliverability or content - most of the time it's a mix 1️⃣ Opens aren’t truth 60-70% opens might look amazing, but they’re rarely accurate: → Outlook’s reading pane - opens do not track → Apple Mail Privacy inflates opens → People open to delete → Or unsubscribe → Or just see who it’s from Opens are an indicator, not proof of genuine engagement (they are not singulary a positive metric remember) If your open rates are consistently that high across ALL emails you send, alarm bells should ring. It’s worth asking: where did this data come from? Bought lists or cold data can spike opens because people don’t know who you are. (Clicks will be near zero because… why would they trust you?) 2️⃣ If it’s B2B 80% of the time, this pattern tells me it's a deliverability issue Corporate spam filters + security bots open and scan your emails That’s why you see “engagement” but no one is actually clicking 3️⃣ If it’s B2C/D2C Then content is the likeliest culprit Consumers go into their inbox to find something, for something, to check something, to see something - not to scroll like TikTok Lot's of consumers will open, just because (but they still saw you so that's good!) Emails designed as “pretty webpages” that don’t load properly - so people don't click Copy that doesn’t give people a reason to move forward - your emails are not great Sometimes it’s both: poor deliverability and poor content 4️⃣ Step back: what’s the goal? This is the part hardly anyone asks! If your newsletter is designed to inform and build trust, not drive immediate clicks… is <1% CTR actually a problem? I’ve seen “low-click” emails generate pipeline 6–12 months later because people remembered the brand when it mattered Email is a long game, a channel that compounds over time So what do you do? ✅ Rule out deliverability issues (I run deliverability audits and masterclasses so get in touch if you need help) ✅ Audit your content, inbox experience, strategy and approach ✅ Most importantly, measure impact beyond opens/clicks STOP obsessing over “good open & click rates” I’d rather you obsess over whether your emails actually move people closer to taking action over time ____ I'm Beth and I train, transform, consult, audit (and occasionally rant) about all things email. I work with B2B & B2C businesses and marketers who want to evolve and transform email, get better at it and get real, measurable results. 💌 Newsletter → RE:markable (view my newsletter under my name) 📊 Free tool → Email Health Check (featured section on profile) 💻 Email & CRM Vault → Very helpful blog (featured section on profile)
-
"Nearly half the data used for ad targeting is wrong." Have you heard of the Truthset study? 1. Truthset reviewed data providers with 790 million unique hashed email addresses and 133 million postal addresses and compared them with verified sources. 2. On average, the match was only correct 50% of the time. 3. Old data was one major driver of inaccuracy, but other factors included people moving or having multiple email addresses. 4. This match is basic, but it underpins targeting details like gender, age, income, and more. So, if it's wrong, it impacts ad personalization. 5. The study found the least validated data delivered an ROI of just $0.67 per dollar spent. The takeaway for marketers: Every additional targeting layer brings up your cost. And if you layer in bad data, your likelihood of success is close to zero.
-
There's a bit of a misconception in the world of "identifying anonymous users on your website" (..."to 900x performance"...) that needs to be discussed more: 1. There's technology that tries to guess ("identify") truly anonymous users to your website - ie. they've never visited your website before 2. There's technology that tries to identify returning users to your website (this is where Elevar focuses) Regarding # 1 -- this technology has existed for years where the tech provider essentially buys access to third party data and tries to stitch disparate data sources to guess who the user is on your website and provide an email for this user that you could then market to via your ESP or sync as audiences. IMHO this isn't right, wrong, good, or bad. It is what it is. I remember testing a service like this for Elevar 7 years ago to help me with my own personal outbound sales (I can't for the life of me remember the name of the service). However, what is very misleading in the market is this: These guesses are NEVER 100% accurate. If you are to take a sample of 100 truly anonymous users on your website, the results may look like this: 9 users correctly identified with the right email 3 users where there's a fuzzy match (the emails _kind of_ match) 23 users where there was an email provided by identify service, but it was flat out wrong 65 users who are unknown to identify service This means 35 emails could be synced to your email service provider with activity on site, but only 9 are correct and the other 25 aren't. Here is where I'd like more transparent discussion: This can be measured and quantified against your real customer data so you: a) know the true quality of the emails you are getting b) what you are paying for c) understand the quality of emails going into your ESP to monitor quality ranking factors When evaluating "identity" services be sure to break up the technologies into these two separate buckets. They serve two very different purposes and have various legal (or performance) ramifications.
-
There are three kinds of lies: lies, d*mned lies, and data quality. Here's the hard truth that most data providers don't want you to know: Most data providers have bad data, and the validation services you rely on to use that bad data are also pretty crappy. There has been a trend in GTM of embracing data waterfalls to "clean up" the problem of bad data, and it's ultimately a big mistake. The problem: Data waterfalls promote the old way of doing outbound — companies are still focused on quantity, not quality. How data waterfalls work: You want to run a signal-based play targeting job changers in the past 3-6 months. You have your lead list and you need updated contact information, so you run your data through a waterfall. You partner with Data Provider A ↳ When Provider A only identifies 50% of the requested data, Provider B takes over ↳ When Provider B identifies 15% of the remaining data, Provider C takes over ↳ Provider C identifies another 10%, so Provider D comes in to fill in gaps ↳ repeat this step from A-Z By the time your data is complete, you've now relied on 4+ third-party vendors. All 4+ vendors with their own quality control and their own limits for what’s acceptable data. And since most providers charge per result, they are incentivized to give you also the questionable ones. So you are collecting the questionable results from not just one provider, but many. Ultimately, you are getting more coverage but are paying for it in accuracy (and actual $$$). When your data isn't accurate, your signal-based plays are more likely to fail and your resources go to waste. To quote Mark Kosoglow: "data waterfalls are one the biggest crackpot GTM snake oil sales I've ever seen."
-
Two months ago, I found 1265 email addresses that were 100% valid (deliverable). 🚨Guess how many aren't valid anymore. Here's what I did: 1. I created a list of software industry professionals based in North America. 2. I found their email addresses and only kept the 100% valid ones. 3. I re-verified these email addresses every week for two months. After 4 weeks, 2.8% were no longer valid. After 8 weeks, 3.7% were no longer valid. Why is this important? Because if I sent an email campaign to these recipients, my bounce rate would be higher than acceptable. Just two months after creating my list. Solution: Regularly re-verify email addresses in your CRM/mailing list to: 1. Make sure you reach your prospects and 2. Keep your bounce rate low (important for deliverability.)