What’s Working for You? (How you can test to see if you are right!) One common method to find out which product offering Or which email outreach style is doing better Is to perform an A/B Test. The premise of the test is simple Obtain feedback or observe behaviors of customers That are exposed to either product A or product B And see if there is a clear difference in preferences. Let us consider the example of Marketing LLC Who wanted to see which email style was resonating more With their potential clients. After conducting required background research On their Ideal Client Profile (ICP), They decided to test their email styles using the A/B Testing method. We sent out 300 emails of Style A to one group And 300 emails of Style B to another group. The groups were randomly selected from their ICP list And the content of the emails was very similar. The subject line and first two sentences of the emails were different. Observation & Proportions: - 100 or 33% of Style A emails were opened. - 120 or 40% of Style B emails were opened. - Total or joint open rate was 220 out of 600 or 37% Clearly the numbers show that Style B had a higher rate of opening. However, it is essential to test this statistically before deciding Whether to go with Style B or Style A for sending future emails to ICPs. We can use a Test of Proportions at a 95% confidence level To ensure that Style B is better, using statistical significance. Actual Test: * Joint p* = 0.37 * Std. Error Sp = sqrt((0.37 x 0.63/300) = 0.03 * Test Z-value = (0.4 – 0.33)/0.03 = 2.33 * 95% Z-value = 1.96 (this is a very important and constant critical value) Since the Test Z-value is greater than 1.96, we can now conclude with 95% confidence that: Emails sent using Style B, were doing better. Actionable Insights from A/B Testing: 1. Deep Dive: Analyze the elements of Style B that contributed to the higher open rates. This could include the subject line, tone, or specific keywords. 2. Limit Variables: When conducting A/B tests, focus on one or two variables at a time to isolate the impact of each change. 3. Scale Up: Increase volume of emails following Style B to further validate the results & reach a larger audience within your ICP. 4. Content Quality: Ensure that the content of the email is compelling & relevant. An opened email is just the first step; the content must result in engagement and conversions. 5. Continuous Testing: Regularly perform A/B tests to keep refining your email strategies. Market dynamics & customer preferences can change over time. 6. Segmentation: Segment ICP further to tailor email styles to different sub-groups, for personalization & relevance. 7. Feedback Loop: Collect feedback from recipients to understand their preferences & pain points, to improve future email campaigns. #PostItStatistics #DataScience Follow Dr. Kruti or Analytics TX, LLC on LinkedIn (Click "Book an Appointment" to register for the workshop!)
Email Testing Techniques
Explore top LinkedIn content from expert professionals.
-
-
"We tried Cold Email, but didn't see results." Has to be one of the most common challenges I hear. Let me explain. Over the course of 2024, I’ve spoken with many B2B SaaS Founders, Marketing Directors, Sales Directors, and GTM Leaders. They all share one problem in common: They’ve tried Cold Outreach, but they don’t get any results. So naturally, I start asking questions and offer to have a look at what they’re doing. When I review their campaigns, one thing becomes crystal clear: They understand how to build prospect lists, but there's little to no split testing happening. Here’s the reality: If you’re only sending 100-200 emails without testing different angles, you’re gambling on the success of your campaign, and in most cases, that gamble doesn’t pay off. Let’s break this down. There are two types of companies: 1️⃣ The 1% that doesn’t need to split test (they already know their ICP and what works for them). 2️⃣ The 99% that absolutely MUST split test to find what works best. If you’re part of the 99% (and most of us are), here’s how to do it effectively: Step 1: Test Pain Points Start by identifying the key problems your target audience is facing. Let’s say you’re an agency targeting e-commerce brands. You could test angles like: → High customer acquisition costs → Low lifetime value → Low return on ad spend Each email script stays consistent, only the pain point changes. 💡 Example: If you’re targeting a Sales Director, one angle might focus on the challenge of getting unqualified leads filling up their pipeline, while another might highlight how their team spends too much time on lead nurturing rather than closing. Allocate a set number of leads to each angle (e.g., 1,000 leads per angle) and track results. Step 2: Analyze & Scale Winners Once you’ve sent out the emails, review your data. Ask yourself: → Which angle is getting the most positive replies? → Are certain pain points resonating more than others? If one angle shows promise, double down. If another flops, drop it. Step 3: Test Offers After narrowing down the best angles, shift your focus to your offer. Split test variations of your offer to see which drives the most engagement and demo bookings. Forget vanity metrics like open rates (for now). Instead, track the ratio of PRRs. Many B2B companies: ❌ Send a small volume of Cold Emails (100-200) and expect big results. ❌ Focus too much on minor variables like subject lines before testing major factors like pain points or offers. ❌ Don’t analyze campaign performance enough to refine their approach. 💡 Pro tip in the PDF below👇 💬 Drop a comment below, or DM me for a free campaign audit.
-
𝗟𝗮𝘀𝘁 𝘄𝗲𝗲𝗸, 𝗮 𝗰𝗹𝗶𝗲𝗻𝘁 𝗮𝘀𝗸𝗲𝗱 𝗺𝗲 𝘁𝗼 𝗿𝗲𝘃𝗶𝗲𝘄 𝘁𝗵𝗲𝗶𝗿 𝘀𝗮𝗹𝗲𝘀 𝘁𝗲𝗮𝗺'𝘀 𝗲𝗺𝗮𝗶𝗹 𝘁𝗲𝗺𝗽𝗹𝗮𝘁𝗲𝘀 𝗮𝗳𝘁𝗲𝗿 𝘀𝗲𝗲𝗶𝗻𝗴 𝘁𝗵𝗲𝗶𝗿 𝗿𝗲𝘀𝗽𝗼𝗻𝘀𝗲 𝗿𝗮𝘁𝗲𝘀 𝗽𝗹𝘂𝗺𝗺𝗲𝘁 𝗯𝗲𝗹𝗼𝘄 𝟭%. "It's just the market," they insisted. "Nobody answers emails anymore." I wasn't convinced. So I pulled up their templates alongside thousands of others I've analyzed over the years, and the patterns were immediately clear. The emails that consistently get responses in today's crowded inboxes aren't the ones with the catchiest subject lines or the most persistent follow-ups. They're the ones that feel like they were written by a human being who's done their homework. Looking through my swipe file of emails with 30%+ response rates, I found myself returning to five core approaches that just work: I call the first one "The Pattern Interrupt" – where instead of saying what everyone else says, you notice something specific and ask a genuine question about it: "I noticed you recently shifted your messaging from security-focused to efficiency-focused. I'm curious what prompted that change?" Then there's what I call "The Contrarian Insight" – where you respectfully challenge conventional wisdom with actual data: "While analyzing conversion patterns across 50 companies in your industry, we discovered something that contradicts the common belief about [specific topic]. I'd be happy to share what we found if it might be useful." My personal favorite is "The Genuine Connection" – where you reference something they've created and add actual value to the conversation: "Your recent post about sales enablement challenges really resonated because I've been wrestling with the same issues. Have you considered [thoughtful question related to their perspective]?" For every client I work with, we build a custom "Message Testing Framework" where we develop variations of these templates specifically for their market, then test them with small batches (about 20-50 prospects each). 𝘽𝙪𝙩 𝙝𝙚𝙧𝙚'𝙨 𝙩𝙝𝙚 𝙘𝙧𝙞𝙩𝙞𝙘𝙖𝙡 𝙥𝙖𝙧𝙩 𝙢𝙤𝙨𝙩 𝙥𝙚𝙤𝙥𝙡𝙚 𝙢𝙞𝙨𝙨 – 𝙬𝙚 𝙙𝙤𝙣'𝙩 𝙟𝙪𝙨𝙩 𝙩𝙧𝙖𝙘𝙠 𝙤𝙥𝙚𝙣 𝙖𝙣𝙙 𝙧𝙚𝙨𝙥𝙤𝙣𝙨𝙚 𝙧𝙖𝙩𝙚𝙨. 𝙒𝙚 𝙘𝙖𝙧𝙚𝙛𝙪𝙡𝙡𝙮 𝙚𝙫𝙖𝙡𝙪𝙖𝙩𝙚 𝙩𝙝𝙚 𝙦𝙪𝙖𝙡𝙞𝙩𝙮 𝙤𝙛 𝙧𝙚𝙨𝙥𝙤𝙣𝙨𝙚𝙨. 𝘼 𝙩𝙚𝙢𝙥𝙡𝙖𝙩𝙚 𝙩𝙝𝙖𝙩 𝙜𝙚𝙩𝙨 𝙦𝙪𝙞𝙘𝙠 𝙗𝙧𝙪𝙨𝙝-𝙤𝙛𝙛𝙨 𝙞𝙨𝙣'𝙩 𝙨𝙪𝙘𝙘𝙚𝙨𝙨𝙛𝙪𝙡 𝙚𝙫𝙚𝙣 𝙞𝙛 𝙩𝙝𝙚 𝙧𝙚𝙨𝙥𝙤𝙣𝙨𝙚 𝙧𝙖𝙩𝙚 𝙡𝙤𝙤𝙠𝙨 𝙜𝙤𝙤𝙙 𝙤𝙣 𝙥𝙖𝙥𝙚𝙧. What consistently works across industries is specificity that shows you've done homework, offering value before asking for anything, and genuine curiosity rather than formulaic personalization. And brevity matters – almost every high-performing email I've analyzed is five sentences or fewer. Which of these approaches would make YOU respond? I'm genuinely curious. #SalesEmails #ProspectingTemplates #OutboundSales
-
Stakeholders often focus on “how many” when presented qualitative research. Which is the wrong question to ask. Qualitative is about understanding the H (human) in HCI. The goal is to understand why they behave like that. When presenting research results: focus on showing clear patterns, supporting findings with evidence like quotes or observations, and connecting everything back to user behaviors and business goals, not sample sizes. Also, combine qualitative with quantitative to explain the what and the why. For example: - Quantitative shows what's happening: 72% abandon the goal-setting flow at account connection. - Qualitative reveals why: Users worry about security, are confused about account selection, and fear they can't reverse connections. - The powerful combination: "Our drop-off problem stems from specific trust concerns and mental model mismatches. By redesigning to address these specific issues, we can reduce the 72% abandonment rate." Beyond Numbers: How to Properly Evaluate Qualitative UX Research (9min) By Dr Maria Panagiotidi https://lnkd.in/gbqRneY4
-
In 2023, I side-hustled my way to find the best cold email structure for early-stage SaaS companies. I started by using templates created by people like Vin Matano Guillermo Blanco 💜 Will Allred and the 30 Minutes to President's Club. Took a handful of different angles and tests. But one framework started to stick out like a sore thumb for most niches and offers. Note: Trying out different angles, offers, lead magnets, and such is worth your time - these will eventually outperform this. But if you need a solid cold email framework without putting too much effort into it, this is what I've found to work best: 0️⃣ Subject: 1 to 5 words: Either 'quick question about {{topic}}' or '{{topic}} question' 1️⃣ Line 1: The companies I've worked with needed to get their name out to their markets quickly and with quality. 1 observational sentence (personalized). So the best way to send volume is by showing your prospects that you sorta know them. This is done with a simple AI-written observation. BTW, we're not actually writing these lines. GPTforSheets (GPT for Work) is, and we're double-checking them to make sure they don't look robotic, which about 5-10% do even with a killer prompt. Or you can still manually write these simple observations about their company. ie 'Hey {{first_name}}, noticed that your team does residential mortgages and had even funded over $6.3 billion in loans in 2019.' 2️⃣ Line 2: Poke the Bear Question (Josh Braun). Relate the question to the subject line. 1 sentence (non-personalized). Know your customers deeply and come up with a few neutral questions to get your prospects to think differently about their current solution. Test 3-5 different questions and see what gets the most positive replies. Do this by listening to every word they say in your meeting recording tools (like Gong Otter.ai Fireflies.ai) and getting help from ChatGPT for brevity. ie Quick q - how do you know if you're missing out on {{ideal overall outcome}} because {{pain}}? 3️⃣ Line 3: Show how you help. 1-2 sentences (non-personalized). ie Asking because we help {{niche/industry}} {{achieve outcome}} without {{common objection}}. 4️⃣ Line 4: Snappy social proof (non-personalized, but mentioning a relevant case study). 1 sentence (non-personalized). PS – if you don't have permission to share a relevant case study's company name, still mention relevant info but not the company's name. ie 'Most recently, {{case study}} went from {{before #/%}} to {{after #/%}} within {{timeframe days/months}} of implementing {{your product name}}.' OR ie ''Most recently, {{case study}} started using {{your product name}} and was able to achieve {{result #/%}} in {{timeframe days/months}}. 5️⃣ Line 5: The CTA. Short and soft. 1 sentence. (non-personalized) ie "Worth a chat?" or "Any interest in learning more?" Try it out, Spencer P
-
My team and I have generated over $25,000,000 in revenue for ecommerce brands through email & SMS. A big chunk of that revenue would've never existed if we didn't run frequent A/B tests. So in this post, I'll break down how you can A/B test the most effective way. You see... Most brands waste revenue and time by A/B testing all of the wrong things. I see it time and time again for 6, 7, and 8-figure brands. You may not know if you fall into that bucket... So I'm here to sort you out. Here's everything you need to know about A/B testing to generate more sales from email: First, there are 4 rules you need to know about. 1. Test on a large enough sample size. 2. Define what metric you're trying to improve. 3. Test for a long enough time period. 4. Make each variation different enough for meaningful data. If you can't check these off, the test most likely isn't worth running at all. Now, let's take a look at the areas you should focus 80% (if not all) of your testing on... Plus, the main needle-moving tests you should run. 1. Pop-ups - Offers (E.g. 10% off vs. Free Shipping) - Behavior (Timing, pop-up type, etc) - Design 2. Welcome Series - Subject lines - Cadence of emails (Aggressive vs. slower-paced) - Email content (E.g. plain text vs. designed emails) 3. Abandonment flows - Subject lines - Timing - Offer placement (What offer to give & when) - Approach (E.g. customer service email angle vs. normal reminder) 4. Post Purchase flow - Upsells & cross-sells - Timing of upsells & cross-sells - Review collection timing 5. Campaigns - Sending times of the day/week - Offers - Content style These are the exact tests we frequently run for our ecommerce clients. And it's how we consistently increase performance in already high-performing accounts. Save this post for the next time you're strategizing your A/B tests!
-
5 reasons why continuous testing is non-negotiable in cold email: #1 - Spam filters are getting smarter: What slipped through yesterday might get caught tomorrow. You need to be constantly tweaking to stay ahead of the algorithms. #2 - Prospect behavior changes: Maybe everyone's sick of "quick question" subject lines this week. Maybe "Saw you on LinkedIn" is the new hot opener. You won't know unless you're testing. #3 - Market conditions shift: Economic changes, industry trends, global events - they all impact how receptive prospects are to different messages. Your campaigns need to evolve with the times. #4 - Your competition is innovating: If you're standing still, you're moving backwards. Your competitors are testing new approaches. Are you? #5 - New tools and technologies emerge: AI, automation, new data sources - the toolset for cold email is constantly expanding. Are you leveraging them? But most people suck at testing. They change five things at once and have no idea what actually moved the needle. Here's how to do it right: - Test one element at a time: Subject line, opening line, CTA - pick one and test it. Otherwise, you won't know what's actually working. - Use statistically significant sample sizes: Don't make decisions based on 10 emails. You need enough data to draw real conclusions. - Track everything: Open rates, click rates, response rates, meeting booked rates - the more data you have, the better decisions you can make. - Be patient: Good testing takes time. Don't expect overnight miracles. Give each test enough runway to show real results. - Learn from failures: A failed test isn't a waste - it's valuable data. Use it to inform your next iteration. Testing is time-consuming, tedious, and it is EASY to feel like you're not making progress. But in the world of cold email, if you're not testing, you're dying. Your competitors are out there right now, tweaking, refining, and improving their campaigns. Are you going to keep sending the same stale emails and wonder why your results are tanking? Or are you going to roll up your sleeves and start testing? Adapt or die.
-
These 8 cold email frameworks helped generate over $5M in revenue. Here’s how they work (and how to use them): The truth? Most cold email templates are outdated and super generic. Buyers have seen them all. And they delete them all. But some people are getting 15-20% reply rates using frameworks they never share. I asked them to break down exactly what works: 1️⃣ Monika Grycz 💌 (AIDA Structure) Start with a single, painful observation. Then back it up with real results from similar companies. End with a next step that feels easy. ↳ We've seen X% reply rates when this is executed properly. 2️⃣ Aaron Reeves (Trigger, Implication, Pain, Proof, Solution) Start with a relevant reason you're reaching out that's tailored to them. Connect what this might mean for their priorities. Highlight the risk of doing nothing. Show how you've helped similar companies avoid that outcome. ↳ This framework works especially well for expansion-stage companies. 3️⃣ Tal Baker-Phillips (Typical Problems Framework) Name the 1–2 issues they’re likely facing and offer a simple way to solve them. ↳ Short, direct, assumes they have the problem without being pushy about it. 4️⃣ Brian LaManna (Champion Play) Target people who've used your solution at previous companies. ↳ Converts at ridiculous rates because there's instant credibility. 5️⃣ Leif Bisping ("Why are you paying?") Highlight the inefficiency of their current tool. Then frame your solution as the obvious alternative. ↳ Creates immediate curiosity and positions you as the obvious alternative. 6️⃣ Ethan Parker (Lead Magnet Framework) Lead with a free resource. Offer something genuinely useful (like a cheat sheet or playbook) before you ask for anything. ↳ Gets them saying yes to something small first. 7️⃣ Thibaut Souyris ("Do the maths") Use hard numbers. Show how your solution saves time, money, or both, and back it with a simple calculation. ↳ Makes the ROI impossible to ignore. 8️⃣ Patrick Trümpi (Challenge of Similar Companies) Point to a challenge others in their space are dealing with. Share how you helped. ↳ Creates FOMO and positions you as the industry expert. The thing is, most people try to reinvent the wheel with cold email. But the best frameworks already exist. You just need to test them and see what works for your audience. Test 3-4 simultaneously and double down on what drives the highest reply rates. P.S. Swipe to see actual emails that booked meetings → 👋 Follow me for modern outbound strategies.
-
"Our old email agency didn't test anything" Most agencies fall on either end of the testing spectrum. They either: 1. Test nothing at all ⤷ Problem: Missing out on incremental revenue gains 2. Test everything ⤷ Problem: Poor allocation of resources on non-needle moving tests So, here's our email A/B testing order of priority framework: 1. Popup and welcome flow ⤷ Start with your offer (%, $ off, free gifts, etc) - % wins most of the time ⤷ Next, move on to the email content (angles, formatting, etc) ⤷ Finally, test time delays - most will buy in the first 48 hours 2. Abandonment flows Abandoned Checkout > Abandoned Cart > Browse Abandonment > Site Abandonment ⤷ Start with time delays (focus on email 1 first) ⤷ Next, move on to your offers (%, $ off, free gifts, etc) ⤷ Finally, test email content - addressing different FUDs 3. Post-purchase 1st-time buyer PP > Repeat buyer PP ⤷ Start with offer timing - immediate upsell or delayed? ⤷ Next, test your offer type (%, $ off, free gifts, etc) ⤷ Then, move on to email content - what content drives most clicks back to website? 4. Other flows ONLY focus on these once the above have been meticulously tested DON'T expect huge incremental gains e.g. sunset flow, winback flows, etc ⤷ These likely won't move the needle much ⤷ Tests will vary based on the flow Any email bros spouting you should test everything... Ignore them. The resources required to test everything at once aren't justified by the ROI.
-
Most agencies waste thousands testing unproven offers with ads. Here's my three-step validation process that costs almost nothing: 1) Cold Email Test (1,000 leads) - Send your offer to 1,000 targeted leads - Track reply rates and meeting bookings - Example: We tested "We will build you a Paving Estimate Calculator" to hardscape companies and saw immediate interest - Benchmark: 8+ positive replies per 2,400 emails indicates a viable offer 2) Twitter/LinkedIn Post - Share your offer as a simple post (no fancy graphics needed) - Look for engagement signals: comments, DMs, shares - When we posted our "build your cold email system" offer, we got immediate DMs asking for more info - If your post gets no traction, the market isn't interested 3) Community Feedback - Share your offer in relevant communities where your target audience hangs out - Track direct messages and inquiries - One of our students offered a free YouTube video in our community and received dozens of DMs - No interest = back to the drawing board With this system, you can test multiple offers simultaneously with minimal investment. Only AFTER validating your offer should you consider investing money into ads. This approach has helped us identify winning offers like our cold email setup service and our agency coaching program before investing heavily in paid acquisition. The best agency offers are stupidly simple… but finding them requires methodical testing rather than guesswork.