How to Recover From Google Core Updates

Explore top LinkedIn content from expert professionals.

Summary

Recovering from Google core updates requires addressing issues that impact your website’s trustworthiness, content quality, and technical structure. These updates evaluate factors like relevance, usefulness, and authority, so fixing low-quality content and technical SEO errors is crucial.

  • Perform a content audit: Identify and remove low-performing, outdated, or duplicate pages, and ensure that your remaining content aligns with your brand’s purpose and user intent.
  • Focus on entity clarity: Strengthen your online presence by optimizing your knowledge graph, using structured data, and ensuring consistent information across platforms.
  • Prioritize technical SEO: Fix indexing issues, clean up your website architecture, and address any technical errors caused by CMS settings or plugins to improve your site’s overall quality.
Summarized by AI based on LinkedIn member posts
  • View profile for Jason Dowdell

    Senior Director, Organic Search at ZenBusiness Inc.

    3,563 followers

    I ignored semantic SEO for 15 years. Then we got crushed by Google's core update in October 2023. For most of my career, keywords, links, and engagement were where I focused my efforts. "Google indexes things, not strings"? Whatever. Knowledge graphs? Freebase / KGIDs / entity relationships / Meh. Then we lost 50% of our organic traffic overnight. When 5,500 of your 11,000+ pages get traffic and the rest are dying on the vine, Google notices. When your entity (Company / Brand) isn't clearly defined in their knowledge graph, they don't know who you are and if they don’t know who you are then they certainly can’t trust you. Here's how we learned our lesson: 🔴 Pre-Oct 2023: 11,000 pages, 50% getting zero traffic 🔴 Oct 5, 2023: Core update hits, rankings collapse 🟡 Nov 2023: Knowledge graph audit reveals the truth 🟢 Dec 2023: Content purge + entity optimization begins 🟢 Jan 2024: Finally started to recover 🟢 Aug 2024: Hallelujah we’re back baby! The recovery playbook was clear. We went on a content diet, cutting half our pages. We rebuilt our knowledge graph presence by: - Crafting a semantically-optimized company description (deployed everywhere) - Creating an entity home with schema markup - Establishing our "same as" relationships across platforms - Getting into Wikidata and building third-party mentions Within months, our knowledge panel returned. By August, our organic traffic had recovered, and our rankings were stronger than before the penalty. Still, it was pretty scary and the recovery happened after AI had shifted the search landscape forever. Here’s the lesson: Google isn't just counting links or matching phrases anymore—it's evaluating if you're a trusted entity in your space. Control your narrative or someone else will. That's not just SEO — it's survival. (And, spoiler: it’s even more important in an AI-driven search landscape). — Hey -- I'm Jason. I write about: 🔍 Semantic SEO that actually works 🤖 LLM marketing strategies for growth teams 🚀 Building visibility in the AI-first world DM or just follow along! #seo #llms #organicgrowth

  • View profile for 😻 Maeva Cifuentes

    🔥 Organic growth advisor for forward-thinking companies

    24,461 followers

    In the March core update, one of my client's traffic fell off a cliff. We've been recovering it the last few months, and with the June core update, it had a boost and is finally growing past where it was before March. Here's what happened: → Webflow was set up to have a "see more" dynamic section at the bottom of certain pages for intenral linking. However, every time you clicked "see more", it would generate a new page that would get crawled by Google. → We noticed it back in January, but it wasn't big enough yet to dedicate resources to - by the time March hit, Google had crawled and deindexed over 27,000 pages like this - which were considered thin and duplicate. → Webflow also had global canonicalization on and was self-canonicalizing each of the paginated pages which were actually identical, causing a huge amount of duplicate content on the website. → Webflow Optimize also generated a unique, self-canonicalized and crawlable page for every optimization test - which generated about another 20,000 pages that were considered duplicate. Google looked at my client's website and saw nearly 50k pages that were deindexed and declared it a bad-quality website, no matter the other very high -quality work we can do. We spent the last few months cleaning this up and aiming to get technical SEO perfect - and we were rewarded by the last core update. Not only are we back (and going up) to the clicks since before the drop, but impressions have early doubled, which is a signal that we're appearing in AIO and AI Mode responses. A big, hard lesson on how technical SEO is more important than some SEOs might say. Keep your website architecture clean and keep an eye on it - without it, it doesn't really matter whether you have the right internal linking, link building or content plan. Also - find out what kind of crazy things your CMS might be doing that could hurt you in the long run.

  • View profile for Chris Tzitzis

    Founder at ChrisTzitzis.com

    2,260 followers

    Here is one of the main strategies websites are using to recover their lost rankings from Google’s HCU and March Core Update: It’s something SEOs have been doing for a while now, but it can be pretty effective at recovering rankings. But a lot of people do it completely wrong. The tactic we’re talking about is called “content pruning.” Content pruning is essentially deleting or removing pages from your website. We’ll cover how to decide which pages to remove in a little bit. First, let’s look at a couple of examples of sites that have used it to recover recently. - Housefresh - Travellemming - Driveandreview - Latest-hairstyles Of course, not all sites are pruning their content like these, and most are using content pruning along with various other strategies which we’ll cover as well. Also, Kevin Indigg did a nice study and sent out a newsletter with some pretty interesting findings. He found that many sites that expanded quickly saw decreases in rankings, while some sites that reduced the number of pages on their sites saw increases in rankings. So, how do we know which content we should remove from our websites? A lot of people just look for content that has: - No clicks - No impressions But while that’s a good place to start, it’s not perfect. Why? - If you’ve been hit, most of your content will have no clicks or impressions. - There could be very good reasons it’s not getting clicks or impressions. - Some pages serve a function beyond getting search traffic. For pages that are designed to get traffic, here’s what you should check next: - Was it getting no clicks or impressions before the updates? - Does it match search intent? - Does it have proper on-page optimizations? - Does it have enough internal links? - Does it cover the topic fully? - Does your website/page have enough backlinks? - Has the content not been updated in a very long time? - Is your site having indexing issues? - Does the page get non-SEO traffic? - Does the page serve some kind of purpose beyond traffic? If you see any issues here, you should address these before deciding whether to remove the content or not. You content might not be ranking for normal SEO reasons. Now, for the harder questions. - Does the content meet Google’s helpful content guidelines? - Does the topic of the content align with the purpose of your site? So, what should you do with content that fails one of these last two checks or fails to improve after changes are made? - Improve or delete it if it’s “unhelpful” - Remove it if it doesn’t align with the purpose of your site. - 301 redirect any pages removed to relevant pages to save link juice. - 410 pages that should never have been on the site and had no backlinks. - Make sure internal linking doesn’t suffer because of removed pages. LinkedIn character limit cut off the end! A lot more info in the video below!

  • View profile for Eli Schwartz

    Author of Product-Led SEO | Strategic SEO & Growth Advisor/Consultant | Angel Investor| Newsletter Productledseo.com| Please add a note to connection requests.

    61,410 followers

    Sites hit by Google updates will get hit again unless they pivot. If a helpful content update has hit a site, the algorithm has deemed too much of its content unhelpful. Solving this means improving the ratio of unhelpful content to “helpful” content. The best solution I have seen is to delete duplicative or thin content, not just bulk it up to suddenly make unhelpful content helpful. Years ago, when the Google Panda update hit my company, we recovered our traffic by deleting 80% of our content. It took months, but eventually, we hit 125% of our Pre-panda update traffic levels. The same brutal approach is required for any content update today. (I believe Helpful Content to be a 2023/2024 version of Panda). Trying to save content dead weight will only delay the inevitable and keep the site from growing.

Explore categories