Monday morning. You open Google Analytics and see that organic traffic has dropped 40% compared to the previous week. No changes to the site. No security alerts. Nothing to explain it at first glance. This situation, which repeats in dozens of sites every time Google launches a significant update, has a methodological solution, but first you need to know exactly what has happened to you.
Last reviewed: March 2026. At the time of publication, Google has not confirmed any major updates for 2026.
The confusion between a core update, a spam update and a manual penalty is one of the most costly errors SEO managers make. Each type of algorithmic impact requires a different response. Treating a core update as a manual penalty, or vice versa, can cost you months of wasted work.
This guide gives you the diagnostic protocol and documented recovery methodologies for each type of update, with context for the Spanish market.
The Google update calendar: 2024-2025
To correlate a traffic drop with an update, you need to know exactly when they occurred. Here is the documented record of significant updates over the last two years.
Confirmed updates 2024
| Update | Start | End | Type | Estimated impact |
|---|---|---|---|---|
| March 2024 Core Update | Mar 5, 2024 | Apr 19, 2024 | Core | Very high (45 days, the longest in history) |
| March 2024 Spam Update | Mar 5, 2024 | Mar 20, 2024 | Spam | High: targeting sites with spammy content and cloaking |
| June 2024 Spam Update | Jun 20, 2024 | ~Jun 27, 2024 | Spam | Moderate: spam detection in search results |
| August 2024 Core Update | Aug 15, 2024 | Sep 3, 2024 | Core | High (19 days, focused on small publisher feedback) |
| November 2024 Core Update | Nov 11, 2024 | Dec 5, 2024 | Core | High (24 days, prioritization of useful content for users) |
| December 2024 Core Update | Dec 12, 2024 | Dec 18, 2024 | Core | Moderate: unexpected, only 6 days after the previous one |
Confirmed updates 2025
| Update | Start | End | Type | Estimated impact |
|---|---|---|---|---|
| March 2025 Core Update | Mar 13, 2025 | Mar 27, 2025 | Core | High: global, all languages and regions |
The March 2025 Core Update was notable for the recalibration of the weight of forum content: many sites that had gained visibility with the “hidden gems update” of mid-2023 experienced significant drops. Amazon, Quora and various government sites also recorded visibility changes, confirming that no domain is exempt from quality re-evaluation.
Important note on the Helpful Content Update
The September 2023 Helpful Content Update (September 14-28, 2023) was the last helpful content update announced independently. From the March 2024 Core Update, Google incorporated the helpful content system directly into the core ranking system. Since then, all core updates implicitly evaluate whether content is created for users or for search engines.
Where to follow updates in real time
- Google Search Central Blog (
developers.google.com/search/blog): official source, publishes release notes for all confirmed updates. - Search Engine Land (
searchengineland.com/library/platforms/google/google-algorithm-updates): comprehensive coverage with impact analysis. - Semrush Sensor (
semrush.com/sensor): measures daily SERP volatility in more than 20 countries. - Google Search Status Dashboard (
status.search.google.com): real-time status of the search system.
The technical SEO audit tools 2026 guide details which stack to use for tracking site health on an ongoing basis.
Types of penalty: how to identify yours
Before acting, you need to know exactly what type of impact you’ve received. These four types have different causes, symptoms, and solutions.
Core Update: general quality re-evaluation
A core update is not a penalty in the technical sense. Google hasn’t found anything “wrong” with your site; it has recalibrated how it evaluates quality at a systemic level. The result is that sites that previously satisfied quality criteria now fall behind others that satisfy them better.
Characteristic symptoms:
- Generalized position drop across multiple pages simultaneously, not just in one section
- Organic traffic drops sharply on the update date or in the following days
- No messages in Google Search Console
- Competitors who gained positions have more comprehensive content, more author authority or more E-E-A-T signals
What does not happen with a core update: no indexation errors appear, no pages are deindexed, no security alerts.
Spam Update: content and manipulative practices
Spam updates target specific practices that Google considers manipulative of ranking: link spam (artificial link networks), cloaking (showing different content to Google and users), scaled low-quality content, and site reputation abuse.
Characteristic symptoms:
- Sharper and more focused drops than a core update
- Can affect specific site sections or all pages
- Often coincides temporally with a core update (as happened in March 2024)
- In severe cases, can lead to partial deindexation
Manual Action: human penalty from Google’s team
A manual action means a human reviewer from Google’s team has examined your site and determined it violates spam policies. It is the only type of impact that generates an explicit notification in Search Console.
How to locate it in GSC
Search Console → Security and Manual Actions → Manual Actions. If there is an active action, it appears here with a problem description and a link to the correction documentation.
Common types of manual actions:
- Unnatural link patterns (artificial or purchased)
- Thin content with little or no added value
- Cloaking or deceptive redirects
- Hidden text or links
- User-generated spam
Helpful Content Update (HCU): content created for search engines
Although the HCU system is now integrated into core updates, its logic remains relevant. The sites most affected by this system had pages designed to capture search traffic but offering little or no real utility to the user arriving from Google.
Common signals in HCU-affected sites:
- Articles that superficially answer the search intent but don’t solve the real problem
- Mass AI-generated content without editorial review
- Pages optimized for keywords but without recognizable author perspective
- Site structure oriented toward SEO (silos, aggressive internal linking) rather than user experience
Decision tree: diagnose your situation
Has organic traffic dropped?
├── Yes
│ ├── Does anything appear in GSC > Manual Actions?
│ │ ├── Yes → MANUAL PENALTY
│ │ │ → Go to the manual recovery section
│ │ └── No
│ │ ├── Does the drop coincide with an official update date (±3 days)?
│ │ │ ├── Yes → Does it affect the whole site or only sections?
│ │ │ │ ├── Whole site → CORE UPDATE
│ │ │ │ └── Specific sections → SPAM UPDATE or HCU
│ │ │ └── No → Are there site changes (migrations, noindex, etc.)?
│ │ │ ├── Yes → INTERNAL TECHNICAL PROBLEM
│ │ │ └── No → MINOR UNANNOUNCED UPDATE
│ │ │ → Monitor with Semrush Sensor
└── No → No algorithmic impact detected
Step-by-step diagnosis: tools and methodology
Once the type of impact is identified, detailed diagnosis requires a systematic protocol. Don’t start making changes before completing this process, as acting without specific data is the most common cause of failed recoveries.
Step 1: Identify the exact date of the drop in GSC
Open Google Search Console → Performance → Web search. Adjust the date range to the last 3 months and examine the clicks and impressions graph. Find the exact point where the decline begins: not the lowest valley, but the first day of the drop.
Note the exact date. This is your reference for the rest of the diagnosis.
The most common mistake here is looking only at total site traffic and losing granularity. Filter by page and by query to understand which sections and keywords dropped first.
Step 2: Correlate with the update calendar
With the drop date identified, compare it with the timeline in the previous section. If the drop began within a ±3 day window of an official update, the correlation is likely.
Also check the Semrush Sensor history (semrush.com/sensor) for that date. Semrush Sensor scores SERP volatility from 0 to 10: scores above 5 indicate high volatility, and above 8 indicate extreme volatility. If the Sensor showed high volatility on your dates and it coincides with an update timeline, the correlation is solid.
MozCast (moz.com/mozcast) presents the same information in meteorological temperature form: “hot” days correspond to high volatility.
Step 3: Analyze which specific pages dropped
Go back to GSC → Performance and filter by page. Export the average position data for the last 16 weeks. Identify:
- Did all pages drop uniformly, or only certain categories?
- Do the pages that dropped most have something in common (content type, length, absence of author, lack of sources)?
- Are there pages that rose at the same time? (If so, analyze what is different about them)
This granularity is the difference between a useful and a generic diagnosis. If only 20% of your pages dropped, you don’t need to review the whole site.
Step 4: Review Manual Actions section in GSC
GSC → Security and Manual Actions → Manual Actions. If there is an active action, jump directly to the manual recovery section. If there’s nothing here, rule out manual penalty as the cause.
Volatility monitoring tools
For diagnosis and ongoing monitoring, these are the most reliable tools:
- Semrush Sensor: filterable by country (including Spain) and by industry category. Allows viewing historical volatility for any date, facilitating retroactive correlation.
- MozCast: temperature visualization for historical SERP volatility.
- Algolia RankingUpdate: less known but useful for quick correlations with unofficial updates.
- Google Search Status Dashboard: real-time status of Google Search systems.
For choosing between Semrush and Ahrefs as your primary monitoring tool, see our detailed comparison where we analyze their ranking tracking modules.
Recovery from a Core Update: what actually works
This is the part where most guides fail: they give you generic advice about “improving content” without specifying what that means in practice.
What Google says officially
Google’s official documentation on core updates (developers.google.com/search/docs/appearance/core-updates) is deliberately vague. Google recommends focusing on producing high-quality content and suggests that if a site has genuinely improved, it should see results in the next core update.
What Google doesn’t say: there is no checklist of specific actions that guarantees recovery. This is not a lack of transparency; the evaluation is systemic, not based on individually verifiable criteria.
What we see in practice
Analyzing documented recovery cases after the March 2024 Core Update and the August 2024 Core Update, the consistent patterns are:
1. The competitors who gained when you lost have more E-E-A-T signals
E-E-A-T stands for Experience, Expertise, Authoritativeness and Trustworthiness. It’s not a direct ranking factor but a framework for evaluating quality. Sites that recovered or gained positions during the 2024 updates shared characteristics such as:
- Clear authorship with author pages showing real credentials
- Citations to primary sources and verifiable data
- Content with first-hand perspective (real experience, not a synthesis of what others say)
- Updated About and company pages with verifiable information
2. Superficial content was affected more than incorrect content
The August 2024 Core Update was explicitly designed to give more visibility to small independent publishers creating original content, in response to feedback received after the September 2023 HCU. Sites that lost positions generally had correct but generic content: the type of answer already offered by dozens of pages with the same words rearranged.
3. The 5 questions Google recommends asking yourself
Official documentation suggests evaluating your content with these questions (adapted):
- Does the content provide original information, research or analysis not found elsewhere?
- Has the content been reviewed by someone with real knowledge of the topic?
- Are there easily verifiable factual errors?
- Would you trust this content for an important decision related to the topic?
- Is the content designed to solve the user’s problem or to rank on Google?
4. What does NOT work
- Simply waiting without making changes: core updates don’t reverse on their own, and the next update will re-evaluate the same content you already had
- Adding more content without improving existing content: increasing the volume of low-quality pages worsened the situation for several sites after the March 2024 Core Update
- Isolated technical optimizations: improving Core Web Vitals without addressing content quality issues doesn’t produce core update recovery
- Massively deleting content without strategy: some sites that deleted 30-50% of their pages recovered; others didn’t. The key isn’t quantity but the average quality of the remaining content
Realistic time horizon
Google’s documentation and documented cases agree: recovery from a core update typically doesn’t come until the next core update, normally 3-4 months later. In 2024, with four core updates (March, August, November, December), the cycle was faster than usual.
Some sites took more than a year to fully recover from the September 2023 Helpful Content Update, even making the right changes. This doesn’t mean the changes don’t work; it means Google’s re-evaluation is gradual, not instantaneous.
Recovery from a manual penalty
Unlike core updates, manual penalties have a defined and documented resolution process.
How to prepare an effective reconsideration request
A reconsideration request is the only official mechanism for Google to lift a manual action. For it to be accepted, it’s not enough to say “we’ve fixed it”; you have to demonstrate exactly what you did, how and when.
Structure of an effective request:
- Acknowledgment of the problem: specifically describe what violation Google detected (copy the exact text of the manual action in GSC).
- History of actions taken: chronological list of every change made, with dates. If you removed spammy links, include the list of domains and disavow dates via Disavow Tool.
- Evidence of improvements: screenshots, change logs, specific before/after URLs.
- Preventive measures: what you’ve implemented to prevent it from happening again.
What causes rejection:
- Describing general changes (“we’ve improved content quality”) without specificity
- Not having corrected all instances of the problem, only some
- Submitting the request too early, before completing corrections
- Using apologetic language without concrete evidence
Google indicates that most reviews take several days or weeks. In practice, reconsideration requests related to artificial links can take up to 6 weeks to process. Well-documented requests are typically resolved in 2–3 weeks.
Once the manual action is resolved, the lifting of the penalty may take a few days to be reflected in rankings, as Google needs to recrawl and index the affected pages.
How to protect your site from future updates
The resilience strategy against Google updates doesn’t consist of “guessing” what is going to change; it consists of building a site that genuinely satisfies user needs regardless of how the algorithm evolves.
User-centered content framework
The most consistent principle extracted from all 2024-2025 updates is: content that solves real problems for real users is more resilient to updates than content optimized to rank.
In practice, this means:
- First-hand perspective: include experiences, real cases, proprietary data that can’t be generated with AI or copied from other sources
- Continuous updates: outdated content loses relevance signals; a periodic review policy (every 6-12 months for evergreen content) reduces exposure
- Depth proportional to intent: not every query requires a 3,000-word article; some queries are resolved with 200 well-written words
E-E-A-T signals: the most actionable ones
Of the four E-E-A-T components, these are the most directly influenceable:
- Experience (the E added in 2022): demonstrate first-hand experience with the topic. For SEO services, this means real client case data, not just theory.
- Technical expertise: mention the author’s relevant credentials. An author page with a verifiable professional track record adds authority signals.
- Trustworthiness: cite primary sources, include update dates, show verifiable contact information.
Traffic diversification as a resilience strategy
Relying exclusively on Google organic traffic is the highest risk factor in any content strategy. Sites that diversify toward newsletters, direct traffic, social media or search on alternative platforms have lower exposure to update fluctuations.
For a complete technical SEO audit of your site that evaluates not just technical status but also the quality factors influencing core updates, check our methodology.
The Spanish market and Google updates
Google updates have universal impacts, but there are particularities of the Hispanic market worth knowing.
HCU impact on Spanish-language content
The September 2023 Helpful Content Update and the core updates of 2024 especially affected Spanish-language content sites that had rapidly scaled their publication volume using generative AI without editorial review. In the Spanish-speaking market, where the volume of well-documented content is lower than in English, sites with a lack of verifiable sources and recognizable authorship were proportionally more affected.
The most impacted niches in Spain included health, personal finance and travel content, precisely the YMYL (Your Money or Your Life) categories where Google applies the strictest quality criteria.
Particularities of the local market
For sites oriented toward the Spanish market and specifically the Catalan and Barcelona market, there are two additional factors:
Domain authority and age: Google has systematically shown preference for .es domains with a long indexation history in local searches. This doesn’t mean new domains can’t rank, but the time horizon for building authority is longer.
Competition in local SEO: local service niches in Barcelona and other major cities have a high concentration of SEO-optimized pages with little content differentiation. In this context, author authority signals and content originality have a more pronounced differential impact.
Why some recoveries take longer on .es domains
Google crawls for .es domains with low or medium traffic occur less frequently than for high-traffic .com or .co.uk domains. This means that even when corrective changes are implemented, Google may take more crawl cycles to re-evaluate them. Increasing the frequency of new content publication can serve as a signal to increase crawl rate, accelerating re-evaluation.
For local sites in Barcelona and Catalonia, optimizing the Google Business Profile and consistency of local citations are factors that contribute to overall domain authority, which indirectly impacts resilience against core updates. You can go deeper into this in our guide on SEO consulting in Barcelona.
What to do if you detect a traffic drop today
If you’ve just detected a drop and don’t know where to start, these are the immediate steps ordered by priority:
First (0-24 hours):
- Open Google Search Console → Manual Actions. If there’s something there, you have a manual penalty and the path is clear.
- Note the exact start date of the drop in the GSC graph.
- Compare that date with the update timeline in this article.
- Check Semrush Sensor to confirm whether there was high volatility on those dates.
Second (1-7 days): 5. Export the GSC performance report filtered by page. Identify which specific pages dropped. 6. Analyze the 5-10 pages that dropped the most: what do they have in common? What is different about the competitors that overtook them? 7. If the diagnosis points to a core update or HCU, start prioritising quality improvements on the most affected pages.
Third (weeks 2-8): 8. Implement content improvements with concrete E-E-A-T criteria. 9. If you have very low-quality pages that can’t be improved, evaluate consolidating or redirecting them. 10. Document all changes with dates to be able to measure impact in the next core update cycle.
Recovery is a months-long process, not a days-long one. But the difference between sites that recover and those that don’t almost always lies in starting with the correct diagnosis, not in the speed of changes.
If you have doubts about the specific diagnosis for your site or need help designing a recovery strategy, contact our team. We analyze the situation with Search Console data and propose a concrete action plan.
Share this article
If you found this content useful, share it with your colleagues.
Frequently Asked Questions
¿Con qué frecuencia publican contenido nuevo?
Publicamos artículos nuevos semanalmente, enfocados en las últimas tendencias de SEO técnico, casos de estudio reales y mejores prácticas. Suscríbete a nuestro newsletter para no perderte ninguna actualización.
¿Los consejos son aplicables a cualquier tipo de sitio web?
Nuestros consejos se adaptan a diferentes tipos de sitios: ecommerce, blogs, sitios corporativos y aplicaciones web. Siempre indicamos cuándo una técnica es específica para cierto tipo de sitio o requerimientos técnicos.
¿Puedo implementar estas técnicas yo mismo?
Muchas técnicas básicas puedes implementarlas tú mismo siguiendo nuestras guías paso a paso. Para optimizaciones avanzadas o auditorías completas, recomendamos consultar con especialistas en SEO técnico como nuestro equipo.
¿Ofrecen servicios de consultoría personalizada?
Sí, ofrecemos servicios de consultoría SEO técnica personalizada, auditorías completas y optimización integral. Contáctanos para discutir las necesidades específicas de tu proyecto y cómo podemos ayudarte.