When a website migration goes wrong, the consequences can be a devastating loss of organic traffic and revenue. But what happens when the damage isn’t immediately visible? What if Google is silently deprioritizing your content, page by page, until your traffic has evaporated?

This is the case study of how a multinational media organization lost 90% of its traffic following a domain migration, and how addressing a seemingly harmless technical issue — soft 404 errors — helped unlock suppressed traffic potential across 13 country-specific domains.

While this case study examines events from 2021–2023, the lessons learned remain timeless and directly applicable to any site facing indexing challenges today.

The catastrophic drop

In January, 2022, the Brazilian localization of a cryptocurrency news website completed a domain migration. After the transition, traffic didn’t just drop — it plummeted. Comparing December 2021 to December 2022, both sessions and pageviews had fallen approximately 90% year-over-year.

Image 105Image 105

According to Google Search Console data, the old domain (xx.com.br) was receiving between 15,000 to 25,000 clicks per day before migration. After migrating to the new subdomain structure (br.xx.com) in January, traffic collapsed and never recovered. It stabilized at around 2,000 to 4,000 clicks per day — a sustained loss that persisted for over a year.

Image 104Image 104

The migration coincided with three major Google algorithm updates in June 2021: the core update, spam update, and page experience update. While these updates caused the expected temporary volatility, the Brazilian site showed no signs of recovery.

Your customers search everywhere. Make sure your brand shows up.

The SEO toolkit you know, plus the AI visibility data you need.

Start Free Trial

Get started with

Semrush One LogoSemrush One Logo

The migration problem: More than just redirects

Domain migrations typically show an initial traffic drop as Google recrawls and reassesses the site. That’s expected.

Normally, this traffic recovers within weeks or months. In this case, there were no signs of recovery.

The root cause? The old domain continued to be crawled by Google long after the migration.

According to the team’s analysis, proper redirect implementation and technical migration protocols weren’t fully implemented, causing Google to split its crawl budget between two domains rather than consolidating authority on the new one.

In mid-August 2022, after addressing the migration issues with the SEO and IT teams, there was a subtle uptick — a peak of 12 clicks and 37 impressions on Aug. 29, 2022. While modest, this represented the first signs of recovery and indicated that Google was beginning to properly recognize the new domain.

Image 108Image 108

Using Facebook Prophet forecasting on pre-migration data, the team estimated that without the migration issues, the Brazilian site would have exceeded 2 million monthly clicks by early 2022. Instead, it was generating a fraction of that traffic.

Image 114Image 114

Understanding the indexing bottleneck

While fixing the migration was critical, it revealed a deeper problem affecting not just Brazil, but all 13 of the site’s country domains: a massive indexing backlog.

Google’s page processing follows four stages:

  • Crawl: Google discovers and reads pages.
  • Render: The page code is rendered.
  • Index: Pages wait in a queue to be stored in Google’s index.
  • Rank: Pages appear in search results with rankings.

The Brazilian site was taking an average of 2 minutes for Google to crawl new articles (an acceptable amount of time for a news site). However, indexing these articles was taking 24 hours. For time-sensitive cryptocurrency news, this delay was catastrophic. By the time the site’s articles were indexed, the news cycle had already moved on.

The scale of the site migration problem: 513,000 crawled, but not indexed, pages

In January 2023, Google Search Console revealed alarming indexing issues across all domains:

  • Crawled – currently not indexed: 513,369 pages (Brazil alone)
  • Soft 404: 1,193 pages and growing rapidly
  • Alternate page with proper canonical tag: 2,532 pages
  • Discovered – currently not indexed: 524 pages
Image 111Image 111

The “Crawled – currently not indexed” issue was particularly concerning. These were pages that Google had successfully crawled but chose not to index. This typically happens when Google considers a page low-quality, duplicate, or not worth the crawl budget.

Image 106Image 106

Upon investigation, the team discovered that converter pages (e.g., “/usd-to-thor?amount=250” or “/eur-to-signaturechain?amount=1000”) were being automatically generated at scale. These thin content pages were consuming Google’s crawl budget, causing it to deprioritize the entire domain.

The soft 404 time bomb

While fixing the migration and removing low-quality pages was important, the most insidious issue was the proliferation of soft 404 errors.

A soft 404 occurs when a page returns a 200 (success) status code but actually contains no meaningful content — essentially a “page not found” that doesn’t properly signal its emptiness to search engines. Unlike hard 404s, which clearly communicate that the page doesn’t exist, soft 404s confuse search engines and waste crawl budgets.

Image 107Image 107

The data revealed this wasn’t isolated to Brazil. Soft 404 errors were growing exponentially across multiple domains:

  • xx.com (main site): 90,400 affected pages
  • es.xx.com (Spain): 17,700 pages
  • kr.xx.com (Korea): 15,400 pages
  • fr.xx.com (France): 15,100 pages
  • de.xx.com (Germany): 8,010 pages
Image 109Image 109

Specifically for France, Google Search Console data showed a direct correlation: As soft 404 errors began accumulating in October 2022, total crawl requests dropped from 60,000–70,000 per day to just 20,000–30,000 per day. Google was literally giving up on crawling the site efficiently.

The crawl budget crisis

The concept of crawl budget is critical to understanding why soft 404s matter so much.

Search engines allocate a finite amount of resources to crawl each website. If Google wastes time crawling broken, empty, or duplicate pages, it has less capacity to discover and index your valuable content.

For news sites publishing dozens of articles daily, this creates a vicious cycle: New content doesn’t get indexed quickly, engagement drops, Google further reduces crawl budget, and the problem compounds.

In January 2023, Google was wasting significant resources crawling pages that provided no value. This meant:

  • Slower indexing of new, timely content.
  • Reduced visibility in search results.
  • Lost traffic opportunities.
  • Degraded domain authority in Google’s eyes.

The systematic fix: Addressing root causes of site migration problems

Starting Jan. 31, 2023, the team implemented a comprehensive technical SEO remediation plan focused on three priorities:

Urgent: Soft 404 resolution

The team identified the source of soft 404 errors and implemented proper HTTP status codes. Pages that truly didn’t exist began returning proper 404 or 410 status codes. Pages with content were fixed to render properly.

High priority: Crawl budget optimization

  • Removed or noindexed automatically generated currency converter pages.
  • Implemented stricter URL parameter handling.
  • Used robots.txt to block low-value URL patterns.
  • Set up proper canonicalization for variant pages.

Medium priority: Core Web Vitals

While user experience metrics were important, the team recognized that fixing indexing issues would have a more immediate impact than optimizing page speed. Core Web Vitals improvements were addressed, but not at the expense of resolving indexing bottlenecks.

Get the newsletter search marketers rely on.


The results: Dramatic recovery across all domains

Weeks after implementing the fixes, the impact was measurable:

Image 112Image 112

Brazil (br.xx.com)

  • Crawled – currently not indexed: Dropped from 513,000 to 220,000 pages (57% reduction).
  • Soft 404 errors: Reduced from 1,193 to 370 pages (69% reduction).
  • Traffic recovery: Visible upward trajectory starting early 2023.
Image 110Image 110

Germany (de.xx.com)

  • Indexed pages: Increased from ~150,000 to 370,748.
  • Total clicks: Rose from ~8,000/day average to sustained 12,000-15,000/day.
  • Google Discover traffic share: Jumped from 42% to 58%.

Poland (pl.xx.com)

  • Indexed pages: Grew from ~100,000 to 135,556.
  • Total clicks: Increased significantly with multiple traffic spikes above 30,000/day.
  • Google Discover traffic share: Rose from 15% to 86%.
Image 113Image 113

Spain (es.xx.com)

  • Google Discover clicks: Increased from ~450,000 to 912,721 total.
  • Traffic distribution: Discover now represents 65% of total traffic.

All domains combined

Image 110Image 110

By late April 2023, soft 404 errors across all domains had dropped from a peak of approximately 120,000 pages to under 20,000 — an 83% reduction.

Most remarkably, the biggest traffic gains came from Google Discover — Google’s personalized content recommendation feed. As indexing health improved, Google began trusting the domains enough to recommend their content more aggressively to users.

The Core Web Vitals paradox

Interestingly, improvements to Core Web Vitals (page speed, interactivity, and visual stability) showed mixed results:

Desktop improvements:

  • Germany: 25.1% → 97.1% good URLs
  • Poland: 20.5% → 68.9% good URLs
  • Korea: 15% → 84.6% good URLs

Mobile challenges:

  • Brazil: 0% → 0% (no improvement)
  • Argentina: 0% → 0%
  • Thailand: 0% → 0%
  • Korea: 93.4% → 0.5% (severe regression)
  • Turkey: 94% → 0% (severe regression)

The team’s hypothesis: Core Web Vitals performance is heavily influenced by regional factors like CDN proximity, server location, network quality, and device capabilities. Countries with poor mobile infrastructure or greater server distance showed minimal improvement despite technical optimizations.

This reinforced an important lesson: Not all technical SEO issues affect all markets equally. A one-size-fits-all approach would have wasted resources by optimizing for metrics that couldn’t improve without infrastructure investment, while the real wins came from addressing indexing fundamentals.

Key technical SEO lessons

1. Indexing issues trump almost everything else

No amount of content quality, backlinks, or page speed optimization matters if Google isn’t indexing your pages. Before optimizing what’s visible, ensure your content is actually being indexed.

2. Soft 404s are silent killers

Unlike hard 404s that immediately alert you to problems, soft 404s quietly accumulate, degrading your crawl budget until you notice traffic declining. Regular monitoring of Google Search Console‘s “Pages” report is essential.

3. Domain migrations require exhaustive validation

The Brazilian site’s migration issues persisted for over a year. A proper migration protocol should include:

  • Complete redirect mapping verification.
  • Confirmation of old domain deindexing.
  • Search Console property setup and validation.
  • Multi-week monitoring of both old and new domains.
  • Crawl rate and indexing speed tracking.

4. Crawl budget is real for high-volume sites

For sites publishing 10+ articles daily across multiple domains, crawl budget optimization is not optional. Automatically generated pages, URL parameters, and infinite scroll implementations can quickly consume available crawl resources.

5. Regional differences demand regional solutions

Core Web Vitals data showed that Brazil, Argentina, and Thailand couldn’t achieve the same performance as European markets. Instead of forcing uniform standards, prioritize fixes tailored to each market that can actually succeed.

6. Google Discover is increasingly critical

For news and timely content publishers, Google Discover accounts for a substantial share of traffic in some markets. But Discover only promotes content from sites Google trusts — and technical issues like soft 404s directly erode that trust.

Practical site migration implementation guide

For teams facing similar challenges, here’s a systematic approach:

Weeks 1-2: Audit and prioritize

  • Access Google Search Console for all properties.
  • Export “Page indexing” reports for all domains.
  • Identify the scale of each issue category.
  • Calculate the trend (growing, stable, or declining).
  • Prioritize based on issue volume and growth rate.

Weeks 3-4: Fix soft 404s

  • Sample 20–30 URLs from the soft 404 report.
  • Identify common patterns (empty pages, broken functionality, etc.).
  • Implement proper HTTP status codes (404, 410, or fix the content).
  • Validate fixes in Google Search Console.
  • Monitor for reduction in affected pages.

Weeks 5-8: Address crawled but not indexed

  • Analyze URLs to identify auto-generated content.
  • Implement robots.txt rules or noindex tags for low-value pages.
  • Review and strengthen internal linking to important pages.
  • Ensure proper canonicalization across variants.
  • Request reindexing via Search Console for key pages.

Weeks 9-12: Monitor and optimize

  • Track indexing coverage weekly.
  • Monitor crawl rate changes in Search Console.
  • Measure organic traffic recovery.
  • Identify remaining outlier issues.
  • Document learnings for future migrations.

Calculating the traffic loss from migration issues

How significant was this suppressed traffic opportunity?

According to Facebook Prophet forecasting based on pre-migration data, the Brazilian site was trending toward 20,000+ daily clicks. At the time of fix implementation in early 2023, it was receiving approximately 5,000–7,000 daily clicks. This represented roughly 6575% of potential traffic being suppressed — or conversely, the site was only achieving 25–35% of its forecasted potential.

More broadly, across all 13 domains, the soft 404 and indexing issues prevented approximately 500,000 pages from being indexed. Given average click-through rates for indexed pages, this represented millions of potential monthly impressions and hundreds of thousands of potential clicks being left on the table.

See the complete picture of your search visibility.

Track, optimize, and win in Google and AI search from one platform.

Start Free Trial

Get started with

Semrush One LogoSemrush One Logo

Technical debt compounds

The most important lesson from this case study is that technical SEO issues don’t stay static — they compound. What starts as a few hundred soft 404s becomes thousands, then tens of thousands.

Google’s response isn’t immediate punishment, but gradual deprioritization. Traffic doesn’t crash overnight; it bleeds slowly.

For the Brazilian site, it took over a year to recognize the full scope of the problem. During that year, competitors filled the gap, topical authority eroded, and recovery became exponentially harder.

The good news? Once identified and systematically addressed, these issues are fixable. Within 12 weeks of implementing the remediation plan, every domain showed measurable improvement. Some saw traffic double or triple.

Technical SEO is often seen as unglamorous maintenance work. But as this case demonstrates, it’s the foundation upon which all other optimization rests. Before worrying about AI-generated content, E-E-A-T signals, or the latest algorithm update, ensure Google can actually find, crawl, and index your content.

Because the best content in the world is worthless if it’s trapped outside search engine indexes.

Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. Search Engine Land is owned by Semrush. Contributor was not asked to make any direct or indirect mentions of Semrush. The opinions they express are their own.