SEO is always changing, anyone in the industry knows that, but the past few months have been particularly eventful. After the August update caused noticeable fluctuations for many websites, September has brought an even bigger shake-up.
Experts and members of the SEO community are reporting major changes in Google’s search results pages, and if you’re regularly checking your rankings, you’ve probably noticed some unusual swings in your metrics too.
So, what’s going on? Google has quietly rolled out changes that directly impact the pagination of search results. This means rank tracking tools can no longer access extended result pages the way they used to, creating significant gaps and inconsistencies in the data.
If you use Google Search Console, you might have seen sharp drops in impressions alongside unexpected spikes in average position (a combination that doesn’t make sense under normal circumstances).
That’s a pretty big change, and one that will need adapting to…fast.
We’ll walk you through exactly what changed, how it impacts your ability to monitor search engine rankings, and most importantly, the steps you can take to adapt your tracking methods moving forward.
Key Takeaways:
- The &num=100 parameter removal disrupted traditional rank tracking, forcing 10x more requests and higher operational costs.
- Scraper bots had artificially inflated impression metrics, explaining the disconnect between impressions and clicks.
- SEO teams must recalibrate strategies, focus on business-critical keywords, and diversify tracking methods.
- Major SEO platforms are updating tools and workflows, which may result in higher subscription costs.
- This change marks the start of enhanced anti-scraping measures, pushing the SEO industry toward more sophisticated and accurate tracking.
Google Alters SERP Pagination, Breaking Rank Tracking Norms
In September 2025, Google removed support for the “&num=100 URL” parameter which affects the ability to view the first 100 results for any search query (rather than the standard 10 per page) An invaluable tool to monitor search rankings quickly.
Previously, you simply had to append “&num=100” to a Google search URL, and a single request would return 100 results. One query = all the data you needed. It was efficient, fast, and affordable.
The Problem Now?
Adding this parameter to Google search URLs doesn’t show extended results anymore. Rank tracking tools that used to get data in one query now need to make 10 separate requests for the same information. In turn, SEO platforms are reporting higher operational costs, slower data collection, and increased complexity for users trying to monitor search engine rankings.
SEO Community Reports Disruptions in Rank Tracking
As is often the case with many Google updates, there was no warning for the SEO community.
During the initial rollout, the community reported intermittent functionality. The parameter worked roughly half the time, suggesting Google was testing the change before full implementation.
Brodie Clark (@brodieseo) flagged similar changes on X and LinkedIn, reporting “a noticeable decline in desktop impressions, resulting in a sharp increase in average position.”
Twitter embed? https://x.com/brodieseo/status/1967220707798270430
Following this initial alert, SEO professionals across different accounts and locations began testing and confirming the change too.
👉 Quick Response from Industry: Several rank tracking services issued incident reports to their clients, acknowledging the disruptions and promising platform updates.
So, What Happened to Your Metrics?
Since this change, desktop impressions have dropped sharply, while average position metrics have shown misleading improvements (Fewer impressions can make your average position appear higher, even if actual rankings haven’t improved.)
Tools that check search engine rankings are now showing inconsistent results and visible gaps in SERP screenshots, affecting entire rank tracking ecosystems. Systems that depended on efficient, single-request data collection are now slower, more expensive, and harder to manage.
How to Make Sense of It
These metric changes are reporting anomalies rather than genuine ranking shifts. Google calculates position as an average across all impressions, and natural ranking fluctuations mean steady trends over time matter more than daily variations.
- Compare data to actual traffic, not just positions: If Search Console shows your average position went up but traffic stayed the same or dropped, that’s a sign the metric is likely misleading. Your real performance is in the visits, clicks, and conversions, not just the reported ranking number.
- Establish new baselines week-over-week: Since the change started around September 10, measure performance relative to this date. Look at trends over multiple weeks instead of day-to-day spikes, so you can see the true direction of your SEO efforts.
- Analyse individual queries: Don’t rely on overall averages. Look at how each keyword is performing separately to get a clearer picture of what’s happening. Some queries may still be performing well, while others may appear to drop artificially.
- Focus on consistent traffic patterns: Instead of panicking over sudden spikes or drops in impressions, pay attention to steady traffic trends. Consistency is more important than daily fluctuations when it comes to SEO performance.
Scraper Bots Blamed for Inflated Impressions
The recent drop in Google Search Console impressions has revealed an unexpected culprit: scraper bots. These automated programs, widely used in SEO, collect data from websites and search results for competitive analysis, market research, or content aggregation.
While scraper bots can provide valuable insights, like competitor rankings, ad strategies, and content performance, aggressive scraping can also cause serious problems:
- Server Strain: Excessive scraping can overload servers and harm site performance
- Content Duplication: Scraped content can be republished elsewhere, hurting search rankings
- Stolen Traffic: Some bots redirect organic traffic or misuse pricing and inventory data
- Skewed Analytics: Artificial traffic from bots can distort metrics, creating misleading performance insights
Why This Explains Inflated Impressions
The timing of the recent drop in impressions suggests that much of the previous traffic recorded in Search Console may have been bot activity.
Desktop impressions, where rank tracking occurs most often, declined by over 200,000 daily impressions for some sites immediately after the parameter was disabled. These bots were inflating impression metrics without generating clicks, contributing to the so-called “great decoupling” between impressions and real user engagement.
Google’s Silence and Strategic Intent
Google hasn’t officially confirmed much about this change. A spokesperson simply noted:
“The use of this URL parameter is not something that we formally support.”
Experts believe the update is part of a strategic move to curb excessive scraping, particularly as AI tools increasingly harvest search data. It’s unclear if the main goal was to block bot traffic from metrics or if it was simply a side effect of limiting scraper access.
SEO Tools Scramble to Adapt to New SERP Structure
Google’s removal of the &num=100 parameter has undoubtably sent shockwaves through the SEO industry nearly overnight. Major platforms are now racing to rebuild their data collection systems, forcing every rank tracking provider (and SEO Strategists) to rethink how they gather search engine rankings.
Which Tools Are Affected…and How
Keyword Insights was among the first to publicly address the disruption. Their rankings module experienced immediate impacts. Semrush has begun adapting its API infrastructure to manage the new requirements, while AccuRanker is focusing on SERP feature tracking to offset the limitations.
Any tool that relied on bulk data collection now faces slower processing times and significantly higher operational costs, potentially affecting the accuracy, timeliness, and reliability of rank tracking data for users, while also driving up subscription prices and limiting the ability to monitor large keyword sets efficiently.
What This Means for Your Rank Tracking
But, Google’s parameter removal hasn’t just disrupted SEO tools; it’s forced SEO professionals and site owners to adapt just as quickly and strategically.
If you rely on rank tracking, now is the time to check in with your provider. Some tools continue functioning using pagination workarounds, while others are showing noticeable data gaps.
Be aware that subscription pricing may change as platforms adjust to higher data acquisition costs. Tools that adapt quickly and efficiently are likely to emerge stronger, while others may struggle with the technical and financial demands.
Actionable Steps
- Assess your rank-tracking provider to understand their adaptation timeline and reliability.
- Consider diversifying your tracking approach across multiple platforms to reduce dependency on a single tool.
- Monitor your provider’s performance during this transition period to ensure your SEO data remains accurate and actionable.
Prepare for More Anti-Scraping Measures
Google’s recent changes are likely just the beginning of a broader push to limit scraping activity. SEO teams should expect more updates designed to protect search data from automated collection.
The search giant now relies more on JavaScript-rendered results, making traditional HTML scrapers obsolete. Google has also stepped up enforcement against automated access. This includes:
- IP blocks that prevent repeated requests from the same address
- CAPTCHAs that stop bots from accessing pages
- Advanced anti-bot technology that can detect and block automated scraping behaviour
💡Pro tip: Build redundancy into your tracking systems now, before the next wave of changes hits. Diversifying tools and adapting to new technologies ensures your SEO insights remain accurate and actionable, even as Google continues to evolve.
TL;DR: Google Overhauls SERP Data & Adapting SEO Tracking for a Smarter Future
Google’s removal of the &num=100 parameter has dramatically changed SEO tracking, forcing tools to make 10x more requests and creating unusual, misleading ranking metrics. The disruption revealed how scraper bots previously inflated impressions and explained many of the discrepancies between clicks and impressions.
SEO teams are responding by recalibrating strategies, focusing on high-impact keywords, using Google Search Console as a primary source, and diversifying tracking methods. Platforms and tools are adapting with workarounds, though operational costs and subscription prices may rise.
The key takeaway: Don’t rely on a single tool or data source. By diversifying tracking and focusing on meaningful trends, SEO professionals can maintain accurate, actionable insights in an evolving search landscape.
At Digital Nomads HQ, we don’t just react, we stay ahead. We are continuously monitoring fluctuations, cross-referencing multiple data sources, and recalibrating strategies in real time. We focus on the metrics that matter, ensuring insights remain accurate and aligned with your business.