Google Just Killed &num=100: What SEOs Need to Know

by
Updated:

You know that trick SEOs have been using forever? Adding &num=100 to a Google search URL so you can get 100 results at once, instead of clicking “Next Page” nine times?

Well, Google has quietly pulled the plug on that convenience. Now, when you try it, sometimes it works, sometimes it doesn’t, basically, a mess.

If you manage SEO for e-commerce clients, you might already see weird drops or jumps in metrics, especially in Google Search Console (GSC) and rank-tracking tools.

Before your panic button gets stuck, let’s discuss what’s happening, why it matters, and what you can do to avoid looking foolish in front of clients (again).

Article Summary

  • Google has deprecated or is largely disabling the &num=100 parameter (so you can no longer reliably get 100 results in one search/query).
  • Rank-tracking tools, SERP-scrapers, and SEO dashboards that relied on that are now taking 10× more “requests”/work to fetch the same data.
  • Google Search Console reports are showing sharp drops in desktop impressions, and “average position” is going up (artificially), because results deeper (e.g., positions 50-100) are no longer consistently fetched/seen.
  • E-commerce sites that monitor long-tail keywords, product category ranks, or positions beyond the top 20 will be hit hardest. Their tools will cost more or deliver less depth.
  • Best move now: adjust expectations, communicate the change to clients, focus more on top-of-SERP (top 20 or so) data, maybe prune what you track so you don’t waste resources chasing ghost data.

What Exactly Happened with &num=100?

The Parameter and Its Disappearance

Until recently, adding &num=100 in a Google search URL would tell Google to return 100 search results in one go. It was a useful hack (some might call it “scraping-friendly”) used by tools, SEOs, and power users who needed full visibility (up to the 100th position) without navigating through multiple pages.

Around September 10-14, 2025, people started observing that &num=100 stopped working reliably. Sometimes you’d get only partial results, two pages, or nothing beyond page 2. It appears Google made this change globally, not just for specific locales.

Why Google Might Be Doing This

We don’t have an official grand statement from Google yet (surprise, surprise), but the SEO community has some good theories:

  • To reduce scraping/automated bulk queries hitting their SERPs. Too many tools drifted into very heavy scraping usage via this parameter.
  • To push people/tools into paginated fetching (i.e., lots of smaller requests) rather than one big one, which gives Google more control over load, rate limits, and maybe filtering.
  • To improve the integrity of performance data, maybe many of the “impressions” in past GSC reports were inflated by bots or tool scrapes that used &num=100. If Google filters those out, you’ll see “drops” in impressions because those ghost views vanish.

How E-Commerce SEO Is Getting Whacked (And Why It’s Extra Messy)

If you’re doing SEO for product category pages, long-tail product variations, or lots of SKUs, this change stings more. Here’s why:

You Track More Keywords, Deeper Positions

E-commerce sites often monitor not only top keywords but also mid- and long-tail ones: product model numbers, combinations, category-+-brand.

Many of those live outside the top 10 or 20. Without reliable page results beyond page 1 or 2, those rankings become invisible. Clients will say, “I haven’t moved” or “we lost impressions,” even though the issue is tool visibility, not actual ranking.

Reporting Looks Worse, Even If Performance Isn’t

Because GSC is dropping lots of “impressions” that came from those deeper positions or bot-loaded results, you’ll see a sudden drop.

Also, the average position will look better (because the worst positions aren’t being reported). That sounds good, but it’s misleading. For e-commerce, where many product pages are naturally going to be lower down until optimized, this can give a false sense of “things are improving” when really you’re just losing data.

Cost and Time Blowup in Tooling + Scraping

Because tools now need to fetch 10 pages (page 1 through page 10) separately to cover what &num=100 used to give in one go, everything costs more: proxy costs, server time, maintenance, API limits. Smaller agencies or internal teams may struggle with budgets. Clients may see tool-price hikes or less frequent updates.

Competitive Insights Get Blurry

E-commerce competition frequently involves spying on competitor product positions deep into SERPs (for example, to find new category keyword opportunities). If you can’t reliably pull data beyond the top few pages, you’ll lose early warning signs of rising competitors or emerging gaps.

What SEOs and Ecom Managers Should Do Now

Here are our tips so you don’t fall into the ravine with clients crying for explanations.

Audit Your Data and Reset Baselines

First thing: check what has changed since early September. Look at historical GSC data, your rank-tracking reports. Mark around September 10-14 as a big “data discontinuity.” If you see desktop impressions halved (or dropped significantly) and average positions improving, that’s likely due to this change rather than a loss of traffic.

Let clients know: “Hey, some changes are tool/reporting noise, not performance collapse.” Build a communication plan so expectations are reconceived.

Re-focus on What Actually Moves the Needle

If you were tracking 100+ positions for “nice to know” insights, deprioritize them for now. Focus on:

  • Keywords already in the top 20-30 (where impression/CTR potential is higher).
  • Product pages with conversion potential (don’t chase vanity keywords too deep).
  • Category pages, core SKUs, and any keyword that drives actual ROI.

Adjust Tooling and Workflow

Talk to your tool providers: many are already working on pagination workarounds. Make sure you know:

  • How often they’ll fetch updates (frequency may drop).
  • Which positions are now reliable, which are semi-reliable.
  • What new costs might appear. You might need to adjust the budget or expectations for clients.

Also, if you or your team does custom scraping, consider moving to more sustainable/lighter-weight methods (with careful respect to Google’s Terms of Service, IP usage, etc.).

Explain It to Clients in Plain English

Your non-SEO-savvy clients won’t care about &num=100 etc. They care about impressions, clicks, conversions, and revenue. So tell them something like:

“We noticed a drop in impressions in Search Console, etc., but we believe part of that is due to Google stopping some background data we used to see (especially lower-position or tool-scraping-driven impressions). Your actual visibility in terms of customer reach has NOT necessarily dropped in the same way.”

Set expectations: The new reporting normal may show fewer impressions and “better average position” stats, but that’s partly a reporting artifact, not just performance.

Keep an Eye on Google Statements

As of now, Google hasn’t officially explained everything. Monitor official blogs, tool-provider updates (Semrush, AccuRanker, etc.), Search Engine Roundtable, etc. Sometimes there are late-breaking clarifications or relief.

Big Picture: What This Change Might Mean Long Term

A few predictions and hopes:

  • We may see higher costs for deep SERP tracking in all established tools. Expect price rises or reduced depth/frequency in “cheap” tiers.
  • SEO strategy may shift even more toward top 10/page 1 rankings and optimizing for high-impact keywords rather than a broad spread of lower ranking ones.
  • Clients and SEOs will value conversion alignment even more: data isn’t just how high you rank but whether people see you, click, and buy. Visibility = leads, not vanity.
  • Tools might build smarter pagination and caching or rely on other signals (e.g., Search Console + smaller SERP scrapes) rather than brute force.
  • Google is possibly exploring ways to limit scraping because those tools also help power bots/LLMs that reuse their content. This could be part of a broader trend to “make SERP access more controlled.”

So yes, this change is annoying (massively). If you’re in e-commerce, your reporting will probably look uglier for a while.

But don’t let it throw you off course. The fundamentals of good SEO — strong product content, keyword targeting, user experience, and conversion focus — still matter more than scraping every possible SERP position.

This is one of those moments when being smart about what you measure, what you optimize for, and how you report wins will separate the good from the great.

Want to know how your site stacks up in the AI era?

Website SEO Grader

Run your site through the SEO Sherpa Grader — it’s free, fast, and brutally honest. You’ll get an instant score plus tailored recommendations to climb higher in search (and maybe even seed a few LLMs along the way).

Article by

If you've been struggling to find a trustworthy SEO agency, your search stops here.

Since 2012, we've been helping startups and world-leading brands like Amazon, HSBC, Nissan, and Farfetch climb to the top of Google. We have one of the best (if not the best) track records in the entire industry.

We are a Global Best Large SEO Agency and a five-time MENA Best Large SEO Agency Winner. We have a 4.9 out of 5-star rating from over 150 reviews on Google.

Get in touch today for higher rankings and more revenue.
Join 37,530+ subscribers and
get access to proven SEO tips
Includes exclusive strategies not found on the blog.

Enjoy this post?
You might like these too

Leave a comment

Leave a Reply

seosherpa
Talk strategy with an expert
Get advice on the best SEO plan to grow your business.
FREE STRATEGY CALL