Executive Summary

We analyzed 49 months of US browser market share data to answer one question: Can you detect bots by looking at browser version age? Short answer: yes, for some browsers and version patterns — but with important caveats.

3.6%

Edge 87 bot campaign. In January 2024, a single 4-year-old Edge version suddenly accounted for 3.6% of all US web traffic, then declined to near-zero within weeks. A characteristic bot-campaign signature.

0.8%

Firefox 11 — ~130 versions behind current. Firefox 11 accounts for ~0.8% of all US traffic and is still growing. With a version lag of ~130 major versions, no plausible update path or legitimate enterprise use case explains its presence. The traffic pattern is consistent with an automated fleet.

1.1%

Firefox 118 disappeared overnight. Firefox 118 ran at ~1% of all US traffic for over 2 years, then dropped to exactly 0% in January 2026 — a pattern strongly consistent with automated traffic being shut down or migrating to a different version.

29.6%

Chrome October 2025 anomaly. Nearly 30% of all US traffic came from Chrome versions 12+ versions behind current — 3× the annual average. Likely a mix of enterprise users and bot activity.

96%

Edge users are nearly always current. 96% of Edge traffic runs within 2 versions of the latest stable. Blocking old Edge versions is very low-risk for false positives.

88%

Old Firefox traffic is dominated by 3 versions. In 2025-26, just 3 old Firefox versions explain 88% of all old-Firefox traffic — a strong sign of bots using hard-coded user-agent strings.

39%

Safari lag rules will backfire. Using the same approach as Chrome, ~39% of Safari traffic appears "old" — but most of these are legitimate iOS 18 users on current iPhones. Don't apply lag rules to Safari.

16%

Chrome old-tail has approximately doubled (within Chrome). Old Chrome versions (lag ≥ 12) went from ~8.7% of Chrome's own traffic in 2022 to ~16.0% in 2025-26. As a fraction of all US web traffic, that is ~5.0% rising to ~9.7%.

109

Chrome 109: legitimate Windows 7 users. Chrome 109 is the last version that runs on Windows 7 and 8. It maintains ~0.5–1.0% of all US traffic. Do not hard-block Chrome 109 without other signals.

115–128

Firefox ESR users appear "old" but are not bots. Firefox's long-term support (ESR) channel keeps enterprise users on versions 12–25 behind current. ESR traffic (~0.1–0.4% of all traffic depending on the ESR version) is legitimate. Firefox 11 (0.5–0.8%) is not — it is far too old to be any ESR version.

⚠ Important caveat All findings are based on StatCounter aggregate data, which includes both human and bot traffic. We cannot prove any traffic is a bot from this data alone — we infer it from patterns (sudden spikes, extreme version age, concentration on specific versions) that are inconsistent with normal human browser update behavior. Server-side signals (request rate, missing headers, IP reputation) are required for production deployments.

Understanding "Version Lag"

Modern browsers like Chrome, Edge, and Firefox release a new major version roughly every 4 weeks. That means:

Version lag ≈ months behind A browser that is 12 versions behind current is approximately 12 months (1 year) old. A browser that is 30 versions behind is approximately 2.5 years old. (Safari is a special case — see the Safari section.)

We define "version lag" as: current version − this user's version. For example, if Chrome 141 is the current stable and a visitor uses Chrome 109, their lag is 32 — meaning they're running a version approximately 2.5 years old.

Chrome Version Cadence (2025)

To make this concrete, here's what the "current" Chrome version looked like each month in 2025. "Current Chrome" here means the single version with the highest share that month (the statistical mode) — a slightly different measure from the 99.5th-percentile reference used in the lag charts, but useful for building intuition about the version timeline.

MonthCurrent Chrome (mode version)Share of All US TrafficLag 12 Threshold (version ≤)
Jan 2025Chrome 13131.4%Chrome 119 and older
Feb 2025Chrome 13220.3%Chrome 120 and older
Mar 2025Chrome 13421.6%Chrome 122 and older
Apr 2025Chrome 13522.3%Chrome 123 and older
May 2025Chrome 13622.9%Chrome 124 and older
Jun 2025Chrome 13727.4%Chrome 125 and older
Jul 2025Chrome 13831.7%Chrome 126 and older
Aug 2025Chrome 13825.5%Chrome 126 and older
Sep 2025Chrome 14019.6%Chrome 128 and older
Oct 2025Chrome 14119.0%Chrome 129 and older
Nov 2025Chrome 14223.8%Chrome 130 and older
Dec 2025Chrome 14319.1%Chrome 131 and older
Jan 2026Chrome 14323.7%Chrome 131 and older

The takeaway: "lag 12" means roughly "version released more than a year ago." But the right threshold depends on how aggressively the browser auto-updates — and whether legitimate enterprise users might be pinned to older versions. Read on for browser-by-browser guidance.

Why Not Just Block Anything Old?

The risk is blocking real people. Some legitimate reasons a user might be on an older browser version:

✓ The signal to look for Bots often use a single hard-coded version string that never changes. So the red flag isn't just "old version" — it's a very old specific version that shows up repeatedly and in bulk. Edge 87 appearing at 3.6% of all traffic in one month (while prior Edge 87 traffic was under 0.1%) is the kind of anomaly that is strongly indicative of a bot campaign.

Browser Market Share Overview

Our data covers 49 months across four periods. Chrome dominates US traffic, followed by Safari, Edge, and Firefox. A few notable shifts from 2022 to 2025-26:

Browser2022 Avg Share (% all US traffic)2025-26 Avg Share (% all US traffic)Change (pp)
Chrome57.86%59.31%+1.45 pp
Edge13.49%13.07%-0.42 pp
Safari16.35%11.33%-5.02 pp
Firefox6.40%6.54%+0.14 pp
Chrome for Android1.90%4.01%+2.11 pp
Brave0.00%2.73%+2.73 pp
Opera1.26%1.39%+0.13 pp
Other0.85%0.78%-0.07 pp
Mozilla (generic)0.39%0.37%-0.02 pp
IE1.28%0.21%-1.07 pp
Safari (WebKit UA)0.17%0.19%+0.02 pp
360 Safe Browser0.05%0.06%+0.01 pp
SeaMonkey0.00%0.01%+0.01 pp

Key notes: IE fell from 1.28% to 0.21% — IE 11 officially ended support in June 2022, so most remaining IE traffic is non-human. Brave grew from near-zero to 2.73%, and Chrome for Android doubled its share (likely reflecting more mobile measurement).

Monthly browser family share, 2022

Figure 1 — Browser family share, January–December 2022. Y-axis: share of all measured US web traffic (%). X-axis: month. Each line is one browser family (desktop and mobile combined for Chrome). Chrome led at ~58%, Safari at ~16%, Edge at ~13%. Within Chrome and Edge, the rapid succession of individual version releases appears as closely spaced peaks at the ~4-week release cadence.

Monthly browser family share, 2025-26

Figure 2 — Browser family share, January 2025–January 2026. Y-axis: share of all measured US web traffic (%). X-axis: month. Chrome holds steady at ~59%. Safari fell to ~11% (down approximately 5 percentage points from 2022). Brave emerged at ~2.7%. Chrome for Android approximately doubled its reported share. Edge and Firefox remained approximately flat.

Bot Signals in the Data

Three patterns in the data are strongly indicative of bot activity rather than human browser use. They share a common signature: specific old versions appearing in bulk, often with sudden activation and cessation behavior inconsistent with normal software update patterns.

Signal 1: The Edge 87 Anomaly (November 2023 – February 2024)

This is the clearest automated-traffic signal in the dataset. By late 2023, the dominant Edge version was ~119 — putting Edge 87 approximately 32 versions behind current (about 2.5 years' worth of releases at Edge's ~4-week cadence). Edge 87 traffic was otherwise under 0.1% — consistent with deeply outdated usage gradually fading away.

Here's what happened to Edge 87's traffic share:

MonthEdge 87 Share (% all US traffic)Interpretation
Sep 20230.0%Background noise
Oct 20230.2%⚠ Anomalous rise and fall
Nov 20231.3%🚨 Bot campaign active
Dec 20232.9%🚨 Bot campaign active
Jan 20243.6%🚨 Bot campaign active
Feb 20240.6%⚠ Anomalous rise and fall
Mar 20240.0%Background noise
Apr 20240.0%Background noise
🚨 Why this is highly consistent with automated traffic Normal browser traffic for a 3-year-old version would be a tiny, slowly declining fraction. Instead, Edge 87 went from under 0.1% to 3.6% in 3 months (a 120× increase), then declined to below 0.1% in 3 weeks. No organic user population behaves this way. The concentration metric (HHI) for old Edge versions hit 4,626 out of 10,000 in 2024 — meaning traffic was extremely concentrated on this one version.

Recommendation: Hard-block Edge 87 outright. Also block the legacy "EdgeHTML" versions (Edge 15, 18, 19) — these pre-date the 2020 redesign and have been end-of-life since 2021.

Signal 2: Firefox 11 — The Growing Bot Fleet

Firefox 11 was released in 2012. Current Firefox in 2026 is version 140+. That's a lag of approximately 129 versions — approximately 10+ years behind current. There is no realistic scenario in which a human user has been running Firefox 11 uninterrupted since 2012.

What makes this more alarming is that Firefox 11 traffic is growing:

YearAvg Monthly Share (% all US traffic)Peak Month Share (% all US traffic)Change from 2022
20220.1%0.1% (Jun 2022)
20230.1%0.4% (Dec 2023)
20240.2%0.4% (Feb 2024)
2025-260.5%0.8% (Jan 2026)
🚨 Why this traffic is strongly indicative of bot activity Firefox 11's share has grown 8× in four years. No browser with a ~130-version lag gains organic users over time — this pattern strongly suggests new automated infrastructure being deployed with a hard-coded Firefox 11 user-agent string. The spike to ~0.8% of all US traffic in 2025 (approximately equalling Firefox's entire remaining legitimate share) strongly suggests an active, growing scraper fleet.

Signal 3: Firefox 118 — The Hidden Campaign That Just Ended

This is a more subtle signal we discovered when looking at which specific old Firefox versions dominate traffic. Firefox 118 was released in September 2023 — a perfectly normal browser version at the time. But something unusual happened: it maintained roughly 1% of all US web traffic for over two years, then dropped to exactly 0% in January 2026 — the last month in our dataset.

PeriodAvg Monthly Share (% all US traffic)Notes
Oct 2023 (launch)2.5%Firefox 118 just released — peak normal use
2024 (all year)0.8%Should be declining naturally...
2025 (Jan–Dec)1.2%Still at ~1.0%+ — elevated; probable non-human traffic
Jan 20260.0%Drops to exactly 0% in one month
🚨 A strongly anomalous pattern A browser version released in 2023 that maintains approximately 1% of all US traffic through all of 2024 and 2025 — without the gradual decline expected from normal user adoption — and then drops to exactly zero in one month, is highly consistent with automated traffic: either a campaign shutting down, migrating to a newer version, or a data-collection methodology change — but not with any organic human browser update pattern. By late 2025, Firefox 118 was approximately 25 versions behind current stable.

Signal 4: Chrome October 2025 Anomaly

In October 2025, nearly 30% of all US web traffic came from Chrome versions 12+ versions behind current (compared to the annual average of ~16%). This is the largest single-month spike in the dataset.

Chrome old-tail share over time

Figure 3 — Chrome old-version traffic over time (lag ≥ 12, 99.5th-percentile reference). Y-axis: share of all US web traffic (%) from Chrome versions more than approximately one year behind current stable. X-axis: month within each year. Each colored line is one calendar year. The October 2025 spike (~29.6%) was approximately 3× the 2025-26 annual average (~9.7%).

The main contributors in October 2025:

VersionShare, Oct 2025 (% all US traffic)Approx AgeAssessment
Chrome 14119.0%Current stableNormal ✓
Chrome 12512.8%~17 months old (released May 2024)Elevated lag; probable non-human traffic. Also spiked in Dec 2024. Possible automation or pinned fleet.
Chrome 13011.1%~5 months oldElevated lag; consistent with delayed update cycle. Could be enterprise or extended-stable users.
Chrome 1170.7%~1 year oldElevated lag; probable non-human traffic
Chrome 1090.5%~2.5 years oldWindows 7 and 8 legacy (likely legitimate)
Chrome 790.4%~5 years oldStrong bot signal
Chrome 830.3%~4.5 years oldStrong bot signal

Chrome 125 (12.8%) is particularly notable: it was released in May 2024 — 17 months before October 2025, placing it 16 versions behind current. It had also spiked anomalously in December 2024 before fading out, then resurged in October 2025. Chrome's Extended Stable channel only extends support by ~8 weeks (2 versions), not 16. This level of persistence for a ~17-month-old version suggests large pinned or legacy fleets, measurement artifacts, or automation using Chrome 125 user-agent strings — not a plausible Extended Stable explanation. Chrome 130 (~5 months old, lag 11) is a more defensible enterprise candidate.

How Concentrated Is the Old-Version Traffic?

Throughout this section, "old-version traffic" means traffic from browsers with a version lag of 12 or more — approximately one year or more behind the current stable release (99.5th-percentile reference). This is the lag ≥ 12 tail visible in the bucket charts above.

One way to distinguish bot traffic from slow-updating humans: bots tend to use one specific version, while humans spread out across multiple adjacent versions. We measure this with the Herfindahl-Hirschman Index (HHI), a standard concentration metric — a higher score means traffic is more concentrated on fewer specific versions.

Interpreting HHI scores (range 0–10,000) 0–1,000 — Traffic distributed across many versions; consistent with organic update diversity.
1,000–2,500 — Moderately concentrated; warrants monitoring.
2,500–10,000 — Heavily concentrated on 1–3 specific versions; strong indicator of automated traffic using hard-coded user-agent strings.
HHI concentration of old-version traffic

Figure 4 — Version concentration (HHI) of old-version traffic, Chrome / Edge / Firefox, all years. Y-axis: Herfindahl-Hirschman Index (HHI, 0–10,000) applied to the lag ≥ 12 tail of each browser. A higher HHI means old-version traffic is more concentrated on fewer specific versions — a pattern more consistent with automated fleets using hard-coded user-agent strings than with organic slow-updating users. X-axis: month. Each colored line is one calendar year. The red dashed line marks HHI = 2,500, the conventional "concentrated" threshold. Edge's 2024 spike corresponds to the Edge 87 campaign; Firefox's upward trend reflects the growing Firefox 11 and Firefox 118 fleets.

BrowserPeriod Concentration Score (HHI) Top 1 Version (% of lag ≥ 12 tail) Top 3 Versions (% of lag ≥ 12 tail)
Chrome202279516.0%36.1%
Chrome2025-2694419.2%39.7%
Edge20222,20434.2%63.4%
Edge2025-264,30260.1%86.6%
Firefox20221,03422.3%45.0%
Firefox2025-263,60553.6%88.1%

What Version Lags Should We Actually Block?

Here's the practical guide, browser by browser. The key insight is that the right lag threshold is different for each browser, and for some browsers (Firefox especially) there are specific version numbers that are better hard-block candidates than a blanket lag rule.

All lag values in this section use the 99.5th-percentile reference version — the version at or below which 99.5% of a browser family's traffic falls in a given month. This approach handles small fractions of pre-release (Beta/Canary) traffic more robustly than using the single highest-share version.

Version lag bucket distribution comparison

Figure 5 — Version lag bucket distribution by browser, 2022 vs 2025-26. Y-axis: share of each browser's own traffic (%) in each lag bucket. Bars within a browser sum to 100%. X-axis: lag bucket. Two bars per cluster: 2022 (blue) and 2025-26 (orange). Note: these are percentages within each browser's traffic, not of all US traffic. Edge remains concentrated in the 0–2 lag bucket. Chrome and Firefox have shifted toward older buckets between 2022 and 2025-26.

Chrome

Release cadence: ~4 weeks/version

Auto-update: Yes, but enterprise can freeze

Version lag guide:

  • Lag 0–5 — Normal. Don't block. (~91% of Chrome)
  • Lag 6–11 — Slow updaters. Log only, no block.
  • Lag 12–17 — Elevated lag. Soft-challenge or rate-limit. (~7% of Chrome)
  • Lag 18–24 — Unlikely organic. Soft-block unless Chrome 109.
  • Lag 25+ — Hard-block (carve out Chrome 109). (~2–3% FP)

Always hard-block: Chrome 79, 83, 87 (persistent bot versions with 4–5 year lag)

Never hard-block on lag alone: Chrome 109 (Windows 7 and 8 legacy users, ~0.5–1.0% of all traffic)

Edge

Release cadence: ~4 weeks/version (synced with Chrome)

Auto-update: Very aggressive — 96% of Edge users are current

Version lag guide:

  • Lag 0–2 — Normal. Don't block. (~96% of Edge)
  • Lag 3–8 — Unusual but some enterprise. Log only.
  • Lag 9+ — Hard-block. Only ~2% of real Edge users here.
  • Lag 21+ — Hard-block. <1% FP rate.

Always hard-block:

  • Edge 87 — strong automated-traffic signal (see above)
  • Edge 15, 18, and 19 — legacy EdgeHTML browser, EOL 2021
Firefox

Release cadence: ~4 weeks/version

⚠ Important: Firefox has an ESR (Extended Support Release) channel — a supported older version for enterprise IT. The current ESR in this dataset ranges from Firefox 115 to 128 (lag ~12–25 depending on month). Do not blanket-block based on lag alone; you will hit ESR users.

Version lag guide:

  • Lag 0–15 — Don't block (may include Firefox 128 ESR, lag ~12–19 in late 2025)
  • Lag 16–29 — Caution. Firefox 115 ESR (now expired but still present at 0.27%) lives here.
  • Lag 30+ — Soft-block. (<5% FP for actual humans)
  • Lag 60+ — Hard-block. Versions like Firefox 72, 78, 44 (2016–2020). Almost certainly non-human traffic.

Always hard-block these specific versions:

  • Firefox 11 — ~130 versions behind current, traffic still growing (automated fleet)
  • Firefox 118 — ran at ~1% for 2 years, dropped to 0% overnight (Jan 2026). Likely migrating to new version — watch for successor.
  • Firefox 44, 52, 59, 72, 78 — confirmed persistent bot versions

Policy Tiers

Here are three readymade policy options depending on your risk tolerance. "False positive rate" (FP rate) refers to the percentage of browser traffic in a given category that would be blocked or challenged — lower is better for user experience.

⚠ FP rates are computed over all traffic, including non-human traffic StatCounter data cannot separate human from automated visits. The FP rates below are calculated over all traffic in each browser category, including any bot traffic already present in that category. The true false positive rate for human visitors is therefore lower than the figures shown — particularly for Chrome and Firefox, where the old-version tail contains substantial non-human traffic. Edge FP rates are more reliable in this respect, as its old-version tail is small and predominantly non-human.
✓ Conservative Policy — Near-zero false positives Block only the versions that are demonstrably bots. Very low risk of affecting real users.
BrowserRuleExamples from DataEst. False Positive Risk
Chrome Block major ≤ 70 Chrome 70 released Oct 2018 — 7+ years old. Traffic at ~0.1% is consistent with non-human traffic. Near zero
Firefox Block Firefox 11, 44, 52, 59, 72, 78, 118 explicitly All confirmed persistent bot versions from this dataset Near zero
Edge Block Edge 87, and Edge 15, 18, 19 Edge 87 = confirmed campaign; EdgeHTML versions (15, 18, 19) = EOL 2021 Near zero
IE Block all Internet Explorer IE 11 EOL June 2022. Any remaining IE traffic is almost certainly non-human. Near zero
Safari Block Safari ≤ 12 and "Safari 604.1", "Safari 537.36" UA strings Safari 12 = 2018; fake WebKit UA strings are spoofed headers Near zero
⚖ Balanced Policy — Low false positives, catches more bots Apply a lag threshold per browser. Use "soft" actions (rate-limit, CAPTCHA challenge) rather than hard blocks where false-positive rates are above 1%.
Browser≤1% FP: min lag≤2% FP: min lag≤5% FP: min lag
Chrome (2025-26)>24 (no threshold)>24Lag ≥ 25–30*
Edge (2025-26)Lag ≥ 21Lag ≥ 9Lag ≥ 3
Firefox (2025-26)>24 (bot-inflated)>24>24*
SafariUse conservative hard-block list only — no lag rule

* Chrome and Firefox cannot achieve a sub-5% FP threshold at any lag ≤ 24 because their old-version tails contain substantial non-human traffic (see caveat above). Chrome lag ≥ 25 is a reasonable soft-challenge starting point. For Firefox, prefer specific version hard-blocks over a blanket lag rule.

🔒 Strict Policy — Aggressive blocking, higher false-positive risk Appropriate when old-version traffic is already elevated in other signals (high request rate, missing Accept-Language headers, no cookie support). Combine with behavioral signals for best results.
BrowserThresholdExpected FP RateJustification
ChromeLag ≥ 12 (~1 year old)~16.0% of Chrome trafficHHI rising; Oct 2025 spike shows bot-heavy months. Pair with rate signals to reduce FP.
EdgeLag ≥ 12<1.0% of Edge trafficEdge users are nearly always current. Very safe to block.
FirefoxLag ≥ 12 + explicit version list~6.0% of Firefox* (but mostly non-human)Old Firefox tail is heavily concentrated on known bot versions. Most "FP" here are non-human traffic.
SafariHard-block list only~0.2% of SafariSafari 12 and older, fake WebKit UAs
💡 Practical recommendation Start with the Conservative policy (hard-blocks only) and add monitoring. Once you can see the volume of traffic hitting each rule, promote the higher-confidence cases to the Balanced thresholds. Save the Strict policy for endpoints where you've already validated elevated bot activity through other signals. Never apply a blanket lag rule to Safari.

The Safari Problem

Safari is fundamentally different from Chrome, Edge, and Firefox when it comes to version-based anti-bot rules. Here's why:

Safari's major version is tied to the operating system. When Apple releases a new version of iOS or macOS, Safari gets a new major version number. You can't upgrade Safari independently — you upgrade your entire phone or computer. This creates a very different traffic pattern:

Note: Shares below are aggregated across all sub-versions (e.g., Safari 18.0 + 18.1 + 18.2 … = "Safari 18"). The lag column uses the 99.5th-percentile reference, which shifts mid-period (see note below table).

Safari VersionTied ToAvg Share (2025-26, all sub-versions)Under lag-based rule...
Safari 18iOS 18 (Sep 2024)6.4% (largest by far)⚠ Lag 8 — flagged as "old" (Aug 2025 onward)
Safari 17iOS 17 (Sep 2023)2.1%Lag 9 — flagged
Safari 16iOS 16 (Sep 2022)1.2%Lag 10 — flagged
Safari 26iOS 26 and macOS 26 (2025)0.7%Current — fine ✓
Safari 15iOS 15 and macOS 120.6%Lag 11 — blocked
Safari 14iOS 14 and macOS 110.2%Lag 12 — blocked
Safari 13iOS 13 and macOS 10.150.1%Lag 13 — blocked
⚠ Reference version instability makes Safari lags unreliable In this dataset, the "current" Safari reference (using the 99.5th-percentile method) is Safari 18 from January through July 2025, then jumps to Safari 26 from August 2025 onward when iOS 26 and macOS 26 launched. This mid-period shift means the same user on Safari 17 would have lag 1 (not flagged) in June 2025 but lag 9 (flagged) in September 2025 — with no change on their end. Lag-bucket results for Safari are therefore not stable across the period and should not be used for blocking decisions.
⚠ The scale of the problem Safari 18 (all sub-versions combined) accounts for 6.43% of all US web traffic in 2025-26 — the dominant Safari version. From August 2025 onward it falls in the "lag 8" bucket under our 99.5th-percentile reference. Blocking at lag ≥ 8 would challenge the majority of current iOS 18 iPhone users. Under our lag-based analysis, 38.8% of all Safari traffic appears "old" — but most of this is legitimate iOS 17 and 18 users on current Apple devices.

There's a secondary complication: some very old Safari user-agent strings (like "Safari 604.1" or "Safari 537.36") are actually Blink/WebKit engine version numbers that appear in old or spoofed user-agent strings, not real Safari versions. These are worth blocking separately as they represent either ancient browsers or deliberate spoofing.

Bottom line for Safari: Stick to hard-blocking specific very old versions (Safari 12 and below, released before 2019) and fake WebKit UA strings. Do not apply a "lag ≥ N" rule to Safari based on major version numbers.

What StatCounter Actually Measures — And Why It Matters for Your Access Logs

This is one of the most important things to understand before acting on this data. StatCounter and your web server's access logs are measuring completely different things, and mixing them up can lead to bad decisions.

StatCounter measures JavaScript page views, not HTTP requests

StatCounter works by embedding a small JavaScript tag on participating websites. Every time a page fully loads and that JavaScript executes, StatCounter records one data point — one "page view." This means:

What happenedStatCounter recordsYour access log records
A human visits one page on a website 1 page view ~30–100 HTTP requests (HTML + images + CSS + JS + fonts + API calls)
A bot crawls 500 pages on a website, executing JS on each 500 page views ~15,000–50,000 HTTP requests
A bot crawls 500 pages but doesn't execute JavaScript 0 page views (invisible to StatCounter) 500–1,000 HTTP requests (just the HTML)
A bot checks one URL repeatedly every 5 minutes for a day 0–288 page views (depends on whether it re-executes JS) 288 HTTP requests (just for the HTML; plus sub-resources each time)
⚠ "1% market share" does not mean "1% of your HTTP requests" If Firefox 11 has 0.8% of StatCounter page views, that does not mean it will appear at 0.8% of entries in your access log. The actual percentage depends heavily on how many pages each Firefox 11 session visits — which for a scraper bot could be thousands of times more than a human visitor.

The Pagination / Faceted Search Trap Problem

This is particularly relevant if your website has:

A bot following links through faceted search can visit an effectively unlimited number of unique URLs on your site. Each page it loads (if it executes JavaScript) becomes one StatCounter data point — but generates dozens of HTTP requests in your access log for the page itself, plus all its assets.

🚨 The amplification effect Imagine a bot that visits 10,000 unique faceted-search pages on your site, each page generating 50 HTTP requests. That bot would contribute 10,000 page views to StatCounter (inflating its browser version's apparent "market share") and 500,000 HTTP requests to your access log. Meanwhile, a real human visitor might generate 5 page views and 250 HTTP requests. The bot-to-human ratio in StatCounter (10,000 : 5 = 2,000×) looks very different from the bot-to-human ratio in your bandwidth bill (500,000 : 250 = 2,000× — same in this case, but the absolute numbers are wildly different from what StatCounter suggests).

What This Means for Old Browser Version Analysis

There are two opposing distortions to be aware of:

1. StatCounter may overstate old-browser version share relative to unique visitors. If bot crawlers using old browser UA strings visit many pages per session (especially in a faceted-search trap), their version's "share" in StatCounter will be inflated relative to the number of actual bot instances. The 0.8% "Firefox 11 share" could represent a handful of bot processes visiting millions of pages, not millions of unique Firefox 11 browser installations.

2. StatCounter may understate the impact on your server load. Each of those many page views triggers dozens of HTTP requests. So if StatCounter shows Firefox 11 at 0.8% of page views, the actual percentage of HTTP requests hitting your server could be much higher — especially if those bots are crawling deep into your pagination or faceted search.

3. Bots that don't execute JavaScript are completely invisible to StatCounter. Many scrapers and crawlers fetch raw HTML without executing JavaScript at all. These would appear in your access logs but not in any StatCounter data. This means StatCounter's numbers are a lower bound on bot activity — the real bot traffic hitting your server is likely higher than what StatCounter can see.

💡 Practical implication for blocking decisions The version-lag analysis in this report is useful for identifying which browser versions are likely bots. But to understand the actual volume and cost of that bot traffic on your infrastructure, you need your own access logs — not StatCounter. Look for old browser UA strings in your logs, then measure: how many requests per session? Are they hitting the same content repeatedly? Are they following pagination into thousands of pages? That's where the real damage assessment lives.

A Note on StatCounter's Own Bot Filtering

StatCounter states that it filters known bots and crawlers. However: its filtering is based on known bot signatures (user-agent strings, IP ranges), not on behavioral analysis. Novel or disguised bots using legitimate browser UA strings (like the Firefox 11 and Edge 87 campaigns in this dataset) may pass through StatCounter's filters and be counted as real traffic. This means some of what appears to be "human market share" in this dataset is almost certainly bot traffic that StatCounter could not identify.

Important Limitations

1. This data includes bots. StatCounter measures JavaScript-rendered page views. Many bots do render JavaScript. StatCounter filters some known bots, but the methodology is proprietary and incomplete. Our "false positive rates" include bot traffic in the denominator — real human FP rates are likely lower than reported.
2. We can't actually confirm what's a bot. All bot inferences are probabilistic — based on patterns (extreme age, sudden activation and cessation, concentration on specific versions) inconsistent with human behavior. To confirm bot status, you need server-side data: request rate, TLS fingerprint, header completeness, JavaScript execution evidence, IP reputation.
3. Chrome 109 is a legitimate exception. Chrome 109 is the last Chrome version supporting Windows 7 and Windows 8.1. Its consistent ~0.5–1% share throughout 2025-26 (slowly declining) is consistent with enterprise legacy hardware. Any lag-based Chrome rule should explicitly carve out Chrome 109.
4. Firefox ESR versions are legitimate. Firefox ships an Extended Support Release (ESR) for enterprise users, updated roughly every ~14 months. The ESR versions active during this dataset (Firefox 115 and 128) appear at lag 12–25+ behind the current standard release, meaning a strict lag rule would incorrectly flag real enterprise users. Treat Firefox lag rules cautiously; prefer explicit version hard-block lists over blanket lag thresholds for Firefox.
5. Enterprise browser freezes are real. Both Chrome and Edge support enterprise policies that freeze browser versions. The Chrome Extended Stable channel adds up to 8 weeks. Some large organizations have IT policies that update quarterly or on a longer schedule. These users would appear as "old" in lag-based analysis.
6. This is US traffic only. StatCounter data is US-focused. Bot traffic patterns and browser version distributions may differ significantly in other countries.

Methodology (Brief)

Data Source

All data in this report is sourced from Statcounter Global Stats (© Statcounter 1999–2026), licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License. Use of this data is credited to Statcounter as required by their terms.

StatCounter measures page views — not unique visitors or raw HTTP requests — collected from a network of over 1.5 million websites representing approximately 5 billion page views per month. The data reflects JavaScript-rendered page views only; traffic from automated clients that do not execute JavaScript is not captured. Statistics are subject to quality assurance revision for 45 days from initial publication.

Four monthly CSV exports covering January 2022 through January 2026 were downloaded from the StatCounter "US browser version" report and loaded into the analysis pipeline. Files were converted from wide format (one column per browser version) to long format with one row per month per browser version. Special cases were handled as follows:

The "current" version reference for each browser was computed two ways: (A) the 99.5th-percentile major version by traffic share that month (handles beta/canary noise), and (B) the single highest-traffic version (mode). Results are reported using Method A.

Outlier months were flagged if lag≥12 traffic exceeded the period's median + 3× the median absolute deviation. Concentration was measured using the Herfindahl-Hirschman Index (HHI) — a standard measure of market concentration — applied to version shares within the old-version tail.

All analysis code is in statcounter_browser_version_analysis.py. Raw data tables are in the output/ directory.