Yes. According to Imperva’s 2025 Bad Bot Report, automated activity accounted for 51% of all web traffic in 2024, meaning bots collectively generated more web requests than humans. Within that automated share, Imperva estimates that 37% of all web traffic came from bad bots and about 14% from good bots like search engine crawlers. This is based on Imperva’s measurements across the websites and APIs it protects, not the entire internet by volume.
It does not mean AI chatbots are posting more than people on social media. Most bot traffic is machine-to-machine activity such as crawling, scraping, security probing, and API calls.
What does “bots surpassed humans in 2024” actually mean?
In these reports, automated traffic refers to any non-human web request. That includes both good bots, for example search engine crawlers, uptime monitors, and accessibility tools, and bad bots, for example scalpers, credential-stuffing tools, scrapers that violate site rules, click-fraud bots, and components of distributed denial-of-service attacks.
Imperva reports that “automated traffic has surpassed human activity, accounting for 51% of all web traffic,” with “bad bots now making up 37% of all internet traffic.” Source
The key point is scope. These figures describe share of web requests observed across Imperva-protected applications. They are not a measure of global internet bandwidth. By data volume, human streaming and downloads still dominate. By request count, automated hits are more numerous.
How do companies measure bot versus human traffic?
Bot management vendors classify traffic using a mix of signals:
- Network and device fingerprints, for example IP reputation, autonomous system, data center origins, headless browsers
- Behavioral patterns, for example impossible click speeds, perfectly periodic request intervals, or scripted cursor movements
- Protocol anomalies, for example malformed headers, reused tokens, or automated API key abuse
- Challenge responses, for example proof-of-work, JavaScript challenges, or cryptographic tokens that most bots fail
Those classifications are applied to HTTP requests hitting the vendor’s customers. Different networks see different mixes of traffic, so estimates vary across providers. Imperva also notes this was the first time in a decade that automated traffic exceeded human activity in its data, which aligns with earlier cycles where bots were majority before falling back.
Are these bots the same as AI accounts on social media?
Mostly not. The majority of automated traffic is background infrastructure, for example crawlers, scrapers, monitoring, security scans, and automated API clients. A smaller slice involves engagement manipulation and influence operations, for example spam accounts, fake likes or follows, and coordinated inauthentic behavior. That activity exists, but it is only one part of the automated share and is not what pushes the total past 50% on its own.
“Bots” in traffic reports primarily means software making web requests, not AI personas arguing with you. Social bot activity represents a subset of automated traffic.
Why does rising bot traffic matter?
- Security risk, bad bots target login pages, checkout flows, ticketing, and APIs to commit fraud, test stolen credentials, or scrape competitive data
- Operational load, automated hits can overwhelm infrastructure and degrade performance for real users
- Skewed analytics, metrics like pageviews, bounce rate, and conversion can be distorted if bot filtering is weak
- Advertising waste, invalid or non-human impressions and clicks drain budgets in programmatic advertising
For organizations, a higher automated share raises the baseline level of noise and risk that application and security teams must handle.
What are the limitations of the claim?
- Vendor scope, Imperva’s percentages are drawn from traffic on properties it protects, other networks may see different ratios
- Web requests, not total internet bandwidth, the report is about web and API requests, not all IP traffic like video streaming or gaming
- Counting requests, not people, a single bot can generate thousands of requests per minute, which inflates its share by count
- Not necessarily the first time ever, Imperva frames 2024 as the first time in a decade bots led again, earlier years also saw bots in the majority
- Definitions vary, providers differ on what they classify as good bots, bad bots, and suspicious traffic
The takeaway is direction, not an exact universal percentage. Multiple large providers, including Imperva and Cloudflare, report significant and growing automated traffic, even if the precise numbers differ.
What can websites and users do about bot traffic?
- Harden logins and APIs, use rate limiting, multi-factor authentication, credential stuffing protections, and token-based verification
- Adopt bot management, deploy challenge-response techniques and behavioral analysis that minimize friction for humans
- Filter analytics, enable bot filtering and exclude known data center and crawler ranges to keep metrics trustworthy
- Protect ads and attribution, follow Media Rating Council invalid traffic guidelines and work with vendors that provide IVT detection and auditing
- Publish and enforce rules, use robots.txt for good bots, set clear terms for scraping, and monitor for abuse
