If you’ve recently reviewed your website statistics in the Site Manager and noticed unusually high visitor numbers, short visits, or sudden spikes, you are not alone. This is a widespread and normal change across the modern internet, not an issue specific to your website.
The short answer
Most of the increase you are seeing comes from automated bots, not real people. These bots are now a constant background presence on the internet, and traditional log-based statistics tools count them very literally.
What changed?
1. Automated traffic has increased dramatically
Over the past year, automated crawlers have increased worldwide due to:
-
AI training and indexing systems
-
Search engine enhancements
-
Content scrapers and SEO scanners
-
Security and availability monitoring tools
These systems routinely visit public websites, often very briefly, and often from rotating IP addresses.
2. Church and nonprofit websites attract more bots than average
Public, content-rich sites such as churches, nonprofits, and educational organizations tend to receive more automated visits because they include:
-
Public text content (sermons, articles, announcements)
-
PDFs (bulletins, calendars, newsletters)
-
Predictable page structures
This makes them easy and attractive targets for automated indexing.
3. Holiday periods amplify bot activity
Traffic spikes frequently appear around holidays such as Christmas and New Year’s. These periods are common scan windows for automated systems and do not indicate an attack or misuse.
Why AWStats numbers look especially high
AWStats is a server log–based statistics tool. This means:
-
Every request is counted (pages, images, files)
-
Bots are counted as “visitors” if they use different IPs or identifiers
-
A single automated system can appear as hundreds or thousands of “unique visitors”
Importantly:
-
Visitors ≠ people
-
Hits ≠ page views
-
Short visits ≠ engagement
Modern bots do not behave like humans, but AWStats counts them anyway.
Why this is not a problem
We monitor all servers continuously for:
-
Performance degradation
-
Excessive resource usage
-
Error spikes
-
Security issues
-
Abuse patterns
If bot traffic were harmful, you would see:
-
Slower page loads
-
Increased errors
-
Bandwidth or CPU exhaustion
When none of these are present, the traffic is considered benign background activity.
Why we don’t simply block all bots
Blocking bots entirely is neither practical nor desirable because:
-
Many bots are legitimate (search engines, accessibility tools, previews)
-
Modern bots rotate IPs and mimic browsers
-
Over-blocking can block real users, mobile carriers, and social media previews
-
Aggressive blocking often creates more problems than it solves
Instead, we focus on:
-
Blocking abusive behavior (login attacks, form spam, POST floods)
-
Rate limiting expensive actions
-
Optimizing servers to handle normal automated traffic efficiently
How to interpret your statistics correctly
When reviewing your reports, focus on:
-
Trends in page views, not visitors
-
Bandwidth consistency
-
Real user engagement tools (forms, emails, calls)
-
Reports from JavaScript-based analytics (which bots typically do not trigger)
High “visitor” numbers alone are no longer a reliable indicator of human traffic.
Bottom line
Seeing high or rising statistics in AWStats today is normal and expected.
-
It does not mean your site is under attack
-
It does not indicate a configuration problem
-
It does not mean someone is stealing your content
-
It does not require action unless performance or security issues appear
If you ever have concerns, our team is happy to review your logs and confirm what you’re seeing.
