I filmed this walkthrough for users who want to see bot traffic hitting their website while hosting with Cloudways. Here’s the link to the full video:
Cloudways makes it easy to access server-level traffic logs, and this is a great way to identify if bots are hammering your site — which can slow things down or skew analytics.
In the video, I show how to:
- Log into your Cloudways dashboard
- Go to Monitoring > Traffic
- Scroll to see IP addresses, user agents, and request paths
- Spot patterns like repeated requests from non-human agents or huge volumes from single IPs
I also show how to filter these logs for bot-like behavior and how to use tools like IP lookup or Whois to investigate suspicious activity.
This data helps you identify good bots (like Googlebot) vs bad bots (like scrapers). Once you identify problem IPs, you can block them via Cloudways’ firewall or .htaccess rules.
It’s a hidden but powerful layer of protection and insight — especially for content sites or stores getting unusual traffic spikes.
Do you monitor bot activity regularly on your sites?