OK, since installing XenForo yesterday there have been times when things are running slow, but looking at the online list there have been quite a few bots present. At times I've hit my server resources limit (they use the tool on my site) showing too many connections. Earlier today I decided to add this to my robots.txt file and things have been running much smoother and faster since. It may prove useful to others here also to limit all spiders (not just some) if your on shared hosting. It may also benefit you to stop using things like "TwitterFeed" to auto send all threads into Facebook and twitter e.t.c. Which only sends lots of spiders on your sites none stop slowing things down, "Shared hosting" isn't really cut out for it! I use this in my robots.txt (crawl delay) being the thing I'm talking about here. If your on shared hosting it might pay to start limiting them. Code: User-agent: * Crawl-delay: 10 Disallow: /cgi-bin/ Can you move this to TIP's for people using shared hosting. I've posted it in the wrong forum by mistake.