Earlier today I decided to add this to my robots.txt file and things have been running much smoother and faster since. It may prove useful to others here also to limit all spiders (not just some) if your on shared hosting. It may also benefit you to stop using things like "TwitterFeed" to auto send all threads into Facebook and twitter e.t.c. Which only sends lots of spiders on your sites none stop slowing things down, "Shared hosting" isn't really cut out for it!
I use this in my robots.txt (crawl delay) being the thing I'm talking about here. If your on shared hosting it might pay to start limiting them.
User-agent: * Crawl-delay: 10 Disallow: /cgi-bin/