I had this recently when a popular thread overloaded the servers in my (becoming former) shared hosting environment. I attributed all of mine to the site being off-line for two days and otherwise unavailable while the DNS for the new hosting service was being propagated. Note, however, that their email suggest that overloading alone might do this. Support at the new host was also looking at the errors in Webmaster Tools and installed a default robots.txt for me. I'm not sure that matterd -- I've never wanted to block bots, so I never bothered to set up an robots.txt files.
If 1059 errors is only 3%, you're getting a lot of bot activity!