Your logo looks familer...you the one that created the sitemap that caused this guy to de-index?
j/k
Is that even possible? can even a bug or glitch in the code cause google to de-list your site?
No, not likely
It is more likely that the person's server is returning a bunch of 404, 503, or 500, which makes the crawlers give up and stop indexing the site, or if the web site starts becoming slow the crawler is also modified.
Blaming the sitemap would really be looking for a scapegoat. I would go to health -> crawl errors in Google Webmaster Tools and check the actual reason why a URL could not be reached.
Another thing to know is that Google has a multi-datacenter multi-tier architecture, it is completely possible that one time you get a number and another time you get another one.
Just look at these two (I just took them right now)
..... wow, between refreshes Google "lost" 10,000 pages.
Is it that they were dropped from the index?
Well, of course not. It just means your search went to a different fleet that doesn't happen to have those 10,000. You are not always going against the same index necessarily.
I do not recommend checking Google Webmaster Tools daily, it will drive you crazy. First of all, it is slow. Your website loads slow on a Monday, it reports that the friday next week, it might show up that some pages are not indexed but you search and there they are ..
It is a GOOD reference, but not a foolproof system. Don't put too much into into it. Optimize for SEO a little bit because it is good sense to do so (you wouldn't create a business but fail to add it to the yellow pages, would you?), and then move on to working on your site, and working on the content, since that is what will keep the visitors and attract even more