there, 4-5 pages was indexed, when I logged into wtools, it tells
me that is unable to fetch robots.txt so gbot will not be coming....
Those 4-5 pages are no longer indexed.
gbot has not been around for 3 days.
wtools robot.txt test says the robots.txt is Allowed.
so what I get from this is that the robots.txt is good.
Any ideas on why gbot can not fetch robots.txt?
I am not using my home page, so I am redirecting home to
however, I am not sure why that would prevent
gbot from fetching the robots.txt