Keep Alive error on Search Console

El Porcharo

Well-known member
My Search Console is complaining about /login/keep-alive URL not being indexed...

It is showing "Submitted URL blocked due to other 4xx issue" which means (according to this page) "The server encountered a 4xx error not covered by any other issue type described here. Try debugging your page using the URL Inspection tool."

I'm aware this is not a page that should be indexed, but I can't find a way to tell Search Console "don't bother, just forget it".

So... is there a way to prevent this kind of system URLs from being indexed at all?
 

Tracy Perry

Well-known member
Yes, there is... but it's a workaround since every URL you report it's being temporarily removed for 6 months only
But if included in robots.txt as a disallow, they should not be re-indexed to be included. I'd check ALL of the URL's that you don't want indexed and include them.

This is my robots.txt and I've really not had an issue with it

Code:
User-agent: PetalBot
User-agent: AspiegelBot
User-agent: AhrefsBot
User-agent: SemrushBot
User-agent: DotBot
User-agent: MauiBot
User-agent: MJ12bot
User-agent: SemRush
Disallow: /

User-agent: Amazonbot
Disallow: /threads/*/reply

User-agent: *
Disallow: /find-new/
Disallow: /account/
Disallow: /attachments/
Disallow: /goto/
Disallow: /posts/
Disallow: /login/
Disallow: /admin.php
Disallow: /lost-password/
Disallow: /misc/contact/
Disallow: /members/
Disallow: /misc/style
Disallow: /whats-new/
Disallow: /search/
Disallow: /misc/language
Disallow: /forums/rss-feeds.25/
Allow: /
 
Top