Robots.txt and sitemap questions

Page speed insights just gave me this message

" robots.txt is not valid - Lighthouse was unable to download a robots.txt file"

It says

Line 2 Content - Disallow * Error - no user agent specified

I think it's working but why does it say it's not valid. I have this:


User-agent: AspiegelBot
Disallow: /

User-agent: AhrefsBot
Disallow: /

User-agent: SemrushBot
Disallow: /

User-agent: DotBot
Disallow: /

User-agent: MauiBot
Disallow: /

User-agent: MJ12bot
Disallow: /

User-agent: ImageSift
Disallow: /

User-agent: AnthropicBot
Disallow: /

User-agent: Yandex
Disallow: /

User-agent: Bytespider
Disallow: /

User-agent: ByteDance
Disallow: /

User-agent: Amazonbot
Disallow: /

User-agent: *
Disallow: /admin.php
Disallow: /account/
Disallow: /goto/
Disallow: /login/
Disallow: /register/
Disallow: /search/
Disallow: /help/
Disallow: /members/

Sitemap: https://www.xxxxxxxxxxxx.com/sitemap.xml
 
Old thread, but a question: xenforo.com robots.txt has Disallow: /community/whats-new/ - what's the reasoning behind that?
 
Old thread, but a question: xenforo.com robots.txt has Disallow: /community/whats-new/ - what's the reasoning behind that?
That's essentially a search landing page that generates dynamic content (based on new posts/content, etc). You probably don't want crawlers hitting search links such as that one, which can eat up resources. That's not where your actual content is, so there is no real harm is keeping it on disallow.
 
Back
Top Bottom