Robots.txt and sitemap questions

Page speed insights just gave me this message

" robots.txt is not valid - Lighthouse was unable to download a robots.txt file"

It says

Line 2 Content - Disallow * Error - no user agent specified

I think it's working but why does it say it's not valid. I have this:


User-agent: AspiegelBot
Disallow: /

User-agent: AhrefsBot
Disallow: /

User-agent: SemrushBot
Disallow: /

User-agent: DotBot
Disallow: /

User-agent: MauiBot
Disallow: /

User-agent: MJ12bot
Disallow: /

User-agent: ImageSift
Disallow: /

User-agent: AnthropicBot
Disallow: /

User-agent: Yandex
Disallow: /

User-agent: Bytespider
Disallow: /

User-agent: ByteDance
Disallow: /

User-agent: Amazonbot
Disallow: /

User-agent: *
Disallow: /admin.php
Disallow: /account/
Disallow: /goto/
Disallow: /login/
Disallow: /register/
Disallow: /search/
Disallow: /help/
Disallow: /members/

Sitemap: https://www.xxxxxxxxxxxx.com/sitemap.xml
 
Back
Top Bottom