Brilliant, thanks @dutchbb !
From the SEO firm:
"Several /tags/ URLs are showing up in Google's Search Index causing "index bloat"
All tag pages must be made "not-indexable" by way of meta robots tags.
Reference: https://developers.google.com/search/reference/robots_meta_tag
Place the following code into the <head> of all /tags/ URLS:
<meta name="robots" content="noindex" />
Important:
Only add this code to tag pages. Adding this anywhere else could result pages not indexing.
Be sure it is only "noindex"
Make sure the snippet is found in the <head>"
I'll need to dig into the pros/cons of doing this stuff via robots.txt vs. noindex in the header.
Yep, i have the noindex tag on my tags pages. Is correct advise, indeed only noindex. Just add it in the template 'tag_view' like this:
<xen:container var="$head.robots">
<meta name="robots" content="noindex" /></xen:container>
Google prefers noindex if you don't want them in search results (low content pages) and will also remove these faster. If you block in robots.txt google can still find the pages if linked to and takes a little longer for them to be removed. Noindex is the correct signal for removing. You can block in robots or nofollow if adding noindex is not possible or there is no usable content on that page.
Question here is if it matters, google is pretty good at sorting this out themselves