One of our site admins has asked a question about blocking access from search engines. For business reasons, the site is public, they just don't want it indexed.
They've used robots.txt correctly, and google has obeyed, but some other search engines haven't. On an apache server, we could play with htaccess and probably block them that way, but I read this article https://yoast.com/prevent-site-being-indexed/ with interest, and wondered if it was possible/sensible to implement this header in a parse file in a similar way to what I've done with mime-types.
<MySource_PRINT id_name="__global__" var="content_type" content_type="application/vnd.google-earth.kml+xml" />
Alternatively, is it possible to 'deny' a user group based on certain parameters (eg: user agent) when you have already granted public read? Or is that cart-before-horse?
Sorry, being a bit lazy here, asking before trying stuff out myself. I'm under pressure for quick answers and was just hoping someone had already travelled this path.