discourse/app/views/robots_txt/no_index.erb
Sam Saffron bb4e8899c4
FEATURE: let Google index pages so it can remove them
Google insists on indexing pages so it can figure out if they
can be removed from the index.

see: https://support.google.com/webmasters/answer/6332384?hl=en

This change ensures the we have special behavior for Googlebot
where we allow indexing, but block the actual indexing via
X-Robots-Tag
2020-05-11 12:15:18 +10:00

14 lines
488 B
Plaintext

# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# Googlebot must be allowed to index so it can remove items from the index
# we return the X-Robots-Tag with noindex, nofollow which will ensure
# indexing is minimized and nothing shows up in Google search results
User-agent: googlebot
Allow: <%= Discourse.base_uri + "/" %>
Disallow: <%= Discourse.base_uri + "/uploads/*" %>
User-agent: *
Disallow: <%= Discourse.base_uri + "/" %>