discourse/app/views/robots_txt/index.erb
Robin Ward 3d7dbdedc0 FEATURE: An API to help sites build robots.txt files programatically
This is mainly useful for subfolder sites, who need to expose their
robots.txt contents to a parent site.
2018-04-16 15:43:20 -04:00

19 lines
466 B
Plaintext

<%= @robots_info[:header] %>
<% if Discourse.base_uri.present? %>
# This robots.txt file is not used. Please append the content below in the robots.txt file located at the root
<% end %>
#
<% @robots_info[:agents].each do |agent| %>
User-agent: <%= agent[:name] %>
<%- if agent[:delay] -%>
Crawl-delay: <%= agent[:delay] %>
<%- end -%>
<% agent[:disallow].each do |path| %>
Disallow: <%= path %>
<% end %>
<% end %>
<%= server_plugin_outlet "robots_txt_index" %>