mirror of
https://github.com/discourse/discourse.git
synced 2024-12-05 10:03:43 +08:00
7bd3986b21
We have a couple of site setting, `slow_down_crawler_user_agents` and `slow_down_crawler_rate`, that are meant to allow site owners to signal to specific crawlers that they're crawling the site too aggressively and that they should slow down. When a crawler is added to the `slow_down_crawler_user_agents` setting, Discourse currently adds a `Crawl-delay` directive for that crawler in `/robots.txt`. Unfortunately, many crawlers don't support the `Crawl-delay` directive in `/robots.txt` which leaves the site owners no options if a crawler is crawling the site too aggressively. This PR replaces the `Crawl-delay` directive with proper rate limiting for crawlers added to the `slow_down_crawler_user_agents` list. On every request made by a non-logged in user, Discourse will check the User Agent string and if it contains one of the values of the `slow_down_crawler_user_agents` list, Discourse will only allow 1 request every N seconds for that User Agent (N is the value of the `slow_down_crawler_rate` setting) and the rest of requests made within the same interval will get a 429 response. The `slow_down_crawler_user_agents` setting becomes quite dangerous with this PR since it could rate limit lots if not all of anonymous traffic if the setting is not used appropriately. So to protect against this scenario, we've added a couple of new validations to the setting when it's changed: 1) each value added to setting must 3 characters or longer 2) each value cannot be a substring of tokens found in popular browser User Agent. The current list of prohibited values is: apple, windows, linux, ubuntu, gecko, firefox, chrome, safari, applewebkit, webkit, mozilla, macintosh, khtml, intel, osx, os x, iphone, ipad and mac.
16 lines
396 B
Plaintext
16 lines
396 B
Plaintext
<%= @robots_info[:header] %>
|
|
<% if Discourse.base_path.present? %>
|
|
# This robots.txt file is not used. Please append the content below in the robots.txt file located at the root
|
|
<% end %>
|
|
#
|
|
<% @robots_info[:agents].each do |agent| %>
|
|
User-agent: <%= agent[:name] %>
|
|
<% agent[:disallow].each do |path| %>
|
|
Disallow: <%= path %>
|
|
<% end %>
|
|
|
|
|
|
<% end %>
|
|
|
|
<%= server_plugin_outlet "robots_txt_index" %>
|