* Phase 0 for user-selectable theme components
- Drops `key` column from the `themes` table
- Drops `theme_key` column from the `user_options` table
- Adds `theme_ids` (array of ints default []) column to the `user_options` table and migrates data from `theme_key` to the new column.
- Removes the `default_theme_key` site setting and adds `default_theme_id` instead.
- Replaces `theme_key` cookie with a new one called `theme_ids`
- no longer need Theme.settings_for_client
* Add possibility to add hidden posts with PostCreator
* FEATURE: Create hidden posts for received spam emails
Spamchecker usually have 3 results: HAM, SPAM and PROBABLY_SPAM
SPAM gets usually directly rejected and needs no further handling.
HAM is good message and usually gets passed unmodified.
PROBABLY_SPAM gets an additional header to allow further processing.
This change addes processing capabilities for such headers and marks
new posts created as hidden when received via email.
3 years is a very conservative limit that allows for a very wide buffer
for year-over-year analysis. The max is set to 5 years because that is
the policy listed for logging in hosted Discourse.
- adds a link to search log
- display a text if log search queries is disabled
- adds link to trust level and user types
- adds a description for eeach report when browsing a report directly
* Feature: Push notifications for Android
Notification config for desktop and mobile are merged.
Desktop notifications stay as they are for desktop views.
If mobile mode, push notifications are enabled.
Added push notification subscriptions in their own table, rather than through
custom fields.
Notification banner prompts appear for both mobile and desktop when enabled.
In some cases add_groups and remove_groups is too much work, some sites
may wish to simply synchronize group membership based on a list.
When sso_overrides_groups is on all not automatic group membership is
sourced from SSO. Note if you omit to specify groups, they will be cleared
out.
Also moved to default crawl delay bing so no more than a req every 5 seconds is allowed
New site settings:
"slow_down_crawler_user_agents" - list of crawlers that will be slowed down
"slow_down_crawler_rate" - how many seconds to wait between requests
Not enforced server side yet
bing is crawling our properties 10x faster than any other crawler,
until default behavior is improved we are blocking it out-of-the-box
You may enable it by setting the blacklist back to empty
This feature can be enabled by choosing a destination for the
`shared drafts category` site setting.
* Staff members can create shared drafts, choosing a destination
category for the topic when it is published.
* Shared Drafts can be viewed in their category, or above the
topic list for the destination category where it will end up.
* When the shared draft is ready, it can be published to the
appropriate category by clicking a button on the topic view.
* When published, Drafts change their timestamps to the current
time, and any edits to the original post are removed.
We trust staff + tl2 and up to perform edits in grace period.
Allow them significantly more edit room in grace period prior to storing
a revision.
editing_grace_period_max_diff_high_trust applies to users with tl2 and up.
So
tl0 / 1 : we store an extra revision if more than 100 chars change
tl2 and up : we store an extra revision if more than 400 chars change
We may tweak these numbers as we go.
If a user performs a substantive edit of 20 chars or more during grace period
we will store a revision to track the change
This allows for better auditing of changes that happen during the grace period
* FEATURE: New site setting for additional allowed filetypes for staff
* Problematic variable name
* feedback
* small issues
* fix indentation
* failing tests
* Remove message bus and fix minor issues
* Missed this message bus
New site settings:
enable_markdown_linkify: which is default on, auto links https:// and http:// and mail://
markdown_linkify_tlds: which allows control of what tlds get autolinked for cases such as www.site.com, default is com|net|gov
- phase one does it match 'trident|webkit|gecko|chrome|safari|msie|opera'
yes- well it is possibly a browser
- phase two does it match 'rss|bot|spider|crawler|facebook|archive|wayback|ping|monitor'
probably a crawler then
Based off: https://gist.github.com/SamSaffron/6cfad7ea3e6df321ffb7a84f93720a53