I was adding specs to ensure that post actions and uploads are removed for permanently deleted posts.
I noticed that post revisions were not permanently destroyed. I added a migration to fix old data.
The instance of the PostRevisor is passed to the post_edited
event. It is useful to know what has happened to the topic in
this event (we already pass a boolean for topic_changed? but that
is not so helpful by itself).
When overriding the translation for i18n keys used in user notifications
like user_notifications.reply_by_email, errors were returned for
valid interpolation keys. Keys like topic_title_url_encoded are
supported, so no error should be raised.
https://meta.discourse.org/t/-/50305/7
* DEV: upgrade mini_sql
Even though we are not planning on using this quite yet, mini_sql now supports
prepared statements.
Would like this upgrade merged so we can do some benchmarking.
Note, this will not work with pg_bouncer, but sites that are not using it
may benefit from the feature.
* implement multisite friendly prepared statements
Fixes `Rack::Lint::LintError: a header value must be a String, but the value of 'Retry-After' is a Integer`. (see: 14a236b4f0/lib/rack/lint.rb (L676))
I found it when I got flooded by those warning a while back in a test-related accident 😉 (ember CLI tests were hitting a local rails server at a fast rate)
Applying oneboxes and replacing censored watched words does not happen
in a strict order which often lead to inconsistencies. This commit
fixes the behavior and will never censor oneboxes.
To make it always censor oneboxes implies significant changes to the
PrettyText pipeline.
This form does not need to show if discourse connect is enabled
because generally the fields that would be filled in here are
filled in by the SSO provider. There is also an issue right now
where enable_local_logins and enable_discourse_connect can be
true at the same time which is not right.
In extreme circumstances when the uploads table is huge, the old version of
this migration could take a very long time.
The rewrite extracts the sha1 directly from the badges table and does an index
based match on the uploads table
Find & Replace and Autotag watched words were not completely exported
and import did not work with these either. This commit changes the
input and output format to CSV, which allows for a secondary column.
This change is backwards compatible because a CSV file with only one
column has one value per line.
There are a few issues which require us to do this:
- We install the latest version of bundler on every rebuild. Therefore we're running 2.2.15 everywhere, even for 'stable' clusters
- Bundler has changed how gem platforms are managed. That meant that on the stable branch we were building libv8 from source via the 'ruby' package, rather than using the precompiled x86_64-linux binary
- Building the libv8 from source is currently failing
Together, these things mean that builds of `stable` are currently failing. Each of the above issues should likely be fixed, but this commit provides the quickest route to get things working again. Note that despite the Gemfile.lock update, no gem versions have changed.
browser-update script does not work correctly in some very old browsers
because the contents of <noscript> is not accessible in JavaScript.
For these browsers, the server can display the crawler page and add the
browser update notice.
Simply loading the browser-update script in the crawler view is not a
solution because that means all crawlers will also see it.