PERF: avoid table scan while performing a very large update (#24525)

We were seeing lots of deadlocks deploying this migration. This improves
the situation in 2 ways.

1. ddl transaction is avoided, so we hold locks for far shorter times
2. we operate in chunks of a maximum of 100_000 posts (though it is heavily filtered down)

* improve code so it is clearer
This commit is contained in:
Sam 2023-11-23 18:15:40 +11:00 committed by GitHub
parent aaadce0652
commit b3920e05e7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -1,12 +1,31 @@
# frozen_string_literal: true
class TriggerPostRebakeCategoryStyleQuotes < ActiveRecord::Migration[7.0]
disable_ddl_transaction!
def up
DB.exec(<<~SQL)
UPDATE posts
SET baked_version = NULL
WHERE cooked LIKE '%blockquote%'
max_id = DB.query_single(<<~SQL).first.to_i
SELECT MAX(id)
FROM posts
SQL
chunk_size = 100_000
while max_id > 0
ids = DB.query_single(<<~SQL, start: max_id - chunk_size, finish: max_id)
SELECT id
FROM posts
WHERE cooked LIKE '%blockquote%'
AND id >= :start AND id <= :finish
SQL
DB.exec(<<~SQL, ids: ids) if ids && ids.length > 0
UPDATE posts
SET baked_version = NULL
WHERE id IN (:ids)
SQL
max_id -= chunk_size
end
end
def down