Skip to content
This repository has been archived by the owner on Apr 26, 2024. It is now read-only.

Commit

Permalink
Fix background update table-scanning events (#14374)
Browse files Browse the repository at this point in the history
When this background update did its last batch, it would try to update all the
events that had been inserted since the bgupdate started, which could cause a
table-scan. Make sure we limit the update correctly.
  • Loading branch information
richvdh authored Nov 7, 2022
1 parent 42f9d41 commit 2193513
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 8 deletions.
1 change: 1 addition & 0 deletions changelog.d/14374.bugfix
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Fix a background database update, introduced in Synapse 1.64.0, which could cause poor database performance.
16 changes: 8 additions & 8 deletions synapse/storage/databases/main/events_bg_updates.py
Original file line number Diff line number Diff line change
Expand Up @@ -1435,16 +1435,16 @@ def _populate_txn(txn: LoggingTransaction) -> bool:
),
)

endpoint = None
row = txn.fetchone()
if row:
endpoint = row[0]
else:
# if the query didn't return a row, we must be almost done. We just
# need to go up to the recorded max_stream_ordering.
endpoint = max_stream_ordering_inclusive

where_clause = "stream_ordering > ?"
args = [min_stream_ordering_exclusive]
if endpoint:
where_clause += " AND stream_ordering <= ?"
args.append(endpoint)
where_clause = "stream_ordering > ? AND stream_ordering <= ?"
args = [min_stream_ordering_exclusive, endpoint]

# now do the updates.
txn.execute(
Expand All @@ -1458,13 +1458,13 @@ def _populate_txn(txn: LoggingTransaction) -> bool:
)

logger.info(
"populated new `events` columns up to %s/%i: updated %i rows",
"populated new `events` columns up to %i/%i: updated %i rows",
endpoint,
max_stream_ordering_inclusive,
txn.rowcount,
)

if endpoint is None:
if endpoint >= max_stream_ordering_inclusive:
# we're done
return True

Expand Down

0 comments on commit 2193513

Please sign in to comment.