Compare commits

..

10 Commits

Author SHA1 Message Date
Travis Ralston
7e09a4d233
Merge e3430769b29a0767431b8daf488aa0cae3d6a1fb into 6ddbb0361222da276cf0f9b76dd2dff92862327e 2025-07-02 18:04:26 +02:00
V02460
6ddbb03612
Raise poetry-core version cap to 2.1.3 (#18575)
Request to raise the defensive version cap for poetry-core from 1.9.1 to
2.1.3.

My understanding is that the major version bump of poetry signals the
transition to standardized pyproject.toml metadata, but does not affect
backwards compatibility.

This is a subset of the changes in #18432

Fixes #18200

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-07-02 15:57:30 +00:00
Erik Johnston
cc8da2c5ed
Log the room ID we're purging state for (#18625)
So we can see what we're deleting.
2025-07-02 15:02:12 +01:00
reivilibre
c17fd947f3
Fix documentation of the Delete Room Admin API's status field. (#18519)
Fixes: #18502

---------

Signed-off-by: Olivier 'reivilibre <oliverw@matrix.org>
2025-07-01 17:55:38 +01:00
Quentin Gliech
24bcdb3f3c
Merge branch 'master' into develop 2025-07-01 17:37:49 +02:00
Quentin Gliech
e3ed93adf3
Add a note in the changelog about the manylinux wheels 2025-07-01 16:01:28 +02:00
Quentin Gliech
214ac2f005
1.133.0 2025-07-01 15:13:42 +02:00
Quentin Gliech
c471e84697
Bump cibuildwheel to 3.0.0 to fix the building of wheels (#18615)
Fixes https://github.com/element-hq/synapse/issues/18614

This upgrade CIBW to 3.0, which now builds using the manylinux_2_28
image, as the previous image is EOL and not supported by some of our
dependencies anymore.

This also updates the job to use the `ubuntu-24.04` base image instead
of `ubuntu-22.04`
2025-07-01 14:54:33 +02:00
Andrew Morgan
291880012f
Stop sending or processing the origin field in PDUs (#18418)
Co-authored-by: Quentin Gliech <quenting@element.io>
Co-authored-by: Eric Eastwood <erice@element.io>
2025-07-01 12:04:23 +01:00
Krishan
a2bee2f255
Add via param to hierarchy enpoint (#18070)
### Pull Request Checklist

Implementation of
[MSC4235](https://github.com/matrix-org/matrix-spec-proposals/pull/4235)
as per suggestion in [pull request
17750](https://github.com/element-hq/synapse/pull/17750#issuecomment-2411248598).

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct
(run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Quentin Gliech <quenting@element.io>
2025-06-30 12:42:14 +00:00
25 changed files with 149 additions and 43 deletions

View File

@ -111,7 +111,7 @@ jobs:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-22.04, macos-13]
os: [ubuntu-24.04, macos-13]
arch: [x86_64, aarch64]
# is_pr is a flag used to exclude certain jobs from the matrix on PRs.
# It is not read by the rest of the workflow.
@ -139,7 +139,7 @@ jobs:
python-version: "3.x"
- name: Install cibuildwheel
run: python -m pip install cibuildwheel==2.23.0
run: python -m pip install cibuildwheel==3.0.0
- name: Set up QEMU to emulate aarch64
if: matrix.arch == 'aarch64'

View File

@ -1,3 +1,21 @@
# Synapse 1.133.0 (2025-07-01)
Pre-built wheels are now built using the [manylinux_2_28](https://github.com/pypa/manylinux#manylinux_2_28-almalinux-8-based) base, which is expected to be compatible with distros using glibc 2.28 or later, including:
- Debian 10+
- Ubuntu 18.10+
- Fedora 29+
- CentOS/RHEL 8+
Previously, wheels were built using the [manylinux2014](https://github.com/pypa/manylinux#manylinux2014-centos-7-based-glibc-217) base, which was expected to be compatible with distros using glibc 2.17 or later.
### Bugfixes
- Bump `cibuildwheel` to 3.0.0 to fix the `manylinux` wheel builds. ([\#18615](https://github.com/element-hq/synapse/issues/18615))
# Synapse 1.133.0rc1 (2025-06-24)
### Features

View File

@ -0,0 +1 @@
Support for [MSC4235](https://github.com/matrix-org/matrix-spec-proposals/pull/4235): via query param for hierarchy endpoint. Contributed by Krishan (@kfiven).

View File

@ -0,0 +1 @@
Stop adding the "origin" field to newly-created events (PDUs).

1
changelog.d/18519.doc Normal file
View File

@ -0,0 +1 @@
Fix documentation of the Delete Room Admin API's status field.

1
changelog.d/18575.misc Normal file
View File

@ -0,0 +1 @@
Raise poetry-core version cap to 2.1.3.

1
changelog.d/18625.misc Normal file
View File

@ -0,0 +1 @@
Log the room ID we're purging state for.

View File

@ -45,6 +45,10 @@ def make_graph(pdus: List[dict], filename_prefix: str) -> None:
colors = {"red", "green", "blue", "yellow", "purple"}
for pdu in pdus:
# TODO: The "origin" field has since been removed from events generated
# by Synapse. We should consider removing it here as well but since this
# is part of `contrib/`, it is left for the community to revise and ensure things
# still work correctly.
origins.add(pdu.get("origin"))
color_map = {color: color for color in colors if color in origins}

6
debian/changelog vendored
View File

@ -1,3 +1,9 @@
matrix-synapse-py3 (1.133.0) stable; urgency=medium
* New synapse release 1.133.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 01 Jul 2025 13:13:24 +0000
matrix-synapse-py3 (1.133.0~rc1) stable; urgency=medium
* New Synapse release 1.133.0rc1.

View File

@ -117,7 +117,6 @@ It returns a JSON body like the following:
"hashes": {
"sha256": "xK1//xnmvHJIOvbgXlkI8eEqdvoMmihVDJ9J4SNlsAw"
},
"origin": "matrix.org",
"origin_server_ts": 1592291711430,
"prev_events": [
"$YK4arsKKcc0LRoe700pS8DSjOvUT4NDv0HfInlMFw2M"

View File

@ -806,7 +806,7 @@ A response body like the following is returned:
}, {
"delete_id": "delete_id2",
"room_id": "!roomid:example.com",
"status": "purging",
"status": "active",
"shutdown_room": {
"kicked_users": [
"@foobar:example.com"
@ -843,7 +843,7 @@ A response body like the following is returned:
```json
{
"status": "purging",
"status": "active",
"delete_id": "bHkCNQpHqOaFhPtK",
"room_id": "!roomid:example.com",
"shutdown_room": {
@ -876,8 +876,8 @@ The following fields are returned in the JSON response body:
- `delete_id` - The ID for this purge
- `room_id` - The ID of the room being deleted
- `status` - The status will be one of:
- `shutting_down` - The process is removing users from the room.
- `purging` - The process is purging the room and event data from database.
- `scheduled` - The deletion is waiting to be started
- `active` - The process is purging the room and event data from database.
- `complete` - The process has completed successfully.
- `failed` - The process is aborted, an error has occurred.
- `error` - A string that shows an error message if `status` is `failed`.

View File

@ -101,7 +101,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.133.0rc1"
version = "1.133.0"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later"
@ -374,7 +374,7 @@ tomli = ">=1.2.3"
# runtime errors caused by build system changes.
# We are happy to raise these upper bounds upon request,
# provided we check that it's safe to do so (i.e. that CI passes).
requires = ["poetry-core>=1.1.0,<=1.9.1", "setuptools_rust>=1.3,<=1.10.2"]
requires = ["poetry-core>=1.1.0,<=2.1.3", "setuptools_rust>=1.3,<=1.10.2"]
build-backend = "poetry.core.masonry.api"

View File

@ -561,6 +561,9 @@ class ExperimentalConfig(Config):
# MSC4076: Add `disable_badge_count`` to pusher configuration
self.msc4076_enabled: bool = experimental.get("msc4076_enabled", False)
# MSC4235: Add `via` param to hierarchy endpoint
self.msc4235_enabled: bool = experimental.get("msc4235_enabled", False)
# MSC4263: Preventing MXID enumeration via key queries
self.msc4263_limit_key_queries_to_users_who_share_rooms = experimental.get(
"msc4263_limit_key_queries_to_users_who_share_rooms",

View File

@ -208,7 +208,6 @@ class EventBase(metaclass=abc.ABCMeta):
depth: DictProperty[int] = DictProperty("depth")
content: DictProperty[JsonDict] = DictProperty("content")
hashes: DictProperty[Dict[str, str]] = DictProperty("hashes")
origin: DictProperty[str] = DictProperty("origin")
origin_server_ts: DictProperty[int] = DictProperty("origin_server_ts")
room_id: DictProperty[str] = DictProperty("room_id")
sender: DictProperty[str] = DictProperty("sender")

View File

@ -302,8 +302,8 @@ def create_local_event_from_event_dict(
event_dict: JsonDict,
internal_metadata_dict: Optional[JsonDict] = None,
) -> EventBase:
"""Takes a fully formed event dict, ensuring that fields like `origin`
and `origin_server_ts` have correct values for a locally produced event,
"""Takes a fully formed event dict, ensuring that fields like
`origin_server_ts` have correct values for a locally produced event,
then signs and hashes it.
"""
@ -319,7 +319,6 @@ def create_local_event_from_event_dict(
if format_version == EventFormatVersions.ROOM_V1_V2:
event_dict["event_id"] = _create_event_id(clock, hostname)
event_dict["origin"] = hostname
event_dict.setdefault("origin_server_ts", time_now)
event_dict.setdefault("unsigned", {})

View File

@ -67,7 +67,6 @@ class EventValidator:
"auth_events",
"content",
"hashes",
"origin",
"prev_events",
"sender",
"type",
@ -77,13 +76,6 @@ class EventValidator:
if k not in event:
raise SynapseError(400, "Event does not have key %s" % (k,))
# Check that the following keys have string values
event_strings = ["origin"]
for s in event_strings:
if not isinstance(getattr(event, s), str):
raise SynapseError(400, "'%s' not a string type" % (s,))
# Depending on the room version, ensure the data is spec compliant JSON.
if event.room_version.strict_canonicaljson:
validate_canonicaljson(event.get_pdu_json())

View File

@ -323,8 +323,7 @@ def event_from_pdu_json(pdu_json: JsonDict, room_version: RoomVersion) -> EventB
SynapseError: if the pdu is missing required fields or is otherwise
not a valid matrix event
"""
# we could probably enforce a bunch of other fields here (room_id, sender,
# origin, etc etc)
# we could probably enforce a bunch of other fields here (room_id, sender, etc.)
assert_params_in_dict(pdu_json, ("type", "depth"))
# Strip any unauthorized values from "unsigned" if they exist

View File

@ -111,7 +111,15 @@ class RoomSummaryHandler:
# If a user tries to fetch the same page multiple times in quick succession,
# only process the first attempt and return its result to subsequent requests.
self._pagination_response_cache: ResponseCache[
Tuple[str, str, bool, Optional[int], Optional[int], Optional[str]]
Tuple[
str,
str,
bool,
Optional[int],
Optional[int],
Optional[str],
Optional[Tuple[str, ...]],
]
] = ResponseCache(
hs.get_clock(),
"get_room_hierarchy",
@ -126,6 +134,7 @@ class RoomSummaryHandler:
max_depth: Optional[int] = None,
limit: Optional[int] = None,
from_token: Optional[str] = None,
remote_room_hosts: Optional[Tuple[str, ...]] = None,
) -> JsonDict:
"""
Implementation of the room hierarchy C-S API.
@ -143,6 +152,9 @@ class RoomSummaryHandler:
limit: An optional limit on the number of rooms to return per
page. Must be a positive integer.
from_token: An optional pagination token.
remote_room_hosts: An optional list of remote homeserver server names. If defined,
each host will be used to try and fetch the room hierarchy. Must be a tuple so
that it can be hashed by the `RoomSummaryHandler._pagination_response_cache`.
Returns:
The JSON hierarchy dictionary.
@ -162,6 +174,7 @@ class RoomSummaryHandler:
max_depth,
limit,
from_token,
remote_room_hosts,
),
self._get_room_hierarchy,
requester.user.to_string(),
@ -170,6 +183,7 @@ class RoomSummaryHandler:
max_depth,
limit,
from_token,
remote_room_hosts,
)
async def _get_room_hierarchy(
@ -180,6 +194,7 @@ class RoomSummaryHandler:
max_depth: Optional[int] = None,
limit: Optional[int] = None,
from_token: Optional[str] = None,
remote_room_hosts: Optional[Tuple[str, ...]] = None,
) -> JsonDict:
"""See docstring for SpaceSummaryHandler.get_room_hierarchy."""
@ -199,7 +214,7 @@ class RoomSummaryHandler:
if not local_room:
room_hierarchy = await self._summarize_remote_room_hierarchy(
_RoomQueueEntry(requested_room_id, ()),
_RoomQueueEntry(requested_room_id, remote_room_hosts or ()),
False,
)
root_room_entry = room_hierarchy[0]
@ -240,7 +255,7 @@ class RoomSummaryHandler:
processed_rooms = set(pagination_session["processed_rooms"])
else:
# The queue of rooms to process, the next room is last on the stack.
room_queue = [_RoomQueueEntry(requested_room_id, ())]
room_queue = [_RoomQueueEntry(requested_room_id, remote_room_hosts or ())]
# Rooms we have already processed.
processed_rooms = set()

View File

@ -1538,6 +1538,7 @@ class RoomHierarchyRestServlet(RestServlet):
super().__init__()
self._auth = hs.get_auth()
self._room_summary_handler = hs.get_room_summary_handler()
self.msc4235_enabled = hs.config.experimental.msc4235_enabled
async def on_GET(
self, request: SynapseRequest, room_id: str
@ -1547,6 +1548,15 @@ class RoomHierarchyRestServlet(RestServlet):
max_depth = parse_integer(request, "max_depth")
limit = parse_integer(request, "limit")
# twisted.web.server.Request.args is incorrectly defined as Optional[Any]
remote_room_hosts = None
if self.msc4235_enabled:
args: Dict[bytes, List[bytes]] = request.args # type: ignore
via_param = parse_strings_from_args(
args, "org.matrix.msc4235.via", required=False
)
remote_room_hosts = tuple(via_param or [])
return 200, await self._room_summary_handler.get_room_hierarchy(
requester,
room_id,
@ -1554,6 +1564,7 @@ class RoomHierarchyRestServlet(RestServlet):
max_depth=max_depth,
limit=limit,
from_token=parse_string(request, "from"),
remote_room_hosts=remote_room_hosts,
)

View File

@ -34,6 +34,7 @@ from synapse.metrics.background_process_metrics import wrap_as_background_proces
from synapse.storage.database import LoggingTransaction
from synapse.storage.databases import Databases
from synapse.types.storage import _BackgroundUpdates
from synapse.util.stringutils import shortstr
if TYPE_CHECKING:
from synapse.server import HomeServer
@ -167,6 +168,12 @@ class PurgeEventsStorageController:
break
(room_id, groups_to_sequences) = next_to_delete
logger.info(
"[purge] deleting state groups for room %s: %s",
room_id,
shortstr(groups_to_sequences.keys(), maxitems=10),
)
made_progress = await self._delete_state_groups(
room_id, groups_to_sequences
)

View File

@ -225,7 +225,7 @@ KNOWN_KEYS = {
"depth",
"event_id",
"hashes",
"origin",
"origin", # old events were created with an origin field.
"origin_server_ts",
"prev_events",
"room_id",

View File

@ -48,7 +48,6 @@ class EventSigningTestCase(unittest.TestCase):
def test_sign_minimal(self) -> None:
event_dict = {
"event_id": "$0:domain",
"origin": "domain",
"origin_server_ts": 1000000,
"signatures": {},
"type": "X",
@ -64,7 +63,7 @@ class EventSigningTestCase(unittest.TestCase):
self.assertTrue(hasattr(event, "hashes"))
self.assertIn("sha256", event.hashes)
self.assertEqual(
event.hashes["sha256"], "6tJjLpXtggfke8UxFhAKg82QVkJzvKOVOOSjUDK4ZSI"
event.hashes["sha256"], "A6Nco6sqoy18PPfPDVdYvoowfc0PVBk9g9OiyT3ncRM"
)
self.assertTrue(hasattr(event, "signatures"))
@ -72,15 +71,14 @@ class EventSigningTestCase(unittest.TestCase):
self.assertIn(KEY_NAME, event.signatures["domain"])
self.assertEqual(
event.signatures[HOSTNAME][KEY_NAME],
"2Wptgo4CwmLo/Y8B8qinxApKaCkBG2fjTWB7AbP5Uy+"
"aIbygsSdLOFzvdDjww8zUVKCmI02eP9xtyJxc/cLiBA",
"PBc48yDVszWB9TRaB/+CZC1B+pDAC10F8zll006j+NN"
"fe4PEMWcVuLaG63LFTK9e4rwJE8iLZMPtCKhDTXhpAQ",
)
def test_sign_message(self) -> None:
event_dict = {
"content": {"body": "Here is the message content"},
"event_id": "$0:domain",
"origin": "domain",
"origin_server_ts": 1000000,
"type": "m.room.message",
"room_id": "!r:domain",
@ -98,7 +96,7 @@ class EventSigningTestCase(unittest.TestCase):
self.assertTrue(hasattr(event, "hashes"))
self.assertIn("sha256", event.hashes)
self.assertEqual(
event.hashes["sha256"], "onLKD1bGljeBWQhWZ1kaP9SorVmRQNdN5aM2JYU2n/g"
event.hashes["sha256"], "rDCeYBepPlI891h/RkI2/Lkf9bt7u0TxFku4tMs7WKk"
)
self.assertTrue(hasattr(event, "signatures"))
@ -106,6 +104,6 @@ class EventSigningTestCase(unittest.TestCase):
self.assertIn(KEY_NAME, event.signatures["domain"])
self.assertEqual(
event.signatures[HOSTNAME][KEY_NAME],
"Wm+VzmOUOz08Ds+0NTWb1d4CZrVsJSikkeRxh6aCcUw"
"u6pNC78FunoD7KNWzqFn241eYHYMGCA5McEiVPdhzBA",
"Ay4aj2b5oJ1k8INYZ9n3KnszCflM0emwcmQQ7vxpbdc"
"Sv9bkJxIZdWX1IJllcZLq89+D3sSabE+vqPtZs9akDw",
)

View File

@ -130,7 +130,7 @@ class PruneEventTestCase(stdlib_unittest.TestCase):
"prev_events": "prev_events",
"prev_state": "prev_state",
"auth_events": "auth_events",
"origin": "domain",
"origin": "domain", # historical top-level field that still exists on old events
"origin_server_ts": 1234,
"membership": "join",
# Also include a key that should be removed.
@ -147,7 +147,7 @@ class PruneEventTestCase(stdlib_unittest.TestCase):
"prev_events": "prev_events",
"prev_state": "prev_state",
"auth_events": "auth_events",
"origin": "domain",
"origin": "domain", # historical top-level field that still exists on old events
"origin_server_ts": 1234,
"membership": "join",
"content": {},
@ -156,13 +156,12 @@ class PruneEventTestCase(stdlib_unittest.TestCase):
},
)
# As of room versions we now redact the membership, prev_states, and origin keys.
# As of room versions we now redact the membership and prev_states keys.
self.run_test(
{
"type": "A",
"prev_state": "prev_state",
"membership": "join",
"origin": "example.com",
},
{"type": "A", "content": {}, "signatures": {}, "unsigned": {}},
room_version=RoomVersions.V11,
@ -246,7 +245,6 @@ class PruneEventTestCase(stdlib_unittest.TestCase):
{
"type": "m.room.create",
"content": {"not_a_real_key": True},
"origin": "some_homeserver",
"nonsense_field": "some_random_garbage",
},
{

View File

@ -535,7 +535,6 @@ class StripUnsignedFromEventsTestCase(unittest.TestCase):
"depth": 1000,
"origin_server_ts": 1,
"type": "m.room.member",
"origin": "test.servx",
"content": {"membership": "join"},
"auth_events": [],
"unsigned": {"malicious garbage": "hackz", "more warez": "more hackz"},
@ -552,7 +551,6 @@ class StripUnsignedFromEventsTestCase(unittest.TestCase):
"depth": 1000,
"origin_server_ts": 1,
"type": "m.room.member",
"origin": "test.servx",
"auth_events": [],
"content": {"membership": "join"},
"unsigned": {
@ -579,7 +577,6 @@ class StripUnsignedFromEventsTestCase(unittest.TestCase):
"depth": 1000,
"origin_server_ts": 1,
"type": "m.room.power_levels",
"origin": "test.servx",
"content": {},
"auth_events": [],
"unsigned": {

View File

@ -1080,6 +1080,62 @@ class SpaceSummaryTestCase(unittest.HomeserverTestCase):
self.assertEqual(federation_requests, 2)
self._assert_hierarchy(result, expected)
def test_fed_remote_room_hosts(self) -> None:
"""
Test if requested room is available over federation using via's.
"""
fed_hostname = self.hs.hostname + "2"
fed_space = "#fed_space:" + fed_hostname
fed_subroom = "#fed_sub_room:" + fed_hostname
remote_room_hosts = tuple(fed_hostname)
requested_room_entry = _RoomEntry(
fed_space,
{
"room_id": fed_space,
"world_readable": True,
"join_rule": "public",
"room_type": RoomTypes.SPACE,
},
[
{
"type": EventTypes.SpaceChild,
"room_id": fed_space,
"state_key": fed_subroom,
"content": {"via": [fed_hostname]},
}
],
)
child_room = {
"room_id": fed_subroom,
"world_readable": True,
"join_rule": "public",
}
async def summarize_remote_room_hierarchy(
_self: Any, room: Any, suggested_only: bool
) -> Tuple[Optional[_RoomEntry], Dict[str, JsonDict], Set[str]]:
return requested_room_entry, {fed_subroom: child_room}, set()
expected = [
(fed_space, [fed_subroom]),
(fed_subroom, ()),
]
with mock.patch(
"synapse.handlers.room_summary.RoomSummaryHandler._summarize_remote_room_hierarchy",
new=summarize_remote_room_hierarchy,
):
result = self.get_success(
self.handler.get_room_hierarchy(
create_requester(self.user),
fed_space,
remote_room_hosts=remote_room_hosts,
)
)
self._assert_hierarchy(result, expected)
class RoomSummaryTestCase(unittest.HomeserverTestCase):
servlets = [