Compare commits

...

26 Commits

Author SHA1 Message Date
Eric Eastwood
67ec14afc9
Merge e0f8992ee34562d41b6b78991132bd4fe678a30f into e6b8eedd0290e1a66dd1567b752cbff57c67b52c 2025-07-02 14:04:53 -05:00
Andrew Morgan
e6b8eedd02
Update Rust in CI to v1.87.0 as well as dtolnay/rust-toolchain GitHub Action pinned commit hash (#18596) 2025-07-02 18:48:28 +00:00
dependabot[bot]
87fc518e0c
Bump base64 from 0.21.7 to 0.22.1 (#18629)
Bumps [base64](https://github.com/marshallpierce/rust-base64) from
0.21.7 to 0.22.1.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/marshallpierce/rust-base64/blob/master/RELEASE-NOTES.md">base64's
changelog</a>.</em></p>
<blockquote>
<h1>0.22.1</h1>
<ul>
<li>Correct the symbols used for the predefined
<code>alphabet::BIN_HEX</code>.</li>
</ul>
<h1>0.22.0</h1>
<ul>
<li><code>DecodeSliceError::OutputSliceTooSmall</code> is now
conservative rather than precise. That is, the error will only occur if
the decoded output <em>cannot</em> fit, meaning that
<code>Engine::decode_slice</code> can now be used with exactly-sized
output slices. As part of this, <code>Engine::internal_decode</code> now
returns <code>DecodeSliceError</code> instead of
<code>DecodeError</code>, but that is not expected to affect any
external callers.</li>
<li><code>DecodeError::InvalidLength</code> now refers specifically to
the <em>number of valid symbols</em> being invalid (i.e. <code>len % 4
== 1</code>), rather than just the number of input bytes. This avoids
confusing scenarios when based on interpretation you could make a case
for either <code>InvalidLength</code> or <code>InvalidByte</code> being
appropriate.</li>
<li>Decoding is somewhat faster (5-10%)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e144006974"><code>e144006</code></a>
v0.22.1</li>
<li><a
href="64cca59ddb"><code>64cca59</code></a>
Merge pull request <a
href="https://redirect.github.com/marshallpierce/rust-base64/issues/271">#271</a>
from JobanSD/patch-1</li>
<li><a
href="838355e0ac"><code>838355e</code></a>
Correct BinHex 4.0 alphabet according to specifications</li>
<li><a
href="bf15ccf30a"><code>bf15ccf</code></a>
Merge pull request <a
href="https://redirect.github.com/marshallpierce/rust-base64/issues/270">#270</a>
from marshallpierce/mp/clippy</li>
<li><a
href="fc6aabee8a"><code>fc6aabe</code></a>
Appease clippy</li>
<li><a
href="9a518a2d5d"><code>9a518a2</code></a>
Merge pull request <a
href="https://redirect.github.com/marshallpierce/rust-base64/issues/267">#267</a>
from bdura/patch-1</li>
<li><a
href="d96c80f242"><code>d96c80f</code></a>
Merge branch 'marshallpierce:master' into patch-1</li>
<li><a
href="5d70ba7576"><code>5d70ba7</code></a>
Merge pull request <a
href="https://redirect.github.com/marshallpierce/rust-base64/issues/269">#269</a>
from marshallpierce/mp/decode-precisely</li>
<li><a
href="efb6c006c7"><code>efb6c00</code></a>
Release notes</li>
<li><a
href="2b91084a31"><code>2b91084</code></a>
Add some tests to boost coverage</li>
<li>Additional commits viewable in <a
href="https://github.com/marshallpierce/rust-base64/compare/v0.21.7...v0.22.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=base64&package-manager=cargo&previous-version=0.21.7&new-version=0.22.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 17:44:42 +00:00
dependabot[bot]
3bd476eb0d
Bump tokio from 1.45.1 to 1.46.0 (#18628)
Bumps [tokio](https://github.com/tokio-rs/tokio) from 1.45.1 to 1.46.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/tokio-rs/tokio/releases">tokio's
releases</a>.</em></p>
<blockquote>
<h2>Tokio v1.46.0</h2>
<h1>1.46.0 (July 2nd, 2025)</h1>
<h3>Fixed</h3>
<ul>
<li>net: fixed <code>TcpStream::shutdown</code> incorrectly returning an
error on macOS (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7290">#7290</a>)</li>
</ul>
<h2>Added</h2>
<ul>
<li>sync: <code>mpsc::OwnedPermit::{same_channel,
same_channel_as_sender}</code> methods (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7389">#7389</a>)</li>
<li>macros: <code>biased</code> option for <code>join!</code> and
<code>try_join!</code>, similar to <code>select!</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7307">#7307</a>)</li>
<li>net: support for cygwin (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7393">#7393</a>)</li>
<li>net: support <code>pope::OpenOptions::read_write</code> on Android
(<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7426">#7426</a>)</li>
<li>net: add <code>Clone</code> implementation for
<code>net::unix::SocketAddr</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7422">#7422</a>)</li>
</ul>
<h2>Changed</h2>
<ul>
<li>runtime: eliminate unnecessary lfence while operating on
<code>queue::Local&lt;T&gt;</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7340">#7340</a>)</li>
<li>task: disallow blocking in <code>LocalSet::{poll,drop}</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7372">#7372</a>)</li>
</ul>
<h2>Unstable</h2>
<ul>
<li>runtime: add <code>TaskMeta::spawn_location</code> tracking where a
task was spawned (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7417">#7417</a>)</li>
<li>runtime: removed borrow from <code>LocalOptions</code> parameter to
<code>runtime::Builder::build_local</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7346">#7346</a>)</li>
</ul>
<h2>Documented</h2>
<ul>
<li>io: clarify behavior of seeking when <code>start_seek</code> is not
used (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7366">#7366</a>)</li>
<li>io: document cancellation safety of
<code>AsyncWriteExt::flush</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7364">#7364</a>)</li>
<li>net: fix docs for <code>recv_buffer_size</code> method (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7336">#7336</a>)</li>
<li>net: fix broken link of <code>RawFd</code> in <code>TcpSocket</code>
docs (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7416">#7416</a>)</li>
<li>net: update <code>AsRawFd</code> doc link to current Rust stdlib
location (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7429">#7429</a>)</li>
<li>readme: fix double period in reactor description (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7363">#7363</a>)</li>
<li>runtime: add doc note that <code>on_*_task_poll</code> is unstable
(<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7311">#7311</a>)</li>
<li>sync: update broadcast docs on allocation failure (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7352">#7352</a>)</li>
<li>time: add a missing panic scenario of <code>time::advance</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7394">#7394</a>)</li>
</ul>
<p><a
href="https://redirect.github.com/tokio-rs/tokio/issues/7290">#7290</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7290">tokio-rs/tokio#7290</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7307">#7307</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7307">tokio-rs/tokio#7307</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7311">#7311</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7311">tokio-rs/tokio#7311</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7336">#7336</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7336">tokio-rs/tokio#7336</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7340">#7340</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7340">tokio-rs/tokio#7340</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7346">#7346</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7346">tokio-rs/tokio#7346</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7352">#7352</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7352">tokio-rs/tokio#7352</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7363">#7363</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7363">tokio-rs/tokio#7363</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7364">#7364</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7364">tokio-rs/tokio#7364</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7366">#7366</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7366">tokio-rs/tokio#7366</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7372">#7372</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7372">tokio-rs/tokio#7372</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7389">#7389</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7389">tokio-rs/tokio#7389</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7393">#7393</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7393">tokio-rs/tokio#7393</a></p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3f1f268583"><code>3f1f268</code></a>
chore: prepare Tokio v1.46.0 (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7437">#7437</a>)</li>
<li><a
href="3e890cc017"><code>3e890cc</code></a>
rt(unstable): add spawn <code>Location</code> to <code>TaskMeta</code>
(<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7417">#7417</a>)</li>
<li><a
href="69290a6432"><code>69290a6</code></a>
net: derive <code>Clone</code> for <code>net::unix::SocketAddr</code>
(<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7422">#7422</a>)</li>
<li><a
href="e2b175848b"><code>e2b1758</code></a>
fuzz: cfg fuzz tests under cfg(test) (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7428">#7428</a>)</li>
<li><a
href="b7a75b5be3"><code>b7a75b5</code></a>
net: update <code>AsRawFd</code> doc link to current Rust stdlib
location (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7429">#7429</a>)</li>
<li><a
href="6b705b3053"><code>6b705b3</code></a>
net: allow <code>pipe::OpenOptions::read_write</code> on Android (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7426">#7426</a>)</li>
<li><a
href="3636fd018a"><code>3636fd0</code></a>
net: fix broken link of <code>RawFd</code> in <code>TcpSocket</code>
docs (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7416">#7416</a>)</li>
<li><a
href="2506c9fa99"><code>2506c9f</code></a>
benches: revert &quot;properly gate unix benches&quot; (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7412">#7412</a>)</li>
<li><a
href="b3a14483bf"><code>b3a1448</code></a>
sync: improve docs of <code>tokio_util::sync::CancellationToken</code>
(<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7408">#7408</a>)</li>
<li><a
href="013f323def"><code>013f323</code></a>
docs: add a missing panic scenario of <code>time::advance</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7394">#7394</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/tokio-rs/tokio/compare/tokio-1.45.1...tokio-1.46.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=tokio&package-manager=cargo&previous-version=1.45.1&new-version=1.46.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 17:44:22 +00:00
dependabot[bot]
717f67f3d3
Bump Swatinem/rust-cache from 2.7.8 to 2.8.0 (#18612)
Bumps [Swatinem/rust-cache](https://github.com/swatinem/rust-cache) from
2.7.8 to 2.8.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/swatinem/rust-cache/releases">Swatinem/rust-cache's
releases</a>.</em></p>
<blockquote>
<h2>v2.8.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Add cache-workspace-crates feature by <a
href="https://github.com/jbransen"><code>@​jbransen</code></a> in <a
href="https://redirect.github.com/Swatinem/rust-cache/pull/246">Swatinem/rust-cache#246</a></li>
<li>Feat: support warpbuild cache provider by <a
href="https://github.com/stegaBOB"><code>@​stegaBOB</code></a> in <a
href="https://redirect.github.com/Swatinem/rust-cache/pull/247">Swatinem/rust-cache#247</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/jbransen"><code>@​jbransen</code></a>
made their first contribution in <a
href="https://redirect.github.com/Swatinem/rust-cache/pull/246">Swatinem/rust-cache#246</a></li>
<li><a href="https://github.com/stegaBOB"><code>@​stegaBOB</code></a>
made their first contribution in <a
href="https://redirect.github.com/Swatinem/rust-cache/pull/247">Swatinem/rust-cache#247</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/Swatinem/rust-cache/compare/v2.7.8...v2.8.0">https://github.com/Swatinem/rust-cache/compare/v2.7.8...v2.8.0</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Swatinem/rust-cache/blob/master/CHANGELOG.md">Swatinem/rust-cache's
changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<h2>2.8.0</h2>
<ul>
<li>Add support for <code>warpbuild</code> cache provider</li>
<li>Add new <code>cache-workspace-crates</code> feature</li>
</ul>
<h2>2.7.8</h2>
<ul>
<li>Include CPU arch in the cache key</li>
</ul>
<h2>2.7.7</h2>
<ul>
<li>Also cache <code>cargo install</code> metadata</li>
</ul>
<h2>2.7.6</h2>
<ul>
<li>Allow opting out of caching $CARGO_HOME/bin</li>
<li>Add runner OS in cache key</li>
<li>Adds an option to do lookup-only of the cache</li>
</ul>
<h2>2.7.5</h2>
<ul>
<li>Support Cargo.lock format cargo-lock v4</li>
<li>Only run macOsWorkaround() on macOS</li>
</ul>
<h2>2.7.3</h2>
<ul>
<li>Work around upstream problem that causes cache saving to hang for
minutes.</li>
</ul>
<h2>2.7.2</h2>
<ul>
<li>Only key by <code>Cargo.toml</code> and <code>Cargo.lock</code>
files of workspace members.</li>
</ul>
<h2>2.7.1</h2>
<ul>
<li>Update toml parser to fix parsing errors.</li>
</ul>
<h2>2.7.0</h2>
<ul>
<li>Properly cache <code>trybuild</code> tests.</li>
</ul>
<h2>2.6.2</h2>
<ul>
<li>Fix <code>toml</code> parsing.</li>
</ul>
<h2>2.6.1</h2>
<ul>
<li>Fix hash contributions of
<code>Cargo.lock</code>/<code>Cargo.toml</code> files.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="98c8021b55"><code>98c8021</code></a>
2.8.0</li>
<li><a
href="14d3bc39c4"><code>14d3bc3</code></a>
update Changelog</li>
<li><a
href="52ea1434f8"><code>52ea143</code></a>
support warpbuild cache provider (<a
href="https://redirect.github.com/swatinem/rust-cache/issues/247">#247</a>)</li>
<li><a
href="eaa85be6b1"><code>eaa85be</code></a>
Add cache-workspace-crates feature (<a
href="https://redirect.github.com/swatinem/rust-cache/issues/246">#246</a>)</li>
<li><a
href="901019c0f8"><code>901019c</code></a>
Update the test lockfiles</li>
<li>See full diff in <a
href="9d47c6ad4b...98c8021b55">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=Swatinem/rust-cache&package-manager=github_actions&previous-version=2.7.8&new-version=2.8.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 16:54:29 +00:00
dependabot[bot]
c2108948a3
Bump treq from 24.9.1 to 25.5.0 (#18610)
Bumps [treq](https://github.com/twisted/treq) from 24.9.1 to 25.5.0.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/twisted/treq/blob/trunk/CHANGELOG.rst">treq's
changelog</a>.</em></p>
<blockquote>
<h1>25.5.0 (2025-05-31)</h1>
<h2>Features</h2>
<ul>
<li>treq is packaged with Hatchling, and consequently no longer directly
depends on setuptools.
(<code>[#388](https://github.com/twisted/treq/issues/388)
&lt;https://github.com/twisted/treq/issues/388&gt;</code>__)</li>
</ul>
<h2>Improved Documentation</h2>
<ul>
<li>Update documentation to use <code>async</code>/<code>await</code>
syntax (<code>[#409](https://github.com/twisted/treq/issues/409)
&lt;https://github.com/twisted/treq/issues/409&gt;</code>__)</li>
</ul>
<h2>Deprecations and Removals</h2>
<ul>
<li>Support for Python 3.8, which has reached end of support, is
deprecated. This is the last release with support for Python 3.8.
(<code>[#407](https://github.com/twisted/treq/issues/407)
&lt;https://github.com/twisted/treq/issues/407&gt;</code>__)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6869fa5d09"><code>6869fa5</code></a>
Merge pull request <a
href="https://redirect.github.com/twisted/treq/issues/410">#410</a> from
twisted/release-25.5.0</li>
<li><a
href="56266566cf"><code>5626656</code></a>
Test with Python 3.13 final</li>
<li><a
href="f10185e4da"><code>f10185e</code></a>
Generate the changelog</li>
<li><a
href="4b846664f1"><code>4b84666</code></a>
Version 25.5.0</li>
<li><a
href="72a4441f59"><code>72a4441</code></a>
Merge pull request <a
href="https://redirect.github.com/twisted/treq/issues/409">#409</a> from
twisted/rtd-shiny</li>
<li><a
href="0a814edd8a"><code>0a814ed</code></a>
Add changefragment</li>
<li><a
href="993cc47df5"><code>993cc47</code></a>
Fix changelog warnings</li>
<li><a
href="3992177456"><code>3992177</code></a>
Link to CookieJar</li>
<li><a
href="cff43d93b6"><code>cff43d9</code></a>
Update source_suffix conf</li>
<li><a
href="e39c8511b1"><code>e39c851</code></a>
async def print_response</li>
<li>Additional commits viewable in <a
href="https://github.com/twisted/treq/compare/treq-24.9.1...treq-25.5.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=treq&package-manager=pip&previous-version=24.9.1&new-version=25.5.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 16:51:43 +00:00
dependabot[bot]
cb4b5585a4
Bump prometheus-client from 0.21.0 to 0.22.1 (#18609)
Bumps [prometheus-client](https://github.com/prometheus/client_python)
from 0.21.0 to 0.22.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/prometheus/client_python/releases">prometheus-client's
releases</a>.</em></p>
<blockquote>
<h2>v0.22.1</h2>
<h2>What's Changed</h2>
<ul>
<li>BugFix: Skip validating and parsing comment lines early (<a
href="https://redirect.github.com/prometheus/client_python/issues/1108">#1108</a>)
by <a href="https://github.com/wissamir"><code>@​wissamir</code></a> in
<a
href="https://redirect.github.com/prometheus/client_python/pull/1109">prometheus/client_python#1109</a></li>
<li>Use License Expressions in pyproject.toml by <a
href="https://github.com/csmarchbanks"><code>@​csmarchbanks</code></a>
in <a
href="https://redirect.github.com/prometheus/client_python/pull/1111">prometheus/client_python#1111</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/prometheus/client_python/compare/v0.22.0...v0.22.1">https://github.com/prometheus/client_python/compare/v0.22.0...v0.22.1</a></p>
<h2>v0.22.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Add support for native histograms in OM parser by <a
href="https://github.com/vesari"><code>@​vesari</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1040">prometheus/client_python#1040</a></li>
<li>Add exemplar support to CounterMetricFamily [Fix <a
href="https://redirect.github.com/prometheus/client_python/issues/1062">#1062</a>]
by <a href="https://github.com/lod"><code>@​lod</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1063">prometheus/client_python#1063</a></li>
<li>Fix <code>write_to_textfile</code> leaves back temp files on errors
by <a href="https://github.com/ethanschen"><code>@​ethanschen</code></a>
in <a
href="https://redirect.github.com/prometheus/client_python/pull/1066">prometheus/client_python#1066</a></li>
<li>Support UTF-8 in metric creation, parsing, and exposition by <a
href="https://github.com/ywwg"><code>@​ywwg</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1070">prometheus/client_python#1070</a></li>
<li>Fix incorrect use of reentrant locks by <a
href="https://github.com/suligap"><code>@​suligap</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1076">prometheus/client_python#1076</a></li>
<li>Remove Python 3.8 support by <a
href="https://github.com/kajinamit"><code>@​kajinamit</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1075">prometheus/client_python#1075</a></li>
<li>Check if labelvalues is in _metrics before deletion in
MetricWrapperBase.remove() by <a
href="https://github.com/GlorifiedPig"><code>@​GlorifiedPig</code></a>
in <a
href="https://redirect.github.com/prometheus/client_python/pull/1077">prometheus/client_python#1077</a></li>
<li>Add support for Python 3.13 by <a
href="https://github.com/Pliner"><code>@​Pliner</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1080">prometheus/client_python#1080</a></li>
<li>Correct nh sample span structure and parsing by <a
href="https://github.com/vesari"><code>@​vesari</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1082">prometheus/client_python#1082</a></li>
<li>Migrate from setup.py to pyproject.toml by <a
href="https://github.com/csmarchbanks"><code>@​csmarchbanks</code></a>
in <a
href="https://redirect.github.com/prometheus/client_python/pull/1084">prometheus/client_python#1084</a></li>
<li>Changed pushgateway.md by <a
href="https://github.com/mallika-mur"><code>@​mallika-mur</code></a> in
<a
href="https://redirect.github.com/prometheus/client_python/pull/1083">prometheus/client_python#1083</a></li>
<li>Fix order-dependent flaky tests related to UTF-8 support by <a
href="https://github.com/dg98"><code>@​dg98</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1093">prometheus/client_python#1093</a></li>
<li>Update versions for docs Github actions by <a
href="https://github.com/csmarchbanks"><code>@​csmarchbanks</code></a>
in <a
href="https://redirect.github.com/prometheus/client_python/pull/1096">prometheus/client_python#1096</a></li>
<li>Documentation Updates by <a
href="https://github.com/ethanschen"><code>@​ethanschen</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1097">prometheus/client_python#1097</a></li>
<li>Add note on gauge.set_function not working with multiprocessing by
<a href="https://github.com/aapeliv"><code>@​aapeliv</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1098">prometheus/client_python#1098</a></li>
<li>Don't send an empty HTTP header for /favicon.ico by <a
href="https://github.com/noselasd"><code>@​noselasd</code></a> in <a
href="https://redirect.github.com/prometheus/client_python/pull/1101">prometheus/client_python#1101</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/prometheus/client_python/compare/v0.21.0...v0.22.0">https://github.com/prometheus/client_python/compare/v0.21.0...v0.22.0</a></p>
<h2>0.21.1 / 2024-12-03</h2>
<h2>What's Changed</h2>
<p>[BUGFIX] Revert incorrect use of reentrant locks. <a
href="https://redirect.github.com/prometheus/client_python/issues/1076">#1076</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d24220a6c4"><code>d24220a</code></a>
Release 0.22.1</li>
<li><a
href="f294cbbf1d"><code>f294cbb</code></a>
Use License Expressions in pyproject.toml (<a
href="https://redirect.github.com/prometheus/client_python/issues/1111">#1111</a>)</li>
<li><a
href="938b73e0bc"><code>938b73e</code></a>
BugFix: Skip validating and parsing comment lines early (<a
href="https://redirect.github.com/prometheus/client_python/issues/1108">#1108</a>)
(<a
href="https://redirect.github.com/prometheus/client_python/issues/1109">#1109</a>)</li>
<li><a
href="8dfa10e5ff"><code>8dfa10e</code></a>
Release 0.22.0</li>
<li><a
href="e3902ea45b"><code>e3902ea</code></a>
Don't send an empty HTTP header. (<a
href="https://redirect.github.com/prometheus/client_python/issues/1101">#1101</a>)</li>
<li><a
href="23ab8264ce"><code>23ab826</code></a>
Add note on gauge.set_function not working with mp, see <a
href="https://redirect.github.com/prometheus/client_python/issues/504">#504</a>
(<a
href="https://redirect.github.com/prometheus/client_python/issues/1098">#1098</a>)</li>
<li><a
href="c1ff3b28d3"><code>c1ff3b2</code></a>
Update docs (<a
href="https://redirect.github.com/prometheus/client_python/issues/1097">#1097</a>)</li>
<li><a
href="e3bfa1f101"><code>e3bfa1f</code></a>
Update versions for docs Github actions (<a
href="https://redirect.github.com/prometheus/client_python/issues/1096">#1096</a>)</li>
<li><a
href="de8bb4adf7"><code>de8bb4a</code></a>
Fix order-dependent flaky tests related to UTF-8 support (<a
href="https://redirect.github.com/prometheus/client_python/issues/1093">#1093</a>)</li>
<li><a
href="46eae7bae8"><code>46eae7b</code></a>
Changed pushgateway.md (<a
href="https://redirect.github.com/prometheus/client_python/issues/1083">#1083</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/prometheus/client_python/compare/v0.21.0...v0.22.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=prometheus-client&package-manager=pip&previous-version=0.21.0&new-version=0.22.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 16:39:13 +00:00
dependabot[bot]
a3ec2d3b3f
Bump pillow from 11.2.1 to 11.3.0 (#18624)
Bumps [pillow](https://github.com/python-pillow/Pillow) from 11.2.1 to
11.3.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/python-pillow/Pillow/releases">pillow's
releases</a>.</em></p>
<blockquote>
<h2>11.3.0</h2>
<p><a
href="https://pillow.readthedocs.io/en/stable/releasenotes/11.3.0.html">https://pillow.readthedocs.io/en/stable/releasenotes/11.3.0.html</a></p>
<h2>Deprecations</h2>
<ul>
<li>Deprecate fromarray mode argument <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9018">#9018</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Deprecate saving I mode images as PNG <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9023">#9023</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
</ul>
<h2>Documentation</h2>
<ul>
<li>Added release notes for <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9041">#9041</a>
<a
href="https://redirect.github.com/python-pillow/Pillow/issues/9042">#9042</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Add release notes for <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8912">#8912</a>
and <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8969">#8969</a>
<a
href="https://redirect.github.com/python-pillow/Pillow/issues/9019">#9019</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>ImageFont does not handle multiline text <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9000">#9000</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Updated Ubuntu CI targets <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8988">#8988</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Update MinGW package names <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8987">#8987</a>
[<a href="https://github.com/H4M5TER"><code>@​H4M5TER</code></a>]</li>
<li>Updated docstring <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8943">#8943</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Mention that tobytes() with the raw encoder uses Pack.c <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8878">#8878</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Refactor docs <code>Makefile</code> <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8933">#8933</a>
[<a href="https://github.com/hugovk"><code>@​hugovk</code></a>]</li>
<li>Add template for quarterly release issue <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8932">#8932</a>
[<a
href="https://github.com/aclark4life"><code>@​aclark4life</code></a>]</li>
<li>Add list of third party plugins <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8910">#8910</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Update redirected URL <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8919">#8919</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Docs: use sentence case for headers <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8914">#8914</a>
[<a href="https://github.com/hugovk"><code>@​hugovk</code></a>]</li>
<li>Docs: remove unused Makefile targets <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8917">#8917</a>
[<a href="https://github.com/hugovk"><code>@​hugovk</code></a>]</li>
<li>Remove indentation from lists <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8915">#8915</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Python 3.13 is tested on Arch <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8894">#8894</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Move XV Thumbnails to read only section <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8893">#8893</a>
[<a
href="https://github.com/aclark4life"><code>@​aclark4life</code></a>]</li>
<li>Updated macOS tested Pillow versions <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8890">#8890</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
</ul>
<h2>Dependencies</h2>
<ul>
<li>Add AVIF to wheels using only aomenc and dav1d AVIF codecs for
reduced size <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8858">#8858</a>
[<a href="https://github.com/fdintino"><code>@​fdintino</code></a>]</li>
<li>Use same AVIF URL when fetching dependency <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8871">#8871</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Update dependency mypy to v1.16.1 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9026">#9026</a>
[@<a href="https://github.com/apps/renovate">renovate[bot]</a>]</li>
<li>Update libpng to 1.6.49 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9014">#9014</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Update dependency cibuildwheel to v3 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9010">#9010</a>
[@<a href="https://github.com/apps/renovate">renovate[bot]</a>]</li>
<li>Updated libjpeg-turbo to 3.1.1 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9009">#9009</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Update dependency mypy to v1.16.0 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8991">#8991</a>
[@<a href="https://github.com/apps/renovate">renovate[bot]</a>]</li>
<li>Updated libpng to 1.6.48 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8940">#8940</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Updated Ghostscript to 10.5.1 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8939">#8939</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Updated harfbuzz to 11.2.1 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8937">#8937</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Updated libavif to 1.3.0 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8949">#8949</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Update dependency cibuildwheel to v2.23.3 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8931">#8931</a>
[@<a href="https://github.com/apps/renovate">renovate[bot]</a>]</li>
<li>Updated harfbuzz to 11.1.0 <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8904">#8904</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
</ul>
<h2>Testing</h2>
<ul>
<li>Add <code>match</code> parameter to <code>pytest.warns()</code> <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9038">#9038</a>
[<a href="https://github.com/hugovk"><code>@​hugovk</code></a>]</li>
<li>Increase pytest verbosity <a
href="https://redirect.github.com/python-pillow/Pillow/issues/9040">#9040</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Improve SgiImagePlugin test coverage <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8896">#8896</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
<li>Update ruff pre-commit ID <a
href="https://redirect.github.com/python-pillow/Pillow/issues/8994">#8994</a>
[<a
href="https://github.com/radarhere"><code>@​radarhere</code></a>]</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="89f1f4626a"><code>89f1f46</code></a>
11.3.0 version bump</li>
<li><a
href="f2de251c76"><code>f2de251</code></a>
Updated check script paths (<a
href="https://redirect.github.com/python-pillow/Pillow/issues/9052">#9052</a>)</li>
<li><a
href="84855d11c8"><code>84855d1</code></a>
Raise FileNotFoundError when opening an empty path (<a
href="https://redirect.github.com/python-pillow/Pillow/issues/9048">#9048</a>)</li>
<li><a
href="204d11d4da"><code>204d11d</code></a>
Raise FileNotFoundError when opening an empty path</li>
<li><a
href="2b39f7581e"><code>2b39f75</code></a>
Handle IPTC TIFF tags with incorrect type (<a
href="https://redirect.github.com/python-pillow/Pillow/issues/8925">#8925</a>)</li>
<li><a
href="e7a53ba19b"><code>e7a53ba</code></a>
Do not update palette for L mode GIF frame (<a
href="https://redirect.github.com/python-pillow/Pillow/issues/8924">#8924</a>)</li>
<li><a
href="c22230b761"><code>c22230b</code></a>
Use save parameters as encoderinfo defaults (<a
href="https://redirect.github.com/python-pillow/Pillow/issues/9001">#9001</a>)</li>
<li><a
href="da10ed1cf3"><code>da10ed1</code></a>
Add support for iOS (<a
href="https://redirect.github.com/python-pillow/Pillow/issues/9030">#9030</a>)</li>
<li><a
href="be2b4e7864"><code>be2b4e7</code></a>
Fix qtables and quality scaling (<a
href="https://redirect.github.com/python-pillow/Pillow/issues/8879">#8879</a>)</li>
<li><a
href="d4162f8505"><code>d4162f8</code></a>
Updated return type</li>
<li>Additional commits viewable in <a
href="https://github.com/python-pillow/Pillow/compare/11.2.1...11.3.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pillow&package-manager=pip&previous-version=11.2.1&new-version=11.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/element-hq/synapse/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 18:29:37 +02:00
dependabot[bot]
81ca2923d1
Bump types-jsonschema from 4.23.0.20250516 to 4.24.0.20250528 (#18611)
Bumps
[types-jsonschema](https://github.com/typeshed-internal/stub_uploader)
from 4.23.0.20250516 to 4.24.0.20250528.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/typeshed-internal/stub_uploader/commits">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=types-jsonschema&package-manager=pip&previous-version=4.23.0.20250516&new-version=4.24.0.20250528)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 18:25:50 +02:00
dependabot[bot]
2372f3e6b7
Bump sigstore/cosign-installer from 3.9.0 to 3.9.1 (#18608)
Bumps
[sigstore/cosign-installer](https://github.com/sigstore/cosign-installer)
from 3.9.0 to 3.9.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/sigstore/cosign-installer/releases">sigstore/cosign-installer's
releases</a>.</em></p>
<blockquote>
<h2>v3.9.1</h2>
<h2>What's Changed</h2>
<ul>
<li>default action install to use release v2.5.1 by <a
href="https://github.com/cpanato"><code>@​cpanato</code></a> in <a
href="https://redirect.github.com/sigstore/cosign-installer/pull/193">sigstore/cosign-installer#193</a></li>
<li>default cosign to v2.5.2 by <a
href="https://github.com/cpanato"><code>@​cpanato</code></a> in <a
href="https://redirect.github.com/sigstore/cosign-installer/pull/194">sigstore/cosign-installer#194</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/sigstore/cosign-installer/compare/v3.9.0...v3.9.1">https://github.com/sigstore/cosign-installer/compare/v3.9.0...v3.9.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="398d4b0eee"><code>398d4b0</code></a>
default cosign to v2.5.2 (<a
href="https://redirect.github.com/sigstore/cosign-installer/issues/194">#194</a>)</li>
<li><a
href="84f54a2bcd"><code>84f54a2</code></a>
default action install to use release v2.5.1 (<a
href="https://redirect.github.com/sigstore/cosign-installer/issues/193">#193</a>)</li>
<li>See full diff in <a
href="fb28c2b633...398d4b0eee">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sigstore/cosign-installer&package-manager=github_actions&previous-version=3.9.0&new-version=3.9.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 18:22:41 +02:00
dependabot[bot]
82757144e9
Bump stefanzweifel/git-auto-commit-action from 5.2.0 to 6.0.1 (#18607)
Bumps
[stefanzweifel/git-auto-commit-action](https://github.com/stefanzweifel/git-auto-commit-action)
from 5.2.0 to 6.0.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/stefanzweifel/git-auto-commit-action/releases">stefanzweifel/git-auto-commit-action's
releases</a>.</em></p>
<blockquote>
<h2>v6.0.1</h2>
<h2>Fixed</h2>
<ul>
<li>Disable Check if Repo is in Detached State (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/379">#379</a>)
<a
href="https://github.com/@stefanzweifel"><code>@​stefanzweifel</code></a></li>
</ul>
<h2>v6.0.0</h2>
<h2>Added</h2>
<ul>
<li>Throw error early if repository is in a detached state (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/357">#357</a>)</li>
</ul>
<h2>Fixed</h2>
<ul>
<li>Fix PAT instructions with Dependabot (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/376">#376</a>)
<a
href="https://github.com/@Dreamsorcerer"><code>@​Dreamsorcerer</code></a></li>
</ul>
<h2>Removed</h2>
<ul>
<li>Remove support for <code>create_branch</code>,
<code>skip_checkout</code>, <code>skip_Fetch</code> (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/314">#314</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/stefanzweifel/git-auto-commit-action/blob/master/CHANGELOG.md">stefanzweifel/git-auto-commit-action's
changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<p>All notable changes to this project will be documented in this
file.</p>
<p>The format is based on <a
href="http://keepachangelog.com/en/1.0.0/">Keep a Changelog</a>
and this project adheres to <a
href="http://semver.org/spec/v2.0.0.html">Semantic Versioning</a>.</p>
<h2><a
href="https://github.com/stefanzweifel/git-auto-commit-action/compare/v6.0.1...HEAD">Unreleased</a></h2>
<blockquote>
<p>TBD</p>
</blockquote>
<h2><a
href="https://github.com/stefanzweifel/git-auto-commit-action/compare/v6.0.0...v6.0.1">v6.0.1</a>
- 2025-06-11</h2>
<h3>Fixed</h3>
<ul>
<li>Disable Check if Repo is in Detached State (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/379">#379</a>)
<a
href="https://github.com/@stefanzweifel"><code>@​stefanzweifel</code></a></li>
</ul>
<h2><a
href="https://github.com/stefanzweifel/git-auto-commit-action/compare/v5.2.0...v6.0.0">v6.0.0</a>
- 2025-06-10</h2>
<h3>Added</h3>
<ul>
<li>Throw error early if repository is in a detached state (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/357">#357</a>)</li>
</ul>
<h3>Fixed</h3>
<ul>
<li>Fix PAT instructions with Dependabot (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/376">#376</a>)
<a
href="https://github.com/@Dreamsorcerer"><code>@​Dreamsorcerer</code></a></li>
</ul>
<h3>Removed</h3>
<ul>
<li>Remove support for <code>create_branch</code>,
<code>skip_checkout</code>, <code>skip_Fetch</code> (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/314">#314</a>)</li>
</ul>
<h2><a
href="https://github.com/stefanzweifel/git-auto-commit-action/compare/v5.1.0...v5.2.0">v5.2.0</a>
- 2025-04-19</h2>
<h3>Added</h3>
<ul>
<li>Add <code>create_git_tag_only</code> option to skip commiting and
always create a git-tag. (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/364">#364</a>)
<a href="https://github.com/@zMynxx"><code>@​zMynxx</code></a></li>
<li>Add Test for <code>create_git_tag_only</code> feature (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/367">#367</a>)
<a
href="https://github.com/@stefanzweifel"><code>@​stefanzweifel</code></a></li>
</ul>
<h3>Fixed</h3>
<ul>
<li>docs: Update README.md per <a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/issues/354">#354</a>
(<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/361">#361</a>)
<a href="https://github.com/@rasa"><code>@​rasa</code></a></li>
</ul>
<h2><a
href="https://github.com/stefanzweifel/git-auto-commit-action/compare/v5.0.1...v5.1.0">v5.1.0</a>
- 2025-01-11</h2>
<h3>Changed</h3>
<ul>
<li>Include <code>github.actor_id</code> in default
<code>commit_author</code> (<a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/pull/354">#354</a>)
<a
href="https://github.com/@parkerbxyz"><code>@​parkerbxyz</code></a></li>
</ul>
<h3>Fixed</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="778341af66"><code>778341a</code></a>
Merge pull request <a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/issues/379">#379</a>
from stefanzweifel/disable-detached-state-check</li>
<li><a
href="33b203d92a"><code>33b203d</code></a>
Disable Check if Repo is in Detached State</li>
<li><a
href="a82d80a75f"><code>a82d80a</code></a>
Update CHANGELOG</li>
<li><a
href="3cc016cfc8"><code>3cc016c</code></a>
Merge pull request <a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/issues/375">#375</a>
from stefanzweifel/v6-next</li>
<li><a
href="ddb7ae4159"><code>ddb7ae4</code></a>
Merge pull request <a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/issues/376">#376</a>
from Dreamsorcerer/patch-1</li>
<li><a
href="b001e5f0ff"><code>b001e5f</code></a>
Apply suggestions from code review</li>
<li><a
href="6494dc61d3"><code>6494dc6</code></a>
Fix PAT instructions with Dependabot</li>
<li><a
href="76180511d9"><code>7618051</code></a>
Add deprecated inputs to fix unbound variable issue</li>
<li><a
href="ae114628ea"><code>ae11462</code></a>
Merge pull request <a
href="https://redirect.github.com/stefanzweifel/git-auto-commit-action/issues/371">#371</a>
from stefanzweifel/dependabot/npm_and_yarn/bats-1.12.0</li>
<li><a
href="3058f91afb"><code>3058f91</code></a>
Bump bats from 1.11.1 to 1.12.0</li>
<li>Additional commits viewable in <a
href="b863ae1933...778341af66">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=stefanzweifel/git-auto-commit-action&package-manager=github_actions&previous-version=5.2.0&new-version=6.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-07-02 18:21:51 +02:00
Andrew Ferrazzutti
2f9c9d5eba
Forbid locked users from using POST /login (#18594)
Discussed in the [Synapse Dev
room](https://matrix.to/#/!vcyiEtMVHIhWXcJAfl:sw1v.org/$K4UojQtvaSpxSe35TWFXtKWGoAuHwHFcKo8qn2lwxSs?via=matrix.org&via=element.io&via=envs.net)

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-07-02 18:18:33 +02:00
V02460
6ddbb03612
Raise poetry-core version cap to 2.1.3 (#18575)
Request to raise the defensive version cap for poetry-core from 1.9.1 to
2.1.3.

My understanding is that the major version bump of poetry signals the
transition to standardized pyproject.toml metadata, but does not affect
backwards compatibility.

This is a subset of the changes in #18432

Fixes #18200

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [x] Pull request is based on the develop branch
* [x] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [x] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-07-02 15:57:30 +00:00
Erik Johnston
cc8da2c5ed
Log the room ID we're purging state for (#18625)
So we can see what we're deleting.
2025-07-02 15:02:12 +01:00
reivilibre
c17fd947f3
Fix documentation of the Delete Room Admin API's status field. (#18519)
Fixes: #18502

---------

Signed-off-by: Olivier 'reivilibre <oliverw@matrix.org>
2025-07-01 17:55:38 +01:00
Eric Eastwood
e0f8992ee3 Fix failing to save metrics because of incorrect label names
We would see this in the logs before:
```
Failed to save metrics! Usage: <ContextResourceUsage ...> Error: Incorrect label names
```
2025-06-26 16:43:13 -05:00
Eric Eastwood
06f9af155b Add introduction comment 2025-06-26 16:19:35 -05:00
Eric Eastwood
5ad555cefc Add docstrings for block metrics 2025-06-26 16:18:37 -05:00
Eric Eastwood
652c34bda6 Better docstrings for _InFlightMetric -> _BlockInFlightMetric 2025-06-26 16:16:37 -05:00
Eric Eastwood
521c68cafe Add changelog 2025-06-26 16:13:39 -05:00
Eric Eastwood
c232ec7b3b Fix mypy complaining about unknown types by changing property order around
Fix mypy complaints

```
synapse/handlers/delayed_events.py:266: error: Cannot determine type of "validator"  [has-type]
synapse/handlers/delayed_events.py:267: error: Cannot determine type of "event_builder_factory"  [has-type]
```
2025-06-26 16:09:34 -05:00
Eric Eastwood
c7d15dbcc7 Bulk refactor @measure_func decorator usage 2025-06-26 16:06:50 -05:00
Eric Eastwood
6731c4bbf0 Refactor Measure in WellKnownResolver 2025-06-26 16:06:50 -05:00
Eric Eastwood
d05b6ca4c1 Bulk refactor Measure(...) to add server_name 2025-06-26 16:06:50 -05:00
Eric Eastwood
65035b6098 Refactor @measure_func decorator to include server name
Update `@measure_func` docstring
2025-06-26 16:06:49 -05:00
Eric Eastwood
02a7668bb2 Add instance label to Measure 2025-06-26 15:53:24 -05:00
54 changed files with 677 additions and 285 deletions

View File

@ -30,7 +30,7 @@ jobs:
run: docker buildx inspect
- name: Install Cosign
uses: sigstore/cosign-installer@fb28c2b6339dcd94da6e4cbcbc5e888961f6f8c3 # v3.9.0
uses: sigstore/cosign-installer@398d4b0eeef1380460a10c8013a76f728fb906ac # v3.9.1
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2

View File

@ -6,6 +6,11 @@ name: Attempt to automatically fix linting errors
on:
workflow_dispatch:
env:
# We use nightly so that `fmt` correctly groups together imports, and
# clippy correctly fixes up the benchmarks.
RUST_VERSION: nightly-2025-06-24
jobs:
fixup:
name: Fix up
@ -16,13 +21,11 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@56f84321dbccf38fb67ce29ab63e4754056677e0 # master (rust 1.85.1)
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
# We use nightly so that `fmt` correctly groups together imports, and
# clippy correctly fixes up the benchmarks.
toolchain: nightly-2022-12-01
toolchain: ${{ env.RUST_VERSION }}
components: clippy, rustfmt
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: Setup Poetry
uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
@ -44,6 +47,6 @@ jobs:
- run: cargo fmt
continue-on-error: true
- uses: stefanzweifel/git-auto-commit-action@b863ae1933cb653a53c021fe36dbb774e1fb9403 # v5.2.0
- uses: stefanzweifel/git-auto-commit-action@778341af668090896ca464160c2def5d1d1a3eb0 # v6.0.1
with:
commit_message: "Attempt to fix linting"

View File

@ -21,6 +21,9 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
RUST_VERSION: 1.87.0
jobs:
check_repo:
# Prevent this workflow from running on any fork of Synapse other than element-hq/synapse, as it is
@ -41,8 +44,10 @@ jobs:
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@fcf085fcb4b4b8f63f96906cd713eb52181b5ea4 # stable (rust 1.85.1)
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
# The dev dependencies aren't exposed in the wheel metadata (at least with current
# poetry-core versions), so we install with poetry.
@ -75,8 +80,10 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@fcf085fcb4b4b8f63f96906cd713eb52181b5ea4 # stable (rust 1.85.1)
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- run: sudo apt-get -qq install xmlsec1
- name: Set up PostgreSQL ${{ matrix.postgres-version }}
@ -148,8 +155,10 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@fcf085fcb4b4b8f63f96906cd713eb52181b5ea4 # stable (rust 1.85.1)
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: Ensure sytest runs `pip install`
# Delete the lockfile so sytest will `pip install` rather than `poetry install`

View File

@ -11,6 +11,9 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
RUST_VERSION: 1.87.0
jobs:
# Job to detect what has changed so we don't run e.g. Rust checks on PRs that
# don't modify Rust code.
@ -85,8 +88,10 @@ jobs:
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@c1678930c21fb233e4987c4ae12158f9125e5762 # 1.81.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
with:
python-version: "3.x"
@ -149,8 +154,10 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@c1678930c21fb233e4987c4ae12158f9125e5762 # 1.81.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: Setup Poetry
uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
@ -210,8 +217,10 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Install Rust
uses: dtolnay/rust-toolchain@c1678930c21fb233e4987c4ae12158f9125e5762 # 1.81.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
with:
poetry-version: "2.1.1"
@ -227,10 +236,11 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@0d72692bcfbf448b1e2afa01a67f71b455a9dcec # 1.86.0
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
components: clippy
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- run: cargo clippy -- -D warnings
@ -245,11 +255,11 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@56f84321dbccf38fb67ce29ab63e4754056677e0 # master (rust 1.85.1)
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: nightly-2025-04-23
components: clippy
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- run: cargo clippy --all-features -- -D warnings
@ -262,12 +272,12 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@56f84321dbccf38fb67ce29ab63e4754056677e0 # master (rust 1.85.1)
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
# We use nightly so that it correctly groups together imports
toolchain: nightly-2025-04-23
components: rustfmt
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- run: cargo fmt --check
@ -362,8 +372,10 @@ jobs:
postgres:${{ matrix.job.postgres-version }}
- name: Install Rust
uses: dtolnay/rust-toolchain@c1678930c21fb233e4987c4ae12158f9125e5762 # 1.81.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
with:
@ -404,8 +416,10 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@c1678930c21fb233e4987c4ae12158f9125e5762 # 1.81.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
# There aren't wheels for some of the older deps, so we need to install
# their build dependencies
@ -519,8 +533,10 @@ jobs:
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
- name: Install Rust
uses: dtolnay/rust-toolchain@c1678930c21fb233e4987c4ae12158f9125e5762 # 1.81.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: Run SyTest
run: /bootstrap.sh synapse
@ -663,8 +679,10 @@ jobs:
path: synapse
- name: Install Rust
uses: dtolnay/rust-toolchain@c1678930c21fb233e4987c4ae12158f9125e5762 # 1.81.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: Prepare Complement's Prerequisites
run: synapse/.ci/scripts/setup_complement_prerequisites.sh
@ -695,8 +713,10 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@c1678930c21fb233e4987c4ae12158f9125e5762 # 1.81.0
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- run: cargo test
@ -713,10 +733,10 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@56f84321dbccf38fb67ce29ab63e4754056677e0 # master (rust 1.85.1)
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: nightly-2022-12-01
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- run: cargo bench --no-run

View File

@ -20,6 +20,9 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
env:
RUST_VERSION: 1.87.0
jobs:
check_repo:
# Prevent this workflow from running on any fork of Synapse other than element-hq/synapse, as it is
@ -43,8 +46,10 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@fcf085fcb4b4b8f63f96906cd713eb52181b5ea4 # stable (rust 1.85.1)
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
with:
@ -69,8 +74,10 @@ jobs:
- run: sudo apt-get -qq install xmlsec1
- name: Install Rust
uses: dtolnay/rust-toolchain@fcf085fcb4b4b8f63f96906cd713eb52181b5ea4 # stable (rust 1.85.1)
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
with:
@ -113,8 +120,10 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Install Rust
uses: dtolnay/rust-toolchain@fcf085fcb4b4b8f63f96906cd713eb52181b5ea4 # stable (rust 1.85.1)
- uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
with:
toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
- name: Patch dependencies
# Note: The poetry commands want to create a virtualenv in /src/.venv/,

31
Cargo.lock generated
View File

@ -65,12 +65,6 @@ dependencies = [
"windows-targets",
]
[[package]]
name = "base64"
version = "0.21.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d297deb1925b89f2ccc13d7635fa0714f12c87adce1c75356b39ca9b7178567"
[[package]]
name = "base64"
version = "0.22.1"
@ -371,7 +365,7 @@ version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b3314d5adb5d94bcdf56771f2e50dbbc80bb4bdf88967526706205ac9eff24eb"
dependencies = [
"base64 0.22.1",
"base64",
"bytes",
"headers-core",
"http",
@ -491,7 +485,7 @@ version = "0.1.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dc2fdfdbff08affe55bb779f33b053aa1fe5dd5b54c257343c17edfa55711bdb"
dependencies = [
"base64 0.22.1",
"base64",
"bytes",
"futures-channel",
"futures-core",
@ -664,6 +658,17 @@ version = "2.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b248f5224d1d606005e02c97f5aa4e88eeb230488bcc03bc9ca4d7991399f2b5"
[[package]]
name = "io-uring"
version = "0.7.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b86e202f00093dcba4275d4636b93ef9dd75d025ae560d2521b45ea28ab49013"
dependencies = [
"bitflags",
"cfg-if",
"libc",
]
[[package]]
name = "ipnet"
version = "2.11.0"
@ -1059,7 +1064,7 @@ version = "0.12.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eabf4c97d9130e2bf606614eb937e86edac8292eaa6f422f995d7e8de1eb1813"
dependencies = [
"base64 0.22.1",
"base64",
"bytes",
"futures-core",
"futures-util",
@ -1333,7 +1338,7 @@ name = "synapse"
version = "0.1.0"
dependencies = [
"anyhow",
"base64 0.21.7",
"base64",
"blake2",
"bytes",
"futures",
@ -1429,15 +1434,17 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
[[package]]
name = "tokio"
version = "1.45.1"
version = "1.46.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75ef51a33ef1da925cea3e4eb122833cb377c61439ca401b770f54902b806779"
checksum = "1140bb80481756a8cbe10541f37433b459c5aa1e727b4c020fbfebdc25bf3ec4"
dependencies = [
"backtrace",
"bytes",
"io-uring",
"libc",
"mio",
"pin-project-lite",
"slab",
"socket2",
"windows-sys 0.52.0",
]

1
changelog.d/18519.doc Normal file
View File

@ -0,0 +1 @@
Fix documentation of the Delete Room Admin API's status field.

1
changelog.d/18575.misc Normal file
View File

@ -0,0 +1 @@
Raise poetry-core version cap to 2.1.3.

1
changelog.d/18594.bugfix Normal file
View File

@ -0,0 +1 @@
Respond with 401 & `M_USER_LOCKED` when a locked user calls `POST /login`, as per the spec.

1
changelog.d/18596.misc Normal file
View File

@ -0,0 +1 @@
Update to Rust 1.87.0 in CI, and bump the pinned commit of the `dtolnay/rust-toolchain` GitHub Action to `b3b07ba8b418998c39fb20f53e8b695cdcc8de1b`.

1
changelog.d/18601.misc Normal file
View File

@ -0,0 +1 @@
Refactor `Measure` block metrics to be homeserver-scoped.

1
changelog.d/18625.misc Normal file
View File

@ -0,0 +1 @@
Log the room ID we're purging state for.

View File

@ -806,7 +806,7 @@ A response body like the following is returned:
}, {
"delete_id": "delete_id2",
"room_id": "!roomid:example.com",
"status": "purging",
"status": "active",
"shutdown_room": {
"kicked_users": [
"@foobar:example.com"
@ -843,7 +843,7 @@ A response body like the following is returned:
```json
{
"status": "purging",
"status": "active",
"delete_id": "bHkCNQpHqOaFhPtK",
"room_id": "!roomid:example.com",
"shutdown_room": {
@ -876,8 +876,8 @@ The following fields are returned in the JSON response body:
- `delete_id` - The ID for this purge
- `room_id` - The ID of the room being deleted
- `status` - The status will be one of:
- `shutting_down` - The process is removing users from the room.
- `purging` - The process is purging the room and event data from database.
- `scheduled` - The deletion is waiting to be started
- `active` - The process is purging the room and event data from database.
- `complete` - The process has completed successfully.
- `failed` - The process is aborted, an error has occurred.
- `error` - A string that shows an error message if `status` is `failed`.

267
poetry.lock generated
View File

@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
# This file is automatically @generated by Poetry 2.1.1 and should not be changed by hand.
[[package]]
name = "annotated-types"
@ -39,7 +39,7 @@ description = "The ultimate Python library in building OAuth and OpenID Connect
optional = true
python-versions = ">=3.9"
groups = ["main"]
markers = "extra == \"oidc\" or extra == \"jwt\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"jwt\" or extra == \"oidc\""
files = [
{file = "authlib-1.5.2-py2.py3-none-any.whl", hash = "sha256:8804dd4402ac5e4a0435ac49e0b6e19e395357cfa632a3f624dcb4f6df13b4b1"},
{file = "authlib-1.5.2.tar.gz", hash = "sha256:fe85ec7e50c5f86f1e2603518bb3b4f632985eb4a355e52256530790e326c512"},
@ -450,7 +450,7 @@ description = "XML bomb protection for Python stdlib modules"
optional = true
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
groups = ["main"]
markers = "extra == \"saml2\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"saml2\""
files = [
{file = "defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61"},
{file = "defusedxml-0.7.1.tar.gz", hash = "sha256:1bb3032db185915b62d7c6209c5a8792be6a32ab2fedacc84e01b52c51aa3e69"},
@ -493,7 +493,7 @@ description = "XPath 1.0/2.0/3.0/3.1 parsers and selectors for ElementTree and l
optional = true
python-versions = ">=3.7"
groups = ["main"]
markers = "extra == \"saml2\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"saml2\""
files = [
{file = "elementpath-4.1.5-py3-none-any.whl", hash = "sha256:2ac1a2fb31eb22bbbf817f8cf6752f844513216263f0e3892c8e79782fe4bb55"},
{file = "elementpath-4.1.5.tar.gz", hash = "sha256:c2d6dc524b29ef751ecfc416b0627668119d8812441c555d7471da41d4bacb8d"},
@ -543,7 +543,7 @@ description = "Python wrapper for hiredis"
optional = true
python-versions = ">=3.8"
groups = ["main"]
markers = "extra == \"redis\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"redis\""
files = [
{file = "hiredis-3.1.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:2892db9db21f0cf7cc298d09f85d3e1f6dc4c4c24463ab67f79bc7a006d51867"},
{file = "hiredis-3.1.0-cp310-cp310-macosx_10_15_x86_64.whl", hash = "sha256:93cfa6cc25ee2ceb0be81dc61eca9995160b9e16bdb7cca4a00607d57e998918"},
@ -889,7 +889,7 @@ description = "Jaeger Python OpenTracing Tracer implementation"
optional = true
python-versions = ">=3.7"
groups = ["main"]
markers = "extra == \"opentracing\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"opentracing\""
files = [
{file = "jaeger-client-4.8.0.tar.gz", hash = "sha256:3157836edab8e2c209bd2d6ae61113db36f7ee399e66b1dcbb715d87ab49bfe0"},
]
@ -1027,7 +1027,7 @@ description = "A strictly RFC 4510 conforming LDAP V3 pure Python client library
optional = true
python-versions = "*"
groups = ["main"]
markers = "extra == \"matrix-synapse-ldap3\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"matrix-synapse-ldap3\""
files = [
{file = "ldap3-2.9.1-py2.py3-none-any.whl", hash = "sha256:5869596fc4948797020d3f03b7939da938778a0f9e2009f7a072ccf92b8e8d70"},
{file = "ldap3-2.9.1.tar.gz", hash = "sha256:f3e7fc4718e3f09dda568b57100095e0ce58633bcabbed8667ce3f8fbaa4229f"},
@ -1043,7 +1043,7 @@ description = "Powerful and Pythonic XML processing library combining libxml2/li
optional = true
python-versions = ">=3.6"
groups = ["main"]
markers = "extra == \"url-preview\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"url-preview\""
files = [
{file = "lxml-5.4.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e7bc6df34d42322c5289e37e9971d6ed114e3776b45fa879f734bded9d1fea9c"},
{file = "lxml-5.4.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6854f8bd8a1536f8a1d9a3655e6354faa6406621cf857dc27b681b69860645c7"},
@ -1323,7 +1323,7 @@ description = "An LDAP3 auth provider for Synapse"
optional = true
python-versions = ">=3.7"
groups = ["main"]
markers = "extra == \"matrix-synapse-ldap3\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"matrix-synapse-ldap3\""
files = [
{file = "matrix-synapse-ldap3-0.3.0.tar.gz", hash = "sha256:8bb6517173164d4b9cc44f49de411d8cebdb2e705d5dd1ea1f38733c4a009e1d"},
{file = "matrix_synapse_ldap3-0.3.0-py3-none-any.whl", hash = "sha256:8b4d701f8702551e98cc1d8c20dbed532de5613584c08d0df22de376ba99159d"},
@ -1436,6 +1436,22 @@ files = [
{file = "msgpack-1.1.0.tar.gz", hash = "sha256:dd432ccc2c72b914e4cb77afce64aab761c1137cc698be3984eee260bcb2896e"},
]
[[package]]
name = "multipart"
version = "1.2.1"
description = "Parser for multipart/form-data"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "multipart-1.2.1-py3-none-any.whl", hash = "sha256:c03dc203bc2e67f6b46a599467ae0d87cf71d7530504b2c1ff4a9ea21d8b8c8c"},
{file = "multipart-1.2.1.tar.gz", hash = "sha256:829b909b67bc1ad1c6d4488fcdc6391c2847842b08323addf5200db88dbe9480"},
]
[package.extras]
dev = ["build", "pytest", "pytest-cov", "twine"]
docs = ["sphinx (>=8,<9)", "sphinx-autobuild"]
[[package]]
name = "mypy"
version = "1.13.0"
@ -1544,7 +1560,7 @@ description = "OpenTracing API for Python. See documentation at http://opentraci
optional = true
python-versions = "*"
groups = ["main"]
markers = "extra == \"opentracing\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"opentracing\""
files = [
{file = "opentracing-2.4.0.tar.gz", hash = "sha256:a173117e6ef580d55874734d1fa7ecb6f3655160b8b8974a2a1e98e5ec9c840d"},
]
@ -1593,114 +1609,119 @@ files = [
[[package]]
name = "pillow"
version = "11.2.1"
version = "11.3.0"
description = "Python Imaging Library (Fork)"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pillow-11.2.1-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:d57a75d53922fc20c165016a20d9c44f73305e67c351bbc60d1adaf662e74047"},
{file = "pillow-11.2.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:127bf6ac4a5b58b3d32fc8289656f77f80567d65660bc46f72c0d77e6600cc95"},
{file = "pillow-11.2.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b4ba4be812c7a40280629e55ae0b14a0aafa150dd6451297562e1764808bbe61"},
{file = "pillow-11.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c8bd62331e5032bc396a93609982a9ab6b411c05078a52f5fe3cc59234a3abd1"},
{file = "pillow-11.2.1-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:562d11134c97a62fe3af29581f083033179f7ff435f78392565a1ad2d1c2c45c"},
{file = "pillow-11.2.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:c97209e85b5be259994eb5b69ff50c5d20cca0f458ef9abd835e262d9d88b39d"},
{file = "pillow-11.2.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:0c3e6d0f59171dfa2e25d7116217543310908dfa2770aa64b8f87605f8cacc97"},
{file = "pillow-11.2.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cc1c3bc53befb6096b84165956e886b1729634a799e9d6329a0c512ab651e579"},
{file = "pillow-11.2.1-cp310-cp310-win32.whl", hash = "sha256:312c77b7f07ab2139924d2639860e084ec2a13e72af54d4f08ac843a5fc9c79d"},
{file = "pillow-11.2.1-cp310-cp310-win_amd64.whl", hash = "sha256:9bc7ae48b8057a611e5fe9f853baa88093b9a76303937449397899385da06fad"},
{file = "pillow-11.2.1-cp310-cp310-win_arm64.whl", hash = "sha256:2728567e249cdd939f6cc3d1f049595c66e4187f3c34078cbc0a7d21c47482d2"},
{file = "pillow-11.2.1-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:35ca289f712ccfc699508c4658a1d14652e8033e9b69839edf83cbdd0ba39e70"},
{file = "pillow-11.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e0409af9f829f87a2dfb7e259f78f317a5351f2045158be321fd135973fff7bf"},
{file = "pillow-11.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d4e5c5edee874dce4f653dbe59db7c73a600119fbea8d31f53423586ee2aafd7"},
{file = "pillow-11.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b93a07e76d13bff9444f1a029e0af2964e654bfc2e2c2d46bfd080df5ad5f3d8"},
{file = "pillow-11.2.1-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:e6def7eed9e7fa90fde255afaf08060dc4b343bbe524a8f69bdd2a2f0018f600"},
{file = "pillow-11.2.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:8f4f3724c068be008c08257207210c138d5f3731af6c155a81c2b09a9eb3a788"},
{file = "pillow-11.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a0a6709b47019dff32e678bc12c63008311b82b9327613f534e496dacaefb71e"},
{file = "pillow-11.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f6b0c664ccb879109ee3ca702a9272d877f4fcd21e5eb63c26422fd6e415365e"},
{file = "pillow-11.2.1-cp311-cp311-win32.whl", hash = "sha256:cc5d875d56e49f112b6def6813c4e3d3036d269c008bf8aef72cd08d20ca6df6"},
{file = "pillow-11.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:0f5c7eda47bf8e3c8a283762cab94e496ba977a420868cb819159980b6709193"},
{file = "pillow-11.2.1-cp311-cp311-win_arm64.whl", hash = "sha256:4d375eb838755f2528ac8cbc926c3e31cc49ca4ad0cf79cff48b20e30634a4a7"},
{file = "pillow-11.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:78afba22027b4accef10dbd5eed84425930ba41b3ea0a86fa8d20baaf19d807f"},
{file = "pillow-11.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:78092232a4ab376a35d68c4e6d5e00dfd73454bd12b230420025fbe178ee3b0b"},
{file = "pillow-11.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25a5f306095c6780c52e6bbb6109624b95c5b18e40aab1c3041da3e9e0cd3e2d"},
{file = "pillow-11.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c7b29dbd4281923a2bfe562acb734cee96bbb129e96e6972d315ed9f232bef4"},
{file = "pillow-11.2.1-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:3e645b020f3209a0181a418bffe7b4a93171eef6c4ef6cc20980b30bebf17b7d"},
{file = "pillow-11.2.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:b2dbea1012ccb784a65349f57bbc93730b96e85b42e9bf7b01ef40443db720b4"},
{file = "pillow-11.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:da3104c57bbd72948d75f6a9389e6727d2ab6333c3617f0a89d72d4940aa0443"},
{file = "pillow-11.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:598174aef4589af795f66f9caab87ba4ff860ce08cd5bb447c6fc553ffee603c"},
{file = "pillow-11.2.1-cp312-cp312-win32.whl", hash = "sha256:1d535df14716e7f8776b9e7fee118576d65572b4aad3ed639be9e4fa88a1cad3"},
{file = "pillow-11.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:14e33b28bf17c7a38eede290f77db7c664e4eb01f7869e37fa98a5aa95978941"},
{file = "pillow-11.2.1-cp312-cp312-win_arm64.whl", hash = "sha256:21e1470ac9e5739ff880c211fc3af01e3ae505859392bf65458c224d0bf283eb"},
{file = "pillow-11.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:fdec757fea0b793056419bca3e9932eb2b0ceec90ef4813ea4c1e072c389eb28"},
{file = "pillow-11.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b0e130705d568e2f43a17bcbe74d90958e8a16263868a12c3e0d9c8162690830"},
{file = "pillow-11.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7bdb5e09068332578214cadd9c05e3d64d99e0e87591be22a324bdbc18925be0"},
{file = "pillow-11.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d189ba1bebfbc0c0e529159631ec72bb9e9bc041f01ec6d3233d6d82eb823bc1"},
{file = "pillow-11.2.1-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:191955c55d8a712fab8934a42bfefbf99dd0b5875078240943f913bb66d46d9f"},
{file = "pillow-11.2.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:ad275964d52e2243430472fc5d2c2334b4fc3ff9c16cb0a19254e25efa03a155"},
{file = "pillow-11.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:750f96efe0597382660d8b53e90dd1dd44568a8edb51cb7f9d5d918b80d4de14"},
{file = "pillow-11.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fe15238d3798788d00716637b3d4e7bb6bde18b26e5d08335a96e88564a36b6b"},
{file = "pillow-11.2.1-cp313-cp313-win32.whl", hash = "sha256:3fe735ced9a607fee4f481423a9c36701a39719252a9bb251679635f99d0f7d2"},
{file = "pillow-11.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:74ee3d7ecb3f3c05459ba95eed5efa28d6092d751ce9bf20e3e253a4e497e691"},
{file = "pillow-11.2.1-cp313-cp313-win_arm64.whl", hash = "sha256:5119225c622403afb4b44bad4c1ca6c1f98eed79db8d3bc6e4e160fc6339d66c"},
{file = "pillow-11.2.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:8ce2e8411c7aaef53e6bb29fe98f28cd4fbd9a1d9be2eeea434331aac0536b22"},
{file = "pillow-11.2.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:9ee66787e095127116d91dea2143db65c7bb1e232f617aa5957c0d9d2a3f23a7"},
{file = "pillow-11.2.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9622e3b6c1d8b551b6e6f21873bdcc55762b4b2126633014cea1803368a9aa16"},
{file = "pillow-11.2.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63b5dff3a68f371ea06025a1a6966c9a1e1ee452fc8020c2cd0ea41b83e9037b"},
{file = "pillow-11.2.1-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:31df6e2d3d8fc99f993fd253e97fae451a8db2e7207acf97859732273e108406"},
{file = "pillow-11.2.1-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:062b7a42d672c45a70fa1f8b43d1d38ff76b63421cbbe7f88146b39e8a558d91"},
{file = "pillow-11.2.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4eb92eca2711ef8be42fd3f67533765d9fd043b8c80db204f16c8ea62ee1a751"},
{file = "pillow-11.2.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:f91ebf30830a48c825590aede79376cb40f110b387c17ee9bd59932c961044f9"},
{file = "pillow-11.2.1-cp313-cp313t-win32.whl", hash = "sha256:e0b55f27f584ed623221cfe995c912c61606be8513bfa0e07d2c674b4516d9dd"},
{file = "pillow-11.2.1-cp313-cp313t-win_amd64.whl", hash = "sha256:36d6b82164c39ce5482f649b437382c0fb2395eabc1e2b1702a6deb8ad647d6e"},
{file = "pillow-11.2.1-cp313-cp313t-win_arm64.whl", hash = "sha256:225c832a13326e34f212d2072982bb1adb210e0cc0b153e688743018c94a2681"},
{file = "pillow-11.2.1-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:7491cf8a79b8eb867d419648fff2f83cb0b3891c8b36da92cc7f1931d46108c8"},
{file = "pillow-11.2.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8b02d8f9cb83c52578a0b4beadba92e37d83a4ef11570a8688bbf43f4ca50909"},
{file = "pillow-11.2.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:014ca0050c85003620526b0ac1ac53f56fc93af128f7546623cc8e31875ab928"},
{file = "pillow-11.2.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3692b68c87096ac6308296d96354eddd25f98740c9d2ab54e1549d6c8aea9d79"},
{file = "pillow-11.2.1-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:f781dcb0bc9929adc77bad571b8621ecb1e4cdef86e940fe2e5b5ee24fd33b35"},
{file = "pillow-11.2.1-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:2b490402c96f907a166615e9a5afacf2519e28295f157ec3a2bb9bd57de638cb"},
{file = "pillow-11.2.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:dd6b20b93b3ccc9c1b597999209e4bc5cf2853f9ee66e3fc9a400a78733ffc9a"},
{file = "pillow-11.2.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:4b835d89c08a6c2ee7781b8dd0a30209a8012b5f09c0a665b65b0eb3560b6f36"},
{file = "pillow-11.2.1-cp39-cp39-win32.whl", hash = "sha256:b10428b3416d4f9c61f94b494681280be7686bda15898a3a9e08eb66a6d92d67"},
{file = "pillow-11.2.1-cp39-cp39-win_amd64.whl", hash = "sha256:6ebce70c3f486acf7591a3d73431fa504a4e18a9b97ff27f5f47b7368e4b9dd1"},
{file = "pillow-11.2.1-cp39-cp39-win_arm64.whl", hash = "sha256:c27476257b2fdcd7872d54cfd119b3a9ce4610fb85c8e32b70b42e3680a29a1e"},
{file = "pillow-11.2.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:9b7b0d4fd2635f54ad82785d56bc0d94f147096493a79985d0ab57aedd563156"},
{file = "pillow-11.2.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:aa442755e31c64037aa7c1cb186e0b369f8416c567381852c63444dd666fb772"},
{file = "pillow-11.2.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f0d3348c95b766f54b76116d53d4cb171b52992a1027e7ca50c81b43b9d9e363"},
{file = "pillow-11.2.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:85d27ea4c889342f7e35f6d56e7e1cb345632ad592e8c51b693d7b7556043ce0"},
{file = "pillow-11.2.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:bf2c33d6791c598142f00c9c4c7d47f6476731c31081331664eb26d6ab583e01"},
{file = "pillow-11.2.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e616e7154c37669fc1dfc14584f11e284e05d1c650e1c0f972f281c4ccc53193"},
{file = "pillow-11.2.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:39ad2e0f424394e3aebc40168845fee52df1394a4673a6ee512d840d14ab3013"},
{file = "pillow-11.2.1-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:80f1df8dbe9572b4b7abdfa17eb5d78dd620b1d55d9e25f834efdbee872d3aed"},
{file = "pillow-11.2.1-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:ea926cfbc3957090becbcbbb65ad177161a2ff2ad578b5a6ec9bb1e1cd78753c"},
{file = "pillow-11.2.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:738db0e0941ca0376804d4de6a782c005245264edaa253ffce24e5a15cbdc7bd"},
{file = "pillow-11.2.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9db98ab6565c69082ec9b0d4e40dd9f6181dab0dd236d26f7a50b8b9bfbd5076"},
{file = "pillow-11.2.1-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:036e53f4170e270ddb8797d4c590e6dd14d28e15c7da375c18978045f7e6c37b"},
{file = "pillow-11.2.1-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:14f73f7c291279bd65fda51ee87affd7c1e097709f7fdd0188957a16c264601f"},
{file = "pillow-11.2.1-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:208653868d5c9ecc2b327f9b9ef34e0e42a4cdd172c2988fd81d62d2bc9bc044"},
{file = "pillow-11.2.1.tar.gz", hash = "sha256:a64dd61998416367b7ef979b73d3a85853ba9bec4c2925f74e588879a58716b6"},
{file = "pillow-11.3.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:1b9c17fd4ace828b3003dfd1e30bff24863e0eb59b535e8f80194d9cc7ecf860"},
{file = "pillow-11.3.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:65dc69160114cdd0ca0f35cb434633c75e8e7fad4cf855177a05bf38678f73ad"},
{file = "pillow-11.3.0-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f1f182ebd2303acf8c380a54f615ec883322593320a9b00438eb842c1f37ae50"},
{file = "pillow-11.3.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4445fa62e15936a028672fd48c4c11a66d641d2c05726c7ec1f8ba6a572036ae"},
{file = "pillow-11.3.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:71f511f6b3b91dd543282477be45a033e4845a40278fa8dcdbfdb07109bf18f9"},
{file = "pillow-11.3.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:040a5b691b0713e1f6cbe222e0f4f74cd233421e105850ae3b3c0ceda520f42e"},
{file = "pillow-11.3.0-cp310-cp310-win32.whl", hash = "sha256:89bd777bc6624fe4115e9fac3352c79ed60f3bb18651420635f26e643e3dd1f6"},
{file = "pillow-11.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:19d2ff547c75b8e3ff46f4d9ef969a06c30ab2d4263a9e287733aa8b2429ce8f"},
{file = "pillow-11.3.0-cp310-cp310-win_arm64.whl", hash = "sha256:819931d25e57b513242859ce1876c58c59dc31587847bf74cfe06b2e0cb22d2f"},
{file = "pillow-11.3.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:1cd110edf822773368b396281a2293aeb91c90a2db00d78ea43e7e861631b722"},
{file = "pillow-11.3.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9c412fddd1b77a75aa904615ebaa6001f169b26fd467b4be93aded278266b288"},
{file = "pillow-11.3.0-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:643f189248837533073c405ec2f0bb250ba54598cf80e8c1e043381a60632f58"},
{file = "pillow-11.3.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:106064daa23a745510dabce1d84f29137a37224831d88eb4ce94bb187b1d7e5f"},
{file = "pillow-11.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:cd8ff254faf15591e724dc7c4ddb6bf4793efcbe13802a4ae3e863cd300b493e"},
{file = "pillow-11.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:932c754c2d51ad2b2271fd01c3d121daaa35e27efae2a616f77bf164bc0b3e94"},
{file = "pillow-11.3.0-cp311-cp311-win32.whl", hash = "sha256:b4b8f3efc8d530a1544e5962bd6b403d5f7fe8b9e08227c6b255f98ad82b4ba0"},
{file = "pillow-11.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:1a992e86b0dd7aeb1f053cd506508c0999d710a8f07b4c791c63843fc6a807ac"},
{file = "pillow-11.3.0-cp311-cp311-win_arm64.whl", hash = "sha256:30807c931ff7c095620fe04448e2c2fc673fcbb1ffe2a7da3fb39613489b1ddd"},
{file = "pillow-11.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdae223722da47b024b867c1ea0be64e0df702c5e0a60e27daad39bf960dd1e4"},
{file = "pillow-11.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:921bd305b10e82b4d1f5e802b6850677f965d8394203d182f078873851dada69"},
{file = "pillow-11.3.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97f07ed9f56a3b9b5f49d3661dc9607484e85c67e27f3e8be2c7d28ca032fec7"},
{file = "pillow-11.3.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:676b2815362456b5b3216b4fd5bd89d362100dc6f4945154ff172e206a22c024"},
{file = "pillow-11.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3e184b2f26ff146363dd07bde8b711833d7b0202e27d13540bfe2e35a323a809"},
{file = "pillow-11.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6be31e3fc9a621e071bc17bb7de63b85cbe0bfae91bb0363c893cbe67247780d"},
{file = "pillow-11.3.0-cp312-cp312-win32.whl", hash = "sha256:7b161756381f0918e05e7cb8a371fff367e807770f8fe92ecb20d905d0e1c149"},
{file = "pillow-11.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:a6444696fce635783440b7f7a9fc24b3ad10a9ea3f0ab66c5905be1c19ccf17d"},
{file = "pillow-11.3.0-cp312-cp312-win_arm64.whl", hash = "sha256:2aceea54f957dd4448264f9bf40875da0415c83eb85f55069d89c0ed436e3542"},
{file = "pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:1c627742b539bba4309df89171356fcb3cc5a9178355b2727d1b74a6cf155fbd"},
{file = "pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:30b7c02f3899d10f13d7a48163c8969e4e653f8b43416d23d13d1bbfdc93b9f8"},
{file = "pillow-11.3.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:7859a4cc7c9295f5838015d8cc0a9c215b77e43d07a25e460f35cf516df8626f"},
{file = "pillow-11.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ec1ee50470b0d050984394423d96325b744d55c701a439d2bd66089bff963d3c"},
{file = "pillow-11.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7db51d222548ccfd274e4572fdbf3e810a5e66b00608862f947b163e613b67dd"},
{file = "pillow-11.3.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c37d8ba9411d6003bba9e518db0db0c58a680ab9fe5179f040b0463644bc9805"},
{file = "pillow-11.3.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:13f87d581e71d9189ab21fe0efb5a23e9f28552d5be6979e84001d3b8505abe8"},
{file = "pillow-11.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:023f6d2d11784a465f09fd09a34b150ea4672e85fb3d05931d89f373ab14abb2"},
{file = "pillow-11.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:45dfc51ac5975b938e9809451c51734124e73b04d0f0ac621649821a63852e7b"},
{file = "pillow-11.3.0-cp313-cp313-win32.whl", hash = "sha256:a4d336baed65d50d37b88ca5b60c0fa9d81e3a87d4a7930d3880d1624d5b31f3"},
{file = "pillow-11.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0bce5c4fd0921f99d2e858dc4d4d64193407e1b99478bc5cacecba2311abde51"},
{file = "pillow-11.3.0-cp313-cp313-win_arm64.whl", hash = "sha256:1904e1264881f682f02b7f8167935cce37bc97db457f8e7849dc3a6a52b99580"},
{file = "pillow-11.3.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4c834a3921375c48ee6b9624061076bc0a32a60b5532b322cc0ea64e639dd50e"},
{file = "pillow-11.3.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5e05688ccef30ea69b9317a9ead994b93975104a677a36a8ed8106be9260aa6d"},
{file = "pillow-11.3.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f85acb69adf2aaee8b7da124efebbdb959a104db34d3a2cb0f3793dbae422a8"},
{file = "pillow-11.3.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:05f6ecbeff5005399bb48d198f098a9b4b6bdf27b8487c7f38ca16eeb070cd59"},
{file = "pillow-11.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a7bc6e6fd0395bc052f16b1a8670859964dbd7003bd0af2ff08342eb6e442cfe"},
{file = "pillow-11.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:83e1b0161c9d148125083a35c1c5a89db5b7054834fd4387499e06552035236c"},
{file = "pillow-11.3.0-cp313-cp313t-win32.whl", hash = "sha256:2a3117c06b8fb646639dce83694f2f9eac405472713fcb1ae887469c0d4f6788"},
{file = "pillow-11.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:857844335c95bea93fb39e0fa2726b4d9d758850b34075a7e3ff4f4fa3aa3b31"},
{file = "pillow-11.3.0-cp313-cp313t-win_arm64.whl", hash = "sha256:8797edc41f3e8536ae4b10897ee2f637235c94f27404cac7297f7b607dd0716e"},
{file = "pillow-11.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d9da3df5f9ea2a89b81bb6087177fb1f4d1c7146d583a3fe5c672c0d94e55e12"},
{file = "pillow-11.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0b275ff9b04df7b640c59ec5a3cb113eefd3795a8df80bac69646ef699c6981a"},
{file = "pillow-11.3.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41742638139424703b4d01665b807c6468e23e699e8e90cffefe291c5832b027"},
{file = "pillow-11.3.0-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:93efb0b4de7e340d99057415c749175e24c8864302369e05914682ba642e5d77"},
{file = "pillow-11.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7966e38dcd0fa11ca390aed7c6f20454443581d758242023cf36fcb319b1a874"},
{file = "pillow-11.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:98a9afa7b9007c67ed84c57c9e0ad86a6000da96eaa638e4f8abe5b65ff83f0a"},
{file = "pillow-11.3.0-cp314-cp314-win32.whl", hash = "sha256:02a723e6bf909e7cea0dac1b0e0310be9d7650cd66222a5f1c571455c0a45214"},
{file = "pillow-11.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:a418486160228f64dd9e9efcd132679b7a02a5f22c982c78b6fc7dab3fefb635"},
{file = "pillow-11.3.0-cp314-cp314-win_arm64.whl", hash = "sha256:155658efb5e044669c08896c0c44231c5e9abcaadbc5cd3648df2f7c0b96b9a6"},
{file = "pillow-11.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:59a03cdf019efbfeeed910bf79c7c93255c3d54bc45898ac2a4140071b02b4ae"},
{file = "pillow-11.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f8a5827f84d973d8636e9dc5764af4f0cf2318d26744b3d902931701b0d46653"},
{file = "pillow-11.3.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4c96f993ab8c98460cd0c001447bff6194403e8b1d7e149ade5f00594918128b"},
{file = "pillow-11.3.0-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41342b64afeba938edb034d122b2dda5db2139b9a4af999729ba8818e0056477"},
{file = "pillow-11.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:068d9c39a2d1b358eb9f245ce7ab1b5c3246c7c8c7d9ba58cfa5b43146c06e50"},
{file = "pillow-11.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a1bc6ba083b145187f648b667e05a2534ecc4b9f2784c2cbe3089e44868f2b9b"},
{file = "pillow-11.3.0-cp314-cp314t-win32.whl", hash = "sha256:118ca10c0d60b06d006be10a501fd6bbdfef559251ed31b794668ed569c87e12"},
{file = "pillow-11.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:8924748b688aa210d79883357d102cd64690e56b923a186f35a82cbc10f997db"},
{file = "pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa"},
{file = "pillow-11.3.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:48d254f8a4c776de343051023eb61ffe818299eeac478da55227d96e241de53f"},
{file = "pillow-11.3.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:7aee118e30a4cf54fdd873bd3a29de51e29105ab11f9aad8c32123f58c8f8081"},
{file = "pillow-11.3.0-cp39-cp39-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:092c80c76635f5ecb10f3f83d76716165c96f5229addbd1ec2bdbbda7d496e06"},
{file = "pillow-11.3.0-cp39-cp39-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cadc9e0ea0a2431124cde7e1697106471fc4c1da01530e679b2391c37d3fbb3a"},
{file = "pillow-11.3.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:6a418691000f2a418c9135a7cf0d797c1bb7d9a485e61fe8e7722845b95ef978"},
{file = "pillow-11.3.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:97afb3a00b65cc0804d1c7abddbf090a81eaac02768af58cbdcaaa0a931e0b6d"},
{file = "pillow-11.3.0-cp39-cp39-win32.whl", hash = "sha256:ea944117a7974ae78059fcc1800e5d3295172bb97035c0c1d9345fca1419da71"},
{file = "pillow-11.3.0-cp39-cp39-win_amd64.whl", hash = "sha256:e5c5858ad8ec655450a7c7df532e9842cf8df7cc349df7225c60d5d348c8aada"},
{file = "pillow-11.3.0-cp39-cp39-win_arm64.whl", hash = "sha256:6abdbfd3aea42be05702a8dd98832329c167ee84400a1d1f61ab11437f1717eb"},
{file = "pillow-11.3.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:3cee80663f29e3843b68199b9d6f4f54bd1d4a6b59bdd91bceefc51238bcb967"},
{file = "pillow-11.3.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b5f56c3f344f2ccaf0dd875d3e180f631dc60a51b314295a3e681fe8cf851fbe"},
{file = "pillow-11.3.0-pp310-pypy310_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:527b37216b6ac3a12d7838dc3bd75208ec57c1c6d11ef01902266a5a0c14fc27"},
{file = "pillow-11.3.0-pp310-pypy310_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:be5463ac478b623b9dd3937afd7fb7ab3d79dd290a28e2b6df292dc75063eb8a"},
{file = "pillow-11.3.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:8dc70ca24c110503e16918a658b869019126ecfe03109b754c402daff12b3d9f"},
{file = "pillow-11.3.0-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7c8ec7a017ad1bd562f93dbd8505763e688d388cde6e4a010ae1486916e713e6"},
{file = "pillow-11.3.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:9ab6ae226de48019caa8074894544af5b53a117ccb9d3b3dcb2871464c829438"},
{file = "pillow-11.3.0-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5418b53c0d59b3824d05e029669efa023bbef0f3e92e75ec8428f3799487f361"},
{file = "pillow-11.3.0-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:504b6f59505f08ae014f724b6207ff6222662aab5cc9542577fb084ed0676ac7"},
{file = "pillow-11.3.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:c84d689db21a1c397d001aa08241044aa2069e7587b398c8cc63020390b1c1b8"},
{file = "pillow-11.3.0.tar.gz", hash = "sha256:3828ee7586cd0b2091b6209e5ad53e20d0649bbe87164a459d0676e035e8f523"},
]
[package.extras]
docs = ["furo", "olefile", "sphinx (>=8.2)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinxext-opengraph"]
docs = ["furo", "olefile", "sphinx (>=8.2)", "sphinx-autobuild", "sphinx-copybutton", "sphinx-inline-tabs", "sphinxext-opengraph"]
fpx = ["olefile"]
mic = ["olefile"]
test-arrow = ["pyarrow"]
tests = ["check-manifest", "coverage (>=7.4.2)", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout", "trove-classifiers (>=2024.10.12)"]
tests = ["check-manifest", "coverage (>=7.4.2)", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout", "pytest-xdist", "trove-classifiers (>=2024.10.12)"]
typing = ["typing-extensions ; python_version < \"3.10\""]
xmp = ["defusedxml"]
[[package]]
name = "prometheus-client"
version = "0.21.0"
version = "0.22.1"
description = "Python client for the Prometheus monitoring system."
optional = false
python-versions = ">=3.8"
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "prometheus_client-0.21.0-py3-none-any.whl", hash = "sha256:4fa6b4dd0ac16d58bb587c04b1caae65b8c5043e85f778f42f5f632f6af2e166"},
{file = "prometheus_client-0.21.0.tar.gz", hash = "sha256:96c83c606b71ff2b0a433c98889d275f51ffec6c5e267de37c7a2b5c9aa9233e"},
{file = "prometheus_client-0.22.1-py3-none-any.whl", hash = "sha256:cca895342e308174341b2cbf99a56bef291fbc0ef7b9e5412a0f26d653ba7094"},
{file = "prometheus_client-0.22.1.tar.gz", hash = "sha256:190f1331e783cf21eb60bca559354e0a4d4378facecf78f5428c39b675d20d28"},
]
[package.extras]
@ -1713,7 +1734,7 @@ description = "psycopg2 - Python-PostgreSQL Database Adapter"
optional = true
python-versions = ">=3.8"
groups = ["main"]
markers = "extra == \"postgres\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"postgres\""
files = [
{file = "psycopg2-2.9.10-cp310-cp310-win32.whl", hash = "sha256:5df2b672140f95adb453af93a7d669d7a7bf0a56bcd26f1502329166f4a61716"},
{file = "psycopg2-2.9.10-cp310-cp310-win_amd64.whl", hash = "sha256:c6f7b8561225f9e711a9c47087388a97fdc948211c10a4bccbf0ba68ab7b3b5a"},
@ -1721,6 +1742,7 @@ files = [
{file = "psycopg2-2.9.10-cp311-cp311-win_amd64.whl", hash = "sha256:0435034157049f6846e95103bd8f5a668788dd913a7c30162ca9503fdf542cb4"},
{file = "psycopg2-2.9.10-cp312-cp312-win32.whl", hash = "sha256:65a63d7ab0e067e2cdb3cf266de39663203d38d6a8ed97f5ca0cb315c73fe067"},
{file = "psycopg2-2.9.10-cp312-cp312-win_amd64.whl", hash = "sha256:4a579d6243da40a7b3182e0430493dbd55950c493d8c68f4eec0b302f6bbf20e"},
{file = "psycopg2-2.9.10-cp313-cp313-win_amd64.whl", hash = "sha256:91fd603a2155da8d0cfcdbf8ab24a2d54bca72795b90d2a3ed2b6da8d979dee2"},
{file = "psycopg2-2.9.10-cp39-cp39-win32.whl", hash = "sha256:9d5b3b94b79a844a986d029eee38998232451119ad653aea42bb9220a8c5066b"},
{file = "psycopg2-2.9.10-cp39-cp39-win_amd64.whl", hash = "sha256:88138c8dedcbfa96408023ea2b0c369eda40fe5d75002c0964c78f46f11fa442"},
{file = "psycopg2-2.9.10.tar.gz", hash = "sha256:12ec0b40b0273f95296233e8750441339298e6a572f7039da5b260e3c8b60e11"},
@ -1733,7 +1755,7 @@ description = ".. image:: https://travis-ci.org/chtd/psycopg2cffi.svg?branch=mas
optional = true
python-versions = "*"
groups = ["main"]
markers = "platform_python_implementation == \"PyPy\" and (extra == \"postgres\" or extra == \"all\")"
markers = "platform_python_implementation == \"PyPy\" and (extra == \"all\" or extra == \"postgres\")"
files = [
{file = "psycopg2cffi-2.9.0.tar.gz", hash = "sha256:7e272edcd837de3a1d12b62185eb85c45a19feda9e62fa1b120c54f9e8d35c52"},
]
@ -1749,7 +1771,7 @@ description = "A Simple library to enable psycopg2 compatability"
optional = true
python-versions = "*"
groups = ["main"]
markers = "platform_python_implementation == \"PyPy\" and (extra == \"postgres\" or extra == \"all\")"
markers = "platform_python_implementation == \"PyPy\" and (extra == \"all\" or extra == \"postgres\")"
files = [
{file = "psycopg2cffi-compat-1.1.tar.gz", hash = "sha256:d25e921748475522b33d13420aad5c2831c743227dc1f1f2585e0fdb5c914e05"},
]
@ -1972,7 +1994,7 @@ description = "Python extension wrapping the ICU C++ API"
optional = true
python-versions = "*"
groups = ["main"]
markers = "extra == \"user-search\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"user-search\""
files = [
{file = "PyICU-2.14.tar.gz", hash = "sha256:acc7eb92bd5c554ed577249c6978450a4feda0aa6f01470152b3a7b382a02132"},
]
@ -2021,7 +2043,7 @@ description = "A development tool to measure, monitor and analyze the memory beh
optional = true
python-versions = ">=3.6"
groups = ["main"]
markers = "extra == \"cache-memory\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"cache-memory\""
files = [
{file = "Pympler-1.0.1-py3-none-any.whl", hash = "sha256:d260dda9ae781e1eab6ea15bacb84015849833ba5555f141d2d9b7b7473b307d"},
{file = "Pympler-1.0.1.tar.gz", hash = "sha256:993f1a3599ca3f4fcd7160c7545ad06310c9e12f70174ae7ae8d4e25f6c5d3fa"},
@ -2081,7 +2103,7 @@ description = "Python implementation of SAML Version 2 Standard"
optional = true
python-versions = ">=3.9,<4.0"
groups = ["main"]
markers = "extra == \"saml2\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"saml2\""
files = [
{file = "pysaml2-7.5.0-py3-none-any.whl", hash = "sha256:bc6627cc344476a83c757f440a73fda1369f13b6fda1b4e16bca63ffbabb5318"},
{file = "pysaml2-7.5.0.tar.gz", hash = "sha256:f36871d4e5ee857c6b85532e942550d2cf90ea4ee943d75eb681044bbc4f54f7"},
@ -2106,7 +2128,7 @@ description = "Extensions to the standard Python datetime module"
optional = true
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
groups = ["main"]
markers = "extra == \"saml2\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"saml2\""
files = [
{file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
{file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
@ -2134,7 +2156,7 @@ description = "World timezone definitions, modern and historical"
optional = true
python-versions = "*"
groups = ["main"]
markers = "extra == \"saml2\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"saml2\""
files = [
{file = "pytz-2022.7.1-py2.py3-none-any.whl", hash = "sha256:78f4f37d8198e0627c5f1143240bb0206b8691d8d7ac6d78fee88b78733f8c4a"},
{file = "pytz-2022.7.1.tar.gz", hash = "sha256:01a0681c4b9684a28304615eba55d1ab31ae00bf68ec157ec3708a8182dbbcd0"},
@ -2498,7 +2520,7 @@ description = "Python client for Sentry (https://sentry.io)"
optional = true
python-versions = ">=3.6"
groups = ["main"]
markers = "extra == \"sentry\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"sentry\""
files = [
{file = "sentry_sdk-2.22.0-py2.py3-none-any.whl", hash = "sha256:3d791d631a6c97aad4da7074081a57073126c69487560c6f8bffcf586461de66"},
{file = "sentry_sdk-2.22.0.tar.gz", hash = "sha256:b4bf43bb38f547c84b2eadcefbe389b36ef75f3f38253d7a74d6b928c07ae944"},
@ -2686,7 +2708,7 @@ description = "Tornado IOLoop Backed Concurrent Futures"
optional = true
python-versions = "*"
groups = ["main"]
markers = "extra == \"opentracing\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"opentracing\""
files = [
{file = "threadloop-1.0.2-py2-none-any.whl", hash = "sha256:5c90dbefab6ffbdba26afb4829d2a9df8275d13ac7dc58dccb0e279992679599"},
{file = "threadloop-1.0.2.tar.gz", hash = "sha256:8b180aac31013de13c2ad5c834819771992d350267bddb854613ae77ef571944"},
@ -2702,7 +2724,7 @@ description = "Python bindings for the Apache Thrift RPC system"
optional = true
python-versions = "*"
groups = ["main"]
markers = "extra == \"opentracing\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"opentracing\""
files = [
{file = "thrift-0.16.0.tar.gz", hash = "sha256:2b5b6488fcded21f9d312aa23c9ff6a0195d0f6ae26ddbd5ad9e3e25dfc14408"},
]
@ -2764,7 +2786,7 @@ description = "Tornado is a Python web framework and asynchronous networking lib
optional = true
python-versions = ">=3.9"
groups = ["main"]
markers = "extra == \"opentracing\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"opentracing\""
files = [
{file = "tornado-6.5-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:f81067dad2e4443b015368b24e802d0083fecada4f0a4572fdb72fc06e54a9a6"},
{file = "tornado-6.5-cp39-abi3-macosx_10_9_x86_64.whl", hash = "sha256:9ac1cbe1db860b3cbb251e795c701c41d343f06a96049d6274e7c77559117e41"},
@ -2804,27 +2826,28 @@ dev = ["furo (>=2024.05.06)", "nox", "packaging", "sphinx (>=5)", "twisted"]
[[package]]
name = "treq"
version = "24.9.1"
version = "25.5.0"
description = "High-level Twisted HTTP Client API"
optional = false
python-versions = ">=3.7"
python-versions = ">=3.8.0"
groups = ["main"]
files = [
{file = "treq-24.9.1-py3-none-any.whl", hash = "sha256:eee4756fd9a857c77f180fd5202b52c518f2d3e2826dce28b89066c03bfc45d0"},
{file = "treq-24.9.1.tar.gz", hash = "sha256:15da7fc404f3e4ed59d0abe5f8eef4966fabbe618039a2a23bc7c15305cefea8"},
{file = "treq-25.5.0-py3-none-any.whl", hash = "sha256:e99d4e66cacaa1f0da82bb60b317d104c29dbd8ac0a75d7f657b348178d830f4"},
{file = "treq-25.5.0.tar.gz", hash = "sha256:25dde3a55ae85ec2f2c56332c99aef255ab14f997d0d00552ebff13538a9804a"},
]
[package.dependencies]
attrs = "*"
hyperlink = ">=21.0.0"
incremental = "*"
incremental = ">=24.7.2"
multipart = "*"
requests = ">=2.1.0"
Twisted = {version = ">=22.10.0", extras = ["tls"]}
twisted = {version = ">=22.10.0", extras = ["tls"]}
typing-extensions = ">=3.10.0"
[package.extras]
dev = ["httpbin (==0.7.0)", "pep8", "pyflakes", "werkzeug (==2.0.3)"]
docs = ["sphinx (<7.0.0)"]
docs = ["sphinx", "sphinx-rtd-theme"]
[[package]]
name = "twine"
@ -2900,7 +2923,7 @@ description = "non-blocking redis client for python"
optional = true
python-versions = "*"
groups = ["main"]
markers = "extra == \"redis\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"redis\""
files = [
{file = "txredisapi-1.4.11-py3-none-any.whl", hash = "sha256:ac64d7a9342b58edca13ef267d4fa7637c1aa63f8595e066801c1e8b56b22d0b"},
{file = "txredisapi-1.4.11.tar.gz", hash = "sha256:3eb1af99aefdefb59eb877b1dd08861efad60915e30ad5bf3d5bf6c5cedcdbc6"},
@ -2966,14 +2989,14 @@ files = [
[[package]]
name = "types-jsonschema"
version = "4.23.0.20250516"
version = "4.24.0.20250528"
description = "Typing stubs for jsonschema"
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "types_jsonschema-4.23.0.20250516-py3-none-any.whl", hash = "sha256:e7d0dd7db7e59e63c26e3230e26ffc64c4704cc5170dc21270b366a35ead1618"},
{file = "types_jsonschema-4.23.0.20250516.tar.gz", hash = "sha256:9ace09d9d35c4390a7251ccd7d833b92ccc189d24d1b347f26212afce361117e"},
{file = "types_jsonschema-4.24.0.20250528-py3-none-any.whl", hash = "sha256:6a906b5ff73ac11c8d1e0b6c30a9693e1e4e1ab56c56c932b3a7e081b86d187b"},
{file = "types_jsonschema-4.24.0.20250528.tar.gz", hash = "sha256:7e28c64e0ae7980eeb158105b20663fc6a6b8f81d5f86ea6614aa0014417bd1e"},
]
[package.dependencies]
@ -3243,7 +3266,7 @@ description = "An XML Schema validator and decoder"
optional = true
python-versions = ">=3.7"
groups = ["main"]
markers = "extra == \"saml2\" or extra == \"all\""
markers = "extra == \"all\" or extra == \"saml2\""
files = [
{file = "xmlschema-2.4.0-py3-none-any.whl", hash = "sha256:dc87be0caaa61f42649899189aab2fd8e0d567f2cf548433ba7b79278d231a4a"},
{file = "xmlschema-2.4.0.tar.gz", hash = "sha256:d74cd0c10866ac609e1ef94a5a69b018ad16e39077bc6393408b40c6babee793"},

View File

@ -374,7 +374,7 @@ tomli = ">=1.2.3"
# runtime errors caused by build system changes.
# We are happy to raise these upper bounds upon request,
# provided we check that it's safe to do so (i.e. that CI passes).
requires = ["poetry-core>=1.1.0,<=1.9.1", "setuptools_rust>=1.3,<=1.10.2"]
requires = ["poetry-core>=1.1.0,<=2.1.3", "setuptools_rust>=1.3,<=1.10.2"]
build-backend = "poetry.core.masonry.api"

View File

@ -29,6 +29,7 @@ from synapse.api.errors import (
InvalidClientTokenError,
MissingClientTokenError,
UnrecognizedRequestError,
UserLockedError,
)
from synapse.http.site import SynapseRequest
from synapse.logging.opentracing import active_span, force_tracing, start_active_span
@ -162,12 +163,7 @@ class InternalAuth(BaseAuth):
if not allow_locked and await self.store.get_user_locked_status(
requester.user.to_string()
):
raise AuthError(
401,
"User account has been locked",
errcode=Codes.USER_LOCKED,
additional_fields={"soft_logout": True},
)
raise UserLockedError()
# Deny the request if the user account has expired.
# This check is only done for regular users, not appservice ones.

View File

@ -306,6 +306,20 @@ class UserDeactivatedError(SynapseError):
)
class UserLockedError(SynapseError):
"""The error returned to the client when the user attempted to access an
authenticated endpoint, but the account has been locked.
"""
def __init__(self) -> None:
super().__init__(
code=HTTPStatus.UNAUTHORIZED,
msg="User account has been locked",
errcode=Codes.USER_LOCKED,
additional_fields={"soft_logout": True},
)
class FederationDeniedError(SynapseError):
"""An error raised when the server tries to federate with a server which
is not on its federation whitelist.

View File

@ -156,7 +156,9 @@ class FederationRemoteSendQueue(AbstractFederationSender):
def _clear_queue_before_pos(self, position_to_delete: int) -> None:
"""Clear all the queues from before a given position"""
with Measure(self.clock, "send_queue._clear"):
with Measure(
self.clock, name="send_queue._clear", server_name=self.server_name
):
# Delete things out of presence maps
keys = self.presence_destinations.keys()
i = self.presence_destinations.bisect_left(position_to_delete)

View File

@ -657,7 +657,11 @@ class FederationSender(AbstractFederationSender):
logger.debug(
"Handling %i events in room %s", len(events), events[0].room_id
)
with Measure(self.clock, "handle_room_events"):
with Measure(
self.clock,
name="handle_room_events",
server_name=self.server_name,
):
for event in events:
await handle_event(event)

View File

@ -58,7 +58,7 @@ class TransactionManager:
"""
def __init__(self, hs: "synapse.server.HomeServer"):
self._server_name = hs.hostname
self.server_name = hs.hostname # nb must be called this for @measure_func
self.clock = hs.get_clock() # nb must be called this for @measure_func
self._store = hs.get_datastores().main
self._transaction_actions = TransactionActions(self._store)
@ -116,7 +116,7 @@ class TransactionManager:
transaction = Transaction(
origin_server_ts=int(self.clock.time_msec()),
transaction_id=txn_id,
origin=self._server_name,
origin=self.server_name,
destination=destination,
pdus=[p.get_pdu_json() for p in pdus],
edus=[edu.get_dict() for edu in edus],

View File

@ -73,6 +73,7 @@ events_processed_counter = Counter("synapse_handlers_appservice_events_processed
class ApplicationServicesHandler:
def __init__(self, hs: "HomeServer"):
self.server_name = hs.hostname
self.store = hs.get_datastores().main
self.is_mine_id = hs.is_mine_id
self.appservice_api = hs.get_application_service_api()
@ -120,7 +121,9 @@ class ApplicationServicesHandler:
@wrap_as_background_process("notify_interested_services")
async def _notify_interested_services(self, max_token: RoomStreamToken) -> None:
with Measure(self.clock, "notify_interested_services"):
with Measure(
self.clock, name="notify_interested_services", server_name=self.server_name
):
self.is_processing = True
try:
upper_bound = -1
@ -329,7 +332,11 @@ class ApplicationServicesHandler:
users: Collection[Union[str, UserID]],
) -> None:
logger.debug("Checking interested services for %s", stream_key)
with Measure(self.clock, "notify_interested_services_ephemeral"):
with Measure(
self.clock,
name="notify_interested_services_ephemeral",
server_name=self.server_name,
):
for service in services:
if stream_key == StreamKeyType.TYPING:
# Note that we don't persist the token (via set_appservice_stream_type_pos)

View File

@ -54,6 +54,7 @@ logger = logging.getLogger(__name__)
class DelayedEventsHandler:
def __init__(self, hs: "HomeServer"):
self.server_name = hs.hostname
self._store = hs.get_datastores().main
self._storage_controllers = hs.get_storage_controllers()
self._config = hs.config
@ -159,7 +160,9 @@ class DelayedEventsHandler:
# Loop round handling deltas until we're up to date
while True:
with Measure(self._clock, "delayed_events_delta"):
with Measure(
self._clock, name="delayed_events_delta", server_name=self.server_name
):
room_max_stream_ordering = self._store.get_room_max_stream_ordering()
if self._event_pos == room_max_stream_ordering:
return

View File

@ -526,6 +526,8 @@ class DeviceHandler(DeviceWorkerHandler):
def __init__(self, hs: "HomeServer"):
super().__init__(hs)
self.server_name = hs.hostname # nb must be called this for @measure_func
self.clock = hs.get_clock() # nb must be called this for @measure_func
self.federation_sender = hs.get_federation_sender()
self._account_data_handler = hs.get_account_data_handler()
self._storage_controllers = hs.get_storage_controllers()
@ -1215,7 +1217,8 @@ class DeviceListUpdater(DeviceListWorkerUpdater):
def __init__(self, hs: "HomeServer", device_handler: DeviceHandler):
self.store = hs.get_datastores().main
self.federation = hs.get_federation_client()
self.clock = hs.get_clock()
self.server_name = hs.hostname # nb must be called this for @measure_func
self.clock = hs.get_clock() # nb must be called this for @measure_func
self.device_handler = device_handler
self._notifier = hs.get_notifier()

View File

@ -476,16 +476,16 @@ _DUMMY_EVENT_ROOM_EXCLUSION_EXPIRY = 7 * 24 * 60 * 60 * 1000
class EventCreationHandler:
def __init__(self, hs: "HomeServer"):
self.hs = hs
self.validator = EventValidator()
self.event_builder_factory = hs.get_event_builder_factory()
self.server_name = hs.hostname # nb must be called this for @measure_func
self.clock = hs.get_clock() # nb must be called this for @measure_func
self.auth_blocking = hs.get_auth_blocking()
self._event_auth_handler = hs.get_event_auth_handler()
self.store = hs.get_datastores().main
self._storage_controllers = hs.get_storage_controllers()
self.state = hs.get_state_handler()
self.clock = hs.get_clock()
self.validator = EventValidator()
self.profile_handler = hs.get_profile_handler()
self.event_builder_factory = hs.get_event_builder_factory()
self.server_name = hs.hostname
self.notifier = hs.get_notifier()
self.config = hs.config
self.require_membership_for_aliases = (

View File

@ -747,6 +747,7 @@ class WorkerPresenceHandler(BasePresenceHandler):
class PresenceHandler(BasePresenceHandler):
def __init__(self, hs: "HomeServer"):
super().__init__(hs)
self.server_name = hs.hostname
self.wheel_timer: WheelTimer[str] = WheelTimer()
self.notifier = hs.get_notifier()
@ -941,7 +942,9 @@ class PresenceHandler(BasePresenceHandler):
now = self.clock.time_msec()
with Measure(self.clock, "presence_update_states"):
with Measure(
self.clock, name="presence_update_states", server_name=self.server_name
):
# NOTE: We purposefully don't await between now and when we've
# calculated what we want to do with the new states, to avoid races.
@ -1497,7 +1500,9 @@ class PresenceHandler(BasePresenceHandler):
async def _unsafe_process(self) -> None:
# Loop round handling deltas until we're up to date
while True:
with Measure(self.clock, "presence_delta"):
with Measure(
self.clock, name="presence_delta", server_name=self.server_name
):
room_max_stream_ordering = self.store.get_room_max_stream_ordering()
if self._event_pos == room_max_stream_ordering:
return
@ -1759,6 +1764,7 @@ class PresenceEventSource(EventSource[int, UserPresenceState]):
# Same with get_presence_router:
#
# AuthHandler -> Notifier -> PresenceEventSource -> ModuleApi -> AuthHandler
self.server_name = hs.hostname
self.get_presence_handler = hs.get_presence_handler
self.get_presence_router = hs.get_presence_router
self.clock = hs.get_clock()
@ -1792,7 +1798,9 @@ class PresenceEventSource(EventSource[int, UserPresenceState]):
user_id = user.to_string()
stream_change_cache = self.store.presence_stream_cache
with Measure(self.clock, "presence.get_new_events"):
with Measure(
self.clock, name="presence.get_new_events", server_name=self.server_name
):
if from_key is not None:
from_key = int(from_key)

View File

@ -329,6 +329,7 @@ class E2eeSyncResult:
class SyncHandler:
def __init__(self, hs: "HomeServer"):
self.server_name = hs.hostname
self.hs_config = hs.config
self.store = hs.get_datastores().main
self.notifier = hs.get_notifier()
@ -710,7 +711,9 @@ class SyncHandler:
sync_config = sync_result_builder.sync_config
with Measure(self.clock, "ephemeral_by_room"):
with Measure(
self.clock, name="ephemeral_by_room", server_name=self.server_name
):
typing_key = since_token.typing_key if since_token else 0
room_ids = sync_result_builder.joined_room_ids
@ -783,7 +786,9 @@ class SyncHandler:
and current token to send down to clients.
newly_joined_room
"""
with Measure(self.clock, "load_filtered_recents"):
with Measure(
self.clock, name="load_filtered_recents", server_name=self.server_name
):
timeline_limit = sync_config.filter_collection.timeline_limit()
block_all_timeline = (
sync_config.filter_collection.blocks_all_room_timeline()
@ -1174,7 +1179,9 @@ class SyncHandler:
# updates even if they occurred logically before the previous event.
# TODO(mjark) Check for new redactions in the state events.
with Measure(self.clock, "compute_state_delta"):
with Measure(
self.clock, name="compute_state_delta", server_name=self.server_name
):
# The memberships needed for events in the timeline.
# Only calculated when `lazy_load_members` is on.
members_to_fetch: Optional[Set[str]] = None
@ -1791,7 +1798,9 @@ class SyncHandler:
# the DB.
return RoomNotifCounts.empty()
with Measure(self.clock, "unread_notifs_for_room_id"):
with Measure(
self.clock, name="unread_notifs_for_room_id", server_name=self.server_name
):
return await self.store.get_unread_event_push_actions_by_room_for_user(
room_id,
sync_config.user.to_string(),

View File

@ -503,6 +503,7 @@ class TypingWriterHandler(FollowerTypingHandler):
class TypingNotificationEventSource(EventSource[int, JsonMapping]):
def __init__(self, hs: "HomeServer"):
self.server_name = hs.hostname
self._main_store = hs.get_datastores().main
self.clock = hs.get_clock()
# We can't call get_typing_handler here because there's a cycle:
@ -535,7 +536,9 @@ class TypingNotificationEventSource(EventSource[int, JsonMapping]):
appservice may be interested in.
* The latest known room serial.
"""
with Measure(self.clock, "typing.get_new_events_as"):
with Measure(
self.clock, name="typing.get_new_events_as", server_name=self.server_name
):
handler = self.get_typing_handler()
events = []
@ -571,7 +574,9 @@ class TypingNotificationEventSource(EventSource[int, JsonMapping]):
Find typing notifications for given rooms (> `from_token` and <= `to_token`)
"""
with Measure(self.clock, "typing.get_new_events"):
with Measure(
self.clock, name="typing.get_new_events", server_name=self.server_name
):
from_key = int(from_key)
handler = self.get_typing_handler()

View File

@ -237,7 +237,9 @@ class UserDirectoryHandler(StateDeltasHandler):
# Loop round handling deltas until we're up to date
while True:
with Measure(self.clock, "user_dir_delta"):
with Measure(
self.clock, name="user_dir_delta", server_name=self.server_name
):
room_max_stream_ordering = self.store.get_room_max_stream_ordering()
if self.pos == room_max_stream_ordering:
return

View File

@ -92,6 +92,7 @@ class MatrixFederationAgent:
def __init__(
self,
server_name: str,
reactor: ISynapseReactor,
tls_client_options_factory: Optional[FederationPolicyForHTTPS],
user_agent: bytes,
@ -100,6 +101,11 @@ class MatrixFederationAgent:
_srv_resolver: Optional[SrvResolver] = None,
_well_known_resolver: Optional[WellKnownResolver] = None,
):
"""
Args:
server_name: Our homeserver name (used to label metrics) (`hs.hostname`).
"""
# proxy_reactor is not blocklisting reactor
proxy_reactor = reactor
@ -127,6 +133,7 @@ class MatrixFederationAgent:
if _well_known_resolver is None:
_well_known_resolver = WellKnownResolver(
server_name,
reactor,
agent=BlocklistingAgentWrapper(
ProxyAgent(

View File

@ -91,12 +91,19 @@ class WellKnownResolver:
def __init__(
self,
server_name: str,
reactor: IReactorTime,
agent: IAgent,
user_agent: bytes,
well_known_cache: Optional[TTLCache[bytes, Optional[bytes]]] = None,
had_well_known_cache: Optional[TTLCache[bytes, bool]] = None,
):
"""
Args:
server_name: Our homeserver name (used to label metrics) (`hs.hostname`).
"""
self.server_name = server_name
self._reactor = reactor
self._clock = Clock(reactor)
@ -134,7 +141,13 @@ class WellKnownResolver:
# TODO: should we linearise so that we don't end up doing two .well-known
# requests for the same server in parallel?
try:
with Measure(self._clock, "get_well_known"):
with Measure(
self._clock,
name="get_well_known",
# This should be our homeserver where the the code is running (used to
# label metrics)
server_name=self.server_name,
):
result: Optional[bytes]
cache_period: float

View File

@ -417,6 +417,7 @@ class MatrixFederationHttpClient:
if hs.get_instance_name() in outbound_federation_restricted_to:
# Talk to federation directly
federation_agent: IAgent = MatrixFederationAgent(
self.server_name,
self.reactor,
tls_client_options_factory,
user_agent.encode("ascii"),
@ -697,7 +698,11 @@ class MatrixFederationHttpClient:
outgoing_requests_counter.labels(request.method).inc()
try:
with Measure(self.clock, "outbound_request"):
with Measure(
self.clock,
name="outbound_request",
server_name=self.server_name,
):
# we don't want all the fancy cookie and redirect handling
# that treq.request gives: just use the raw Agent.

View File

@ -66,6 +66,19 @@ all_gauges: Dict[str, Collector] = {}
HAVE_PROC_SELF_STAT = os.path.exists("/proc/self/stat")
INSTANCE_LABEL_NAME = "instance"
"""
The standard Prometheus label name used to identify which server instance the metrics
came from.
In the case of a Synapse homeserver, this should be set to the homeserver name
(`hs.hostname`).
Normally, this would be set automatically by the Prometheus server scraping the data but
since we support multiple instances of Synapse running in the same process and all
metrics are in a single global `REGISTRY`, we need to manually label any metrics.
"""
class _RegistryProxy:
@staticmethod
@ -192,7 +205,16 @@ class InFlightGauge(Generic[MetricsEntry], Collector):
same key.
Note that `callback` may be called on a separate thread.
Args:
key: A tuple of label values, which must match the order of the
`labels` given to the constructor.
callback
"""
assert len(key) == len(self.labels), (
f"Expected {len(self.labels)} labels in `key`, got {len(key)}: {key}"
)
with self._lock:
self._registrations.setdefault(key, set()).add(callback)
@ -201,7 +223,17 @@ class InFlightGauge(Generic[MetricsEntry], Collector):
key: Tuple[str, ...],
callback: Callable[[MetricsEntry], None],
) -> None:
"""Registers that we've exited a block with labels `key`."""
"""
Registers that we've exited a block with labels `key`.
Args:
key: A tuple of label values, which must match the order of the
`labels` given to the constructor.
callback
"""
assert len(key) == len(self.labels), (
f"Expected {len(self.labels)} labels in `key`, got {len(key)}: {key}"
)
with self._lock:
self._registrations.setdefault(key, set()).discard(callback)
@ -225,7 +257,7 @@ class InFlightGauge(Generic[MetricsEntry], Collector):
with self._lock:
callbacks = set(self._registrations[key])
in_flight.add_metric(key, len(callbacks))
in_flight.add_metric(labels=key, value=len(callbacks))
metrics = self._metrics_class()
metrics_by_key[key] = metrics
@ -239,7 +271,7 @@ class InFlightGauge(Generic[MetricsEntry], Collector):
"_".join([self.name, name]), "", labels=self.labels
)
for key, metrics in metrics_by_key.items():
gauge.add_metric(key, getattr(metrics, name))
gauge.add_metric(labels=key, value=getattr(metrics, name))
yield gauge
def _register_with_collector(self) -> None:

View File

@ -31,6 +31,7 @@ IS_USER_ALLOWED_TO_UPLOAD_MEDIA_OF_SIZE_CALLBACK = Callable[[str, int], Awaitabl
class MediaRepositoryModuleApiCallbacks:
def __init__(self, hs: "HomeServer") -> None:
self.server_name = hs.hostname
self.clock = hs.get_clock()
self._get_media_config_for_user_callbacks: List[
GET_MEDIA_CONFIG_FOR_USER_CALLBACK
@ -57,7 +58,11 @@ class MediaRepositoryModuleApiCallbacks:
async def get_media_config_for_user(self, user_id: str) -> Optional[JsonDict]:
for callback in self._get_media_config_for_user_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res: Optional[JsonDict] = await delay_cancellation(callback(user_id))
if res:
return res
@ -68,7 +73,11 @@ class MediaRepositoryModuleApiCallbacks:
self, user_id: str, size: int
) -> bool:
for callback in self._is_user_allowed_to_upload_media_of_size_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res: bool = await delay_cancellation(callback(user_id, size))
if not res:
return res

View File

@ -43,6 +43,7 @@ GET_RATELIMIT_OVERRIDE_FOR_USER_CALLBACK = Callable[
class RatelimitModuleApiCallbacks:
def __init__(self, hs: "HomeServer") -> None:
self.server_name = hs.hostname
self.clock = hs.get_clock()
self._get_ratelimit_override_for_user_callbacks: List[
GET_RATELIMIT_OVERRIDE_FOR_USER_CALLBACK
@ -64,7 +65,11 @@ class RatelimitModuleApiCallbacks:
self, user_id: str, limiter_name: str
) -> Optional[RatelimitOverride]:
for callback in self._get_ratelimit_override_for_user_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res: Optional[RatelimitOverride] = await delay_cancellation(
callback(user_id, limiter_name)
)

View File

@ -356,6 +356,7 @@ class SpamCheckerModuleApiCallbacks:
NOT_SPAM: Literal["NOT_SPAM"] = "NOT_SPAM"
def __init__(self, hs: "synapse.server.HomeServer") -> None:
self.server_name = hs.hostname
self.clock = hs.get_clock()
self._check_event_for_spam_callbacks: List[CHECK_EVENT_FOR_SPAM_CALLBACK] = []
@ -490,7 +491,11 @@ class SpamCheckerModuleApiCallbacks:
generally discouraged as it doesn't support internationalization.
"""
for callback in self._check_event_for_spam_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(callback(event))
if res is False or res == self.NOT_SPAM:
# This spam-checker accepts the event.
@ -543,7 +548,11 @@ class SpamCheckerModuleApiCallbacks:
True if the event should be silently dropped
"""
for callback in self._should_drop_federated_event_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res: Union[bool, str] = await delay_cancellation(callback(event))
if res:
return res
@ -565,7 +574,11 @@ class SpamCheckerModuleApiCallbacks:
NOT_SPAM if the operation is permitted, [Codes, Dict] otherwise.
"""
for callback in self._user_may_join_room_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(callback(user_id, room_id, is_invited))
# Normalize return values to `Codes` or `"NOT_SPAM"`.
if res is True or res is self.NOT_SPAM:
@ -604,7 +617,11 @@ class SpamCheckerModuleApiCallbacks:
NOT_SPAM if the operation is permitted, Codes otherwise.
"""
for callback in self._user_may_invite_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(
callback(inviter_userid, invitee_userid, room_id)
)
@ -643,7 +660,11 @@ class SpamCheckerModuleApiCallbacks:
NOT_SPAM if the operation is permitted, Codes otherwise.
"""
for callback in self._federated_user_may_invite_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(callback(event))
# Normalize return values to `Codes` or `"NOT_SPAM"`.
if res is True or res is self.NOT_SPAM:
@ -686,7 +707,11 @@ class SpamCheckerModuleApiCallbacks:
NOT_SPAM if the operation is permitted, Codes otherwise.
"""
for callback in self._user_may_send_3pid_invite_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(
callback(inviter_userid, medium, address, room_id)
)
@ -722,7 +747,11 @@ class SpamCheckerModuleApiCallbacks:
room_config: The room creation configuration which is the body of the /createRoom request
"""
for callback in self._user_may_create_room_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
checker_args = inspect.signature(callback)
# Also ensure backwards compatibility with spam checker callbacks
# that don't expect the room_config argument.
@ -786,7 +815,11 @@ class SpamCheckerModuleApiCallbacks:
content: The content of the state event
"""
for callback in self._user_may_send_state_event_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
# We make a copy of the content to ensure that the spam checker cannot modify it.
res = await delay_cancellation(
callback(user_id, room_id, event_type, state_key, deepcopy(content))
@ -814,7 +847,11 @@ class SpamCheckerModuleApiCallbacks:
"""
for callback in self._user_may_create_room_alias_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(callback(userid, room_alias))
if res is True or res is self.NOT_SPAM:
continue
@ -847,7 +884,11 @@ class SpamCheckerModuleApiCallbacks:
room_id: The ID of the room that would be published
"""
for callback in self._user_may_publish_room_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(callback(userid, room_id))
if res is True or res is self.NOT_SPAM:
continue
@ -889,7 +930,11 @@ class SpamCheckerModuleApiCallbacks:
True if the user is spammy.
"""
for callback in self._check_username_for_spam_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
checker_args = inspect.signature(callback)
# Make a copy of the user profile object to ensure the spam checker cannot
# modify it.
@ -938,7 +983,11 @@ class SpamCheckerModuleApiCallbacks:
"""
for callback in self._check_registration_for_spam_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
behaviour = await delay_cancellation(
callback(email_threepid, username, request_info, auth_provider_id)
)
@ -980,7 +1029,11 @@ class SpamCheckerModuleApiCallbacks:
"""
for callback in self._check_media_file_for_spam_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(callback(file_wrapper, file_info))
# Normalize return values to `Codes` or `"NOT_SPAM"`.
if res is False or res is self.NOT_SPAM:
@ -1027,7 +1080,11 @@ class SpamCheckerModuleApiCallbacks:
"""
for callback in self._check_login_for_spam_callbacks:
with Measure(self.clock, f"{callback.__module__}.{callback.__qualname__}"):
with Measure(
self.clock,
name=f"{callback.__module__}.{callback.__qualname__}",
server_name=self.server_name,
):
res = await delay_cancellation(
callback(
user_id,

View File

@ -129,7 +129,8 @@ class BulkPushRuleEvaluator:
def __init__(self, hs: "HomeServer"):
self.hs = hs
self.store = hs.get_datastores().main
self.clock = hs.get_clock()
self.server_name = hs.hostname # nb must be called this for @measure_func
self.clock = hs.get_clock() # nb must be called this for @measure_func
self._event_auth_handler = hs.get_event_auth_handler()
self.should_calculate_push_rules = self.hs.config.push.enable_push

View File

@ -76,6 +76,7 @@ class ReplicationFederationSendEventsRestServlet(ReplicationEndpoint):
def __init__(self, hs: "HomeServer"):
super().__init__(hs)
self.server_name = hs.hostname
self.store = hs.get_datastores().main
self._storage_controllers = hs.get_storage_controllers()
self.clock = hs.get_clock()
@ -122,7 +123,9 @@ class ReplicationFederationSendEventsRestServlet(ReplicationEndpoint):
async def _handle_request( # type: ignore[override]
self, request: Request, content: JsonDict
) -> Tuple[int, JsonDict]:
with Measure(self.clock, "repl_fed_send_events_parse"):
with Measure(
self.clock, name="repl_fed_send_events_parse", server_name=self.server_name
):
room_id = content["room_id"]
backfilled = content["backfilled"]

View File

@ -76,6 +76,7 @@ class ReplicationSendEventRestServlet(ReplicationEndpoint):
def __init__(self, hs: "HomeServer"):
super().__init__(hs)
self.server_name = hs.hostname
self.event_creation_handler = hs.get_event_creation_handler()
self.store = hs.get_datastores().main
self._storage_controllers = hs.get_storage_controllers()
@ -121,7 +122,9 @@ class ReplicationSendEventRestServlet(ReplicationEndpoint):
async def _handle_request( # type: ignore[override]
self, request: Request, content: JsonDict, event_id: str
) -> Tuple[int, JsonDict]:
with Measure(self.clock, "repl_send_event_parse"):
with Measure(
self.clock, name="repl_send_event_parse", server_name=self.server_name
):
event_dict = content["event"]
room_ver = KNOWN_ROOM_VERSIONS[content["room_version"]]
internal_metadata = content["internal_metadata"]

View File

@ -77,6 +77,7 @@ class ReplicationSendEventsRestServlet(ReplicationEndpoint):
def __init__(self, hs: "HomeServer"):
super().__init__(hs)
self.server_name = hs.hostname
self.event_creation_handler = hs.get_event_creation_handler()
self.store = hs.get_datastores().main
self._storage_controllers = hs.get_storage_controllers()
@ -122,7 +123,9 @@ class ReplicationSendEventsRestServlet(ReplicationEndpoint):
async def _handle_request( # type: ignore[override]
self, request: Request, payload: JsonDict
) -> Tuple[int, JsonDict]:
with Measure(self.clock, "repl_send_events_parse"):
with Measure(
self.clock, name="repl_send_events_parse", server_name=self.server_name
):
events_and_context = []
events = payload["events"]
rooms = set()

View File

@ -75,6 +75,7 @@ class ReplicationDataHandler:
"""
def __init__(self, hs: "HomeServer"):
self.server_name = hs.hostname
self.store = hs.get_datastores().main
self.notifier = hs.get_notifier()
self._reactor = hs.get_reactor()
@ -342,7 +343,11 @@ class ReplicationDataHandler:
waiting_list.add((position, deferred))
# We measure here to get in flight counts and average waiting time.
with Measure(self._clock, "repl.wait_for_stream_position"):
with Measure(
self._clock,
name="repl.wait_for_stream_position",
server_name=self.server_name,
):
logger.info(
"Waiting for repl stream %r to reach %s (%s); currently at: %s",
stream_name,

View File

@ -78,6 +78,7 @@ class ReplicationStreamer:
"""
def __init__(self, hs: "HomeServer"):
self.server_name = hs.hostname
self.store = hs.get_datastores().main
self.clock = hs.get_clock()
self.notifier = hs.get_notifier()
@ -155,7 +156,11 @@ class ReplicationStreamer:
while self.pending_updates:
self.pending_updates = False
with Measure(self.clock, "repl.stream.get_updates"):
with Measure(
self.clock,
name="repl.stream.get_updates",
server_name=self.server_name,
):
all_streams = self.streams
if self._replication_torture_level is not None:

View File

@ -42,6 +42,7 @@ from synapse.api.errors import (
NotApprovedError,
SynapseError,
UserDeactivatedError,
UserLockedError,
)
from synapse.api.ratelimiting import Ratelimiter
from synapse.api.urls import CLIENT_API_PREFIX
@ -313,7 +314,7 @@ class LoginRestServlet(RestServlet):
should_issue_refresh_token=should_issue_refresh_token,
# The user represented by an appservice's configured sender_localpart
# is not actually created in Synapse.
should_check_deactivated=qualified_user_id != appservice.sender,
should_check_deactivated_or_locked=qualified_user_id != appservice.sender,
request_info=request_info,
)
@ -367,7 +368,7 @@ class LoginRestServlet(RestServlet):
auth_provider_id: Optional[str] = None,
should_issue_refresh_token: bool = False,
auth_provider_session_id: Optional[str] = None,
should_check_deactivated: bool = True,
should_check_deactivated_or_locked: bool = True,
*,
request_info: RequestInfo,
) -> LoginResponse:
@ -389,8 +390,8 @@ class LoginRestServlet(RestServlet):
should_issue_refresh_token: True if this login should issue
a refresh token alongside the access token.
auth_provider_session_id: The session ID got during login from the SSO IdP.
should_check_deactivated: True if the user should be checked for
deactivation status before logging in.
should_check_deactivated_or_locked: True if the user should be checked for
deactivation or locked status before logging in.
This exists purely for appservice's configured sender_localpart
which doesn't have an associated user in the database.
@ -415,11 +416,14 @@ class LoginRestServlet(RestServlet):
)
user_id = canonical_uid
# If the account has been deactivated, do not proceed with the login.
if should_check_deactivated:
# If the account has been deactivated or locked, do not proceed with the login.
if should_check_deactivated_or_locked:
deactivated = await self._main_store.get_user_deactivated_status(user_id)
if deactivated:
raise UserDeactivatedError("This account has been deactivated")
locked = await self._main_store.get_user_locked_status(user_id)
if locked:
raise UserLockedError()
device_id = login_submission.get("device_id")

View File

@ -189,7 +189,8 @@ class StateHandler:
"""
def __init__(self, hs: "HomeServer"):
self.clock = hs.get_clock()
self.server_name = hs.hostname # nb must be called this for @measure_func
self.clock = hs.get_clock() # nb must be called this for @measure_func
self.store = hs.get_datastores().main
self._state_storage_controller = hs.get_storage_controllers().state
self.hs = hs
@ -631,6 +632,7 @@ class StateResolutionHandler:
"""
def __init__(self, hs: "HomeServer"):
self.server_name = hs.hostname
self.clock = hs.get_clock()
self.resolve_linearizer = Linearizer(name="state_resolve_lock")
@ -747,7 +749,9 @@ class StateResolutionHandler:
# which will be used as a cache key for future resolutions, but
# not get persisted.
with Measure(self.clock, "state.create_group_ids"):
with Measure(
self.clock, name="state.create_group_ids", server_name=self.server_name
):
cache = _make_state_cache_entry(new_state, state_groups_ids)
self._state_cache[group_names] = cache
@ -785,7 +789,9 @@ class StateResolutionHandler:
a map from (type, state_key) to event_id.
"""
try:
with Measure(self.clock, "state._resolve_events") as m:
with Measure(
self.clock, name="state._resolve_events", server_name=self.server_name
) as m:
room_version_obj = KNOWN_ROOM_VERSIONS[room_version]
if room_version_obj.state_res == StateResolutionVersions.V1:
return await v1.resolve_events_with_store(

View File

@ -55,6 +55,7 @@ class SQLBaseStore(metaclass=ABCMeta):
hs: "HomeServer",
):
self.hs = hs
self.server_name = hs.hostname
self._clock = hs.get_clock()
self.database_engine = database.engine
self.db_pool = database

View File

@ -337,6 +337,7 @@ class EventsPersistenceStorageController:
assert stores.persist_events
self.persist_events_store = stores.persist_events
self.server_name = hs.hostname
self._clock = hs.get_clock()
self._instance_name = hs.get_instance_name()
self.is_mine_id = hs.is_mine_id
@ -616,7 +617,11 @@ class EventsPersistenceStorageController:
state_delta_for_room = None
if not backfilled:
with Measure(self._clock, "_calculate_state_and_extrem"):
with Measure(
self._clock,
name="_calculate_state_and_extrem",
server_name=self.server_name,
):
# Work out the new "current state" for the room.
# We do this by working out what the new extremities are and then
# calculating the state from that.
@ -627,7 +632,11 @@ class EventsPersistenceStorageController:
room_id, chunk
)
with Measure(self._clock, "calculate_chain_cover_index_for_events"):
with Measure(
self._clock,
name="calculate_chain_cover_index_for_events",
server_name=self.server_name,
):
# We now calculate chain ID/sequence numbers for any state events we're
# persisting. We ignore out of band memberships as we're not in the room
# and won't have their auth chain (we'll fix it up later if we join the
@ -719,7 +728,11 @@ class EventsPersistenceStorageController:
break
logger.debug("Calculating state delta for room %s", room_id)
with Measure(self._clock, "persist_events.get_new_state_after_events"):
with Measure(
self._clock,
name="persist_events.get_new_state_after_events",
server_name=self.server_name,
):
res = await self._get_new_state_after_events(
room_id,
ev_ctx_rm,
@ -746,7 +759,11 @@ class EventsPersistenceStorageController:
# removed keys entirely.
delta = DeltaState([], delta_ids)
elif current_state is not None:
with Measure(self._clock, "persist_events.calculate_state_delta"):
with Measure(
self._clock,
name="persist_events.calculate_state_delta",
server_name=self.server_name,
):
delta = await self._calculate_state_delta(room_id, current_state)
if delta:

View File

@ -34,6 +34,7 @@ from synapse.metrics.background_process_metrics import wrap_as_background_proces
from synapse.storage.database import LoggingTransaction
from synapse.storage.databases import Databases
from synapse.types.storage import _BackgroundUpdates
from synapse.util.stringutils import shortstr
if TYPE_CHECKING:
from synapse.server import HomeServer
@ -167,6 +168,12 @@ class PurgeEventsStorageController:
break
(room_id, groups_to_sequences) = next_to_delete
logger.info(
"[purge] deleting state groups for room %s: %s",
room_id,
shortstr(groups_to_sequences.keys(), maxitems=10),
)
made_progress = await self._delete_state_groups(
room_id, groups_to_sequences
)

View File

@ -68,6 +68,7 @@ class StateStorageController:
"""
def __init__(self, hs: "HomeServer", stores: "Databases"):
self.server_name = hs.hostname
self._is_mine_id = hs.is_mine_id
self._clock = hs.get_clock()
self.stores = stores
@ -812,7 +813,9 @@ class StateStorageController:
state_group = object()
assert state_group is not None
with Measure(self._clock, "get_joined_hosts"):
with Measure(
self._clock, name="get_joined_hosts", server_name=self.server_name
):
return await self._get_joined_hosts(
room_id, state_group, state_entry=state_entry
)

View File

@ -1246,7 +1246,9 @@ class EventsWorkerStore(SQLBaseStore):
to event row. Note that it may well contain additional events that
were not part of this request.
"""
with Measure(self._clock, "_fetch_event_list"):
with Measure(
self._clock, name="_fetch_event_list", server_name=self.server_name
):
try:
events_to_fetch = {
event_id for events, _ in event_list for event_id in events

View File

@ -983,7 +983,11 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
`_get_user_ids_from_membership_event_ids` for any uncached events.
"""
with Measure(self._clock, "get_joined_user_ids_from_state"):
with Measure(
self._clock,
name="get_joined_user_ids_from_state",
server_name=self.server_name,
):
users_in_room = set()
member_event_ids = [
e_id for key, e_id in state.items() if key[0] == EventTypes.Member

View File

@ -41,49 +41,83 @@ from synapse.logging.context import (
LoggingContext,
current_context,
)
from synapse.metrics import InFlightGauge
from synapse.metrics import INSTANCE_LABEL_NAME, InFlightGauge
from synapse.util import Clock
logger = logging.getLogger(__name__)
block_counter = Counter("synapse_util_metrics_block_count", "", ["block_name"])
# Metrics to see the number of and how much time is spend in various blocks of code.
#
block_counter = Counter(
"synapse_util_metrics_block_count",
"",
labelnames=["block_name", INSTANCE_LABEL_NAME],
)
"""The number of times this block has been called."""
block_timer = Counter("synapse_util_metrics_block_time_seconds", "", ["block_name"])
block_timer = Counter(
"synapse_util_metrics_block_time_seconds",
"",
labelnames=["block_name", INSTANCE_LABEL_NAME],
)
"""The cumulative time spent executing this block across all calls, in seconds."""
block_ru_utime = Counter(
"synapse_util_metrics_block_ru_utime_seconds", "", ["block_name"]
"synapse_util_metrics_block_ru_utime_seconds",
"",
labelnames=["block_name", INSTANCE_LABEL_NAME],
)
"""Resource usage: user CPU time in seconds used in this block"""
block_ru_stime = Counter(
"synapse_util_metrics_block_ru_stime_seconds", "", ["block_name"]
"synapse_util_metrics_block_ru_stime_seconds",
"",
labelnames=["block_name", INSTANCE_LABEL_NAME],
)
"""Resource usage: system CPU time in seconds used in this block"""
block_db_txn_count = Counter(
"synapse_util_metrics_block_db_txn_count", "", ["block_name"]
"synapse_util_metrics_block_db_txn_count",
"",
labelnames=["block_name", INSTANCE_LABEL_NAME],
)
"""Number of database transactions completed in this block"""
# seconds spent waiting for db txns, excluding scheduling time, in this block
block_db_txn_duration = Counter(
"synapse_util_metrics_block_db_txn_duration_seconds", "", ["block_name"]
"synapse_util_metrics_block_db_txn_duration_seconds",
"",
labelnames=["block_name", INSTANCE_LABEL_NAME],
)
"""Seconds spent waiting for database txns, excluding scheduling time, in this block"""
# seconds spent waiting for a db connection, in this block
block_db_sched_duration = Counter(
"synapse_util_metrics_block_db_sched_duration_seconds", "", ["block_name"]
"synapse_util_metrics_block_db_sched_duration_seconds",
"",
labelnames=["block_name", INSTANCE_LABEL_NAME],
)
"""Seconds spent waiting for a db connection, in this block"""
# This is dynamically created in InFlightGauge.__init__.
class _InFlightMetric(Protocol):
class _BlockInFlightMetric(Protocol):
"""
Sub-metrics used for the `InFlightGauge` for blocks.
"""
real_time_max: float
"""The longest observed duration of any single execution of this block, in seconds."""
real_time_sum: float
"""The cumulative time spent executing this block across all calls, in seconds."""
# Tracks the number of blocks currently active
in_flight: InFlightGauge[_InFlightMetric] = InFlightGauge(
in_flight: InFlightGauge[_BlockInFlightMetric] = InFlightGauge(
"synapse_util_metrics_block_in_flight",
"",
labels=["block_name"],
labels=["block_name", INSTANCE_LABEL_NAME],
# Matches the fields in the `_BlockInFlightMetric`
sub_metrics=["real_time_max", "real_time_sum"],
)
@ -92,8 +126,16 @@ P = ParamSpec("P")
R = TypeVar("R")
class HasClock(Protocol):
class HasClockAndServerName(Protocol):
clock: Clock
"""
Used to measure functions
"""
server_name: str
"""
The homeserver name that this Measure is associated with (used to label the metric)
(`hs.hostname`).
"""
def measure_func(
@ -101,8 +143,9 @@ def measure_func(
) -> Callable[[Callable[P, Awaitable[R]]], Callable[P, Awaitable[R]]]:
"""Decorate an async method with a `Measure` context manager.
The Measure is created using `self.clock`; it should only be used to decorate
methods in classes defining an instance-level `clock` attribute.
The Measure is created using `self.clock` and `self.server_name; it should only be
used to decorate methods in classes defining an instance-level `clock` and
`server_name` attributes.
Usage:
@ -116,16 +159,21 @@ def measure_func(
with Measure(...):
...
Args:
name: The name of the metric to report (the block name) (used to label the
metric). Defaults to the name of the decorated function.
"""
def wrapper(
func: Callable[Concatenate[HasClock, P], Awaitable[R]],
func: Callable[Concatenate[HasClockAndServerName, P], Awaitable[R]],
) -> Callable[P, Awaitable[R]]:
block_name = func.__name__ if name is None else name
@wraps(func)
async def measured_func(self: HasClock, *args: P.args, **kwargs: P.kwargs) -> R:
with Measure(self.clock, block_name):
async def measured_func(
self: HasClockAndServerName, *args: P.args, **kwargs: P.kwargs
) -> R:
with Measure(self.clock, name=block_name, server_name=self.server_name):
r = await func(self, *args, **kwargs)
return r
@ -142,19 +190,24 @@ class Measure:
__slots__ = [
"clock",
"name",
"server_name",
"_logging_context",
"start",
]
def __init__(self, clock: Clock, name: str) -> None:
def __init__(self, clock: Clock, *, name: str, server_name: str) -> None:
"""
Args:
clock: An object with a "time()" method, which returns the current
time in seconds.
name: The name of the metric to report.
name: The name of the metric to report (the block name) (used to label the
metric).
server_name: The homeserver name that this Measure is associated with (used to
label the metric) (`hs.hostname`).
"""
self.clock = clock
self.name = name
self.server_name = server_name
curr_context = current_context()
if not curr_context:
logger.warning(
@ -174,7 +227,7 @@ class Measure:
self.start = self.clock.time()
self._logging_context.__enter__()
in_flight.register((self.name,), self._update_in_flight)
in_flight.register((self.name, self.server_name), self._update_in_flight)
logger.debug("Entering block %s", self.name)
@ -194,19 +247,20 @@ class Measure:
duration = self.clock.time() - self.start
usage = self.get_resource_usage()
in_flight.unregister((self.name,), self._update_in_flight)
in_flight.unregister((self.name, self.server_name), self._update_in_flight)
self._logging_context.__exit__(exc_type, exc_val, exc_tb)
try:
block_counter.labels(self.name).inc()
block_timer.labels(self.name).inc(duration)
block_ru_utime.labels(self.name).inc(usage.ru_utime)
block_ru_stime.labels(self.name).inc(usage.ru_stime)
block_db_txn_count.labels(self.name).inc(usage.db_txn_count)
block_db_txn_duration.labels(self.name).inc(usage.db_txn_duration_sec)
block_db_sched_duration.labels(self.name).inc(usage.db_sched_duration_sec)
except ValueError:
logger.warning("Failed to save metrics! Usage: %s", usage)
labels = {"block_name": self.name, INSTANCE_LABEL_NAME: self.server_name}
block_counter.labels(**labels).inc()
block_timer.labels(**labels).inc(duration)
block_ru_utime.labels(**labels).inc(usage.ru_utime)
block_ru_stime.labels(**labels).inc(usage.ru_stime)
block_db_txn_count.labels(**labels).inc(usage.db_txn_count)
block_db_txn_duration.labels(**labels).inc(usage.db_txn_duration_sec)
block_db_sched_duration.labels(**labels).inc(usage.db_sched_duration_sec)
except ValueError as exc:
logger.warning("Failed to save metrics! Usage: %s Error: %s", usage, exc)
def get_resource_usage(self) -> ContextResourceUsage:
"""Get the resources used within this Measure block
@ -215,7 +269,7 @@ class Measure:
"""
return self._logging_context.get_resource_usage()
def _update_in_flight(self, metrics: _InFlightMetric) -> None:
def _update_in_flight(self, metrics: _BlockInFlightMetric) -> None:
"""Gets called when processing in flight metrics"""
assert self.start is not None
duration = self.clock.time() - self.start

View File

@ -86,6 +86,7 @@ class TypingNotificationsTestCase(unittest.HomeserverTestCase):
self.mock_federation_client = AsyncMock(spec=["put_json"])
self.mock_federation_client.put_json.return_value = (200, "OK")
self.mock_federation_client.agent = MatrixFederationAgent(
"OUR_STUB_HOMESERVER_NAME",
reactor,
tls_client_options_factory=None,
user_agent=b"SynapseInTrialTest/0.0.0",

View File

@ -91,6 +91,7 @@ class MatrixFederationAgentTests(unittest.TestCase):
"test_cache", timer=self.reactor.seconds
)
self.well_known_resolver = WellKnownResolver(
"OUR_STUB_HOMESERVER_NAME",
self.reactor,
Agent(self.reactor, contextFactory=self.tls_factory),
b"test-agent",
@ -269,6 +270,7 @@ class MatrixFederationAgentTests(unittest.TestCase):
because it is created too early during setUp
"""
return MatrixFederationAgent(
"OUR_STUB_HOMESERVER_NAME",
reactor=cast(ISynapseReactor, self.reactor),
tls_client_options_factory=self.tls_factory,
user_agent=b"test-agent", # Note that this is unused since _well_known_resolver is provided.
@ -1011,6 +1013,7 @@ class MatrixFederationAgentTests(unittest.TestCase):
# Build a new agent and WellKnownResolver with a different tls factory
tls_factory = FederationPolicyForHTTPS(config)
agent = MatrixFederationAgent(
"OUR_STUB_HOMESERVER_NAME",
reactor=self.reactor,
tls_client_options_factory=tls_factory,
user_agent=b"test-agent", # This is unused since _well_known_resolver is passed below.
@ -1018,6 +1021,7 @@ class MatrixFederationAgentTests(unittest.TestCase):
ip_blocklist=IPSet(),
_srv_resolver=self.mock_resolver,
_well_known_resolver=WellKnownResolver(
"OUR_STUB_HOMESERVER_NAME",
cast(ISynapseReactor, self.reactor),
Agent(self.reactor, contextFactory=tls_factory),
b"test-agent",

View File

@ -68,6 +68,7 @@ class FederationSenderTestCase(BaseMultiWorkerStreamTestCase):
reactor, _ = get_clock()
self.matrix_federation_agent = MatrixFederationAgent(
"OUR_STUB_HOMESERVER_NAME",
reactor,
tls_client_options_factory=None,
user_agent=b"SynapseInTrialTest/0.0.0",

View File

@ -2846,6 +2846,16 @@ class UserRestTestCase(unittest.HomeserverTestCase):
self.assertEqual(Codes.USER_LOCKED, channel.json_body["errcode"])
self.assertTrue(channel.json_body["soft_logout"])
# User is not authorized to log in anymore
channel = self.make_request(
"POST",
"/_matrix/client/r0/login",
{"type": "m.login.password", "user": "user", "password": "pass"},
)
self.assertEqual(401, channel.code, msg=channel.json_body)
self.assertEqual(Codes.USER_LOCKED, channel.json_body["errcode"])
self.assertTrue(channel.json_body["soft_logout"])
@override_config({"user_directory": {"enabled": True, "search_all_users": True}})
def test_locked_user_not_in_user_dir(self) -> None:
# User is available in the user dir