mirror of
https://github.com/element-hq/synapse.git
synced 2025-07-14 00:00:37 -04:00
Compare commits
No commits in common. "develop" and "v1.48.0rc1" have entirely different histories.
develop
...
v1.48.0rc1
@ -1,10 +0,0 @@
|
|||||||
#!/bin/sh
|
|
||||||
set -xeu
|
|
||||||
|
|
||||||
# On 32-bit Linux platforms, we need libatomic1 to use rustup
|
|
||||||
if command -v yum &> /dev/null; then
|
|
||||||
yum install -y libatomic
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Install a Rust toolchain
|
|
||||||
curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain 1.82.0 -y --profile minimal
|
|
@ -1,91 +0,0 @@
|
|||||||
{{- /*gotype: github.com/haveyoudebuggedit/gotestfmt/parser.Package*/ -}}
|
|
||||||
{{- /*
|
|
||||||
This template contains the format for an individual package. GitHub actions does not currently support nested groups so
|
|
||||||
we are creating a stylized header for each package.
|
|
||||||
|
|
||||||
This template is based on https://github.com/haveyoudebuggedit/gotestfmt/blob/f179b0e462a9dcf7101515d87eec4e4d7e58b92a/.gotestfmt/github/package.gotpl
|
|
||||||
which is under the Unlicense licence.
|
|
||||||
*/ -}}
|
|
||||||
{{- $settings := .Settings -}}
|
|
||||||
{{- if and (or (not $settings.HideSuccessfulPackages) (ne .Result "PASS")) (or (not $settings.HideEmptyPackages) (ne .Result "SKIP") (ne (len .TestCases) 0)) -}}
|
|
||||||
{{- if eq .Result "PASS" -}}
|
|
||||||
{{ "\033" }}[0;32m
|
|
||||||
{{- else if eq .Result "SKIP" -}}
|
|
||||||
{{ "\033" }}[0;33m
|
|
||||||
{{- else -}}
|
|
||||||
{{ "\033" }}[0;31m
|
|
||||||
{{- end -}}
|
|
||||||
📦 {{ .Name }}{{- "\033" }}[0m
|
|
||||||
{{- with .Coverage -}}
|
|
||||||
{{- "\033" -}}[0;37m ({{ . }}% coverage){{- "\033" -}}[0m
|
|
||||||
{{- end -}}
|
|
||||||
{{- "\n" -}}
|
|
||||||
{{- with .Reason -}}
|
|
||||||
{{- " " -}}🛑 {{ . -}}{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
||||||
{{- with .Output -}}
|
|
||||||
{{- . -}}{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
||||||
{{- with .TestCases -}}
|
|
||||||
{{- /* Passing tests are first */ -}}
|
|
||||||
{{- range . -}}
|
|
||||||
{{- if eq .Result "PASS" -}}
|
|
||||||
::group::{{ "\033" }}[0;32m✅{{ " " }}{{- .Name -}}
|
|
||||||
{{- "\033" -}}[0;37m ({{if $settings.ShowTestStatus}}{{.Result}}; {{end}}{{ .Duration -}}
|
|
||||||
{{- with .Coverage -}}
|
|
||||||
, coverage: {{ . }}%
|
|
||||||
{{- end -}})
|
|
||||||
{{- "\033" -}}[0m
|
|
||||||
{{- "\n" -}}
|
|
||||||
|
|
||||||
{{- with .Output -}}
|
|
||||||
{{- formatTestOutput . $settings -}}
|
|
||||||
{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
||||||
|
|
||||||
::endgroup::{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
||||||
{{- end -}}
|
|
||||||
|
|
||||||
{{- /* Then skipped tests are second */ -}}
|
|
||||||
{{- range . -}}
|
|
||||||
{{- if eq .Result "SKIP" -}}
|
|
||||||
::group::{{ "\033" }}[0;33m🚧{{ " " }}{{- .Name -}}
|
|
||||||
{{- "\033" -}}[0;37m ({{if $settings.ShowTestStatus}}{{.Result}}; {{end}}{{ .Duration -}}
|
|
||||||
{{- with .Coverage -}}
|
|
||||||
, coverage: {{ . }}%
|
|
||||||
{{- end -}})
|
|
||||||
{{- "\033" -}}[0m
|
|
||||||
{{- "\n" -}}
|
|
||||||
|
|
||||||
{{- with .Output -}}
|
|
||||||
{{- formatTestOutput . $settings -}}
|
|
||||||
{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
||||||
|
|
||||||
::endgroup::{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
||||||
{{- end -}}
|
|
||||||
|
|
||||||
{{- /* and failing tests are last */ -}}
|
|
||||||
{{- range . -}}
|
|
||||||
{{- if and (ne .Result "PASS") (ne .Result "SKIP") -}}
|
|
||||||
::group::{{ "\033" }}[0;31m❌{{ " " }}{{- .Name -}}
|
|
||||||
{{- "\033" -}}[0;37m ({{if $settings.ShowTestStatus}}{{.Result}}; {{end}}{{ .Duration -}}
|
|
||||||
{{- with .Coverage -}}
|
|
||||||
, coverage: {{ . }}%
|
|
||||||
{{- end -}})
|
|
||||||
{{- "\033" -}}[0m
|
|
||||||
{{- "\n" -}}
|
|
||||||
|
|
||||||
{{- with .Output -}}
|
|
||||||
{{- formatTestOutput . $settings -}}
|
|
||||||
{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
||||||
|
|
||||||
::endgroup::{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
||||||
{{- end -}}
|
|
||||||
{{- end -}}
|
|
||||||
{{- "\n" -}}
|
|
||||||
{{- end -}}
|
|
@ -1,4 +0,0 @@
|
|||||||
---
|
|
||||||
title: CI run against latest deps is failing
|
|
||||||
---
|
|
||||||
See https://github.com/{{env.GITHUB_REPOSITORY}}/actions/runs/{{env.GITHUB_RUN_ID}}
|
|
8
.ci/patch_for_twisted_trunk.sh
Executable file
8
.ci/patch_for_twisted_trunk.sh
Executable file
@ -0,0 +1,8 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
# replaces the dependency on Twisted in `python_dependencies` with trunk.
|
||||||
|
|
||||||
|
set -e
|
||||||
|
cd "$(dirname "$0")"/..
|
||||||
|
|
||||||
|
sed -i -e 's#"Twisted.*"#"Twisted @ git+https://github.com/twisted/twisted"#' synapse/python_dependencies.py
|
@ -1,147 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
#
|
|
||||||
# This file is licensed under the Affero General Public License (AGPL) version 3.
|
|
||||||
#
|
|
||||||
# Copyright (C) 2023 New Vector, Ltd
|
|
||||||
#
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as
|
|
||||||
# published by the Free Software Foundation, either version 3 of the
|
|
||||||
# License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# See the GNU Affero General Public License for more details:
|
|
||||||
# <https://www.gnu.org/licenses/agpl-3.0.html>.
|
|
||||||
#
|
|
||||||
# Originally licensed under the Apache License, Version 2.0:
|
|
||||||
# <http://www.apache.org/licenses/LICENSE-2.0>.
|
|
||||||
#
|
|
||||||
# [This file includes modifications made by New Vector Limited]
|
|
||||||
#
|
|
||||||
#
|
|
||||||
|
|
||||||
# Wraps `auditwheel repair` to first check if we're repairing a potentially abi3
|
|
||||||
# compatible wheel, if so rename the wheel before repairing it.
|
|
||||||
|
|
||||||
import argparse
|
|
||||||
import os
|
|
||||||
import subprocess
|
|
||||||
from typing import Optional
|
|
||||||
from zipfile import ZipFile
|
|
||||||
|
|
||||||
from packaging.tags import Tag
|
|
||||||
from packaging.utils import parse_wheel_filename
|
|
||||||
from packaging.version import Version
|
|
||||||
|
|
||||||
|
|
||||||
def check_is_abi3_compatible(wheel_file: str) -> None:
|
|
||||||
"""Check the contents of the built wheel for any `.so` files that are *not*
|
|
||||||
abi3 compatible.
|
|
||||||
"""
|
|
||||||
|
|
||||||
with ZipFile(wheel_file, "r") as wheel:
|
|
||||||
for file in wheel.namelist():
|
|
||||||
if not file.endswith(".so"):
|
|
||||||
continue
|
|
||||||
|
|
||||||
if not file.endswith(".abi3.so"):
|
|
||||||
raise Exception(f"Found non-abi3 lib: {file}")
|
|
||||||
|
|
||||||
|
|
||||||
def cpython(wheel_file: str, name: str, version: Version, tag: Tag) -> str:
|
|
||||||
"""Replaces the cpython wheel file with a ABI3 compatible wheel"""
|
|
||||||
|
|
||||||
if tag.abi == "abi3":
|
|
||||||
# Nothing to do.
|
|
||||||
return wheel_file
|
|
||||||
|
|
||||||
check_is_abi3_compatible(wheel_file)
|
|
||||||
|
|
||||||
# HACK: it seems that some older versions of pip will consider a wheel marked
|
|
||||||
# as macosx_11_0 as incompatible with Big Sur. I haven't done the full archaeology
|
|
||||||
# here; there are some clues in
|
|
||||||
# https://github.com/pantsbuild/pants/pull/12857
|
|
||||||
# https://github.com/pypa/pip/issues/9138
|
|
||||||
# https://github.com/pypa/packaging/pull/319
|
|
||||||
# Empirically this seems to work, note that macOS 11 and 10.16 are the same,
|
|
||||||
# both versions are valid for backwards compatibility.
|
|
||||||
platform = tag.platform.replace("macosx_11_0", "macosx_10_16")
|
|
||||||
abi3_tag = Tag(tag.interpreter, "abi3", platform)
|
|
||||||
|
|
||||||
dirname = os.path.dirname(wheel_file)
|
|
||||||
new_wheel_file = os.path.join(
|
|
||||||
dirname,
|
|
||||||
f"{name}-{version}-{abi3_tag}.whl",
|
|
||||||
)
|
|
||||||
|
|
||||||
os.rename(wheel_file, new_wheel_file)
|
|
||||||
|
|
||||||
print("Renamed wheel to", new_wheel_file)
|
|
||||||
|
|
||||||
return new_wheel_file
|
|
||||||
|
|
||||||
|
|
||||||
def main(wheel_file: str, dest_dir: str, archs: Optional[str]) -> None:
|
|
||||||
"""Entry point"""
|
|
||||||
|
|
||||||
# Parse the wheel file name into its parts. Note that `parse_wheel_filename`
|
|
||||||
# normalizes the package name (i.e. it converts matrix_synapse ->
|
|
||||||
# matrix-synapse), which is not what we want.
|
|
||||||
_, version, build, tags = parse_wheel_filename(os.path.basename(wheel_file))
|
|
||||||
name = os.path.basename(wheel_file).split("-")[0]
|
|
||||||
|
|
||||||
if len(tags) != 1:
|
|
||||||
# We expect only a wheel file with only a single tag
|
|
||||||
raise Exception(f"Unexpectedly found multiple tags: {tags}")
|
|
||||||
|
|
||||||
tag = next(iter(tags))
|
|
||||||
|
|
||||||
if build:
|
|
||||||
# We don't use build tags in Synapse
|
|
||||||
raise Exception(f"Unexpected build tag: {build}")
|
|
||||||
|
|
||||||
# If the wheel is for cpython then convert it into an abi3 wheel.
|
|
||||||
if tag.interpreter.startswith("cp"):
|
|
||||||
wheel_file = cpython(wheel_file, name, version, tag)
|
|
||||||
|
|
||||||
# Finally, repair the wheel.
|
|
||||||
if archs is not None:
|
|
||||||
# If we are given archs then we are on macos and need to use
|
|
||||||
# `delocate-listdeps`.
|
|
||||||
subprocess.run(["delocate-listdeps", wheel_file], check=True)
|
|
||||||
subprocess.run(
|
|
||||||
["delocate-wheel", "--require-archs", archs, "-w", dest_dir, wheel_file],
|
|
||||||
check=True,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
subprocess.run(["auditwheel", "repair", "-w", dest_dir, wheel_file], check=True)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
parser = argparse.ArgumentParser(description="Tag wheel as abi3 and repair it.")
|
|
||||||
|
|
||||||
parser.add_argument(
|
|
||||||
"--wheel-dir",
|
|
||||||
"-w",
|
|
||||||
metavar="WHEEL_DIR",
|
|
||||||
help="Directory to store delocated wheels",
|
|
||||||
required=True,
|
|
||||||
)
|
|
||||||
|
|
||||||
parser.add_argument(
|
|
||||||
"--require-archs",
|
|
||||||
metavar="archs",
|
|
||||||
default=None,
|
|
||||||
)
|
|
||||||
|
|
||||||
parser.add_argument(
|
|
||||||
"wheel_file",
|
|
||||||
metavar="WHEEL_FILE",
|
|
||||||
)
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
wheel_file = args.wheel_file
|
|
||||||
wheel_dir = args.wheel_dir
|
|
||||||
archs = args.require_archs
|
|
||||||
|
|
||||||
main(wheel_file, wheel_dir, archs)
|
|
@ -1,151 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
#
|
|
||||||
# This file is licensed under the Affero General Public License (AGPL) version 3.
|
|
||||||
#
|
|
||||||
# Copyright (C) 2023 New Vector, Ltd
|
|
||||||
#
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as
|
|
||||||
# published by the Free Software Foundation, either version 3 of the
|
|
||||||
# License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# See the GNU Affero General Public License for more details:
|
|
||||||
# <https://www.gnu.org/licenses/agpl-3.0.html>.
|
|
||||||
#
|
|
||||||
# Originally licensed under the Apache License, Version 2.0:
|
|
||||||
# <http://www.apache.org/licenses/LICENSE-2.0>.
|
|
||||||
#
|
|
||||||
# [This file includes modifications made by New Vector Limited]
|
|
||||||
#
|
|
||||||
#
|
|
||||||
|
|
||||||
# Calculate the trial jobs to run based on if we're in a PR or not.
|
|
||||||
|
|
||||||
import json
|
|
||||||
import os
|
|
||||||
|
|
||||||
|
|
||||||
def set_output(key: str, value: str):
|
|
||||||
# See https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#setting-an-output-parameter
|
|
||||||
with open(os.environ["GITHUB_OUTPUT"], "at") as f:
|
|
||||||
print(f"{key}={value}", file=f)
|
|
||||||
|
|
||||||
|
|
||||||
IS_PR = os.environ["GITHUB_REF"].startswith("refs/pull/")
|
|
||||||
|
|
||||||
# First calculate the various trial jobs.
|
|
||||||
#
|
|
||||||
# For PRs, we only run each type of test with the oldest Python version supported (which
|
|
||||||
# is Python 3.9 right now)
|
|
||||||
|
|
||||||
trial_sqlite_tests = [
|
|
||||||
{
|
|
||||||
"python-version": "3.9",
|
|
||||||
"database": "sqlite",
|
|
||||||
"extras": "all",
|
|
||||||
}
|
|
||||||
]
|
|
||||||
|
|
||||||
if not IS_PR:
|
|
||||||
trial_sqlite_tests.extend(
|
|
||||||
{
|
|
||||||
"python-version": version,
|
|
||||||
"database": "sqlite",
|
|
||||||
"extras": "all",
|
|
||||||
}
|
|
||||||
for version in ("3.10", "3.11", "3.12", "3.13")
|
|
||||||
)
|
|
||||||
|
|
||||||
trial_postgres_tests = [
|
|
||||||
{
|
|
||||||
"python-version": "3.9",
|
|
||||||
"database": "postgres",
|
|
||||||
"postgres-version": "13",
|
|
||||||
"extras": "all",
|
|
||||||
}
|
|
||||||
]
|
|
||||||
|
|
||||||
if not IS_PR:
|
|
||||||
trial_postgres_tests.append(
|
|
||||||
{
|
|
||||||
"python-version": "3.13",
|
|
||||||
"database": "postgres",
|
|
||||||
"postgres-version": "17",
|
|
||||||
"extras": "all",
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
trial_no_extra_tests = [
|
|
||||||
{
|
|
||||||
"python-version": "3.9",
|
|
||||||
"database": "sqlite",
|
|
||||||
"extras": "",
|
|
||||||
}
|
|
||||||
]
|
|
||||||
|
|
||||||
print("::group::Calculated trial jobs")
|
|
||||||
print(
|
|
||||||
json.dumps(
|
|
||||||
trial_sqlite_tests + trial_postgres_tests + trial_no_extra_tests, indent=4
|
|
||||||
)
|
|
||||||
)
|
|
||||||
print("::endgroup::")
|
|
||||||
|
|
||||||
test_matrix = json.dumps(
|
|
||||||
trial_sqlite_tests + trial_postgres_tests + trial_no_extra_tests
|
|
||||||
)
|
|
||||||
set_output("trial_test_matrix", test_matrix)
|
|
||||||
|
|
||||||
|
|
||||||
# First calculate the various sytest jobs.
|
|
||||||
#
|
|
||||||
# For each type of test we only run on bullseye on PRs
|
|
||||||
|
|
||||||
|
|
||||||
sytest_tests = [
|
|
||||||
{
|
|
||||||
"sytest-tag": "bullseye",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"sytest-tag": "bullseye",
|
|
||||||
"postgres": "postgres",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"sytest-tag": "bullseye",
|
|
||||||
"postgres": "multi-postgres",
|
|
||||||
"workers": "workers",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"sytest-tag": "bullseye",
|
|
||||||
"postgres": "multi-postgres",
|
|
||||||
"workers": "workers",
|
|
||||||
"reactor": "asyncio",
|
|
||||||
},
|
|
||||||
]
|
|
||||||
|
|
||||||
if not IS_PR:
|
|
||||||
sytest_tests.extend(
|
|
||||||
[
|
|
||||||
{
|
|
||||||
"sytest-tag": "bullseye",
|
|
||||||
"reactor": "asyncio",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"sytest-tag": "bullseye",
|
|
||||||
"postgres": "postgres",
|
|
||||||
"reactor": "asyncio",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"sytest-tag": "testing",
|
|
||||||
"postgres": "postgres",
|
|
||||||
},
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
print("::group::Calculated sytest jobs")
|
|
||||||
print(json.dumps(sytest_tests, indent=4))
|
|
||||||
print("::endgroup::")
|
|
||||||
|
|
||||||
test_matrix = json.dumps(sytest_tests)
|
|
||||||
set_output("sytest_test_matrix", test_matrix)
|
|
@ -1,23 +0,0 @@
|
|||||||
#! /usr/bin/env python
|
|
||||||
import sys
|
|
||||||
|
|
||||||
if sys.version_info < (3, 11):
|
|
||||||
raise RuntimeError("Requires at least Python 3.11, to import tomllib")
|
|
||||||
|
|
||||||
import tomllib
|
|
||||||
|
|
||||||
with open("poetry.lock", "rb") as f:
|
|
||||||
lockfile = tomllib.load(f)
|
|
||||||
|
|
||||||
try:
|
|
||||||
lock_version = lockfile["metadata"]["lock-version"]
|
|
||||||
assert lock_version == "2.1"
|
|
||||||
except Exception:
|
|
||||||
print(
|
|
||||||
"""\
|
|
||||||
Lockfile is not version 2.1. You probably need to upgrade poetry on your local box
|
|
||||||
and re-run `poetry lock`. See the Poetry cheat sheet at
|
|
||||||
https://element-hq.github.io/synapse/develop/development/dependencies.html
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
raise
|
|
@ -1,25 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
#
|
|
||||||
# Fetches a version of complement which best matches the current build.
|
|
||||||
#
|
|
||||||
# The tarball is unpacked into `./complement`.
|
|
||||||
|
|
||||||
set -e
|
|
||||||
mkdir -p complement
|
|
||||||
|
|
||||||
# Pick an appropriate version of complement. Depending on whether this is a PR or release,
|
|
||||||
# etc. we need to use different fallbacks:
|
|
||||||
#
|
|
||||||
# 1. First check if there's a similarly named branch (GITHUB_HEAD_REF
|
|
||||||
# for pull requests, otherwise GITHUB_REF).
|
|
||||||
# 2. Attempt to use the base branch, e.g. when merging into release-vX.Y
|
|
||||||
# (GITHUB_BASE_REF for pull requests).
|
|
||||||
# 3. Use the default complement branch ("HEAD").
|
|
||||||
for BRANCH_NAME in "$GITHUB_HEAD_REF" "$GITHUB_BASE_REF" "${GITHUB_REF#refs/heads/}" "HEAD"; do
|
|
||||||
# Skip empty branch names and merge commits.
|
|
||||||
if [[ -z "$BRANCH_NAME" || $BRANCH_NAME =~ ^refs/pull/.* ]]; then
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
|
|
||||||
(wget -O - "https://github.com/matrix-org/complement/archive/$BRANCH_NAME.tar.gz" | tar -xz --strip-components=1 -C complement) && break
|
|
||||||
done
|
|
@ -1,21 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
#
|
|
||||||
# wraps `gotestfmt`, hiding output from successful packages unless
|
|
||||||
# all tests passed.
|
|
||||||
|
|
||||||
set -o pipefail
|
|
||||||
set -e
|
|
||||||
|
|
||||||
# tee the test results to a log, whilst also piping them into gotestfmt,
|
|
||||||
# telling it to hide successful results, so that we can clearly see
|
|
||||||
# unsuccessful results.
|
|
||||||
tee complement.log | gotestfmt -hide successful-packages
|
|
||||||
|
|
||||||
# gotestfmt will exit non-zero if there were any failures, so if we got to this
|
|
||||||
# point, we must have had a successful result.
|
|
||||||
echo "All tests successful; showing all test results"
|
|
||||||
|
|
||||||
# Pipe the test results back through gotestfmt, showing all results.
|
|
||||||
# The log file consists of JSON lines giving the test results, interspersed
|
|
||||||
# with regular stdout lines (including reports of downloaded packages).
|
|
||||||
grep '^{"Time":' complement.log | gotestfmt
|
|
31
.ci/scripts/postgres_exec.py
Executable file
31
.ci/scripts/postgres_exec.py
Executable file
@ -0,0 +1,31 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
# Copyright 2019 The Matrix.org Foundation C.I.C.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
import sys
|
||||||
|
|
||||||
|
import psycopg2
|
||||||
|
|
||||||
|
# a very simple replacment for `psql`, to make up for the lack of the postgres client
|
||||||
|
# libraries in the synapse docker image.
|
||||||
|
|
||||||
|
# We use "postgres" as a database because it's bound to exist and the "synapse" one
|
||||||
|
# doesn't exist yet.
|
||||||
|
db_conn = psycopg2.connect(
|
||||||
|
user="postgres", host="localhost", password="postgres", dbname="postgres"
|
||||||
|
)
|
||||||
|
db_conn.autocommit = True
|
||||||
|
cur = db_conn.cursor()
|
||||||
|
for c in sys.argv[1:]:
|
||||||
|
cur.execute(c)
|
@ -1,36 +0,0 @@
|
|||||||
#!/usr/bin/env bash
|
|
||||||
# this script is run by GitHub Actions in a plain `jammy` container; it
|
|
||||||
# - installs the minimal system requirements, and poetry;
|
|
||||||
# - patches the project definition file to refer to old versions only;
|
|
||||||
# - creates a venv with these old versions using poetry; and finally
|
|
||||||
# - invokes `trial` to run the tests with old deps.
|
|
||||||
|
|
||||||
set -ex
|
|
||||||
|
|
||||||
# Prevent virtualenv from auto-updating pip to an incompatible version
|
|
||||||
export VIRTUALENV_NO_DOWNLOAD=1
|
|
||||||
|
|
||||||
# TODO: in the future, we could use an implementation of
|
|
||||||
# https://github.com/python-poetry/poetry/issues/3527
|
|
||||||
# https://github.com/pypa/pip/issues/8085
|
|
||||||
# to select the lowest possible versions, rather than resorting to this sed script.
|
|
||||||
|
|
||||||
# Patch the project definitions in-place:
|
|
||||||
# - Replace all lower and tilde bounds with exact bounds
|
|
||||||
# - Replace all caret bounds---but not the one that defines the supported Python version!
|
|
||||||
# - Delete all lines referring to psycopg2 --- so no testing of postgres support.
|
|
||||||
# - Use pyopenssl 17.0, which is the oldest version that works with
|
|
||||||
# a `cryptography` compiled against OpenSSL 1.1.
|
|
||||||
# - Omit systemd: we're not logging to journal here.
|
|
||||||
|
|
||||||
sed -i \
|
|
||||||
-e "s/[~>]=/==/g" \
|
|
||||||
-e '/^python = "^/!s/\^/==/g' \
|
|
||||||
-e "/psycopg2/d" \
|
|
||||||
-e 's/pyOpenSSL = "==16.0.0"/pyOpenSSL = "==17.0.0"/' \
|
|
||||||
-e '/systemd/d' \
|
|
||||||
pyproject.toml
|
|
||||||
|
|
||||||
echo "::group::Patched pyproject.toml"
|
|
||||||
cat pyproject.toml
|
|
||||||
echo "::endgroup::"
|
|
@ -1,26 +0,0 @@
|
|||||||
#!/bin/sh
|
|
||||||
#
|
|
||||||
# Common commands to set up Complement's prerequisites in a GitHub Actions CI run.
|
|
||||||
#
|
|
||||||
# Must be called after Synapse has been checked out to `synapse/`.
|
|
||||||
#
|
|
||||||
set -eu
|
|
||||||
|
|
||||||
alias block='{ set +x; } 2>/dev/null; func() { echo "::group::$*"; set -x; }; func'
|
|
||||||
alias endblock='{ set +x; } 2>/dev/null; func() { echo "::endgroup::"; set -x; }; func'
|
|
||||||
|
|
||||||
block Install Complement Dependencies
|
|
||||||
sudo apt-get -qq update && sudo apt-get install -qqy libolm3 libolm-dev
|
|
||||||
go install -v github.com/gotesttools/gotestfmt/v2/cmd/gotestfmt@latest
|
|
||||||
endblock
|
|
||||||
|
|
||||||
block Install custom gotestfmt template
|
|
||||||
mkdir .gotestfmt/github -p
|
|
||||||
cp synapse/.ci/complement_package.gotpl .gotestfmt/github/package.gotpl
|
|
||||||
endblock
|
|
||||||
|
|
||||||
block Check out Complement
|
|
||||||
# Attempt to check out the same branch of Complement as the PR. If it
|
|
||||||
# doesn't exist, fallback to HEAD.
|
|
||||||
synapse/.ci/scripts/checkout_complement.sh
|
|
||||||
endblock
|
|
@ -2,30 +2,34 @@
|
|||||||
|
|
||||||
# Test for the export-data admin command against sqlite and postgres
|
# Test for the export-data admin command against sqlite and postgres
|
||||||
|
|
||||||
# Expects Synapse to have been already installed with `poetry install --extras postgres`.
|
|
||||||
# Expects `poetry` to be available on the `PATH`.
|
|
||||||
|
|
||||||
set -xe
|
set -xe
|
||||||
cd "$(dirname "$0")/../.."
|
cd "$(dirname "$0")/../.."
|
||||||
|
|
||||||
|
echo "--- Install dependencies"
|
||||||
|
|
||||||
|
# Install dependencies for this test.
|
||||||
|
pip install psycopg2
|
||||||
|
|
||||||
|
# Install Synapse itself. This won't update any libraries.
|
||||||
|
pip install -e .
|
||||||
|
|
||||||
echo "--- Generate the signing key"
|
echo "--- Generate the signing key"
|
||||||
|
|
||||||
# Generate the server's signing key.
|
# Generate the server's signing key.
|
||||||
poetry run synapse_homeserver --generate-keys -c .ci/sqlite-config.yaml
|
python -m synapse.app.homeserver --generate-keys -c .ci/sqlite-config.yaml
|
||||||
|
|
||||||
echo "--- Prepare test database"
|
echo "--- Prepare test database"
|
||||||
|
|
||||||
# Make sure the SQLite3 database is using the latest schema and has no pending background update.
|
# Make sure the SQLite3 database is using the latest schema and has no pending background update.
|
||||||
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
|
scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
|
||||||
|
|
||||||
# Run the export-data command on the sqlite test database
|
# Run the export-data command on the sqlite test database
|
||||||
poetry run python -m synapse.app.admin_cmd -c .ci/sqlite-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
|
python -m synapse.app.admin_cmd -c .ci/sqlite-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
|
||||||
--output-directory /tmp/export_data
|
--output-directory /tmp/export_data
|
||||||
|
|
||||||
# Test that the output directory exists and contains the rooms directory
|
# Test that the output directory exists and contains the rooms directory
|
||||||
dir_r="/tmp/export_data/rooms"
|
dir="/tmp/export_data/rooms"
|
||||||
dir_u="/tmp/export_data/user_data"
|
if [ -d "$dir" ]; then
|
||||||
if [ -d "$dir_r" ] && [ -d "$dir_u" ]; then
|
|
||||||
echo "Command successful, this test passes"
|
echo "Command successful, this test passes"
|
||||||
else
|
else
|
||||||
echo "No output directories found, the command fails against a sqlite database."
|
echo "No output directories found, the command fails against a sqlite database."
|
||||||
@ -33,20 +37,19 @@ else
|
|||||||
fi
|
fi
|
||||||
|
|
||||||
# Create the PostgreSQL database.
|
# Create the PostgreSQL database.
|
||||||
psql -c "CREATE DATABASE synapse"
|
.ci/scripts/postgres_exec.py "CREATE DATABASE synapse"
|
||||||
|
|
||||||
# Port the SQLite databse to postgres so we can check command works against postgres
|
# Port the SQLite databse to postgres so we can check command works against postgres
|
||||||
echo "+++ Port SQLite3 databse to postgres"
|
echo "+++ Port SQLite3 databse to postgres"
|
||||||
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
|
scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
|
||||||
|
|
||||||
# Run the export-data command on postgres database
|
# Run the export-data command on postgres database
|
||||||
poetry run python -m synapse.app.admin_cmd -c .ci/postgres-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
|
python -m synapse.app.admin_cmd -c .ci/postgres-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
|
||||||
--output-directory /tmp/export_data2
|
--output-directory /tmp/export_data2
|
||||||
|
|
||||||
# Test that the output directory exists and contains the rooms directory
|
# Test that the output directory exists and contains the rooms directory
|
||||||
dir_r2="/tmp/export_data2/rooms"
|
dir2="/tmp/export_data2/rooms"
|
||||||
dir_u2="/tmp/export_data2/user_data"
|
if [ -d "$dir2" ]; then
|
||||||
if [ -d "$dir_r2" ] && [ -d "$dir_u2" ]; then
|
|
||||||
echo "Command successful, this test passes"
|
echo "Command successful, this test passes"
|
||||||
else
|
else
|
||||||
echo "No output directories found, the command fails against a postgres database."
|
echo "No output directories found, the command fails against a postgres database."
|
||||||
|
16
.ci/scripts/test_old_deps.sh
Executable file
16
.ci/scripts/test_old_deps.sh
Executable file
@ -0,0 +1,16 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# this script is run by GitHub Actions in a plain `bionic` container; it installs the
|
||||||
|
# minimal requirements for tox and hands over to the py3-old tox environment.
|
||||||
|
|
||||||
|
set -ex
|
||||||
|
|
||||||
|
apt-get update
|
||||||
|
apt-get install -y python3 python3-dev python3-pip libxml2-dev libxslt-dev xmlsec1 zlib1g-dev tox
|
||||||
|
|
||||||
|
export LANG="C.UTF-8"
|
||||||
|
|
||||||
|
# Prevent virtualenv from auto-updating pip to an incompatible version
|
||||||
|
export VIRTUALENV_NO_DOWNLOAD=1
|
||||||
|
|
||||||
|
exec tox -e py3-old,combine
|
@ -1,37 +1,41 @@
|
|||||||
#!/usr/bin/env bash
|
#!/usr/bin/env bash
|
||||||
#
|
#
|
||||||
# Test script for 'synapse_port_db'.
|
# Test script for 'synapse_port_db'.
|
||||||
# - configures synapse and a postgres server.
|
# - sets up synapse and deps
|
||||||
# - runs the port script on a prepopulated test sqlite db. Checks that the
|
# - runs the port script on a prepopulated test sqlite db
|
||||||
# return code is zero.
|
# - also runs it against an new sqlite db
|
||||||
# - reruns the port script on the same sqlite db, targetting the same postgres db.
|
|
||||||
# Checks that the return code is zero.
|
|
||||||
# - runs the port script against a new sqlite db. Checks the return code is zero.
|
|
||||||
#
|
|
||||||
# Expects Synapse to have been already installed with `poetry install --extras postgres`.
|
|
||||||
# Expects `poetry` to be available on the `PATH`.
|
|
||||||
|
|
||||||
set -xe -o pipefail
|
|
||||||
|
set -xe
|
||||||
cd "$(dirname "$0")/../.."
|
cd "$(dirname "$0")/../.."
|
||||||
|
|
||||||
|
echo "--- Install dependencies"
|
||||||
|
|
||||||
|
# Install dependencies for this test.
|
||||||
|
pip install psycopg2 coverage coverage-enable-subprocess
|
||||||
|
|
||||||
|
# Install Synapse itself. This won't update any libraries.
|
||||||
|
pip install -e .
|
||||||
|
|
||||||
echo "--- Generate the signing key"
|
echo "--- Generate the signing key"
|
||||||
poetry run synapse_homeserver --generate-keys -c .ci/sqlite-config.yaml
|
|
||||||
|
# Generate the server's signing key.
|
||||||
|
python -m synapse.app.homeserver --generate-keys -c .ci/sqlite-config.yaml
|
||||||
|
|
||||||
echo "--- Prepare test database"
|
echo "--- Prepare test database"
|
||||||
# Make sure the SQLite3 database is using the latest schema and has no pending background updates.
|
|
||||||
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
|
# Make sure the SQLite3 database is using the latest schema and has no pending background update.
|
||||||
|
scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
|
||||||
|
|
||||||
# Create the PostgreSQL database.
|
# Create the PostgreSQL database.
|
||||||
psql -c "CREATE DATABASE synapse"
|
.ci/scripts/postgres_exec.py "CREATE DATABASE synapse"
|
||||||
|
|
||||||
echo "+++ Run synapse_port_db against test database"
|
echo "+++ Run synapse_port_db against test database"
|
||||||
# TODO: this invocation of synapse_port_db (and others below) used to be prepended with `coverage run`,
|
coverage run scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
|
||||||
# but coverage seems unable to find the entrypoints installed by `pip install -e .`.
|
|
||||||
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
|
|
||||||
|
|
||||||
# We should be able to run twice against the same database.
|
# We should be able to run twice against the same database.
|
||||||
echo "+++ Run synapse_port_db a second time"
|
echo "+++ Run synapse_port_db a second time"
|
||||||
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
|
coverage run scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
|
||||||
|
|
||||||
#####
|
#####
|
||||||
|
|
||||||
@ -42,26 +46,12 @@ echo "--- Prepare empty SQLite database"
|
|||||||
# we do this by deleting the sqlite db, and then doing the same again.
|
# we do this by deleting the sqlite db, and then doing the same again.
|
||||||
rm .ci/test_db.db
|
rm .ci/test_db.db
|
||||||
|
|
||||||
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
|
scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
|
||||||
|
|
||||||
# re-create the PostgreSQL database.
|
# re-create the PostgreSQL database.
|
||||||
psql \
|
.ci/scripts/postgres_exec.py \
|
||||||
-c "DROP DATABASE synapse" \
|
"DROP DATABASE synapse" \
|
||||||
-c "CREATE DATABASE synapse"
|
"CREATE DATABASE synapse"
|
||||||
|
|
||||||
echo "+++ Run synapse_port_db against empty database"
|
echo "+++ Run synapse_port_db against empty database"
|
||||||
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
|
coverage run scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
|
||||||
|
|
||||||
echo "--- Create a brand new postgres database from schema"
|
|
||||||
cp .ci/postgres-config.yaml .ci/postgres-config-unported.yaml
|
|
||||||
sed -i -e 's/database: synapse/database: synapse_unported/' .ci/postgres-config-unported.yaml
|
|
||||||
psql -c "CREATE DATABASE synapse_unported"
|
|
||||||
poetry run update_synapse_database --database-config .ci/postgres-config-unported.yaml --run-background-updates
|
|
||||||
|
|
||||||
echo "+++ Comparing ported schema with unported schema"
|
|
||||||
# Ignore the tables that portdb creates. (Should it tidy them up when the porting is completed?)
|
|
||||||
psql synapse -c "DROP TABLE port_from_sqlite3;"
|
|
||||||
pg_dump --format=plain --schema-only --no-tablespaces --no-acl --no-owner synapse_unported > unported.sql
|
|
||||||
pg_dump --format=plain --schema-only --no-tablespaces --no-acl --no-owner synapse > ported.sql
|
|
||||||
# By default, `diff` returns zero if there are no changes and nonzero otherwise
|
|
||||||
diff -u unported.sql ported.sql | tee schema_diff
|
|
||||||
|
@ -3,16 +3,11 @@
|
|||||||
|
|
||||||
# things to include
|
# things to include
|
||||||
!docker
|
!docker
|
||||||
|
!scripts
|
||||||
!synapse
|
!synapse
|
||||||
!rust
|
!MANIFEST.in
|
||||||
!README.rst
|
!README.rst
|
||||||
!pyproject.toml
|
!setup.py
|
||||||
!poetry.lock
|
!synctl
|
||||||
!Cargo.lock
|
|
||||||
!Cargo.toml
|
|
||||||
!build_rust.py
|
|
||||||
|
|
||||||
rust/target
|
|
||||||
synapse/*.so
|
|
||||||
|
|
||||||
**/__pycache__
|
**/__pycache__
|
||||||
|
@ -4,7 +4,6 @@
|
|||||||
root = true
|
root = true
|
||||||
|
|
||||||
# 4 space indentation
|
# 4 space indentation
|
||||||
[*.{py,pyi}]
|
[*.py]
|
||||||
indent_style = space
|
indent_style = space
|
||||||
indent_size = 4
|
indent_size = 4
|
||||||
max_line_length = 88
|
|
||||||
|
@ -1,28 +1,8 @@
|
|||||||
# Commits in this file will be removed from GitHub blame results.
|
# Black reformatting (#5482).
|
||||||
#
|
|
||||||
# To use this file locally, use:
|
|
||||||
# git blame --ignore-revs-file="path/to/.git-blame-ignore-revs" <files>
|
|
||||||
#
|
|
||||||
# or configure the `blame.ignoreRevsFile` option in your git config.
|
|
||||||
#
|
|
||||||
# If ignoring a pull request that was not squash merged, only the merge
|
|
||||||
# commit needs to be put here. Child commits will be resolved from it.
|
|
||||||
|
|
||||||
# Run black (https://github.com/matrix-org/synapse/pull/3679).
|
|
||||||
8b3d9b6b199abb87246f982d5db356f1966db925
|
|
||||||
|
|
||||||
# Black reformatting (https://github.com/matrix-org/synapse/pull/5482).
|
|
||||||
32e7c9e7f20b57dd081023ac42d6931a8da9b3a3
|
32e7c9e7f20b57dd081023ac42d6931a8da9b3a3
|
||||||
|
|
||||||
# Target Python 3.5 with black (https://github.com/matrix-org/synapse/pull/8664).
|
# Target Python 3.5 with black (#8664).
|
||||||
aff1eb7c671b0a3813407321d2702ec46c71fa56
|
aff1eb7c671b0a3813407321d2702ec46c71fa56
|
||||||
|
|
||||||
# Update black to 20.8b1 (https://github.com/matrix-org/synapse/pull/9381).
|
# Update black to 20.8b1 (#9381).
|
||||||
0a00b7ff14890987f09112a2ae696c61001e6cf1
|
0a00b7ff14890987f09112a2ae696c61001e6cf1
|
||||||
|
|
||||||
# Convert tests/rest/admin/test_room.py to unix file endings (https://github.com/matrix-org/synapse/pull/7953).
|
|
||||||
c4268e3da64f1abb5b31deaeb5769adb6510c0a7
|
|
||||||
|
|
||||||
# Update black to 23.1.0 (https://github.com/matrix-org/synapse/pull/15103)
|
|
||||||
9bb2eac71962970d02842bca441f4bcdbbf93a11
|
|
||||||
|
|
||||||
|
2
.github/CODEOWNERS
vendored
2
.github/CODEOWNERS
vendored
@ -1,2 +1,2 @@
|
|||||||
# Automatically request reviews from the synapse-core team when a pull request comes in.
|
# Automatically request reviews from the synapse-core team when a pull request comes in.
|
||||||
* @element-hq/synapse-core
|
* @matrix-org/synapse-core
|
4
.github/FUNDING.yml
vendored
Normal file
4
.github/FUNDING.yml
vendored
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
# One username per supported platform and one custom link
|
||||||
|
patreon: matrixdotorg
|
||||||
|
liberapay: matrixdotorg
|
||||||
|
custom: https://paypal.me/matrixdotorg
|
2
.github/ISSUE_TEMPLATE.md
vendored
2
.github/ISSUE_TEMPLATE.md
vendored
@ -2,4 +2,4 @@
|
|||||||
(using a matrix.org account if necessary). We do not use GitHub issues for
|
(using a matrix.org account if necessary). We do not use GitHub issues for
|
||||||
support.
|
support.
|
||||||
|
|
||||||
**If you want to report a security issue** please see https://element.io/security/security-disclosure-policy
|
**If you want to report a security issue** please see https://matrix.org/security-disclosure-policy/
|
||||||
|
72
.github/ISSUE_TEMPLATE/BUG_REPORT.md
vendored
Normal file
72
.github/ISSUE_TEMPLATE/BUG_REPORT.md
vendored
Normal file
@ -0,0 +1,72 @@
|
|||||||
|
---
|
||||||
|
name: Bug report
|
||||||
|
about: Create a report to help us improve
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<!--
|
||||||
|
|
||||||
|
**THIS IS NOT A SUPPORT CHANNEL!**
|
||||||
|
**IF YOU HAVE SUPPORT QUESTIONS ABOUT RUNNING OR CONFIGURING YOUR OWN HOME SERVER**,
|
||||||
|
please ask in **#synapse:matrix.org** (using a matrix.org account if necessary)
|
||||||
|
|
||||||
|
If you want to report a security issue, please see https://matrix.org/security-disclosure-policy/
|
||||||
|
|
||||||
|
This is a bug report template. By following the instructions below and
|
||||||
|
filling out the sections with your information, you will help the us to get all
|
||||||
|
the necessary data to fix your issue.
|
||||||
|
|
||||||
|
You can also preview your report before submitting it. You may remove sections
|
||||||
|
that aren't relevant to your particular case.
|
||||||
|
|
||||||
|
Text between <!-- and --> marks will be invisible in the report.
|
||||||
|
|
||||||
|
-->
|
||||||
|
|
||||||
|
### Description
|
||||||
|
|
||||||
|
<!-- Describe here the problem that you are experiencing -->
|
||||||
|
|
||||||
|
### Steps to reproduce
|
||||||
|
|
||||||
|
- list the steps
|
||||||
|
- that reproduce the bug
|
||||||
|
- using hyphens as bullet points
|
||||||
|
|
||||||
|
<!--
|
||||||
|
Describe how what happens differs from what you expected.
|
||||||
|
|
||||||
|
If you can identify any relevant log snippets from _homeserver.log_, please include
|
||||||
|
those (please be careful to remove any personal or private data). Please surround them with
|
||||||
|
``` (three backticks, on a line on their own), so that they are formatted legibly.
|
||||||
|
-->
|
||||||
|
|
||||||
|
### Version information
|
||||||
|
|
||||||
|
<!-- IMPORTANT: please answer the following questions, to help us narrow down the problem -->
|
||||||
|
|
||||||
|
<!-- Was this issue identified on matrix.org or another homeserver? -->
|
||||||
|
- **Homeserver**:
|
||||||
|
|
||||||
|
If not matrix.org:
|
||||||
|
|
||||||
|
<!--
|
||||||
|
What version of Synapse is running?
|
||||||
|
|
||||||
|
You can find the Synapse version with this command:
|
||||||
|
|
||||||
|
$ curl http://localhost:8008/_synapse/admin/v1/server_version
|
||||||
|
|
||||||
|
(You may need to replace `localhost:8008` if Synapse is not configured to
|
||||||
|
listen on that port.)
|
||||||
|
-->
|
||||||
|
- **Version**:
|
||||||
|
|
||||||
|
- **Install method**:
|
||||||
|
<!-- examples: package manager/git clone/pip -->
|
||||||
|
|
||||||
|
- **Platform**:
|
||||||
|
<!--
|
||||||
|
Tell us about the environment in which your homeserver is operating
|
||||||
|
distro, hardware, if it's running in a vm/container, etc.
|
||||||
|
-->
|
144
.github/ISSUE_TEMPLATE/BUG_REPORT.yml
vendored
144
.github/ISSUE_TEMPLATE/BUG_REPORT.yml
vendored
@ -1,144 +0,0 @@
|
|||||||
name: Bug report
|
|
||||||
description: Create a report to help us improve
|
|
||||||
body:
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
**THIS IS NOT A SUPPORT CHANNEL!**
|
|
||||||
**IF YOU HAVE SUPPORT QUESTIONS ABOUT RUNNING OR CONFIGURING YOUR OWN HOME SERVER**, please ask in **[#synapse:matrix.org](https://matrix.to/#/#synapse:matrix.org)** (using a matrix.org account if necessary).
|
|
||||||
|
|
||||||
If you want to report a security issue, please see https://element.io/security/security-disclosure-policy
|
|
||||||
|
|
||||||
This is a bug report form. By following the instructions below and completing the sections with your information, you will help the us to get all the necessary data to fix your issue.
|
|
||||||
|
|
||||||
You can also preview your report before submitting it.
|
|
||||||
- type: textarea
|
|
||||||
id: description
|
|
||||||
attributes:
|
|
||||||
label: Description
|
|
||||||
description: Describe the problem that you are experiencing
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: reproduction_steps
|
|
||||||
attributes:
|
|
||||||
label: Steps to reproduce
|
|
||||||
description: |
|
|
||||||
Describe the series of steps that leads you to the problem.
|
|
||||||
|
|
||||||
Describe how what happens differs from what you expected.
|
|
||||||
placeholder: Tell us what you see!
|
|
||||||
value: |
|
|
||||||
- list the steps
|
|
||||||
- that reproduce the bug
|
|
||||||
- using hyphens as bullet points
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
---
|
|
||||||
|
|
||||||
**IMPORTANT**: please answer the following questions, to help us narrow down the problem.
|
|
||||||
- type: input
|
|
||||||
id: homeserver
|
|
||||||
attributes:
|
|
||||||
label: Homeserver
|
|
||||||
description: Which homeserver was this issue identified on? (matrix.org, another homeserver, etc)
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: input
|
|
||||||
id: version
|
|
||||||
attributes:
|
|
||||||
label: Synapse Version
|
|
||||||
description: |
|
|
||||||
What version of Synapse is this homeserver running?
|
|
||||||
|
|
||||||
You can find the Synapse version by visiting https://yourserver.example.com/_matrix/federation/v1/version
|
|
||||||
|
|
||||||
or with this command:
|
|
||||||
|
|
||||||
```
|
|
||||||
$ curl http://localhost:8008/_synapse/admin/v1/server_version
|
|
||||||
```
|
|
||||||
|
|
||||||
(You may need to replace `localhost:8008` if Synapse is not configured to listen on that port.)
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: dropdown
|
|
||||||
id: install_method
|
|
||||||
attributes:
|
|
||||||
label: Installation Method
|
|
||||||
options:
|
|
||||||
- Docker (matrixdotorg/synapse)
|
|
||||||
- Debian packages from packages.matrix.org
|
|
||||||
- pip (from PyPI)
|
|
||||||
- Other (please mention below)
|
|
||||||
- I don't know
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: input
|
|
||||||
id: database
|
|
||||||
attributes:
|
|
||||||
label: Database
|
|
||||||
description: |
|
|
||||||
Are you using SQLite or PostgreSQL? What's the version of your database?
|
|
||||||
|
|
||||||
If PostgreSQL, please also answer the following:
|
|
||||||
- are you using a single PostgreSQL server
|
|
||||||
or [separate servers for `main` and `state`](https://element-hq.github.io/synapse/latest/usage/configuration/config_documentation.html#databases)?
|
|
||||||
- have you previously ported from SQLite using the Synapse "portdb" script?
|
|
||||||
- have you previously restored from a backup?
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: dropdown
|
|
||||||
id: workers
|
|
||||||
attributes:
|
|
||||||
label: Workers
|
|
||||||
description: |
|
|
||||||
Are you running a single Synapse process, or are you running
|
|
||||||
[2 or more workers](https://element-hq.github.io/synapse/latest/workers.html)?
|
|
||||||
options:
|
|
||||||
- Single process
|
|
||||||
- Multiple workers
|
|
||||||
- I don't know
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: platform
|
|
||||||
attributes:
|
|
||||||
label: Platform
|
|
||||||
description: |
|
|
||||||
Tell us about the environment in which your homeserver is operating...
|
|
||||||
e.g. distro, hardware, if it's running in a vm/container, etc.
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: config
|
|
||||||
attributes:
|
|
||||||
label: Configuration
|
|
||||||
description: |
|
|
||||||
Do you have any unusual config options turned on? If so, please provide details.
|
|
||||||
|
|
||||||
- Experimental or undocumented features
|
|
||||||
- [Presence](https://element-hq.github.io/synapse/latest/usage/configuration/config_documentation.html#presence)
|
|
||||||
- [Message retention](https://element-hq.github.io/synapse/latest/message_retention_policies.html)
|
|
||||||
- [Synapse modules](https://element-hq.github.io/synapse/latest/modules/index.html)
|
|
||||||
- type: textarea
|
|
||||||
id: logs
|
|
||||||
attributes:
|
|
||||||
label: Relevant log output
|
|
||||||
description: |
|
|
||||||
Please copy and paste any relevant log output as text (not images), ideally at INFO or DEBUG log level.
|
|
||||||
This will be automatically formatted into code, so there is no need for backticks (`\``).
|
|
||||||
|
|
||||||
Please be careful to remove any personal or private data.
|
|
||||||
|
|
||||||
**Bug reports are usually impossible to diagnose without logging.**
|
|
||||||
render: shell
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: anything_else
|
|
||||||
attributes:
|
|
||||||
label: Anything else that would be useful to know?
|
|
9
.github/PULL_REQUEST_TEMPLATE.md
vendored
9
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -1,12 +1,13 @@
|
|||||||
### Pull Request Checklist
|
### Pull Request Checklist
|
||||||
|
|
||||||
<!-- Please read https://element-hq.github.io/synapse/latest/development/contributing_guide.html before submitting your pull request -->
|
<!-- Please read https://matrix-org.github.io/synapse/latest/development/contributing_guide.html before submitting your pull request -->
|
||||||
|
|
||||||
* [ ] Pull request is based on the develop branch
|
* [ ] Pull request is based on the develop branch
|
||||||
* [ ] Pull request includes a [changelog file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog). The entry should:
|
* [ ] Pull request includes a [changelog file](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#changelog). The entry should:
|
||||||
- Be a short description of your change which makes sense to users. "Fixed a bug that prevented receiving messages from other servers." instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
|
- Be a short description of your change which makes sense to users. "Fixed a bug that prevented receiving messages from other servers." instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
|
||||||
- Use markdown where necessary, mostly for `code blocks`.
|
- Use markdown where necessary, mostly for `code blocks`.
|
||||||
- End with either a period (.) or an exclamation mark (!).
|
- End with either a period (.) or an exclamation mark (!).
|
||||||
- Start with a capital letter.
|
- Start with a capital letter.
|
||||||
- Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry.
|
* [ ] Pull request includes a [sign off](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#sign-off)
|
||||||
* [ ] [Code style](https://element-hq.github.io/synapse/latest/code_style.html) is correct (run the [linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
|
* [ ] [Code style](https://matrix-org.github.io/synapse/latest/code_style.html) is correct
|
||||||
|
(run the [linters](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
|
||||||
|
23
.github/dependabot.yml
vendored
23
.github/dependabot.yml
vendored
@ -1,23 +0,0 @@
|
|||||||
version: 2
|
|
||||||
updates:
|
|
||||||
- # "pip" is the correct setting for poetry, per https://docs.github.com/en/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file#package-ecosystem
|
|
||||||
package-ecosystem: "pip"
|
|
||||||
directory: "/"
|
|
||||||
schedule:
|
|
||||||
interval: "weekly"
|
|
||||||
|
|
||||||
- package-ecosystem: "docker"
|
|
||||||
directory: "/docker"
|
|
||||||
schedule:
|
|
||||||
interval: "weekly"
|
|
||||||
|
|
||||||
- package-ecosystem: "github-actions"
|
|
||||||
directory: "/"
|
|
||||||
schedule:
|
|
||||||
interval: "weekly"
|
|
||||||
|
|
||||||
- package-ecosystem: "cargo"
|
|
||||||
directory: "/"
|
|
||||||
versioning-strategy: "lockfile-only"
|
|
||||||
schedule:
|
|
||||||
interval: "weekly"
|
|
176
.github/workflows/docker.yml
vendored
176
.github/workflows/docker.yml
vendored
@ -10,146 +10,66 @@ on:
|
|||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
contents: read
|
contents: read
|
||||||
packages: write
|
|
||||||
id-token: write # needed for signing the images with GitHub OIDC Token
|
|
||||||
jobs:
|
jobs:
|
||||||
build:
|
build:
|
||||||
name: Build and push image for ${{ matrix.platform }}
|
runs-on: ubuntu-latest
|
||||||
runs-on: ${{ matrix.runs_on }}
|
|
||||||
strategy:
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
- platform: linux/amd64
|
|
||||||
runs_on: ubuntu-24.04
|
|
||||||
suffix: linux-amd64
|
|
||||||
- platform: linux/arm64
|
|
||||||
runs_on: ubuntu-24.04-arm
|
|
||||||
suffix: linux-arm64
|
|
||||||
steps:
|
steps:
|
||||||
|
- name: Set up QEMU
|
||||||
|
id: qemu
|
||||||
|
uses: docker/setup-qemu-action@v1
|
||||||
|
with:
|
||||||
|
platforms: arm64
|
||||||
|
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
id: buildx
|
id: buildx
|
||||||
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
|
uses: docker/setup-buildx-action@v1
|
||||||
|
|
||||||
- name: Checkout repository
|
- name: Inspect builder
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
run: docker buildx inspect
|
||||||
|
|
||||||
- name: Extract version from pyproject.toml
|
|
||||||
# Note: explicitly requesting bash will mean bash is invoked with `-eo pipefail`, see
|
|
||||||
# https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsshell
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
echo "SYNAPSE_VERSION=$(grep "^version" pyproject.toml | sed -E 's/version\s*=\s*["]([^"]*)["]/\1/')" >> $GITHUB_ENV
|
|
||||||
|
|
||||||
- name: Log in to DockerHub
|
- name: Log in to DockerHub
|
||||||
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
|
uses: docker/login-action@v1
|
||||||
with:
|
with:
|
||||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
||||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
||||||
|
|
||||||
- name: Log in to GHCR
|
|
||||||
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
|
|
||||||
with:
|
|
||||||
registry: ghcr.io
|
|
||||||
username: ${{ github.repository_owner }}
|
|
||||||
password: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
|
|
||||||
- name: Build and push by digest
|
|
||||||
id: build
|
|
||||||
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
|
|
||||||
with:
|
|
||||||
push: true
|
|
||||||
labels: |
|
|
||||||
gitsha1=${{ github.sha }}
|
|
||||||
org.opencontainers.image.version=${{ env.SYNAPSE_VERSION }}
|
|
||||||
tags: |
|
|
||||||
docker.io/matrixdotorg/synapse
|
|
||||||
ghcr.io/element-hq/synapse
|
|
||||||
file: "docker/Dockerfile"
|
|
||||||
platforms: ${{ matrix.platform }}
|
|
||||||
outputs: type=image,push-by-digest=true,name-canonical=true,push=true
|
|
||||||
|
|
||||||
- name: Export digest
|
|
||||||
run: |
|
|
||||||
mkdir -p ${{ runner.temp }}/digests
|
|
||||||
digest="${{ steps.build.outputs.digest }}"
|
|
||||||
touch "${{ runner.temp }}/digests/${digest#sha256:}"
|
|
||||||
|
|
||||||
- name: Upload digest
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: digests-${{ matrix.suffix }}
|
|
||||||
path: ${{ runner.temp }}/digests/*
|
|
||||||
if-no-files-found: error
|
|
||||||
retention-days: 1
|
|
||||||
|
|
||||||
merge:
|
|
||||||
name: Push merged images to ${{ matrix.repository }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
strategy:
|
|
||||||
matrix:
|
|
||||||
repository:
|
|
||||||
- docker.io/matrixdotorg/synapse
|
|
||||||
- ghcr.io/element-hq/synapse
|
|
||||||
|
|
||||||
needs:
|
|
||||||
- build
|
|
||||||
steps:
|
|
||||||
- name: Download digests
|
|
||||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
|
||||||
with:
|
|
||||||
path: ${{ runner.temp }}/digests
|
|
||||||
pattern: digests-*
|
|
||||||
merge-multiple: true
|
|
||||||
|
|
||||||
- name: Log in to DockerHub
|
|
||||||
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
|
|
||||||
if: ${{ startsWith(matrix.repository, 'docker.io') }}
|
|
||||||
with:
|
|
||||||
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
|
||||||
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
|
||||||
|
|
||||||
- name: Log in to GHCR
|
|
||||||
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
|
|
||||||
if: ${{ startsWith(matrix.repository, 'ghcr.io') }}
|
|
||||||
with:
|
|
||||||
registry: ghcr.io
|
|
||||||
username: ${{ github.repository_owner }}
|
|
||||||
password: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
|
|
||||||
- name: Set up Docker Buildx
|
|
||||||
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
|
|
||||||
|
|
||||||
- name: Install Cosign
|
|
||||||
uses: sigstore/cosign-installer@398d4b0eeef1380460a10c8013a76f728fb906ac # v3.9.1
|
|
||||||
|
|
||||||
- name: Calculate docker image tag
|
- name: Calculate docker image tag
|
||||||
uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804 # v5.7.0
|
id: set-tag
|
||||||
|
run: |
|
||||||
|
case "${GITHUB_REF}" in
|
||||||
|
refs/heads/develop)
|
||||||
|
tag=develop
|
||||||
|
;;
|
||||||
|
refs/heads/master|refs/heads/main)
|
||||||
|
tag=latest
|
||||||
|
;;
|
||||||
|
refs/tags/*)
|
||||||
|
tag=${GITHUB_REF#refs/tags/}
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
tag=${GITHUB_SHA}
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
echo "::set-output name=tag::$tag"
|
||||||
|
|
||||||
|
# for release builds, we want to get the amd64 image out asap, so first
|
||||||
|
# we do an amd64-only build, before following up with a multiarch build.
|
||||||
|
- name: Build and push amd64
|
||||||
|
uses: docker/build-push-action@v2
|
||||||
|
if: "${{ startsWith(github.ref, 'refs/tags/v') }}"
|
||||||
with:
|
with:
|
||||||
images: ${{ matrix.repository }}
|
push: true
|
||||||
flavor: |
|
labels: "gitsha1=${{ github.sha }}"
|
||||||
latest=false
|
tags: "matrixdotorg/synapse:${{ steps.set-tag.outputs.tag }}"
|
||||||
tags: |
|
file: "docker/Dockerfile"
|
||||||
type=raw,value=develop,enable=${{ github.ref == 'refs/heads/develop' }}
|
platforms: linux/amd64
|
||||||
type=raw,value=latest,enable=${{ github.ref == 'refs/heads/master' }}
|
|
||||||
type=raw,value=latest,enable=${{ github.ref == 'refs/heads/main' }}
|
|
||||||
type=pep440,pattern={{raw}}
|
|
||||||
type=sha
|
|
||||||
|
|
||||||
- name: Create manifest list and push
|
- name: Build and push all platforms
|
||||||
working-directory: ${{ runner.temp }}/digests
|
uses: docker/build-push-action@v2
|
||||||
env:
|
with:
|
||||||
REPOSITORY: ${{ matrix.repository }}
|
push: true
|
||||||
run: |
|
labels: "gitsha1=${{ github.sha }}"
|
||||||
docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
|
tags: "matrixdotorg/synapse:${{ steps.set-tag.outputs.tag }}"
|
||||||
$(printf "$REPOSITORY@sha256:%s " *)
|
file: "docker/Dockerfile"
|
||||||
|
platforms: linux/amd64,linux/arm64
|
||||||
- name: Sign each manifest
|
|
||||||
env:
|
|
||||||
REPOSITORY: ${{ matrix.repository }}
|
|
||||||
run: |
|
|
||||||
DIGESTS=""
|
|
||||||
for TAG in $(echo "$DOCKER_METADATA_OUTPUT_JSON" | jq -r '.tags[]'); do
|
|
||||||
DIGEST="$(docker buildx imagetools inspect $TAG --format '{{json .Manifest}}' | jq -r '.digest')"
|
|
||||||
DIGESTS="$DIGESTS $REPOSITORY@$DIGEST"
|
|
||||||
done
|
|
||||||
cosign sign --yes $DIGESTS
|
|
||||||
|
34
.github/workflows/docs-pr-netlify.yaml
vendored
34
.github/workflows/docs-pr-netlify.yaml
vendored
@ -1,34 +0,0 @@
|
|||||||
name: Deploy documentation PR preview
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_run:
|
|
||||||
workflows: [ "Prepare documentation PR preview" ]
|
|
||||||
types:
|
|
||||||
- completed
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
netlify:
|
|
||||||
if: github.event.workflow_run.conclusion == 'success' && github.event.workflow_run.event == 'pull_request'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
|
|
||||||
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
|
|
||||||
- name: 📥 Download artifact
|
|
||||||
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
|
|
||||||
with:
|
|
||||||
workflow: docs-pr.yaml
|
|
||||||
run_id: ${{ github.event.workflow_run.id }}
|
|
||||||
name: book
|
|
||||||
path: book
|
|
||||||
|
|
||||||
- name: 📤 Deploy to Netlify
|
|
||||||
uses: matrix-org/netlify-pr-preview@9805cd123fc9a7e421e35340a05e1ebc5dee46b5 # v3
|
|
||||||
with:
|
|
||||||
path: book
|
|
||||||
owner: ${{ github.event.workflow_run.head_repository.owner.login }}
|
|
||||||
branch: ${{ github.event.workflow_run.head_branch }}
|
|
||||||
revision: ${{ github.event.workflow_run.head_sha }}
|
|
||||||
token: ${{ secrets.NETLIFY_AUTH_TOKEN }}
|
|
||||||
site_id: ${{ secrets.NETLIFY_SITE_ID }}
|
|
||||||
desc: Documentation preview
|
|
||||||
deployment_env: PR Documentation Preview
|
|
71
.github/workflows/docs-pr.yaml
vendored
71
.github/workflows/docs-pr.yaml
vendored
@ -1,71 +0,0 @@
|
|||||||
name: Prepare documentation PR preview
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request:
|
|
||||||
paths:
|
|
||||||
- docs/**
|
|
||||||
- book.toml
|
|
||||||
- .github/workflows/docs-pr.yaml
|
|
||||||
- scripts-dev/schema_versions.py
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
pages:
|
|
||||||
name: GitHub Pages
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
with:
|
|
||||||
# Fetch all history so that the schema_versions script works.
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
- name: Setup mdbook
|
|
||||||
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
|
|
||||||
with:
|
|
||||||
mdbook-version: '0.4.17'
|
|
||||||
|
|
||||||
- name: Setup python
|
|
||||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
|
|
||||||
- run: "pip install 'packaging>=20.0' 'GitPython>=3.1.20'"
|
|
||||||
|
|
||||||
- name: Build the documentation
|
|
||||||
# mdbook will only create an index.html if we're including docs/README.md in SUMMARY.md.
|
|
||||||
# However, we're using docs/README.md for other purposes and need to pick a new page
|
|
||||||
# as the default. Let's opt for the welcome page instead.
|
|
||||||
run: |
|
|
||||||
mdbook build
|
|
||||||
cp book/welcome_and_overview.html book/index.html
|
|
||||||
|
|
||||||
- name: Upload Artifact
|
|
||||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
|
||||||
with:
|
|
||||||
name: book
|
|
||||||
path: book
|
|
||||||
# We'll only use this in a workflow_run, then we're done with it
|
|
||||||
retention-days: 1
|
|
||||||
|
|
||||||
link-check:
|
|
||||||
name: Check links in documentation
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Setup mdbook
|
|
||||||
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
|
|
||||||
with:
|
|
||||||
mdbook-version: '0.4.17'
|
|
||||||
|
|
||||||
- name: Setup htmltest
|
|
||||||
run: |
|
|
||||||
wget https://github.com/wjdp/htmltest/releases/download/v0.17.0/htmltest_0.17.0_linux_amd64.tar.gz
|
|
||||||
echo '775c597ee74899d6002cd2d93076f897f4ba68686bceabe2e5d72e84c57bc0fb htmltest_0.17.0_linux_amd64.tar.gz' | sha256sum -c
|
|
||||||
tar zxf htmltest_0.17.0_linux_amd64.tar.gz
|
|
||||||
|
|
||||||
- name: Test links with htmltest
|
|
||||||
# Build the book with `./` as the site URL (to make checks on 404.html possible)
|
|
||||||
# Then run htmltest (without checking external links since that involves the network and is slow).
|
|
||||||
run: |
|
|
||||||
MDBOOK_OUTPUT__HTML__SITE_URL="./" mdbook build
|
|
||||||
./htmltest book --skip-external
|
|
74
.github/workflows/docs.yaml
vendored
74
.github/workflows/docs.yaml
vendored
@ -13,10 +13,25 @@ on:
|
|||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
pre:
|
pages:
|
||||||
name: Calculate variables for GitHub Pages deployment
|
name: GitHub Pages
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
|
- uses: actions/checkout@v2
|
||||||
|
|
||||||
|
- name: Setup mdbook
|
||||||
|
uses: peaceiris/actions-mdbook@4b5ef36b314c2599664ca107bb8c02412548d79d # v1.1.14
|
||||||
|
with:
|
||||||
|
mdbook-version: '0.4.9'
|
||||||
|
|
||||||
|
- name: Build the documentation
|
||||||
|
# mdbook will only create an index.html if we're including docs/README.md in SUMMARY.md.
|
||||||
|
# However, we're using docs/README.md for other purposes and need to pick a new page
|
||||||
|
# as the default. Let's opt for the welcome page instead.
|
||||||
|
run: |
|
||||||
|
mdbook build
|
||||||
|
cp book/welcome_and_overview.html book/index.html
|
||||||
|
|
||||||
# Figure out the target directory.
|
# Figure out the target directory.
|
||||||
#
|
#
|
||||||
# The target directory depends on the name of the branch
|
# The target directory depends on the name of the branch
|
||||||
@ -39,61 +54,12 @@ jobs:
|
|||||||
esac
|
esac
|
||||||
|
|
||||||
# finally, set the 'branch-version' var.
|
# finally, set the 'branch-version' var.
|
||||||
echo "branch-version=$branch" >> "$GITHUB_OUTPUT"
|
echo "::set-output name=branch-version::$branch"
|
||||||
outputs:
|
|
||||||
branch-version: ${{ steps.vars.outputs.branch-version }}
|
|
||||||
|
|
||||||
################################################################################
|
|
||||||
pages-docs:
|
|
||||||
name: GitHub Pages
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- pre
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
with:
|
|
||||||
# Fetch all history so that the schema_versions script works.
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
- name: Setup mdbook
|
|
||||||
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
|
|
||||||
with:
|
|
||||||
mdbook-version: '0.4.17'
|
|
||||||
|
|
||||||
- name: Set version of docs
|
|
||||||
run: echo 'window.SYNAPSE_VERSION = "${{ needs.pre.outputs.branch-version }}";' > ./docs/website_files/version.js
|
|
||||||
|
|
||||||
- name: Setup python
|
|
||||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
|
|
||||||
- run: "pip install 'packaging>=20.0' 'GitPython>=3.1.20'"
|
|
||||||
|
|
||||||
- name: Build the documentation
|
|
||||||
# mdbook will only create an index.html if we're including docs/README.md in SUMMARY.md.
|
|
||||||
# However, we're using docs/README.md for other purposes and need to pick a new page
|
|
||||||
# as the default. Let's opt for the welcome page instead.
|
|
||||||
run: |
|
|
||||||
mdbook build
|
|
||||||
cp book/welcome_and_overview.html book/index.html
|
|
||||||
|
|
||||||
- name: Prepare and publish schema files
|
|
||||||
run: |
|
|
||||||
sudo apt-get update && sudo apt-get install -y yq
|
|
||||||
mkdir -p book/schema
|
|
||||||
# Remove developer notice before publishing.
|
|
||||||
rm schema/v*/Do\ not\ edit\ files\ in\ this\ folder
|
|
||||||
# Copy schema files that are independent from current Synapse version.
|
|
||||||
cp -r -t book/schema schema/v*/
|
|
||||||
# Convert config schema from YAML source file to JSON.
|
|
||||||
yq < schema/synapse-config.schema.yaml \
|
|
||||||
> book/schema/synapse-config.schema.json
|
|
||||||
|
|
||||||
# Deploy to the target directory.
|
# Deploy to the target directory.
|
||||||
- name: Deploy to gh pages
|
- name: Deploy to gh pages
|
||||||
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4.0.0
|
uses: peaceiris/actions-gh-pages@068dc23d9710f1ba62e86896f84735d869951305 # v3.8.0
|
||||||
with:
|
with:
|
||||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||||
publish_dir: ./book
|
publish_dir: ./book
|
||||||
destination_dir: ./${{ needs.pre.outputs.branch-version }}
|
destination_dir: ./${{ steps.vars.outputs.branch-version }}
|
||||||
|
52
.github/workflows/fix_lint.yaml
vendored
52
.github/workflows/fix_lint.yaml
vendored
@ -1,52 +0,0 @@
|
|||||||
# A helper workflow to automatically fixup any linting errors on a PR. Must be
|
|
||||||
# triggered manually.
|
|
||||||
|
|
||||||
name: Attempt to automatically fix linting errors
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_dispatch:
|
|
||||||
|
|
||||||
env:
|
|
||||||
# We use nightly so that `fmt` correctly groups together imports, and
|
|
||||||
# clippy correctly fixes up the benchmarks.
|
|
||||||
RUST_VERSION: nightly-2025-06-24
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
fixup:
|
|
||||||
name: Fix up
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout repository
|
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
components: clippy, rustfmt
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- name: Setup Poetry
|
|
||||||
uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
|
||||||
install-project: "false"
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
|
|
||||||
- name: Run ruff check
|
|
||||||
continue-on-error: true
|
|
||||||
run: poetry run ruff check --fix .
|
|
||||||
|
|
||||||
- name: Run ruff format
|
|
||||||
continue-on-error: true
|
|
||||||
run: poetry run ruff format --quiet .
|
|
||||||
|
|
||||||
- run: cargo clippy --all-features --fix -- -D warnings
|
|
||||||
continue-on-error: true
|
|
||||||
|
|
||||||
- run: cargo fmt
|
|
||||||
continue-on-error: true
|
|
||||||
|
|
||||||
- uses: stefanzweifel/git-auto-commit-action@778341af668090896ca464160c2def5d1d1a3eb0 # v6.0.1
|
|
||||||
with:
|
|
||||||
commit_message: "Attempt to fix linting"
|
|
243
.github/workflows/latest_deps.yml
vendored
243
.github/workflows/latest_deps.yml
vendored
@ -1,243 +0,0 @@
|
|||||||
# People who are freshly `pip install`ing from PyPI will pull in the latest versions of
|
|
||||||
# dependencies which match the broad requirements. Since most CI runs are against
|
|
||||||
# the locked poetry environment, run specifically against the latest dependencies to
|
|
||||||
# know if there's an upcoming breaking change.
|
|
||||||
#
|
|
||||||
# As an overview this workflow:
|
|
||||||
# - checks out develop,
|
|
||||||
# - installs from source, pulling in the dependencies like a fresh `pip install` would, and
|
|
||||||
# - runs mypy and test suites in that checkout.
|
|
||||||
#
|
|
||||||
# Based on the twisted trunk CI job.
|
|
||||||
|
|
||||||
name: Latest dependencies
|
|
||||||
|
|
||||||
on:
|
|
||||||
schedule:
|
|
||||||
- cron: 0 7 * * *
|
|
||||||
workflow_dispatch:
|
|
||||||
|
|
||||||
concurrency:
|
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
|
||||||
cancel-in-progress: true
|
|
||||||
|
|
||||||
env:
|
|
||||||
RUST_VERSION: 1.87.0
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
check_repo:
|
|
||||||
# Prevent this workflow from running on any fork of Synapse other than element-hq/synapse, as it is
|
|
||||||
# only useful to the Synapse core team.
|
|
||||||
# All other workflow steps depend on this one, thus if 'should_run_workflow' is not 'true', the rest
|
|
||||||
# of the workflow will be skipped as well.
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
outputs:
|
|
||||||
should_run_workflow: ${{ steps.check_condition.outputs.should_run_workflow }}
|
|
||||||
steps:
|
|
||||||
- id: check_condition
|
|
||||||
run: echo "should_run_workflow=${{ github.repository == 'element-hq/synapse' }}" >> "$GITHUB_OUTPUT"
|
|
||||||
|
|
||||||
mypy:
|
|
||||||
needs: check_repo
|
|
||||||
if: needs.check_repo.outputs.should_run_workflow == 'true'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
# The dev dependencies aren't exposed in the wheel metadata (at least with current
|
|
||||||
# poetry-core versions), so we install with poetry.
|
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
extras: "all"
|
|
||||||
# Dump installed versions for debugging.
|
|
||||||
- run: poetry run pip list > before.txt
|
|
||||||
# Upgrade all runtime dependencies only. This is intended to mimic a fresh
|
|
||||||
# `pip install matrix-synapse[all]` as closely as possible.
|
|
||||||
- run: poetry update --without dev
|
|
||||||
- run: poetry run pip list > after.txt && (diff -u before.txt after.txt || true)
|
|
||||||
- name: Remove unhelpful options from mypy config
|
|
||||||
run: sed -e '/warn_unused_ignores = True/d' -e '/warn_redundant_casts = True/d' -i mypy.ini
|
|
||||||
- run: poetry run mypy
|
|
||||||
trial:
|
|
||||||
needs: check_repo
|
|
||||||
if: needs.check_repo.outputs.should_run_workflow == 'true'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
strategy:
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
- database: "sqlite"
|
|
||||||
- database: "postgres"
|
|
||||||
postgres-version: "14"
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- run: sudo apt-get -qq install xmlsec1
|
|
||||||
- name: Set up PostgreSQL ${{ matrix.postgres-version }}
|
|
||||||
if: ${{ matrix.postgres-version }}
|
|
||||||
run: |
|
|
||||||
docker run -d -p 5432:5432 \
|
|
||||||
-e POSTGRES_PASSWORD=postgres \
|
|
||||||
-e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \
|
|
||||||
postgres:${{ matrix.postgres-version }}
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
- run: pip install .[all,test]
|
|
||||||
- name: Await PostgreSQL
|
|
||||||
if: ${{ matrix.postgres-version }}
|
|
||||||
timeout-minutes: 2
|
|
||||||
run: until pg_isready -h localhost; do sleep 1; done
|
|
||||||
|
|
||||||
# We nuke the local copy, as we've installed synapse into the virtualenv
|
|
||||||
# (rather than use an editable install, which we no longer support). If we
|
|
||||||
# don't do this then python can't find the native lib.
|
|
||||||
- run: rm -rf synapse/
|
|
||||||
|
|
||||||
- run: python -m twisted.trial --jobs=2 tests
|
|
||||||
env:
|
|
||||||
SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }}
|
|
||||||
SYNAPSE_POSTGRES_HOST: localhost
|
|
||||||
SYNAPSE_POSTGRES_USER: postgres
|
|
||||||
SYNAPSE_POSTGRES_PASSWORD: postgres
|
|
||||||
- name: Dump logs
|
|
||||||
# Logs are most useful when the command fails, always include them.
|
|
||||||
if: ${{ always() }}
|
|
||||||
# Note: Dumps to workflow logs instead of using actions/upload-artifact
|
|
||||||
# This keeps logs colocated with failing jobs
|
|
||||||
# It also ignores find's exit code; this is a best effort affair
|
|
||||||
run: >-
|
|
||||||
find _trial_temp -name '*.log'
|
|
||||||
-exec echo "::group::{}" \;
|
|
||||||
-exec cat {} \;
|
|
||||||
-exec echo "::endgroup::" \;
|
|
||||||
|| true
|
|
||||||
|
|
||||||
|
|
||||||
sytest:
|
|
||||||
needs: check_repo
|
|
||||||
if: needs.check_repo.outputs.should_run_workflow == 'true'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
container:
|
|
||||||
image: matrixdotorg/sytest-synapse:testing
|
|
||||||
volumes:
|
|
||||||
- ${{ github.workspace }}:/src
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
- sytest-tag: bullseye
|
|
||||||
|
|
||||||
- sytest-tag: bullseye
|
|
||||||
postgres: postgres
|
|
||||||
workers: workers
|
|
||||||
redis: redis
|
|
||||||
env:
|
|
||||||
POSTGRES: ${{ matrix.postgres && 1}}
|
|
||||||
WORKERS: ${{ matrix.workers && 1 }}
|
|
||||||
REDIS: ${{ matrix.redis && 1 }}
|
|
||||||
BLACKLIST: ${{ matrix.workers && 'synapse-blacklist-with-workers' }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- name: Ensure sytest runs `pip install`
|
|
||||||
# Delete the lockfile so sytest will `pip install` rather than `poetry install`
|
|
||||||
run: rm /src/poetry.lock
|
|
||||||
working-directory: /src
|
|
||||||
- name: Prepare test blacklist
|
|
||||||
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
|
|
||||||
- name: Run SyTest
|
|
||||||
run: /bootstrap.sh synapse
|
|
||||||
working-directory: /src
|
|
||||||
- name: Summarise results.tap
|
|
||||||
if: ${{ always() }}
|
|
||||||
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
|
|
||||||
- name: Upload SyTest logs
|
|
||||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
|
||||||
if: ${{ always() }}
|
|
||||||
with:
|
|
||||||
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
|
|
||||||
path: |
|
|
||||||
/logs/results.tap
|
|
||||||
/logs/**/*.log*
|
|
||||||
|
|
||||||
|
|
||||||
complement:
|
|
||||||
needs: check_repo
|
|
||||||
if: "!failure() && !cancelled() && needs.check_repo.outputs.should_run_workflow == 'true'"
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
- arrangement: monolith
|
|
||||||
database: SQLite
|
|
||||||
|
|
||||||
- arrangement: monolith
|
|
||||||
database: Postgres
|
|
||||||
|
|
||||||
- arrangement: workers
|
|
||||||
database: Postgres
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Check out synapse codebase
|
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
with:
|
|
||||||
path: synapse
|
|
||||||
|
|
||||||
- name: Prepare Complement's Prerequisites
|
|
||||||
run: synapse/.ci/scripts/setup_complement_prerequisites.sh
|
|
||||||
|
|
||||||
- uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
|
|
||||||
with:
|
|
||||||
cache-dependency-path: complement/go.sum
|
|
||||||
go-version-file: complement/go.mod
|
|
||||||
|
|
||||||
- run: |
|
|
||||||
set -o pipefail
|
|
||||||
TEST_ONLY_IGNORE_POETRY_LOCKFILE=1 POSTGRES=${{ (matrix.database == 'Postgres') && 1 || '' }} WORKERS=${{ (matrix.arrangement == 'workers') && 1 || '' }} COMPLEMENT_DIR=`pwd`/complement synapse/scripts-dev/complement.sh -json 2>&1 | synapse/.ci/scripts/gotestfmt
|
|
||||||
shell: bash
|
|
||||||
name: Run Complement Tests
|
|
||||||
|
|
||||||
# Open an issue if the build fails, so we know about it.
|
|
||||||
# Only do this if we're not experimenting with this action in a PR.
|
|
||||||
open-issue:
|
|
||||||
if: "failure() && github.event_name != 'push' && github.event_name != 'pull_request' && needs.check_repo.outputs.should_run_workflow == 'true'"
|
|
||||||
needs:
|
|
||||||
# TODO: should mypy be included here? It feels more brittle than the others.
|
|
||||||
- mypy
|
|
||||||
- trial
|
|
||||||
- sytest
|
|
||||||
- complement
|
|
||||||
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- uses: JasonEtco/create-an-issue@1b14a70e4d8dc185e5cc76d3bec9eab20257b2c5 # v2.9.2
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
with:
|
|
||||||
update_existing: true
|
|
||||||
filename: .ci/latest_deps_build_failed_issue_template.md
|
|
24
.github/workflows/poetry_lockfile.yaml
vendored
24
.github/workflows/poetry_lockfile.yaml
vendored
@ -1,24 +0,0 @@
|
|||||||
on:
|
|
||||||
push:
|
|
||||||
branches: ["develop", "release-*"]
|
|
||||||
paths:
|
|
||||||
- poetry.lock
|
|
||||||
pull_request:
|
|
||||||
paths:
|
|
||||||
- poetry.lock
|
|
||||||
|
|
||||||
concurrency:
|
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
|
||||||
cancel-in-progress: true
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
check-sdists:
|
|
||||||
name: "Check locked dependencies have sdists"
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: '3.x'
|
|
||||||
- run: pip install tomli
|
|
||||||
- run: ./scripts-dev/check_locked_deps_have_sdists.py
|
|
74
.github/workflows/push_complement_image.yml
vendored
74
.github/workflows/push_complement_image.yml
vendored
@ -1,74 +0,0 @@
|
|||||||
# This task does not run complement tests, see tests.yaml instead.
|
|
||||||
# This task does not build docker images for synapse for use on docker hub, see docker.yaml instead
|
|
||||||
|
|
||||||
name: Store complement-synapse image in ghcr.io
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
branches: [ "master" ]
|
|
||||||
schedule:
|
|
||||||
- cron: '0 5 * * *'
|
|
||||||
workflow_dispatch:
|
|
||||||
inputs:
|
|
||||||
branch:
|
|
||||||
required: true
|
|
||||||
default: 'develop'
|
|
||||||
type: choice
|
|
||||||
options:
|
|
||||||
- develop
|
|
||||||
- master
|
|
||||||
|
|
||||||
# Only run this action once per pull request/branch; restart if a new commit arrives.
|
|
||||||
# C.f. https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#concurrency
|
|
||||||
# and https://docs.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#github-context
|
|
||||||
concurrency:
|
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
|
||||||
cancel-in-progress: true
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
build:
|
|
||||||
name: Build and push complement image
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
packages: write
|
|
||||||
steps:
|
|
||||||
- name: Checkout specific branch (debug build)
|
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
if: github.event_name == 'workflow_dispatch'
|
|
||||||
with:
|
|
||||||
ref: ${{ inputs.branch }}
|
|
||||||
- name: Checkout clean copy of develop (scheduled build)
|
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
if: github.event_name == 'schedule'
|
|
||||||
with:
|
|
||||||
ref: develop
|
|
||||||
- name: Checkout clean copy of master (on-push)
|
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
if: github.event_name == 'push'
|
|
||||||
with:
|
|
||||||
ref: master
|
|
||||||
- name: Login to registry
|
|
||||||
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
|
|
||||||
with:
|
|
||||||
registry: ghcr.io
|
|
||||||
username: ${{ github.actor }}
|
|
||||||
password: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
- name: Work out labels for complement image
|
|
||||||
id: meta
|
|
||||||
uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804 # v5.7.0
|
|
||||||
with:
|
|
||||||
images: ghcr.io/${{ github.repository }}/complement-synapse
|
|
||||||
tags: |
|
|
||||||
type=schedule,pattern=nightly,enable=${{ github.event_name == 'schedule'}}
|
|
||||||
type=raw,value=develop,enable=${{ github.event_name == 'schedule' || inputs.branch == 'develop' }}
|
|
||||||
type=raw,value=latest,enable=${{ github.event_name == 'push' || inputs.branch == 'master' }}
|
|
||||||
type=sha,format=long
|
|
||||||
- name: Run scripts-dev/complement.sh to generate complement-synapse:latest image.
|
|
||||||
run: scripts-dev/complement.sh --build-only
|
|
||||||
- name: Tag and push generated image
|
|
||||||
run: |
|
|
||||||
for TAG in ${{ join(fromJson(steps.meta.outputs.json).tags, ' ') }}; do
|
|
||||||
echo "tag and push $TAG"
|
|
||||||
docker tag complement-synapse $TAG
|
|
||||||
docker push $TAG
|
|
||||||
done
|
|
137
.github/workflows/release-artifacts.yml
vendored
137
.github/workflows/release-artifacts.yml
vendored
@ -4,16 +4,13 @@ name: Build release artifacts
|
|||||||
|
|
||||||
on:
|
on:
|
||||||
# we build on PRs and develop to (hopefully) get early warning
|
# we build on PRs and develop to (hopefully) get early warning
|
||||||
# of things breaking (but only build one set of debs). PRs skip
|
# of things breaking (but only build one set of debs)
|
||||||
# building wheels on macOS & ARM.
|
|
||||||
pull_request:
|
pull_request:
|
||||||
push:
|
push:
|
||||||
branches: ["develop", "release-*"]
|
branches: ["develop"]
|
||||||
|
|
||||||
# we do the full build on tags.
|
# we do the full build on tags.
|
||||||
tags: ["v*"]
|
tags: ["v*"]
|
||||||
merge_group:
|
|
||||||
workflow_dispatch:
|
|
||||||
|
|
||||||
concurrency:
|
concurrency:
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
group: ${{ github.workflow }}-${{ github.ref }}
|
||||||
@ -27,19 +24,16 @@ jobs:
|
|||||||
name: "Calculate list of debian distros"
|
name: "Calculate list of debian distros"
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
- uses: actions/setup-python@v2
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
- id: set-distros
|
- id: set-distros
|
||||||
run: |
|
run: |
|
||||||
# if we're running from a tag, get the full list of distros; otherwise just use debian:sid
|
# if we're running from a tag, get the full list of distros; otherwise just use debian:sid
|
||||||
# NOTE: inside the actual Dockerfile-dhvirtualenv, the image name is expanded into its full image path
|
|
||||||
dists='["debian:sid"]'
|
dists='["debian:sid"]'
|
||||||
if [[ $GITHUB_REF == refs/tags/* ]]; then
|
if [[ $GITHUB_REF == refs/tags/* ]]; then
|
||||||
dists=$(scripts-dev/build_debian_packages.py --show-dists-json)
|
dists=$(scripts-dev/build_debian_packages --show-dists-json)
|
||||||
fi
|
fi
|
||||||
echo "distros=$dists" >> "$GITHUB_OUTPUT"
|
echo "::set-output name=distros::$dists"
|
||||||
# map the step outputs to job outputs
|
# map the step outputs to job outputs
|
||||||
outputs:
|
outputs:
|
||||||
distros: ${{ steps.set-distros.outputs.distros }}
|
distros: ${{ steps.set-distros.outputs.distros }}
|
||||||
@ -55,18 +49,18 @@ jobs:
|
|||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout
|
- name: Checkout
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
uses: actions/checkout@v2
|
||||||
with:
|
with:
|
||||||
path: src
|
path: src
|
||||||
|
|
||||||
- name: Set up Docker Buildx
|
- name: Set up Docker Buildx
|
||||||
id: buildx
|
id: buildx
|
||||||
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
|
uses: docker/setup-buildx-action@v1
|
||||||
with:
|
with:
|
||||||
install: true
|
install: true
|
||||||
|
|
||||||
- name: Set up docker layer caching
|
- name: Set up docker layer caching
|
||||||
uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3
|
uses: actions/cache@v2
|
||||||
with:
|
with:
|
||||||
path: /tmp/.buildx-cache
|
path: /tmp/.buildx-cache
|
||||||
key: ${{ runner.os }}-buildx-${{ github.sha }}
|
key: ${{ runner.os }}-buildx-${{ github.sha }}
|
||||||
@ -74,15 +68,13 @@ jobs:
|
|||||||
${{ runner.os }}-buildx-
|
${{ runner.os }}-buildx-
|
||||||
|
|
||||||
- name: Set up python
|
- name: Set up python
|
||||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
uses: actions/setup-python@v2
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
|
|
||||||
- name: Build the packages
|
- name: Build the packages
|
||||||
# see https://github.com/docker/build-push-action/issues/252
|
# see https://github.com/docker/build-push-action/issues/252
|
||||||
# for the cache magic here
|
# for the cache magic here
|
||||||
run: |
|
run: |
|
||||||
./src/scripts-dev/build_debian_packages.py \
|
./src/scripts-dev/build_debian_packages \
|
||||||
--docker-build-arg=--cache-from=type=local,src=/tmp/.buildx-cache \
|
--docker-build-arg=--cache-from=type=local,src=/tmp/.buildx-cache \
|
||||||
--docker-build-arg=--cache-to=type=local,mode=max,dest=/tmp/.buildx-cache-new \
|
--docker-build-arg=--cache-to=type=local,mode=max,dest=/tmp/.buildx-cache-new \
|
||||||
--docker-build-arg=--progress=plain \
|
--docker-build-arg=--progress=plain \
|
||||||
@ -91,94 +83,25 @@ jobs:
|
|||||||
rm -rf /tmp/.buildx-cache
|
rm -rf /tmp/.buildx-cache
|
||||||
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
|
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
|
||||||
|
|
||||||
- name: Artifact name
|
|
||||||
id: artifact-name
|
|
||||||
# We can't have colons in the upload name of the artifact, so we convert
|
|
||||||
# e.g. `debian:sid` to `sid`.
|
|
||||||
env:
|
|
||||||
DISTRO: ${{ matrix.distro }}
|
|
||||||
run: |
|
|
||||||
echo "ARTIFACT_NAME=${DISTRO#*:}" >> "$GITHUB_OUTPUT"
|
|
||||||
|
|
||||||
- name: Upload debs as artifacts
|
- name: Upload debs as artifacts
|
||||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
uses: actions/upload-artifact@v2
|
||||||
with:
|
with:
|
||||||
name: debs-${{ steps.artifact-name.outputs.ARTIFACT_NAME }}
|
name: debs
|
||||||
path: debs/*
|
path: debs/*
|
||||||
|
|
||||||
build-wheels:
|
|
||||||
name: Build wheels on ${{ matrix.os }}
|
|
||||||
runs-on: ${{ matrix.os }}
|
|
||||||
strategy:
|
|
||||||
matrix:
|
|
||||||
os:
|
|
||||||
- ubuntu-24.04
|
|
||||||
- ubuntu-24.04-arm
|
|
||||||
- macos-13 # This uses x86-64
|
|
||||||
- macos-14 # This uses arm64
|
|
||||||
# is_pr is a flag used to exclude certain jobs from the matrix on PRs.
|
|
||||||
# It is not read by the rest of the workflow.
|
|
||||||
is_pr:
|
|
||||||
- ${{ startsWith(github.ref, 'refs/pull/') }}
|
|
||||||
|
|
||||||
exclude:
|
|
||||||
# Don't build macos wheels on PR CI.
|
|
||||||
- is_pr: true
|
|
||||||
os: "macos-13"
|
|
||||||
- is_pr: true
|
|
||||||
os: "macos-14"
|
|
||||||
# Don't build aarch64 wheels on PR CI.
|
|
||||||
- is_pr: true
|
|
||||||
os: "ubuntu-24.04-arm"
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
# setup-python@v4 doesn't impose a default python version. Need to use 3.x
|
|
||||||
# here, because `python` on osx points to Python 2.7.
|
|
||||||
python-version: "3.x"
|
|
||||||
|
|
||||||
- name: Install cibuildwheel
|
|
||||||
run: python -m pip install cibuildwheel==3.0.0
|
|
||||||
|
|
||||||
- name: Only build a single wheel on PR
|
|
||||||
if: startsWith(github.ref, 'refs/pull/')
|
|
||||||
run: echo "CIBW_BUILD="cp39-manylinux_*"" >> $GITHUB_ENV
|
|
||||||
|
|
||||||
- name: Build wheels
|
|
||||||
run: python -m cibuildwheel --output-dir wheelhouse
|
|
||||||
env:
|
|
||||||
# Skip testing for platforms which various libraries don't have wheels
|
|
||||||
# for, and so need extra build deps.
|
|
||||||
CIBW_TEST_SKIP: pp3*-* *i686* *musl*
|
|
||||||
|
|
||||||
- uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
|
||||||
with:
|
|
||||||
name: Wheel-${{ matrix.os }}
|
|
||||||
path: ./wheelhouse/*.whl
|
|
||||||
|
|
||||||
build-sdist:
|
build-sdist:
|
||||||
name: Build sdist
|
name: "Build pypi distribution files"
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
if: ${{ !startsWith(github.ref, 'refs/pull/') }}
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
- uses: actions/setup-python@v2
|
||||||
|
- run: pip install wheel
|
||||||
|
- run: |
|
||||||
|
python setup.py sdist bdist_wheel
|
||||||
|
- uses: actions/upload-artifact@v2
|
||||||
with:
|
with:
|
||||||
python-version: "3.10"
|
name: python-dist
|
||||||
|
path: dist/*
|
||||||
- run: pip install build
|
|
||||||
|
|
||||||
- name: Build sdist
|
|
||||||
run: python -m build --sdist
|
|
||||||
|
|
||||||
- uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
|
||||||
with:
|
|
||||||
name: Sdist
|
|
||||||
path: dist/*.tar.gz
|
|
||||||
|
|
||||||
# if it's a tag, create a release and attach the artifacts to it
|
# if it's a tag, create a release and attach the artifacts to it
|
||||||
attach-assets:
|
attach-assets:
|
||||||
@ -186,28 +109,20 @@ jobs:
|
|||||||
if: ${{ !failure() && !cancelled() && startsWith(github.ref, 'refs/tags/') }}
|
if: ${{ !failure() && !cancelled() && startsWith(github.ref, 'refs/tags/') }}
|
||||||
needs:
|
needs:
|
||||||
- build-debs
|
- build-debs
|
||||||
- build-wheels
|
|
||||||
- build-sdist
|
- build-sdist
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- name: Download all workflow run artifacts
|
- name: Download all workflow run artifacts
|
||||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
uses: actions/download-artifact@v2
|
||||||
- name: Build a tarball for the debs
|
- name: Build a tarball for the debs
|
||||||
# We need to merge all the debs uploads into one folder, then compress
|
run: tar -cvJf debs.tar.xz debs
|
||||||
# that.
|
|
||||||
run: |
|
|
||||||
mkdir debs
|
|
||||||
mv debs*/* debs/
|
|
||||||
tar -cvJf debs.tar.xz debs
|
|
||||||
- name: Attach to release
|
- name: Attach to release
|
||||||
# Pinned to work around https://github.com/softprops/action-gh-release/issues/445
|
uses: softprops/action-gh-release@a929a66f232c1b11af63782948aa2210f981808a # PR#109
|
||||||
uses: softprops/action-gh-release@c95fe1489396fe8a9eb87c0abf8aa5b2ef267fda # v0.1.15
|
|
||||||
env:
|
env:
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
with:
|
with:
|
||||||
files: |
|
files: |
|
||||||
Sdist/*
|
python-dist/*
|
||||||
Wheel*/*
|
|
||||||
debs.tar.xz
|
debs.tar.xz
|
||||||
# if it's not already published, keep the release as a draft.
|
# if it's not already published, keep the release as a draft.
|
||||||
draft: true
|
draft: true
|
||||||
|
57
.github/workflows/schema.yaml
vendored
57
.github/workflows/schema.yaml
vendored
@ -1,57 +0,0 @@
|
|||||||
name: Schema
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request:
|
|
||||||
paths:
|
|
||||||
- schema/**
|
|
||||||
- docs/usage/configuration/config_documentation.md
|
|
||||||
push:
|
|
||||||
branches: ["develop", "release-*"]
|
|
||||||
workflow_dispatch:
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
validate-schema:
|
|
||||||
name: Ensure Synapse config schema is valid
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
- name: Install check-jsonschema
|
|
||||||
run: pip install check-jsonschema==0.33.0
|
|
||||||
|
|
||||||
- name: Validate meta schema
|
|
||||||
run: check-jsonschema --check-metaschema schema/v*/meta.schema.json
|
|
||||||
- name: Validate schema
|
|
||||||
run: |-
|
|
||||||
# Please bump on introduction of a new meta schema.
|
|
||||||
LATEST_META_SCHEMA_VERSION=v1
|
|
||||||
check-jsonschema \
|
|
||||||
--schemafile="schema/$LATEST_META_SCHEMA_VERSION/meta.schema.json" \
|
|
||||||
schema/synapse-config.schema.yaml
|
|
||||||
- name: Validate default config
|
|
||||||
# Populates the empty instance with default values and checks against the schema.
|
|
||||||
run: |-
|
|
||||||
echo "{}" | check-jsonschema \
|
|
||||||
--fill-defaults --schemafile=schema/synapse-config.schema.yaml -
|
|
||||||
|
|
||||||
check-doc-generation:
|
|
||||||
name: Ensure generated documentation is up-to-date
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
- name: Install PyYAML
|
|
||||||
run: pip install PyYAML==6.0.2
|
|
||||||
|
|
||||||
- name: Regenerate config documentation
|
|
||||||
run: |
|
|
||||||
scripts-dev/gen_config_documentation.py \
|
|
||||||
schema/synapse-config.schema.yaml \
|
|
||||||
> docs/usage/configuration/config_documentation.md
|
|
||||||
- name: Error in case of any differences
|
|
||||||
# Errors if there are now any modified files (untracked files are ignored).
|
|
||||||
run: 'git diff --exit-code'
|
|
762
.github/workflows/tests.yml
vendored
762
.github/workflows/tests.yml
vendored
@ -4,392 +4,121 @@ on:
|
|||||||
push:
|
push:
|
||||||
branches: ["develop", "release-*"]
|
branches: ["develop", "release-*"]
|
||||||
pull_request:
|
pull_request:
|
||||||
merge_group:
|
|
||||||
workflow_dispatch:
|
|
||||||
|
|
||||||
concurrency:
|
concurrency:
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
group: ${{ github.workflow }}-${{ github.ref }}
|
||||||
cancel-in-progress: true
|
cancel-in-progress: true
|
||||||
|
|
||||||
env:
|
|
||||||
RUST_VERSION: 1.87.0
|
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
# Job to detect what has changed so we don't run e.g. Rust checks on PRs that
|
|
||||||
# don't modify Rust code.
|
|
||||||
changes:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
outputs:
|
|
||||||
rust: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.rust }}
|
|
||||||
trial: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.trial }}
|
|
||||||
integration: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.integration }}
|
|
||||||
linting: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.linting }}
|
|
||||||
linting_readme: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.linting_readme }}
|
|
||||||
steps:
|
|
||||||
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2
|
|
||||||
id: filter
|
|
||||||
# We only check on PRs
|
|
||||||
if: startsWith(github.ref, 'refs/pull/')
|
|
||||||
with:
|
|
||||||
filters: |
|
|
||||||
rust:
|
|
||||||
- 'rust/**'
|
|
||||||
- 'Cargo.toml'
|
|
||||||
- 'Cargo.lock'
|
|
||||||
- '.rustfmt.toml'
|
|
||||||
- '.github/workflows/tests.yml'
|
|
||||||
|
|
||||||
trial:
|
|
||||||
- 'synapse/**'
|
|
||||||
- 'tests/**'
|
|
||||||
- 'rust/**'
|
|
||||||
- '.ci/scripts/calculate_jobs.py'
|
|
||||||
- 'Cargo.toml'
|
|
||||||
- 'Cargo.lock'
|
|
||||||
- 'pyproject.toml'
|
|
||||||
- 'poetry.lock'
|
|
||||||
- '.github/workflows/tests.yml'
|
|
||||||
|
|
||||||
integration:
|
|
||||||
- 'synapse/**'
|
|
||||||
- 'rust/**'
|
|
||||||
- 'docker/**'
|
|
||||||
- 'Cargo.toml'
|
|
||||||
- 'Cargo.lock'
|
|
||||||
- 'pyproject.toml'
|
|
||||||
- 'poetry.lock'
|
|
||||||
- 'docker/**'
|
|
||||||
- '.ci/**'
|
|
||||||
- 'scripts-dev/complement.sh'
|
|
||||||
- '.github/workflows/tests.yml'
|
|
||||||
|
|
||||||
linting:
|
|
||||||
- 'synapse/**'
|
|
||||||
- 'docker/**'
|
|
||||||
- 'tests/**'
|
|
||||||
- 'scripts-dev/**'
|
|
||||||
- 'contrib/**'
|
|
||||||
- 'synmark/**'
|
|
||||||
- 'stubs/**'
|
|
||||||
- '.ci/**'
|
|
||||||
- 'mypy.ini'
|
|
||||||
- 'pyproject.toml'
|
|
||||||
- 'poetry.lock'
|
|
||||||
- '.github/workflows/tests.yml'
|
|
||||||
|
|
||||||
linting_readme:
|
|
||||||
- 'README.rst'
|
|
||||||
|
|
||||||
check-sampleconfig:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: changes
|
|
||||||
if: ${{ needs.changes.outputs.linting == 'true' }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
extras: "all"
|
|
||||||
- run: poetry run scripts-dev/generate_sample_config.sh --check
|
|
||||||
- run: poetry run scripts-dev/config-lint.sh
|
|
||||||
|
|
||||||
check-schema-delta:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: changes
|
|
||||||
if: ${{ needs.changes.outputs.linting == 'true' }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
- run: "pip install 'click==8.1.1' 'GitPython>=3.1.20'"
|
|
||||||
- run: scripts-dev/check_schema_delta.py --force-colors
|
|
||||||
|
|
||||||
check-lockfile:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
- run: .ci/scripts/check_lockfile.py
|
|
||||||
|
|
||||||
lint:
|
lint:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
needs: changes
|
strategy:
|
||||||
if: ${{ needs.changes.outputs.linting == 'true' }}
|
matrix:
|
||||||
|
toxenv:
|
||||||
|
- "check-sampleconfig"
|
||||||
|
- "check_codestyle"
|
||||||
|
- "check_isort"
|
||||||
|
- "mypy"
|
||||||
|
- "packaging"
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout repository
|
- uses: actions/checkout@v2
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/setup-python@v2
|
||||||
|
- run: pip install tox
|
||||||
- name: Setup Poetry
|
- run: tox -e ${{ matrix.toxenv }}
|
||||||
uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
install-project: "false"
|
|
||||||
|
|
||||||
- name: Run ruff check
|
|
||||||
run: poetry run ruff check --output-format=github .
|
|
||||||
|
|
||||||
- name: Run ruff format
|
|
||||||
run: poetry run ruff format --check .
|
|
||||||
|
|
||||||
lint-mypy:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
name: Typechecking
|
|
||||||
needs: changes
|
|
||||||
if: ${{ needs.changes.outputs.linting == 'true' }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout repository
|
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- name: Setup Poetry
|
|
||||||
uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
|
||||||
# We want to make use of type hints in optional dependencies too.
|
|
||||||
extras: all
|
|
||||||
# We have seen odd mypy failures that were resolved when we started
|
|
||||||
# installing the project again:
|
|
||||||
# https://github.com/matrix-org/synapse/pull/15376#issuecomment-1498983775
|
|
||||||
# To make CI green, err towards caution and install the project.
|
|
||||||
install-project: "true"
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
|
|
||||||
# Cribbed from
|
|
||||||
# https://github.com/AustinScola/mypy-cache-github-action/blob/85ea4f2972abed39b33bd02c36e341b28ca59213/src/restore.ts#L10-L17
|
|
||||||
- name: Restore/persist mypy's cache
|
|
||||||
uses: actions/cache@5a3ec84eff668545956fd18022155c47e93e2684 # v4.2.3
|
|
||||||
with:
|
|
||||||
path: |
|
|
||||||
.mypy_cache
|
|
||||||
key: mypy-cache-${{ github.context.sha }}
|
|
||||||
restore-keys: mypy-cache-
|
|
||||||
|
|
||||||
- name: Run mypy
|
|
||||||
run: poetry run mypy
|
|
||||||
|
|
||||||
lint-crlf:
|
lint-crlf:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- name: Check line endings
|
- name: Check line endings
|
||||||
run: scripts-dev/check_line_terminators.sh
|
run: scripts-dev/check_line_terminators.sh
|
||||||
|
|
||||||
lint-newsfile:
|
lint-newsfile:
|
||||||
if: ${{ (github.base_ref == 'develop' || contains(github.base_ref, 'release-')) && github.actor != 'dependabot[bot]' }}
|
if: ${{ github.base_ref == 'develop' || contains(github.base_ref, 'release-') }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
with:
|
with:
|
||||||
ref: ${{ github.event.pull_request.head.sha }}
|
ref: ${{ github.event.pull_request.head.sha }}
|
||||||
fetch-depth: 0
|
fetch-depth: 0
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
- uses: actions/setup-python@v2
|
||||||
with:
|
- run: pip install tox
|
||||||
python-version: "3.x"
|
- run: scripts-dev/check-newsfragment
|
||||||
- run: "pip install 'towncrier>=18.6.0rc1'"
|
|
||||||
- run: scripts-dev/check-newsfragment.sh
|
|
||||||
env:
|
env:
|
||||||
PULL_REQUEST_NUMBER: ${{ github.event.number }}
|
PULL_REQUEST_NUMBER: ${{ github.event.number }}
|
||||||
|
|
||||||
lint-pydantic:
|
lint-sdist:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
needs: changes
|
|
||||||
if: ${{ needs.changes.outputs.linting == 'true' }}
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
with:
|
- uses: actions/setup-python@v2
|
||||||
ref: ${{ github.event.pull_request.head.sha }}
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
extras: "all"
|
|
||||||
- run: poetry run scripts-dev/check_pydantic_models.py
|
|
||||||
|
|
||||||
lint-clippy:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: changes
|
|
||||||
if: ${{ needs.changes.outputs.rust == 'true' }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
components: clippy
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- run: cargo clippy -- -D warnings
|
|
||||||
|
|
||||||
# We also lint against a nightly rustc so that we can lint the benchmark
|
|
||||||
# suite, which requires a nightly compiler.
|
|
||||||
lint-clippy-nightly:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: changes
|
|
||||||
if: ${{ needs.changes.outputs.rust == 'true' }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: nightly-2025-04-23
|
|
||||||
components: clippy
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- run: cargo clippy --all-features -- -D warnings
|
|
||||||
|
|
||||||
lint-rustfmt:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: changes
|
|
||||||
if: ${{ needs.changes.outputs.rust == 'true' }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
# We use nightly so that it correctly groups together imports
|
|
||||||
toolchain: nightly-2025-04-23
|
|
||||||
components: rustfmt
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- run: cargo fmt --check
|
|
||||||
|
|
||||||
# This is to detect issues with the rst file, which can otherwise cause issues
|
|
||||||
# when uploading packages to PyPi.
|
|
||||||
lint-readme:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: changes
|
|
||||||
if: ${{ needs.changes.outputs.linting_readme == 'true' }}
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
with:
|
||||||
python-version: "3.x"
|
python-version: "3.x"
|
||||||
- run: "pip install rstcheck"
|
- run: pip install wheel
|
||||||
- run: "rstcheck --report-level=WARNING README.rst"
|
- run: python setup.py sdist bdist_wheel
|
||||||
|
- uses: actions/upload-artifact@v2
|
||||||
|
with:
|
||||||
|
name: Python Distributions
|
||||||
|
path: dist/*
|
||||||
|
|
||||||
# Dummy step to gate other tests on without repeating the whole list
|
# Dummy step to gate other tests on without repeating the whole list
|
||||||
linting-done:
|
linting-done:
|
||||||
if: ${{ !cancelled() }} # Run this even if prior jobs were skipped
|
if: ${{ !cancelled() }} # Run this even if prior jobs were skipped
|
||||||
needs:
|
needs: [lint, lint-crlf, lint-newsfile, lint-sdist]
|
||||||
- lint
|
|
||||||
- lint-mypy
|
|
||||||
- lint-crlf
|
|
||||||
- lint-newsfile
|
|
||||||
- lint-pydantic
|
|
||||||
- check-sampleconfig
|
|
||||||
- check-schema-delta
|
|
||||||
- check-lockfile
|
|
||||||
- lint-clippy
|
|
||||||
- lint-clippy-nightly
|
|
||||||
- lint-rustfmt
|
|
||||||
- lint-readme
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: matrix-org/done-action@3409aa904e8a2aaf2220f09bc954d3d0b0a2ee67 # v3
|
- run: "true"
|
||||||
with:
|
|
||||||
needs: ${{ toJSON(needs) }}
|
|
||||||
|
|
||||||
# Various bits are skipped if there was no applicable changes.
|
trial:
|
||||||
skippable: |
|
|
||||||
check-sampleconfig
|
|
||||||
check-schema-delta
|
|
||||||
lint
|
|
||||||
lint-mypy
|
|
||||||
lint-newsfile
|
|
||||||
lint-pydantic
|
|
||||||
lint-clippy
|
|
||||||
lint-clippy-nightly
|
|
||||||
lint-rustfmt
|
|
||||||
lint-readme
|
|
||||||
|
|
||||||
|
|
||||||
calculate-test-jobs:
|
|
||||||
if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail
|
if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail
|
||||||
needs: linting-done
|
needs: linting-done
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
- id: get-matrix
|
|
||||||
run: .ci/scripts/calculate_jobs.py
|
|
||||||
outputs:
|
|
||||||
trial_test_matrix: ${{ steps.get-matrix.outputs.trial_test_matrix }}
|
|
||||||
sytest_test_matrix: ${{ steps.get-matrix.outputs.sytest_test_matrix }}
|
|
||||||
|
|
||||||
trial:
|
|
||||||
if: ${{ !cancelled() && !failure() && needs.changes.outputs.trial == 'true' }} # Allow previous steps to be skipped, but not fail
|
|
||||||
needs:
|
|
||||||
- calculate-test-jobs
|
|
||||||
- changes
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
strategy:
|
strategy:
|
||||||
matrix:
|
matrix:
|
||||||
job: ${{ fromJson(needs.calculate-test-jobs.outputs.trial_test_matrix) }}
|
python-version: ["3.6", "3.7", "3.8", "3.9", "3.10"]
|
||||||
|
database: ["sqlite"]
|
||||||
|
toxenv: ["py"]
|
||||||
|
include:
|
||||||
|
# Newest Python without optional deps
|
||||||
|
- python-version: "3.10"
|
||||||
|
toxenv: "py-noextras"
|
||||||
|
|
||||||
|
# Oldest Python with PostgreSQL
|
||||||
|
- python-version: "3.6"
|
||||||
|
database: "postgres"
|
||||||
|
postgres-version: "9.6"
|
||||||
|
toxenv: "py"
|
||||||
|
|
||||||
|
# Newest Python with newest PostgreSQL
|
||||||
|
- python-version: "3.10"
|
||||||
|
database: "postgres"
|
||||||
|
postgres-version: "14"
|
||||||
|
toxenv: "py"
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- run: sudo apt-get -qq install xmlsec1
|
- run: sudo apt-get -qq install xmlsec1
|
||||||
- name: Set up PostgreSQL ${{ matrix.job.postgres-version }}
|
- name: Set up PostgreSQL ${{ matrix.postgres-version }}
|
||||||
if: ${{ matrix.job.postgres-version }}
|
if: ${{ matrix.postgres-version }}
|
||||||
# 1. Mount postgres data files onto a tmpfs in-memory filesystem to reduce overhead of docker's overlayfs layer.
|
|
||||||
# 2. Expose the unix socket for postgres. This removes latency of using docker-proxy for connections.
|
|
||||||
run: |
|
run: |
|
||||||
docker run -d -p 5432:5432 \
|
docker run -d -p 5432:5432 \
|
||||||
--tmpfs /var/lib/postgres:rw,size=6144m \
|
|
||||||
--mount 'type=bind,src=/var/run/postgresql,dst=/var/run/postgresql' \
|
|
||||||
-e POSTGRES_PASSWORD=postgres \
|
-e POSTGRES_PASSWORD=postgres \
|
||||||
-e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \
|
-e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \
|
||||||
postgres:${{ matrix.job.postgres-version }}
|
postgres:${{ matrix.postgres-version }}
|
||||||
|
- uses: actions/setup-python@v2
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
with:
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
python-version: ${{ matrix.python-version }}
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
- run: pip install tox
|
||||||
|
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
|
||||||
python-version: ${{ matrix.job.python-version }}
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
extras: ${{ matrix.job.extras }}
|
|
||||||
- name: Await PostgreSQL
|
- name: Await PostgreSQL
|
||||||
if: ${{ matrix.job.postgres-version }}
|
if: ${{ matrix.postgres-version }}
|
||||||
timeout-minutes: 2
|
timeout-minutes: 2
|
||||||
run: until pg_isready -h localhost; do sleep 1; done
|
run: until pg_isready -h localhost; do sleep 1; done
|
||||||
- run: poetry run trial --jobs=6 tests
|
- run: tox -e ${{ matrix.toxenv }}
|
||||||
env:
|
env:
|
||||||
SYNAPSE_POSTGRES: ${{ matrix.job.database == 'postgres' || '' }}
|
TRIAL_FLAGS: "--jobs=2"
|
||||||
SYNAPSE_POSTGRES_HOST: /var/run/postgresql
|
SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }}
|
||||||
|
SYNAPSE_POSTGRES_HOST: localhost
|
||||||
SYNAPSE_POSTGRES_USER: postgres
|
SYNAPSE_POSTGRES_USER: postgres
|
||||||
SYNAPSE_POSTGRES_PASSWORD: postgres
|
SYNAPSE_POSTGRES_PASSWORD: postgres
|
||||||
- name: Dump logs
|
- name: Dump logs
|
||||||
@ -406,51 +135,18 @@ jobs:
|
|||||||
|| true
|
|| true
|
||||||
|
|
||||||
trial-olddeps:
|
trial-olddeps:
|
||||||
# Note: sqlite only; no postgres
|
if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail
|
||||||
if: ${{ !cancelled() && !failure() && needs.changes.outputs.trial == 'true' }} # Allow previous steps to be skipped, but not fail
|
needs: linting-done
|
||||||
needs:
|
runs-on: ubuntu-latest
|
||||||
- linting-done
|
|
||||||
- changes
|
|
||||||
runs-on: ubuntu-22.04
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
|
- name: Test with old deps
|
||||||
- name: Install Rust
|
uses: docker://ubuntu:bionic # For old python and sqlite
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
with:
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
workdir: /github/workspace
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
entrypoint: .ci/scripts/test_old_deps.sh
|
||||||
|
env:
|
||||||
# There aren't wheels for some of the older deps, so we need to install
|
TRIAL_FLAGS: "--jobs=2"
|
||||||
# their build dependencies
|
|
||||||
- run: |
|
|
||||||
sudo apt-get -qq update
|
|
||||||
sudo apt-get -qq install build-essential libffi-dev python3-dev \
|
|
||||||
libxml2-dev libxslt-dev xmlsec1 zlib1g-dev libjpeg-dev libwebp-dev
|
|
||||||
|
|
||||||
- uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
|
||||||
with:
|
|
||||||
python-version: '3.9'
|
|
||||||
|
|
||||||
- name: Prepare old deps
|
|
||||||
if: steps.cache-poetry-old-deps.outputs.cache-hit != 'true'
|
|
||||||
run: .ci/scripts/prepare_old_deps.sh
|
|
||||||
|
|
||||||
# Note: we install using `pip` here, not poetry. `poetry install` ignores the
|
|
||||||
# build-system section (https://github.com/python-poetry/poetry/issues/6154), but
|
|
||||||
# we explicitly want to test that you can `pip install` using the oldest version
|
|
||||||
# of poetry-core and setuptools-rust.
|
|
||||||
- run: pip install .[all,test]
|
|
||||||
|
|
||||||
# We nuke the local copy, as we've installed synapse into the virtualenv
|
|
||||||
# (rather than use an editable install, which we no longer support). If we
|
|
||||||
# don't do this then python can't find the native lib.
|
|
||||||
- run: rm -rf synapse/
|
|
||||||
|
|
||||||
# Sanity check we can import/run Synapse
|
|
||||||
- run: python -m synapse.app.homeserver --help
|
|
||||||
|
|
||||||
- run: python -m twisted.trial -j6 tests
|
|
||||||
- name: Dump logs
|
- name: Dump logs
|
||||||
# Logs are most useful when the command fails, always include them.
|
# Logs are most useful when the command fails, always include them.
|
||||||
if: ${{ always() }}
|
if: ${{ always() }}
|
||||||
@ -466,27 +162,23 @@ jobs:
|
|||||||
|
|
||||||
trial-pypy:
|
trial-pypy:
|
||||||
# Very slow; only run if the branch name includes 'pypy'
|
# Very slow; only run if the branch name includes 'pypy'
|
||||||
# Note: sqlite only; no postgres. Completely untested since poetry move.
|
if: ${{ contains(github.ref, 'pypy') && !failure() && !cancelled() }}
|
||||||
if: ${{ contains(github.ref, 'pypy') && !failure() && !cancelled() && needs.changes.outputs.trial == 'true' }}
|
needs: linting-done
|
||||||
needs:
|
|
||||||
- linting-done
|
|
||||||
- changes
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
strategy:
|
strategy:
|
||||||
matrix:
|
matrix:
|
||||||
python-version: ["pypy-3.9"]
|
python-version: ["pypy-3.6"]
|
||||||
extras: ["all"]
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
# Install libs necessary for PyPy to build binary wheels for dependencies
|
|
||||||
- run: sudo apt-get -qq install xmlsec1 libxml2-dev libxslt-dev
|
- run: sudo apt-get -qq install xmlsec1 libxml2-dev libxslt-dev
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
- uses: actions/setup-python@v2
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
poetry-version: "2.1.1"
|
- run: pip install tox
|
||||||
extras: ${{ matrix.extras }}
|
- run: tox -e py
|
||||||
- run: poetry run trial --jobs=2 tests
|
env:
|
||||||
|
TRIAL_FLAGS: "--jobs=2"
|
||||||
- name: Dump logs
|
- name: Dump logs
|
||||||
# Logs are most useful when the command fails, always include them.
|
# Logs are most useful when the command fails, always include them.
|
||||||
if: ${{ always() }}
|
if: ${{ always() }}
|
||||||
@ -501,43 +193,51 @@ jobs:
|
|||||||
|| true
|
|| true
|
||||||
|
|
||||||
sytest:
|
sytest:
|
||||||
if: ${{ !failure() && !cancelled() && needs.changes.outputs.integration == 'true' }}
|
if: ${{ !failure() && !cancelled() }}
|
||||||
needs:
|
needs: linting-done
|
||||||
- calculate-test-jobs
|
|
||||||
- changes
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
container:
|
container:
|
||||||
image: matrixdotorg/sytest-synapse:${{ matrix.job.sytest-tag }}
|
image: matrixdotorg/sytest-synapse:${{ matrix.sytest-tag }}
|
||||||
volumes:
|
volumes:
|
||||||
- ${{ github.workspace }}:/src
|
- ${{ github.workspace }}:/src
|
||||||
env:
|
env:
|
||||||
# If this is a pull request to a release branch, use that branch as default branch for sytest, else use develop
|
|
||||||
# This works because the release script always create a branch on the sytest repo with the same name as the release branch
|
|
||||||
SYTEST_DEFAULT_BRANCH: ${{ startsWith(github.base_ref, 'release-') && github.base_ref || 'develop' }}
|
|
||||||
SYTEST_BRANCH: ${{ github.head_ref }}
|
SYTEST_BRANCH: ${{ github.head_ref }}
|
||||||
POSTGRES: ${{ matrix.job.postgres && 1}}
|
POSTGRES: ${{ matrix.postgres && 1}}
|
||||||
MULTI_POSTGRES: ${{ (matrix.job.postgres == 'multi-postgres') || '' }}
|
MULTI_POSTGRES: ${{ (matrix.postgres == 'multi-postgres') && 1}}
|
||||||
ASYNCIO_REACTOR: ${{ (matrix.job.reactor == 'asyncio') || '' }}
|
WORKERS: ${{ matrix.workers && 1 }}
|
||||||
WORKERS: ${{ matrix.job.workers && 1 }}
|
REDIS: ${{ matrix.redis && 1 }}
|
||||||
BLACKLIST: ${{ matrix.job.workers && 'synapse-blacklist-with-workers' }}
|
BLACKLIST: ${{ matrix.workers && 'synapse-blacklist-with-workers' }}
|
||||||
TOP: ${{ github.workspace }}
|
TOP: ${{ github.workspace }}
|
||||||
|
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
job: ${{ fromJson(needs.calculate-test-jobs.outputs.sytest_test_matrix) }}
|
include:
|
||||||
|
- sytest-tag: bionic
|
||||||
|
|
||||||
|
- sytest-tag: bionic
|
||||||
|
postgres: postgres
|
||||||
|
|
||||||
|
- sytest-tag: testing
|
||||||
|
postgres: postgres
|
||||||
|
|
||||||
|
- sytest-tag: bionic
|
||||||
|
postgres: multi-postgres
|
||||||
|
workers: workers
|
||||||
|
|
||||||
|
- sytest-tag: buster
|
||||||
|
postgres: multi-postgres
|
||||||
|
workers: workers
|
||||||
|
|
||||||
|
- sytest-tag: buster
|
||||||
|
postgres: postgres
|
||||||
|
workers: workers
|
||||||
|
redis: redis
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- name: Prepare test blacklist
|
- name: Prepare test blacklist
|
||||||
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
|
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- name: Run SyTest
|
- name: Run SyTest
|
||||||
run: /bootstrap.sh synapse
|
run: /bootstrap.sh synapse
|
||||||
working-directory: /src
|
working-directory: /src
|
||||||
@ -545,17 +245,17 @@ jobs:
|
|||||||
if: ${{ always() }}
|
if: ${{ always() }}
|
||||||
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
|
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
|
||||||
- name: Upload SyTest logs
|
- name: Upload SyTest logs
|
||||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
uses: actions/upload-artifact@v2
|
||||||
if: ${{ always() }}
|
if: ${{ always() }}
|
||||||
with:
|
with:
|
||||||
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.job.*, ', ') }})
|
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
|
||||||
path: |
|
path: |
|
||||||
/logs/results.tap
|
/logs/results.tap
|
||||||
/logs/**/*.log*
|
/logs/**/*.log*
|
||||||
|
|
||||||
export-data:
|
export-data:
|
||||||
if: ${{ !failure() && !cancelled() && needs.changes.outputs.integration == 'true'}} # Allow previous steps to be skipped, but not fail
|
if: ${{ !failure() && !cancelled() }} # Allow previous steps to be skipped, but not fail
|
||||||
needs: [linting-done, portdb, changes]
|
needs: [linting-done, portdb]
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
env:
|
env:
|
||||||
TOP: ${{ github.workspace }}
|
TOP: ${{ github.workspace }}
|
||||||
@ -575,34 +275,27 @@ jobs:
|
|||||||
--health-retries 5
|
--health-retries 5
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- run: sudo apt-get -qq install xmlsec1 postgresql-client
|
- run: sudo apt-get -qq install xmlsec1
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
- uses: actions/setup-python@v2
|
||||||
with:
|
with:
|
||||||
poetry-version: "2.1.1"
|
python-version: "3.9"
|
||||||
extras: "postgres"
|
|
||||||
- run: .ci/scripts/test_export_data_command.sh
|
- run: .ci/scripts/test_export_data_command.sh
|
||||||
env:
|
|
||||||
PGHOST: localhost
|
|
||||||
PGUSER: postgres
|
|
||||||
PGPASSWORD: postgres
|
|
||||||
PGDATABASE: postgres
|
|
||||||
|
|
||||||
|
|
||||||
portdb:
|
portdb:
|
||||||
if: ${{ !failure() && !cancelled() && needs.changes.outputs.integration == 'true'}} # Allow previous steps to be skipped, but not fail
|
if: ${{ !failure() && !cancelled() }} # Allow previous steps to be skipped, but not fail
|
||||||
needs:
|
needs: linting-done
|
||||||
- linting-done
|
|
||||||
- changes
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
env:
|
||||||
|
TOP: ${{ github.workspace }}
|
||||||
strategy:
|
strategy:
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
include:
|
||||||
- python-version: "3.9"
|
- python-version: "3.6"
|
||||||
postgres-version: "13"
|
postgres-version: "9.6"
|
||||||
|
|
||||||
- python-version: "3.13"
|
- python-version: "3.10"
|
||||||
postgres-version: "17"
|
postgres-version: "14"
|
||||||
|
|
||||||
services:
|
services:
|
||||||
postgres:
|
postgres:
|
||||||
@ -619,155 +312,106 @@ jobs:
|
|||||||
--health-retries 5
|
--health-retries 5
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- name: Add PostgreSQL apt repository
|
- run: sudo apt-get -qq install xmlsec1
|
||||||
# We need a version of pg_dump that can handle the version of
|
- uses: actions/setup-python@v2
|
||||||
# PostgreSQL being tested against. The Ubuntu package repository lags
|
|
||||||
# behind new releases, so we have to use the PostreSQL apt repository.
|
|
||||||
# Steps taken from https://www.postgresql.org/download/linux/ubuntu/
|
|
||||||
run: |
|
|
||||||
sudo sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
|
|
||||||
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
|
|
||||||
sudo apt-get update
|
|
||||||
- run: sudo apt-get -qq install xmlsec1 postgresql-client
|
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
poetry-version: "2.1.1"
|
|
||||||
extras: "postgres"
|
|
||||||
- run: .ci/scripts/test_synapse_port_db.sh
|
- run: .ci/scripts/test_synapse_port_db.sh
|
||||||
id: run_tester_script
|
|
||||||
env:
|
|
||||||
PGHOST: localhost
|
|
||||||
PGUSER: postgres
|
|
||||||
PGPASSWORD: postgres
|
|
||||||
PGDATABASE: postgres
|
|
||||||
- name: "Upload schema differences"
|
|
||||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
|
||||||
if: ${{ failure() && !cancelled() && steps.run_tester_script.outcome == 'failure' }}
|
|
||||||
with:
|
|
||||||
name: Schema dumps
|
|
||||||
path: |
|
|
||||||
unported.sql
|
|
||||||
ported.sql
|
|
||||||
schema_diff
|
|
||||||
|
|
||||||
complement:
|
complement:
|
||||||
if: "${{ !failure() && !cancelled() && needs.changes.outputs.integration == 'true' }}"
|
if: ${{ !failure() && !cancelled() }}
|
||||||
needs:
|
needs: linting-done
|
||||||
- linting-done
|
|
||||||
- changes
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
container:
|
||||||
strategy:
|
# https://github.com/matrix-org/complement/blob/master/dockerfiles/ComplementCIBuildkite.Dockerfile
|
||||||
fail-fast: false
|
image: matrixdotorg/complement:latest
|
||||||
matrix:
|
env:
|
||||||
include:
|
CI: true
|
||||||
- arrangement: monolith
|
ports:
|
||||||
database: SQLite
|
- 8448:8448
|
||||||
|
volumes:
|
||||||
- arrangement: monolith
|
- /var/run/docker.sock:/var/run/docker.sock
|
||||||
database: Postgres
|
|
||||||
|
|
||||||
- arrangement: workers
|
|
||||||
database: Postgres
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout synapse codebase
|
- name: Run actions/checkout@v2 for synapse
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
uses: actions/checkout@v2
|
||||||
with:
|
with:
|
||||||
path: synapse
|
path: synapse
|
||||||
|
|
||||||
- name: Install Rust
|
# Attempt to check out the same branch of Complement as the PR. If it
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
# doesn't exist, fallback to master.
|
||||||
with:
|
- name: Checkout complement
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- name: Prepare Complement's Prerequisites
|
|
||||||
run: synapse/.ci/scripts/setup_complement_prerequisites.sh
|
|
||||||
|
|
||||||
- uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
|
|
||||||
with:
|
|
||||||
cache-dependency-path: complement/go.sum
|
|
||||||
go-version-file: complement/go.mod
|
|
||||||
|
|
||||||
# use p=1 concurrency as GHA boxes are underpowered and don't like running tons of synapses at once.
|
|
||||||
- run: |
|
|
||||||
set -o pipefail
|
|
||||||
COMPLEMENT_DIR=`pwd`/complement synapse/scripts-dev/complement.sh -p 1 -json 2>&1 | synapse/.ci/scripts/gotestfmt
|
|
||||||
shell: bash
|
shell: bash
|
||||||
|
run: |
|
||||||
|
mkdir -p complement
|
||||||
|
# Attempt to use the version of complement which best matches the current
|
||||||
|
# build. Depending on whether this is a PR or release, etc. we need to
|
||||||
|
# use different fallbacks.
|
||||||
|
#
|
||||||
|
# 1. First check if there's a similarly named branch (GITHUB_HEAD_REF
|
||||||
|
# for pull requests, otherwise GITHUB_REF).
|
||||||
|
# 2. Attempt to use the base branch, e.g. when merging into release-vX.Y
|
||||||
|
# (GITHUB_BASE_REF for pull requests).
|
||||||
|
# 3. Use the default complement branch ("master").
|
||||||
|
for BRANCH_NAME in "$GITHUB_HEAD_REF" "$GITHUB_BASE_REF" "${GITHUB_REF#refs/heads/}" "master"; do
|
||||||
|
# Skip empty branch names and merge commits.
|
||||||
|
if [[ -z "$BRANCH_NAME" || $BRANCH_NAME =~ ^refs/pull/.* ]]; then
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
(wget -O - "https://github.com/matrix-org/complement/archive/$BRANCH_NAME.tar.gz" | tar -xz --strip-components=1 -C complement) && break
|
||||||
|
done
|
||||||
|
|
||||||
|
# Build initial Synapse image
|
||||||
|
- run: docker build -t matrixdotorg/synapse:latest -f docker/Dockerfile .
|
||||||
|
working-directory: synapse
|
||||||
|
|
||||||
|
# Build a ready-to-run Synapse image based on the initial image above.
|
||||||
|
# This new image includes a config file, keys for signing and TLS, and
|
||||||
|
# other settings to make it suitable for testing under Complement.
|
||||||
|
- run: docker build -t complement-synapse -f Synapse.Dockerfile .
|
||||||
|
working-directory: complement/dockerfiles
|
||||||
|
|
||||||
|
# Run Complement
|
||||||
|
- run: go test -v -tags synapse_blacklist,msc2403,msc2946,msc3083 ./tests/...
|
||||||
env:
|
env:
|
||||||
POSTGRES: ${{ (matrix.database == 'Postgres') && 1 || '' }}
|
COMPLEMENT_BASE_IMAGE: complement-synapse:latest
|
||||||
WORKERS: ${{ (matrix.arrangement == 'workers') && 1 || '' }}
|
working-directory: complement
|
||||||
name: Run Complement Tests
|
|
||||||
|
|
||||||
cargo-test:
|
|
||||||
if: ${{ needs.changes.outputs.rust == 'true' }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- linting-done
|
|
||||||
- changes
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- run: cargo test
|
|
||||||
|
|
||||||
# We want to ensure that the cargo benchmarks still compile, which requires a
|
|
||||||
# nightly compiler.
|
|
||||||
cargo-bench:
|
|
||||||
if: ${{ needs.changes.outputs.rust == 'true' }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- linting-done
|
|
||||||
- changes
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: nightly-2022-12-01
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- run: cargo bench --no-run
|
|
||||||
|
|
||||||
# a job which marks all the other jobs as complete, thus allowing PRs to be merged.
|
# a job which marks all the other jobs as complete, thus allowing PRs to be merged.
|
||||||
tests-done:
|
tests-done:
|
||||||
if: ${{ always() }}
|
if: ${{ always() }}
|
||||||
needs:
|
needs:
|
||||||
|
- lint
|
||||||
|
- lint-crlf
|
||||||
|
- lint-newsfile
|
||||||
|
- lint-sdist
|
||||||
- trial
|
- trial
|
||||||
- trial-olddeps
|
- trial-olddeps
|
||||||
- sytest
|
- sytest
|
||||||
- export-data
|
|
||||||
- portdb
|
- portdb
|
||||||
- complement
|
- complement
|
||||||
- cargo-test
|
|
||||||
- cargo-bench
|
|
||||||
- linting-done
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: matrix-org/done-action@3409aa904e8a2aaf2220f09bc954d3d0b0a2ee67 # v3
|
- name: Set build result
|
||||||
with:
|
env:
|
||||||
needs: ${{ toJSON(needs) }}
|
NEEDS_CONTEXT: ${{ toJSON(needs) }}
|
||||||
|
# the `jq` incantation dumps out a series of "<job> <result>" lines.
|
||||||
|
# we set it to an intermediate variable to avoid a pipe, which makes it
|
||||||
|
# hard to set $rc.
|
||||||
|
run: |
|
||||||
|
rc=0
|
||||||
|
results=$(jq -r 'to_entries[] | [.key,.value.result] | join(" ")' <<< $NEEDS_CONTEXT)
|
||||||
|
while read job result ; do
|
||||||
|
# The newsfile lint may be skipped on non PR builds
|
||||||
|
if [ $result == "skipped" ] && [ $job == "lint-newsfile" ]; then
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
# Various bits are skipped if there was no applicable changes.
|
if [ "$result" != "success" ]; then
|
||||||
# The newsfile lint may be skipped on non PR builds.
|
echo "::set-failed ::Job $job returned $result"
|
||||||
skippable: |
|
rc=1
|
||||||
trial
|
fi
|
||||||
trial-olddeps
|
done <<< $results
|
||||||
sytest
|
exit $rc
|
||||||
portdb
|
|
||||||
export-data
|
|
||||||
complement
|
|
||||||
lint-newsfile
|
|
||||||
cargo-test
|
|
||||||
cargo-bench
|
|
||||||
|
14
.github/workflows/triage-incoming.yml
vendored
14
.github/workflows/triage-incoming.yml
vendored
@ -1,14 +0,0 @@
|
|||||||
name: Move new issues into the issue triage board
|
|
||||||
|
|
||||||
on:
|
|
||||||
issues:
|
|
||||||
types: [ opened ]
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
triage:
|
|
||||||
uses: matrix-org/backend-meta/.github/workflows/triage-incoming.yml@18beaf3c8e536108bd04d18e6c3dc40ba3931e28 # v2.0.3
|
|
||||||
with:
|
|
||||||
project_id: 'PVT_kwDOAIB0Bs4AFDdZ'
|
|
||||||
content_id: ${{ github.event.issue.node_id }}
|
|
||||||
secrets:
|
|
||||||
github_access_token: ${{ secrets.ELEMENT_BOT_TOKEN }}
|
|
44
.github/workflows/triage_labelled.yml
vendored
44
.github/workflows/triage_labelled.yml
vendored
@ -1,44 +0,0 @@
|
|||||||
name: Move labelled issues to correct projects
|
|
||||||
|
|
||||||
on:
|
|
||||||
issues:
|
|
||||||
types: [ labeled ]
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
move_needs_info:
|
|
||||||
name: Move X-Needs-Info on the triage board
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: >
|
|
||||||
contains(github.event.issue.labels.*.name, 'X-Needs-Info')
|
|
||||||
steps:
|
|
||||||
- uses: actions/add-to-project@5b1a254a3546aef88e0a7724a77a623fa2e47c36 # main (v1.0.2 + 10 commits)
|
|
||||||
id: add_project
|
|
||||||
with:
|
|
||||||
project-url: "https://github.com/orgs/matrix-org/projects/67"
|
|
||||||
github-token: ${{ secrets.ELEMENT_BOT_TOKEN }}
|
|
||||||
- name: Set status
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.ELEMENT_BOT_TOKEN }}
|
|
||||||
run: |
|
|
||||||
gh api graphql -f query='
|
|
||||||
mutation(
|
|
||||||
$project: ID!
|
|
||||||
$item: ID!
|
|
||||||
$fieldid: ID!
|
|
||||||
$columnid: String!
|
|
||||||
) {
|
|
||||||
updateProjectV2ItemFieldValue(
|
|
||||||
input: {
|
|
||||||
projectId: $project
|
|
||||||
itemId: $item
|
|
||||||
fieldId: $fieldid
|
|
||||||
value: {
|
|
||||||
singleSelectOptionId: $columnid
|
|
||||||
}
|
|
||||||
}
|
|
||||||
) {
|
|
||||||
projectV2Item {
|
|
||||||
id
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}' -f project="PVT_kwDOAIB0Bs4AFDdZ" -f item=${{ steps.add_project.outputs.itemId }} -f fieldid="PVTSSF_lADOAIB0Bs4AFDdZzgC6ZA4" -f columnid=ba22e43c --silent
|
|
174
.github/workflows/twisted_trunk.yml
vendored
174
.github/workflows/twisted_trunk.yml
vendored
@ -5,90 +5,32 @@ on:
|
|||||||
- cron: 0 8 * * *
|
- cron: 0 8 * * *
|
||||||
|
|
||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
# NB: inputs are only present when this workflow is dispatched manually.
|
|
||||||
# (The default below is the default field value in the form to trigger
|
|
||||||
# a manual dispatch). Otherwise the inputs will evaluate to null.
|
|
||||||
inputs:
|
|
||||||
twisted_ref:
|
|
||||||
description: Commit, branch or tag to checkout from upstream Twisted.
|
|
||||||
required: false
|
|
||||||
default: 'trunk'
|
|
||||||
type: string
|
|
||||||
|
|
||||||
|
|
||||||
concurrency:
|
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
|
||||||
cancel-in-progress: true
|
|
||||||
|
|
||||||
env:
|
|
||||||
RUST_VERSION: 1.87.0
|
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
check_repo:
|
|
||||||
# Prevent this workflow from running on any fork of Synapse other than element-hq/synapse, as it is
|
|
||||||
# only useful to the Synapse core team.
|
|
||||||
# All other workflow steps depend on this one, thus if 'should_run_workflow' is not 'true', the rest
|
|
||||||
# of the workflow will be skipped as well.
|
|
||||||
if: github.repository == 'element-hq/synapse'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
outputs:
|
|
||||||
should_run_workflow: ${{ steps.check_condition.outputs.should_run_workflow }}
|
|
||||||
steps:
|
|
||||||
- id: check_condition
|
|
||||||
run: echo "should_run_workflow=${{ github.repository == 'element-hq/synapse' }}" >> "$GITHUB_OUTPUT"
|
|
||||||
|
|
||||||
mypy:
|
mypy:
|
||||||
needs: check_repo
|
|
||||||
if: needs.check_repo.outputs.should_run_workflow == 'true'
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
|
- uses: actions/setup-python@v2
|
||||||
- name: Install Rust
|
- run: .ci/patch_for_twisted_trunk.sh
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
- run: pip install tox
|
||||||
with:
|
- run: tox -e mypy
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
|
||||||
with:
|
|
||||||
python-version: "3.x"
|
|
||||||
extras: "all"
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
- run: |
|
|
||||||
poetry remove twisted
|
|
||||||
poetry add --extras tls git+https://github.com/twisted/twisted.git#${{ inputs.twisted_ref || 'trunk' }}
|
|
||||||
poetry install --no-interaction --extras "all test"
|
|
||||||
- name: Remove unhelpful options from mypy config
|
|
||||||
run: sed -e '/warn_unused_ignores = True/d' -e '/warn_redundant_casts = True/d' -i mypy.ini
|
|
||||||
- run: poetry run mypy
|
|
||||||
|
|
||||||
trial:
|
trial:
|
||||||
needs: check_repo
|
|
||||||
if: needs.check_repo.outputs.should_run_workflow == 'true'
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- run: sudo apt-get -qq install xmlsec1
|
- run: sudo apt-get -qq install xmlsec1
|
||||||
|
- uses: actions/setup-python@v2
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
with:
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
python-version: 3.6
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
- run: .ci/patch_for_twisted_trunk.sh
|
||||||
|
- run: pip install tox
|
||||||
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
|
- run: tox -e py
|
||||||
with:
|
env:
|
||||||
python-version: "3.x"
|
TRIAL_FLAGS: "--jobs=2"
|
||||||
extras: "all test"
|
|
||||||
poetry-version: "2.1.1"
|
|
||||||
- run: |
|
|
||||||
poetry remove twisted
|
|
||||||
poetry add --extras tls git+https://github.com/twisted/twisted.git#trunk
|
|
||||||
poetry install --no-interaction --extras "all test"
|
|
||||||
- run: poetry run trial --jobs 2 tests
|
|
||||||
|
|
||||||
- name: Dump logs
|
- name: Dump logs
|
||||||
# Logs are most useful when the command fails, always include them.
|
# Logs are most useful when the command fails, always include them.
|
||||||
@ -104,50 +46,25 @@ jobs:
|
|||||||
|| true
|
|| true
|
||||||
|
|
||||||
sytest:
|
sytest:
|
||||||
needs: check_repo
|
|
||||||
if: needs.check_repo.outputs.should_run_workflow == 'true'
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
container:
|
container:
|
||||||
# We're using debian:bullseye because it uses Python 3.9 which is our minimum supported Python version.
|
image: matrixdotorg/sytest-synapse:buster
|
||||||
# This job is a canary to warn us about unreleased twisted changes that would cause problems for us if
|
|
||||||
# they were to be released immediately. For simplicity's sake (and to save CI runners) we use the oldest
|
|
||||||
# version, assuming that any incompatibilities on newer versions would also be present on the oldest.
|
|
||||||
image: matrixdotorg/sytest-synapse:bullseye
|
|
||||||
volumes:
|
volumes:
|
||||||
- ${{ github.workspace }}:/src
|
- ${{ github.workspace }}:/src
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
|
|
||||||
- name: Install Rust
|
|
||||||
uses: dtolnay/rust-toolchain@b3b07ba8b418998c39fb20f53e8b695cdcc8de1b # master
|
|
||||||
with:
|
|
||||||
toolchain: ${{ env.RUST_VERSION }}
|
|
||||||
- uses: Swatinem/rust-cache@98c8021b550208e191a6a3145459bfc9fb29c4c0 # v2.8.0
|
|
||||||
|
|
||||||
- name: Patch dependencies
|
- name: Patch dependencies
|
||||||
# Note: The poetry commands want to create a virtualenv in /src/.venv/,
|
run: .ci/patch_for_twisted_trunk.sh
|
||||||
# but the sytest-synapse container expects it to be in /venv/.
|
|
||||||
# We symlink it before running poetry so that poetry actually
|
|
||||||
# ends up installing to `/venv`.
|
|
||||||
run: |
|
|
||||||
ln -s -T /venv /src/.venv
|
|
||||||
poetry remove twisted
|
|
||||||
poetry add --extras tls git+https://github.com/twisted/twisted.git#trunk
|
|
||||||
poetry install --no-interaction --extras "all test"
|
|
||||||
working-directory: /src
|
working-directory: /src
|
||||||
- name: Run SyTest
|
- name: Run SyTest
|
||||||
run: /bootstrap.sh synapse
|
run: /bootstrap.sh synapse
|
||||||
working-directory: /src
|
working-directory: /src
|
||||||
env:
|
|
||||||
# Use offline mode to avoid reinstalling the pinned version of
|
|
||||||
# twisted.
|
|
||||||
OFFLINE: 1
|
|
||||||
- name: Summarise results.tap
|
- name: Summarise results.tap
|
||||||
if: ${{ always() }}
|
if: ${{ always() }}
|
||||||
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
|
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
|
||||||
- name: Upload SyTest logs
|
- name: Upload SyTest logs
|
||||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
uses: actions/upload-artifact@v2
|
||||||
if: ${{ always() }}
|
if: ${{ always() }}
|
||||||
with:
|
with:
|
||||||
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
|
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
|
||||||
@ -155,70 +72,19 @@ jobs:
|
|||||||
/logs/results.tap
|
/logs/results.tap
|
||||||
/logs/**/*.log*
|
/logs/**/*.log*
|
||||||
|
|
||||||
complement:
|
|
||||||
needs: check_repo
|
|
||||||
if: "!failure() && !cancelled() && needs.check_repo.outputs.should_run_workflow == 'true'"
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
- arrangement: monolith
|
|
||||||
database: SQLite
|
|
||||||
|
|
||||||
- arrangement: monolith
|
|
||||||
database: Postgres
|
|
||||||
|
|
||||||
- arrangement: workers
|
|
||||||
database: Postgres
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Run actions/checkout@v4 for synapse
|
|
||||||
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
|
||||||
with:
|
|
||||||
path: synapse
|
|
||||||
|
|
||||||
- name: Prepare Complement's Prerequisites
|
|
||||||
run: synapse/.ci/scripts/setup_complement_prerequisites.sh
|
|
||||||
|
|
||||||
- uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5.5.0
|
|
||||||
with:
|
|
||||||
cache-dependency-path: complement/go.sum
|
|
||||||
go-version-file: complement/go.mod
|
|
||||||
|
|
||||||
# This step is specific to the 'Twisted trunk' test run:
|
|
||||||
- name: Patch dependencies
|
|
||||||
run: |
|
|
||||||
set -x
|
|
||||||
DEBIAN_FRONTEND=noninteractive sudo apt-get install -yqq python3 pipx
|
|
||||||
pipx install poetry==2.1.1
|
|
||||||
|
|
||||||
poetry remove -n twisted
|
|
||||||
poetry add -n --extras tls git+https://github.com/twisted/twisted.git#trunk
|
|
||||||
poetry lock
|
|
||||||
working-directory: synapse
|
|
||||||
|
|
||||||
- run: |
|
|
||||||
set -o pipefail
|
|
||||||
TEST_ONLY_SKIP_DEP_HASH_VERIFICATION=1 POSTGRES=${{ (matrix.database == 'Postgres') && 1 || '' }} WORKERS=${{ (matrix.arrangement == 'workers') && 1 || '' }} COMPLEMENT_DIR=`pwd`/complement synapse/scripts-dev/complement.sh -json 2>&1 | synapse/.ci/scripts/gotestfmt
|
|
||||||
shell: bash
|
|
||||||
name: Run Complement Tests
|
|
||||||
|
|
||||||
# open an issue if the build fails, so we know about it.
|
# open an issue if the build fails, so we know about it.
|
||||||
open-issue:
|
open-issue:
|
||||||
if: failure() && needs.check_repo.outputs.should_run_workflow == 'true'
|
if: failure()
|
||||||
needs:
|
needs:
|
||||||
- mypy
|
- mypy
|
||||||
- trial
|
- trial
|
||||||
- sytest
|
- sytest
|
||||||
- complement
|
|
||||||
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
|
- uses: actions/checkout@v2
|
||||||
- uses: JasonEtco/create-an-issue@1b14a70e4d8dc185e5cc76d3bec9eab20257b2c5 # v2.9.2
|
- uses: JasonEtco/create-an-issue@5d9504915f79f9cc6d791934b8ef34f2353dd74d # v2.5.0, 2020-12-06
|
||||||
env:
|
env:
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
with:
|
with:
|
||||||
|
29
.gitignore
vendored
29
.gitignore
vendored
@ -15,11 +15,6 @@ _trial_temp*/
|
|||||||
.DS_Store
|
.DS_Store
|
||||||
__pycache__/
|
__pycache__/
|
||||||
|
|
||||||
# We do want poetry, cargo and flake lockfiles.
|
|
||||||
!poetry.lock
|
|
||||||
!Cargo.lock
|
|
||||||
!flake.lock
|
|
||||||
|
|
||||||
# stuff that is likely to exist when you run a server locally
|
# stuff that is likely to exist when you run a server locally
|
||||||
/*.db
|
/*.db
|
||||||
/*.log
|
/*.log
|
||||||
@ -34,20 +29,11 @@ __pycache__/
|
|||||||
/logs
|
/logs
|
||||||
/media_store/
|
/media_store/
|
||||||
/uploads
|
/uploads
|
||||||
/homeserver-config-overrides.d
|
|
||||||
|
|
||||||
# For direnv users
|
|
||||||
/.envrc
|
|
||||||
.direnv/
|
|
||||||
|
|
||||||
# For nix/devenv users
|
|
||||||
.devenv/
|
|
||||||
|
|
||||||
# IDEs
|
# IDEs
|
||||||
/.idea/
|
/.idea/
|
||||||
/.ropeproject/
|
/.ropeproject/
|
||||||
/.vscode/
|
/.vscode/
|
||||||
/.zed/
|
|
||||||
|
|
||||||
# build products
|
# build products
|
||||||
!/.coveragerc
|
!/.coveragerc
|
||||||
@ -59,23 +45,8 @@ __pycache__/
|
|||||||
/coverage.*
|
/coverage.*
|
||||||
/dist/
|
/dist/
|
||||||
/docs/build/
|
/docs/build/
|
||||||
/dev-docs/_build/
|
|
||||||
/htmlcov
|
/htmlcov
|
||||||
/pip-wheel-metadata/
|
/pip-wheel-metadata/
|
||||||
|
|
||||||
# docs
|
# docs
|
||||||
book/
|
book/
|
||||||
|
|
||||||
# complement
|
|
||||||
/complement-*
|
|
||||||
/main.tar.gz
|
|
||||||
|
|
||||||
# rust
|
|
||||||
/target/
|
|
||||||
/synapse/*.so
|
|
||||||
|
|
||||||
# Poetry will create a setup.py, which we don't want to include.
|
|
||||||
/setup.py
|
|
||||||
|
|
||||||
# Don't include users' poetry configs
|
|
||||||
/poetry.toml
|
|
||||||
|
@ -1 +0,0 @@
|
|||||||
group_imports = "StdExternalCrate"
|
|
9464
CHANGES.md
9464
CHANGES.md
File diff suppressed because it is too large
Load Diff
@ -1,3 +1,3 @@
|
|||||||
# Welcome to Synapse
|
# Welcome to Synapse
|
||||||
|
|
||||||
Please see the [contributors' guide](https://element-hq.github.io/synapse/latest/development/contributing_guide.html) in our rendered documentation.
|
Please see the [contributors' guide](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html) in our rendered documentation.
|
||||||
|
2122
Cargo.lock
generated
2122
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
@ -1,6 +0,0 @@
|
|||||||
# We make the whole Synapse folder a workspace so that we can run `cargo`
|
|
||||||
# commands from the root (rather than having to cd into rust/).
|
|
||||||
|
|
||||||
[workspace]
|
|
||||||
members = ["rust"]
|
|
||||||
resolver = "2"
|
|
@ -1,7 +1,7 @@
|
|||||||
# Installation Instructions
|
# Installation Instructions
|
||||||
|
|
||||||
This document has moved to the
|
This document has moved to the
|
||||||
[Synapse documentation website](https://element-hq.github.io/synapse/latest/setup/installation.html).
|
[Synapse documentation website](https://matrix-org.github.io/synapse/latest/setup/installation.html).
|
||||||
Please update your links.
|
Please update your links.
|
||||||
|
|
||||||
The markdown source is available in [docs/setup/installation.md](docs/setup/installation.md).
|
The markdown source is available in [docs/setup/installation.md](docs/setup/installation.md).
|
||||||
|
177
LICENSE
Normal file
177
LICENSE
Normal file
@ -0,0 +1,177 @@
|
|||||||
|
|
||||||
|
Apache License
|
||||||
|
Version 2.0, January 2004
|
||||||
|
http://www.apache.org/licenses/
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||||
|
|
||||||
|
1. Definitions.
|
||||||
|
|
||||||
|
"License" shall mean the terms and conditions for use, reproduction,
|
||||||
|
and distribution as defined by Sections 1 through 9 of this document.
|
||||||
|
|
||||||
|
"Licensor" shall mean the copyright owner or entity authorized by
|
||||||
|
the copyright owner that is granting the License.
|
||||||
|
|
||||||
|
"Legal Entity" shall mean the union of the acting entity and all
|
||||||
|
other entities that control, are controlled by, or are under common
|
||||||
|
control with that entity. For the purposes of this definition,
|
||||||
|
"control" means (i) the power, direct or indirect, to cause the
|
||||||
|
direction or management of such entity, whether by contract or
|
||||||
|
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||||
|
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||||
|
|
||||||
|
"You" (or "Your") shall mean an individual or Legal Entity
|
||||||
|
exercising permissions granted by this License.
|
||||||
|
|
||||||
|
"Source" form shall mean the preferred form for making modifications,
|
||||||
|
including but not limited to software source code, documentation
|
||||||
|
source, and configuration files.
|
||||||
|
|
||||||
|
"Object" form shall mean any form resulting from mechanical
|
||||||
|
transformation or translation of a Source form, including but
|
||||||
|
not limited to compiled object code, generated documentation,
|
||||||
|
and conversions to other media types.
|
||||||
|
|
||||||
|
"Work" shall mean the work of authorship, whether in Source or
|
||||||
|
Object form, made available under the License, as indicated by a
|
||||||
|
copyright notice that is included in or attached to the work
|
||||||
|
(an example is provided in the Appendix below).
|
||||||
|
|
||||||
|
"Derivative Works" shall mean any work, whether in Source or Object
|
||||||
|
form, that is based on (or derived from) the Work and for which the
|
||||||
|
editorial revisions, annotations, elaborations, or other modifications
|
||||||
|
represent, as a whole, an original work of authorship. For the purposes
|
||||||
|
of this License, Derivative Works shall not include works that remain
|
||||||
|
separable from, or merely link (or bind by name) to the interfaces of,
|
||||||
|
the Work and Derivative Works thereof.
|
||||||
|
|
||||||
|
"Contribution" shall mean any work of authorship, including
|
||||||
|
the original version of the Work and any modifications or additions
|
||||||
|
to that Work or Derivative Works thereof, that is intentionally
|
||||||
|
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||||
|
or by an individual or Legal Entity authorized to submit on behalf of
|
||||||
|
the copyright owner. For the purposes of this definition, "submitted"
|
||||||
|
means any form of electronic, verbal, or written communication sent
|
||||||
|
to the Licensor or its representatives, including but not limited to
|
||||||
|
communication on electronic mailing lists, source code control systems,
|
||||||
|
and issue tracking systems that are managed by, or on behalf of, the
|
||||||
|
Licensor for the purpose of discussing and improving the Work, but
|
||||||
|
excluding communication that is conspicuously marked or otherwise
|
||||||
|
designated in writing by the copyright owner as "Not a Contribution."
|
||||||
|
|
||||||
|
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||||
|
on behalf of whom a Contribution has been received by Licensor and
|
||||||
|
subsequently incorporated within the Work.
|
||||||
|
|
||||||
|
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
copyright license to reproduce, prepare Derivative Works of,
|
||||||
|
publicly display, publicly perform, sublicense, and distribute the
|
||||||
|
Work and such Derivative Works in Source or Object form.
|
||||||
|
|
||||||
|
3. Grant of Patent License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
(except as stated in this section) patent license to make, have made,
|
||||||
|
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||||
|
where such license applies only to those patent claims licensable
|
||||||
|
by such Contributor that are necessarily infringed by their
|
||||||
|
Contribution(s) alone or by combination of their Contribution(s)
|
||||||
|
with the Work to which such Contribution(s) was submitted. If You
|
||||||
|
institute patent litigation against any entity (including a
|
||||||
|
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||||
|
or a Contribution incorporated within the Work constitutes direct
|
||||||
|
or contributory patent infringement, then any patent licenses
|
||||||
|
granted to You under this License for that Work shall terminate
|
||||||
|
as of the date such litigation is filed.
|
||||||
|
|
||||||
|
4. Redistribution. You may reproduce and distribute copies of the
|
||||||
|
Work or Derivative Works thereof in any medium, with or without
|
||||||
|
modifications, and in Source or Object form, provided that You
|
||||||
|
meet the following conditions:
|
||||||
|
|
||||||
|
(a) You must give any other recipients of the Work or
|
||||||
|
Derivative Works a copy of this License; and
|
||||||
|
|
||||||
|
(b) You must cause any modified files to carry prominent notices
|
||||||
|
stating that You changed the files; and
|
||||||
|
|
||||||
|
(c) You must retain, in the Source form of any Derivative Works
|
||||||
|
that You distribute, all copyright, patent, trademark, and
|
||||||
|
attribution notices from the Source form of the Work,
|
||||||
|
excluding those notices that do not pertain to any part of
|
||||||
|
the Derivative Works; and
|
||||||
|
|
||||||
|
(d) If the Work includes a "NOTICE" text file as part of its
|
||||||
|
distribution, then any Derivative Works that You distribute must
|
||||||
|
include a readable copy of the attribution notices contained
|
||||||
|
within such NOTICE file, excluding those notices that do not
|
||||||
|
pertain to any part of the Derivative Works, in at least one
|
||||||
|
of the following places: within a NOTICE text file distributed
|
||||||
|
as part of the Derivative Works; within the Source form or
|
||||||
|
documentation, if provided along with the Derivative Works; or,
|
||||||
|
within a display generated by the Derivative Works, if and
|
||||||
|
wherever such third-party notices normally appear. The contents
|
||||||
|
of the NOTICE file are for informational purposes only and
|
||||||
|
do not modify the License. You may add Your own attribution
|
||||||
|
notices within Derivative Works that You distribute, alongside
|
||||||
|
or as an addendum to the NOTICE text from the Work, provided
|
||||||
|
that such additional attribution notices cannot be construed
|
||||||
|
as modifying the License.
|
||||||
|
|
||||||
|
You may add Your own copyright statement to Your modifications and
|
||||||
|
may provide additional or different license terms and conditions
|
||||||
|
for use, reproduction, or distribution of Your modifications, or
|
||||||
|
for any such Derivative Works as a whole, provided Your use,
|
||||||
|
reproduction, and distribution of the Work otherwise complies with
|
||||||
|
the conditions stated in this License.
|
||||||
|
|
||||||
|
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||||
|
any Contribution intentionally submitted for inclusion in the Work
|
||||||
|
by You to the Licensor shall be under the terms and conditions of
|
||||||
|
this License, without any additional terms or conditions.
|
||||||
|
Notwithstanding the above, nothing herein shall supersede or modify
|
||||||
|
the terms of any separate license agreement you may have executed
|
||||||
|
with Licensor regarding such Contributions.
|
||||||
|
|
||||||
|
6. Trademarks. This License does not grant permission to use the trade
|
||||||
|
names, trademarks, service marks, or product names of the Licensor,
|
||||||
|
except as required for reasonable and customary use in describing the
|
||||||
|
origin of the Work and reproducing the content of the NOTICE file.
|
||||||
|
|
||||||
|
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||||
|
agreed to in writing, Licensor provides the Work (and each
|
||||||
|
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||||
|
implied, including, without limitation, any warranties or conditions
|
||||||
|
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||||
|
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||||
|
appropriateness of using or redistributing the Work and assume any
|
||||||
|
risks associated with Your exercise of permissions under this License.
|
||||||
|
|
||||||
|
8. Limitation of Liability. In no event and under no legal theory,
|
||||||
|
whether in tort (including negligence), contract, or otherwise,
|
||||||
|
unless required by applicable law (such as deliberate and grossly
|
||||||
|
negligent acts) or agreed to in writing, shall any Contributor be
|
||||||
|
liable to You for damages, including any direct, indirect, special,
|
||||||
|
incidental, or consequential damages of any character arising as a
|
||||||
|
result of this License or out of the use or inability to use the
|
||||||
|
Work (including but not limited to damages for loss of goodwill,
|
||||||
|
work stoppage, computer failure or malfunction, or any and all
|
||||||
|
other commercial damages or losses), even if such Contributor
|
||||||
|
has been advised of the possibility of such damages.
|
||||||
|
|
||||||
|
9. Accepting Warranty or Additional Liability. While redistributing
|
||||||
|
the Work or Derivative Works thereof, You may choose to offer,
|
||||||
|
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||||
|
or other liability obligations and/or rights consistent with this
|
||||||
|
License. However, in accepting such obligations, You may act only
|
||||||
|
on Your own behalf and on Your sole responsibility, not on behalf
|
||||||
|
of any other Contributor, and only if You agree to indemnify,
|
||||||
|
defend, and hold each Contributor harmless for any liability
|
||||||
|
incurred by, or claims asserted against, such Contributor by reason
|
||||||
|
of your accepting any such warranty or additional liability.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
661
LICENSE-AGPL-3.0
661
LICENSE-AGPL-3.0
@ -1,661 +0,0 @@
|
|||||||
GNU AFFERO GENERAL PUBLIC LICENSE
|
|
||||||
Version 3, 19 November 2007
|
|
||||||
|
|
||||||
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
|
||||||
Everyone is permitted to copy and distribute verbatim copies
|
|
||||||
of this license document, but changing it is not allowed.
|
|
||||||
|
|
||||||
Preamble
|
|
||||||
|
|
||||||
The GNU Affero General Public License is a free, copyleft license for
|
|
||||||
software and other kinds of works, specifically designed to ensure
|
|
||||||
cooperation with the community in the case of network server software.
|
|
||||||
|
|
||||||
The licenses for most software and other practical works are designed
|
|
||||||
to take away your freedom to share and change the works. By contrast,
|
|
||||||
our General Public Licenses are intended to guarantee your freedom to
|
|
||||||
share and change all versions of a program--to make sure it remains free
|
|
||||||
software for all its users.
|
|
||||||
|
|
||||||
When we speak of free software, we are referring to freedom, not
|
|
||||||
price. Our General Public Licenses are designed to make sure that you
|
|
||||||
have the freedom to distribute copies of free software (and charge for
|
|
||||||
them if you wish), that you receive source code or can get it if you
|
|
||||||
want it, that you can change the software or use pieces of it in new
|
|
||||||
free programs, and that you know you can do these things.
|
|
||||||
|
|
||||||
Developers that use our General Public Licenses protect your rights
|
|
||||||
with two steps: (1) assert copyright on the software, and (2) offer
|
|
||||||
you this License which gives you legal permission to copy, distribute
|
|
||||||
and/or modify the software.
|
|
||||||
|
|
||||||
A secondary benefit of defending all users' freedom is that
|
|
||||||
improvements made in alternate versions of the program, if they
|
|
||||||
receive widespread use, become available for other developers to
|
|
||||||
incorporate. Many developers of free software are heartened and
|
|
||||||
encouraged by the resulting cooperation. However, in the case of
|
|
||||||
software used on network servers, this result may fail to come about.
|
|
||||||
The GNU General Public License permits making a modified version and
|
|
||||||
letting the public access it on a server without ever releasing its
|
|
||||||
source code to the public.
|
|
||||||
|
|
||||||
The GNU Affero General Public License is designed specifically to
|
|
||||||
ensure that, in such cases, the modified source code becomes available
|
|
||||||
to the community. It requires the operator of a network server to
|
|
||||||
provide the source code of the modified version running there to the
|
|
||||||
users of that server. Therefore, public use of a modified version, on
|
|
||||||
a publicly accessible server, gives the public access to the source
|
|
||||||
code of the modified version.
|
|
||||||
|
|
||||||
An older license, called the Affero General Public License and
|
|
||||||
published by Affero, was designed to accomplish similar goals. This is
|
|
||||||
a different license, not a version of the Affero GPL, but Affero has
|
|
||||||
released a new version of the Affero GPL which permits relicensing under
|
|
||||||
this license.
|
|
||||||
|
|
||||||
The precise terms and conditions for copying, distribution and
|
|
||||||
modification follow.
|
|
||||||
|
|
||||||
TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
0. Definitions.
|
|
||||||
|
|
||||||
"This License" refers to version 3 of the GNU Affero General Public License.
|
|
||||||
|
|
||||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
|
||||||
works, such as semiconductor masks.
|
|
||||||
|
|
||||||
"The Program" refers to any copyrightable work licensed under this
|
|
||||||
License. Each licensee is addressed as "you". "Licensees" and
|
|
||||||
"recipients" may be individuals or organizations.
|
|
||||||
|
|
||||||
To "modify" a work means to copy from or adapt all or part of the work
|
|
||||||
in a fashion requiring copyright permission, other than the making of an
|
|
||||||
exact copy. The resulting work is called a "modified version" of the
|
|
||||||
earlier work or a work "based on" the earlier work.
|
|
||||||
|
|
||||||
A "covered work" means either the unmodified Program or a work based
|
|
||||||
on the Program.
|
|
||||||
|
|
||||||
To "propagate" a work means to do anything with it that, without
|
|
||||||
permission, would make you directly or secondarily liable for
|
|
||||||
infringement under applicable copyright law, except executing it on a
|
|
||||||
computer or modifying a private copy. Propagation includes copying,
|
|
||||||
distribution (with or without modification), making available to the
|
|
||||||
public, and in some countries other activities as well.
|
|
||||||
|
|
||||||
To "convey" a work means any kind of propagation that enables other
|
|
||||||
parties to make or receive copies. Mere interaction with a user through
|
|
||||||
a computer network, with no transfer of a copy, is not conveying.
|
|
||||||
|
|
||||||
An interactive user interface displays "Appropriate Legal Notices"
|
|
||||||
to the extent that it includes a convenient and prominently visible
|
|
||||||
feature that (1) displays an appropriate copyright notice, and (2)
|
|
||||||
tells the user that there is no warranty for the work (except to the
|
|
||||||
extent that warranties are provided), that licensees may convey the
|
|
||||||
work under this License, and how to view a copy of this License. If
|
|
||||||
the interface presents a list of user commands or options, such as a
|
|
||||||
menu, a prominent item in the list meets this criterion.
|
|
||||||
|
|
||||||
1. Source Code.
|
|
||||||
|
|
||||||
The "source code" for a work means the preferred form of the work
|
|
||||||
for making modifications to it. "Object code" means any non-source
|
|
||||||
form of a work.
|
|
||||||
|
|
||||||
A "Standard Interface" means an interface that either is an official
|
|
||||||
standard defined by a recognized standards body, or, in the case of
|
|
||||||
interfaces specified for a particular programming language, one that
|
|
||||||
is widely used among developers working in that language.
|
|
||||||
|
|
||||||
The "System Libraries" of an executable work include anything, other
|
|
||||||
than the work as a whole, that (a) is included in the normal form of
|
|
||||||
packaging a Major Component, but which is not part of that Major
|
|
||||||
Component, and (b) serves only to enable use of the work with that
|
|
||||||
Major Component, or to implement a Standard Interface for which an
|
|
||||||
implementation is available to the public in source code form. A
|
|
||||||
"Major Component", in this context, means a major essential component
|
|
||||||
(kernel, window system, and so on) of the specific operating system
|
|
||||||
(if any) on which the executable work runs, or a compiler used to
|
|
||||||
produce the work, or an object code interpreter used to run it.
|
|
||||||
|
|
||||||
The "Corresponding Source" for a work in object code form means all
|
|
||||||
the source code needed to generate, install, and (for an executable
|
|
||||||
work) run the object code and to modify the work, including scripts to
|
|
||||||
control those activities. However, it does not include the work's
|
|
||||||
System Libraries, or general-purpose tools or generally available free
|
|
||||||
programs which are used unmodified in performing those activities but
|
|
||||||
which are not part of the work. For example, Corresponding Source
|
|
||||||
includes interface definition files associated with source files for
|
|
||||||
the work, and the source code for shared libraries and dynamically
|
|
||||||
linked subprograms that the work is specifically designed to require,
|
|
||||||
such as by intimate data communication or control flow between those
|
|
||||||
subprograms and other parts of the work.
|
|
||||||
|
|
||||||
The Corresponding Source need not include anything that users
|
|
||||||
can regenerate automatically from other parts of the Corresponding
|
|
||||||
Source.
|
|
||||||
|
|
||||||
The Corresponding Source for a work in source code form is that
|
|
||||||
same work.
|
|
||||||
|
|
||||||
2. Basic Permissions.
|
|
||||||
|
|
||||||
All rights granted under this License are granted for the term of
|
|
||||||
copyright on the Program, and are irrevocable provided the stated
|
|
||||||
conditions are met. This License explicitly affirms your unlimited
|
|
||||||
permission to run the unmodified Program. The output from running a
|
|
||||||
covered work is covered by this License only if the output, given its
|
|
||||||
content, constitutes a covered work. This License acknowledges your
|
|
||||||
rights of fair use or other equivalent, as provided by copyright law.
|
|
||||||
|
|
||||||
You may make, run and propagate covered works that you do not
|
|
||||||
convey, without conditions so long as your license otherwise remains
|
|
||||||
in force. You may convey covered works to others for the sole purpose
|
|
||||||
of having them make modifications exclusively for you, or provide you
|
|
||||||
with facilities for running those works, provided that you comply with
|
|
||||||
the terms of this License in conveying all material for which you do
|
|
||||||
not control copyright. Those thus making or running the covered works
|
|
||||||
for you must do so exclusively on your behalf, under your direction
|
|
||||||
and control, on terms that prohibit them from making any copies of
|
|
||||||
your copyrighted material outside their relationship with you.
|
|
||||||
|
|
||||||
Conveying under any other circumstances is permitted solely under
|
|
||||||
the conditions stated below. Sublicensing is not allowed; section 10
|
|
||||||
makes it unnecessary.
|
|
||||||
|
|
||||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
|
||||||
|
|
||||||
No covered work shall be deemed part of an effective technological
|
|
||||||
measure under any applicable law fulfilling obligations under article
|
|
||||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
|
||||||
similar laws prohibiting or restricting circumvention of such
|
|
||||||
measures.
|
|
||||||
|
|
||||||
When you convey a covered work, you waive any legal power to forbid
|
|
||||||
circumvention of technological measures to the extent such circumvention
|
|
||||||
is effected by exercising rights under this License with respect to
|
|
||||||
the covered work, and you disclaim any intention to limit operation or
|
|
||||||
modification of the work as a means of enforcing, against the work's
|
|
||||||
users, your or third parties' legal rights to forbid circumvention of
|
|
||||||
technological measures.
|
|
||||||
|
|
||||||
4. Conveying Verbatim Copies.
|
|
||||||
|
|
||||||
You may convey verbatim copies of the Program's source code as you
|
|
||||||
receive it, in any medium, provided that you conspicuously and
|
|
||||||
appropriately publish on each copy an appropriate copyright notice;
|
|
||||||
keep intact all notices stating that this License and any
|
|
||||||
non-permissive terms added in accord with section 7 apply to the code;
|
|
||||||
keep intact all notices of the absence of any warranty; and give all
|
|
||||||
recipients a copy of this License along with the Program.
|
|
||||||
|
|
||||||
You may charge any price or no price for each copy that you convey,
|
|
||||||
and you may offer support or warranty protection for a fee.
|
|
||||||
|
|
||||||
5. Conveying Modified Source Versions.
|
|
||||||
|
|
||||||
You may convey a work based on the Program, or the modifications to
|
|
||||||
produce it from the Program, in the form of source code under the
|
|
||||||
terms of section 4, provided that you also meet all of these conditions:
|
|
||||||
|
|
||||||
a) The work must carry prominent notices stating that you modified
|
|
||||||
it, and giving a relevant date.
|
|
||||||
|
|
||||||
b) The work must carry prominent notices stating that it is
|
|
||||||
released under this License and any conditions added under section
|
|
||||||
7. This requirement modifies the requirement in section 4 to
|
|
||||||
"keep intact all notices".
|
|
||||||
|
|
||||||
c) You must license the entire work, as a whole, under this
|
|
||||||
License to anyone who comes into possession of a copy. This
|
|
||||||
License will therefore apply, along with any applicable section 7
|
|
||||||
additional terms, to the whole of the work, and all its parts,
|
|
||||||
regardless of how they are packaged. This License gives no
|
|
||||||
permission to license the work in any other way, but it does not
|
|
||||||
invalidate such permission if you have separately received it.
|
|
||||||
|
|
||||||
d) If the work has interactive user interfaces, each must display
|
|
||||||
Appropriate Legal Notices; however, if the Program has interactive
|
|
||||||
interfaces that do not display Appropriate Legal Notices, your
|
|
||||||
work need not make them do so.
|
|
||||||
|
|
||||||
A compilation of a covered work with other separate and independent
|
|
||||||
works, which are not by their nature extensions of the covered work,
|
|
||||||
and which are not combined with it such as to form a larger program,
|
|
||||||
in or on a volume of a storage or distribution medium, is called an
|
|
||||||
"aggregate" if the compilation and its resulting copyright are not
|
|
||||||
used to limit the access or legal rights of the compilation's users
|
|
||||||
beyond what the individual works permit. Inclusion of a covered work
|
|
||||||
in an aggregate does not cause this License to apply to the other
|
|
||||||
parts of the aggregate.
|
|
||||||
|
|
||||||
6. Conveying Non-Source Forms.
|
|
||||||
|
|
||||||
You may convey a covered work in object code form under the terms
|
|
||||||
of sections 4 and 5, provided that you also convey the
|
|
||||||
machine-readable Corresponding Source under the terms of this License,
|
|
||||||
in one of these ways:
|
|
||||||
|
|
||||||
a) Convey the object code in, or embodied in, a physical product
|
|
||||||
(including a physical distribution medium), accompanied by the
|
|
||||||
Corresponding Source fixed on a durable physical medium
|
|
||||||
customarily used for software interchange.
|
|
||||||
|
|
||||||
b) Convey the object code in, or embodied in, a physical product
|
|
||||||
(including a physical distribution medium), accompanied by a
|
|
||||||
written offer, valid for at least three years and valid for as
|
|
||||||
long as you offer spare parts or customer support for that product
|
|
||||||
model, to give anyone who possesses the object code either (1) a
|
|
||||||
copy of the Corresponding Source for all the software in the
|
|
||||||
product that is covered by this License, on a durable physical
|
|
||||||
medium customarily used for software interchange, for a price no
|
|
||||||
more than your reasonable cost of physically performing this
|
|
||||||
conveying of source, or (2) access to copy the
|
|
||||||
Corresponding Source from a network server at no charge.
|
|
||||||
|
|
||||||
c) Convey individual copies of the object code with a copy of the
|
|
||||||
written offer to provide the Corresponding Source. This
|
|
||||||
alternative is allowed only occasionally and noncommercially, and
|
|
||||||
only if you received the object code with such an offer, in accord
|
|
||||||
with subsection 6b.
|
|
||||||
|
|
||||||
d) Convey the object code by offering access from a designated
|
|
||||||
place (gratis or for a charge), and offer equivalent access to the
|
|
||||||
Corresponding Source in the same way through the same place at no
|
|
||||||
further charge. You need not require recipients to copy the
|
|
||||||
Corresponding Source along with the object code. If the place to
|
|
||||||
copy the object code is a network server, the Corresponding Source
|
|
||||||
may be on a different server (operated by you or a third party)
|
|
||||||
that supports equivalent copying facilities, provided you maintain
|
|
||||||
clear directions next to the object code saying where to find the
|
|
||||||
Corresponding Source. Regardless of what server hosts the
|
|
||||||
Corresponding Source, you remain obligated to ensure that it is
|
|
||||||
available for as long as needed to satisfy these requirements.
|
|
||||||
|
|
||||||
e) Convey the object code using peer-to-peer transmission, provided
|
|
||||||
you inform other peers where the object code and Corresponding
|
|
||||||
Source of the work are being offered to the general public at no
|
|
||||||
charge under subsection 6d.
|
|
||||||
|
|
||||||
A separable portion of the object code, whose source code is excluded
|
|
||||||
from the Corresponding Source as a System Library, need not be
|
|
||||||
included in conveying the object code work.
|
|
||||||
|
|
||||||
A "User Product" is either (1) a "consumer product", which means any
|
|
||||||
tangible personal property which is normally used for personal, family,
|
|
||||||
or household purposes, or (2) anything designed or sold for incorporation
|
|
||||||
into a dwelling. In determining whether a product is a consumer product,
|
|
||||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
|
||||||
product received by a particular user, "normally used" refers to a
|
|
||||||
typical or common use of that class of product, regardless of the status
|
|
||||||
of the particular user or of the way in which the particular user
|
|
||||||
actually uses, or expects or is expected to use, the product. A product
|
|
||||||
is a consumer product regardless of whether the product has substantial
|
|
||||||
commercial, industrial or non-consumer uses, unless such uses represent
|
|
||||||
the only significant mode of use of the product.
|
|
||||||
|
|
||||||
"Installation Information" for a User Product means any methods,
|
|
||||||
procedures, authorization keys, or other information required to install
|
|
||||||
and execute modified versions of a covered work in that User Product from
|
|
||||||
a modified version of its Corresponding Source. The information must
|
|
||||||
suffice to ensure that the continued functioning of the modified object
|
|
||||||
code is in no case prevented or interfered with solely because
|
|
||||||
modification has been made.
|
|
||||||
|
|
||||||
If you convey an object code work under this section in, or with, or
|
|
||||||
specifically for use in, a User Product, and the conveying occurs as
|
|
||||||
part of a transaction in which the right of possession and use of the
|
|
||||||
User Product is transferred to the recipient in perpetuity or for a
|
|
||||||
fixed term (regardless of how the transaction is characterized), the
|
|
||||||
Corresponding Source conveyed under this section must be accompanied
|
|
||||||
by the Installation Information. But this requirement does not apply
|
|
||||||
if neither you nor any third party retains the ability to install
|
|
||||||
modified object code on the User Product (for example, the work has
|
|
||||||
been installed in ROM).
|
|
||||||
|
|
||||||
The requirement to provide Installation Information does not include a
|
|
||||||
requirement to continue to provide support service, warranty, or updates
|
|
||||||
for a work that has been modified or installed by the recipient, or for
|
|
||||||
the User Product in which it has been modified or installed. Access to a
|
|
||||||
network may be denied when the modification itself materially and
|
|
||||||
adversely affects the operation of the network or violates the rules and
|
|
||||||
protocols for communication across the network.
|
|
||||||
|
|
||||||
Corresponding Source conveyed, and Installation Information provided,
|
|
||||||
in accord with this section must be in a format that is publicly
|
|
||||||
documented (and with an implementation available to the public in
|
|
||||||
source code form), and must require no special password or key for
|
|
||||||
unpacking, reading or copying.
|
|
||||||
|
|
||||||
7. Additional Terms.
|
|
||||||
|
|
||||||
"Additional permissions" are terms that supplement the terms of this
|
|
||||||
License by making exceptions from one or more of its conditions.
|
|
||||||
Additional permissions that are applicable to the entire Program shall
|
|
||||||
be treated as though they were included in this License, to the extent
|
|
||||||
that they are valid under applicable law. If additional permissions
|
|
||||||
apply only to part of the Program, that part may be used separately
|
|
||||||
under those permissions, but the entire Program remains governed by
|
|
||||||
this License without regard to the additional permissions.
|
|
||||||
|
|
||||||
When you convey a copy of a covered work, you may at your option
|
|
||||||
remove any additional permissions from that copy, or from any part of
|
|
||||||
it. (Additional permissions may be written to require their own
|
|
||||||
removal in certain cases when you modify the work.) You may place
|
|
||||||
additional permissions on material, added by you to a covered work,
|
|
||||||
for which you have or can give appropriate copyright permission.
|
|
||||||
|
|
||||||
Notwithstanding any other provision of this License, for material you
|
|
||||||
add to a covered work, you may (if authorized by the copyright holders of
|
|
||||||
that material) supplement the terms of this License with terms:
|
|
||||||
|
|
||||||
a) Disclaiming warranty or limiting liability differently from the
|
|
||||||
terms of sections 15 and 16 of this License; or
|
|
||||||
|
|
||||||
b) Requiring preservation of specified reasonable legal notices or
|
|
||||||
author attributions in that material or in the Appropriate Legal
|
|
||||||
Notices displayed by works containing it; or
|
|
||||||
|
|
||||||
c) Prohibiting misrepresentation of the origin of that material, or
|
|
||||||
requiring that modified versions of such material be marked in
|
|
||||||
reasonable ways as different from the original version; or
|
|
||||||
|
|
||||||
d) Limiting the use for publicity purposes of names of licensors or
|
|
||||||
authors of the material; or
|
|
||||||
|
|
||||||
e) Declining to grant rights under trademark law for use of some
|
|
||||||
trade names, trademarks, or service marks; or
|
|
||||||
|
|
||||||
f) Requiring indemnification of licensors and authors of that
|
|
||||||
material by anyone who conveys the material (or modified versions of
|
|
||||||
it) with contractual assumptions of liability to the recipient, for
|
|
||||||
any liability that these contractual assumptions directly impose on
|
|
||||||
those licensors and authors.
|
|
||||||
|
|
||||||
All other non-permissive additional terms are considered "further
|
|
||||||
restrictions" within the meaning of section 10. If the Program as you
|
|
||||||
received it, or any part of it, contains a notice stating that it is
|
|
||||||
governed by this License along with a term that is a further
|
|
||||||
restriction, you may remove that term. If a license document contains
|
|
||||||
a further restriction but permits relicensing or conveying under this
|
|
||||||
License, you may add to a covered work material governed by the terms
|
|
||||||
of that license document, provided that the further restriction does
|
|
||||||
not survive such relicensing or conveying.
|
|
||||||
|
|
||||||
If you add terms to a covered work in accord with this section, you
|
|
||||||
must place, in the relevant source files, a statement of the
|
|
||||||
additional terms that apply to those files, or a notice indicating
|
|
||||||
where to find the applicable terms.
|
|
||||||
|
|
||||||
Additional terms, permissive or non-permissive, may be stated in the
|
|
||||||
form of a separately written license, or stated as exceptions;
|
|
||||||
the above requirements apply either way.
|
|
||||||
|
|
||||||
8. Termination.
|
|
||||||
|
|
||||||
You may not propagate or modify a covered work except as expressly
|
|
||||||
provided under this License. Any attempt otherwise to propagate or
|
|
||||||
modify it is void, and will automatically terminate your rights under
|
|
||||||
this License (including any patent licenses granted under the third
|
|
||||||
paragraph of section 11).
|
|
||||||
|
|
||||||
However, if you cease all violation of this License, then your
|
|
||||||
license from a particular copyright holder is reinstated (a)
|
|
||||||
provisionally, unless and until the copyright holder explicitly and
|
|
||||||
finally terminates your license, and (b) permanently, if the copyright
|
|
||||||
holder fails to notify you of the violation by some reasonable means
|
|
||||||
prior to 60 days after the cessation.
|
|
||||||
|
|
||||||
Moreover, your license from a particular copyright holder is
|
|
||||||
reinstated permanently if the copyright holder notifies you of the
|
|
||||||
violation by some reasonable means, this is the first time you have
|
|
||||||
received notice of violation of this License (for any work) from that
|
|
||||||
copyright holder, and you cure the violation prior to 30 days after
|
|
||||||
your receipt of the notice.
|
|
||||||
|
|
||||||
Termination of your rights under this section does not terminate the
|
|
||||||
licenses of parties who have received copies or rights from you under
|
|
||||||
this License. If your rights have been terminated and not permanently
|
|
||||||
reinstated, you do not qualify to receive new licenses for the same
|
|
||||||
material under section 10.
|
|
||||||
|
|
||||||
9. Acceptance Not Required for Having Copies.
|
|
||||||
|
|
||||||
You are not required to accept this License in order to receive or
|
|
||||||
run a copy of the Program. Ancillary propagation of a covered work
|
|
||||||
occurring solely as a consequence of using peer-to-peer transmission
|
|
||||||
to receive a copy likewise does not require acceptance. However,
|
|
||||||
nothing other than this License grants you permission to propagate or
|
|
||||||
modify any covered work. These actions infringe copyright if you do
|
|
||||||
not accept this License. Therefore, by modifying or propagating a
|
|
||||||
covered work, you indicate your acceptance of this License to do so.
|
|
||||||
|
|
||||||
10. Automatic Licensing of Downstream Recipients.
|
|
||||||
|
|
||||||
Each time you convey a covered work, the recipient automatically
|
|
||||||
receives a license from the original licensors, to run, modify and
|
|
||||||
propagate that work, subject to this License. You are not responsible
|
|
||||||
for enforcing compliance by third parties with this License.
|
|
||||||
|
|
||||||
An "entity transaction" is a transaction transferring control of an
|
|
||||||
organization, or substantially all assets of one, or subdividing an
|
|
||||||
organization, or merging organizations. If propagation of a covered
|
|
||||||
work results from an entity transaction, each party to that
|
|
||||||
transaction who receives a copy of the work also receives whatever
|
|
||||||
licenses to the work the party's predecessor in interest had or could
|
|
||||||
give under the previous paragraph, plus a right to possession of the
|
|
||||||
Corresponding Source of the work from the predecessor in interest, if
|
|
||||||
the predecessor has it or can get it with reasonable efforts.
|
|
||||||
|
|
||||||
You may not impose any further restrictions on the exercise of the
|
|
||||||
rights granted or affirmed under this License. For example, you may
|
|
||||||
not impose a license fee, royalty, or other charge for exercise of
|
|
||||||
rights granted under this License, and you may not initiate litigation
|
|
||||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
|
||||||
any patent claim is infringed by making, using, selling, offering for
|
|
||||||
sale, or importing the Program or any portion of it.
|
|
||||||
|
|
||||||
11. Patents.
|
|
||||||
|
|
||||||
A "contributor" is a copyright holder who authorizes use under this
|
|
||||||
License of the Program or a work on which the Program is based. The
|
|
||||||
work thus licensed is called the contributor's "contributor version".
|
|
||||||
|
|
||||||
A contributor's "essential patent claims" are all patent claims
|
|
||||||
owned or controlled by the contributor, whether already acquired or
|
|
||||||
hereafter acquired, that would be infringed by some manner, permitted
|
|
||||||
by this License, of making, using, or selling its contributor version,
|
|
||||||
but do not include claims that would be infringed only as a
|
|
||||||
consequence of further modification of the contributor version. For
|
|
||||||
purposes of this definition, "control" includes the right to grant
|
|
||||||
patent sublicenses in a manner consistent with the requirements of
|
|
||||||
this License.
|
|
||||||
|
|
||||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
|
||||||
patent license under the contributor's essential patent claims, to
|
|
||||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
|
||||||
propagate the contents of its contributor version.
|
|
||||||
|
|
||||||
In the following three paragraphs, a "patent license" is any express
|
|
||||||
agreement or commitment, however denominated, not to enforce a patent
|
|
||||||
(such as an express permission to practice a patent or covenant not to
|
|
||||||
sue for patent infringement). To "grant" such a patent license to a
|
|
||||||
party means to make such an agreement or commitment not to enforce a
|
|
||||||
patent against the party.
|
|
||||||
|
|
||||||
If you convey a covered work, knowingly relying on a patent license,
|
|
||||||
and the Corresponding Source of the work is not available for anyone
|
|
||||||
to copy, free of charge and under the terms of this License, through a
|
|
||||||
publicly available network server or other readily accessible means,
|
|
||||||
then you must either (1) cause the Corresponding Source to be so
|
|
||||||
available, or (2) arrange to deprive yourself of the benefit of the
|
|
||||||
patent license for this particular work, or (3) arrange, in a manner
|
|
||||||
consistent with the requirements of this License, to extend the patent
|
|
||||||
license to downstream recipients. "Knowingly relying" means you have
|
|
||||||
actual knowledge that, but for the patent license, your conveying the
|
|
||||||
covered work in a country, or your recipient's use of the covered work
|
|
||||||
in a country, would infringe one or more identifiable patents in that
|
|
||||||
country that you have reason to believe are valid.
|
|
||||||
|
|
||||||
If, pursuant to or in connection with a single transaction or
|
|
||||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
|
||||||
covered work, and grant a patent license to some of the parties
|
|
||||||
receiving the covered work authorizing them to use, propagate, modify
|
|
||||||
or convey a specific copy of the covered work, then the patent license
|
|
||||||
you grant is automatically extended to all recipients of the covered
|
|
||||||
work and works based on it.
|
|
||||||
|
|
||||||
A patent license is "discriminatory" if it does not include within
|
|
||||||
the scope of its coverage, prohibits the exercise of, or is
|
|
||||||
conditioned on the non-exercise of one or more of the rights that are
|
|
||||||
specifically granted under this License. You may not convey a covered
|
|
||||||
work if you are a party to an arrangement with a third party that is
|
|
||||||
in the business of distributing software, under which you make payment
|
|
||||||
to the third party based on the extent of your activity of conveying
|
|
||||||
the work, and under which the third party grants, to any of the
|
|
||||||
parties who would receive the covered work from you, a discriminatory
|
|
||||||
patent license (a) in connection with copies of the covered work
|
|
||||||
conveyed by you (or copies made from those copies), or (b) primarily
|
|
||||||
for and in connection with specific products or compilations that
|
|
||||||
contain the covered work, unless you entered into that arrangement,
|
|
||||||
or that patent license was granted, prior to 28 March 2007.
|
|
||||||
|
|
||||||
Nothing in this License shall be construed as excluding or limiting
|
|
||||||
any implied license or other defenses to infringement that may
|
|
||||||
otherwise be available to you under applicable patent law.
|
|
||||||
|
|
||||||
12. No Surrender of Others' Freedom.
|
|
||||||
|
|
||||||
If conditions are imposed on you (whether by court order, agreement or
|
|
||||||
otherwise) that contradict the conditions of this License, they do not
|
|
||||||
excuse you from the conditions of this License. If you cannot convey a
|
|
||||||
covered work so as to satisfy simultaneously your obligations under this
|
|
||||||
License and any other pertinent obligations, then as a consequence you may
|
|
||||||
not convey it at all. For example, if you agree to terms that obligate you
|
|
||||||
to collect a royalty for further conveying from those to whom you convey
|
|
||||||
the Program, the only way you could satisfy both those terms and this
|
|
||||||
License would be to refrain entirely from conveying the Program.
|
|
||||||
|
|
||||||
13. Remote Network Interaction; Use with the GNU General Public License.
|
|
||||||
|
|
||||||
Notwithstanding any other provision of this License, if you modify the
|
|
||||||
Program, your modified version must prominently offer all users
|
|
||||||
interacting with it remotely through a computer network (if your version
|
|
||||||
supports such interaction) an opportunity to receive the Corresponding
|
|
||||||
Source of your version by providing access to the Corresponding Source
|
|
||||||
from a network server at no charge, through some standard or customary
|
|
||||||
means of facilitating copying of software. This Corresponding Source
|
|
||||||
shall include the Corresponding Source for any work covered by version 3
|
|
||||||
of the GNU General Public License that is incorporated pursuant to the
|
|
||||||
following paragraph.
|
|
||||||
|
|
||||||
Notwithstanding any other provision of this License, you have
|
|
||||||
permission to link or combine any covered work with a work licensed
|
|
||||||
under version 3 of the GNU General Public License into a single
|
|
||||||
combined work, and to convey the resulting work. The terms of this
|
|
||||||
License will continue to apply to the part which is the covered work,
|
|
||||||
but the work with which it is combined will remain governed by version
|
|
||||||
3 of the GNU General Public License.
|
|
||||||
|
|
||||||
14. Revised Versions of this License.
|
|
||||||
|
|
||||||
The Free Software Foundation may publish revised and/or new versions of
|
|
||||||
the GNU Affero General Public License from time to time. Such new versions
|
|
||||||
will be similar in spirit to the present version, but may differ in detail to
|
|
||||||
address new problems or concerns.
|
|
||||||
|
|
||||||
Each version is given a distinguishing version number. If the
|
|
||||||
Program specifies that a certain numbered version of the GNU Affero General
|
|
||||||
Public License "or any later version" applies to it, you have the
|
|
||||||
option of following the terms and conditions either of that numbered
|
|
||||||
version or of any later version published by the Free Software
|
|
||||||
Foundation. If the Program does not specify a version number of the
|
|
||||||
GNU Affero General Public License, you may choose any version ever published
|
|
||||||
by the Free Software Foundation.
|
|
||||||
|
|
||||||
If the Program specifies that a proxy can decide which future
|
|
||||||
versions of the GNU Affero General Public License can be used, that proxy's
|
|
||||||
public statement of acceptance of a version permanently authorizes you
|
|
||||||
to choose that version for the Program.
|
|
||||||
|
|
||||||
Later license versions may give you additional or different
|
|
||||||
permissions. However, no additional obligations are imposed on any
|
|
||||||
author or copyright holder as a result of your choosing to follow a
|
|
||||||
later version.
|
|
||||||
|
|
||||||
15. Disclaimer of Warranty.
|
|
||||||
|
|
||||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
|
||||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
|
||||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
|
||||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
|
||||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
|
||||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
|
||||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
|
||||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
|
||||||
|
|
||||||
16. Limitation of Liability.
|
|
||||||
|
|
||||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
|
||||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
|
||||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
|
||||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
|
||||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
|
||||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
|
||||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
|
||||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
|
||||||
SUCH DAMAGES.
|
|
||||||
|
|
||||||
17. Interpretation of Sections 15 and 16.
|
|
||||||
|
|
||||||
If the disclaimer of warranty and limitation of liability provided
|
|
||||||
above cannot be given local legal effect according to their terms,
|
|
||||||
reviewing courts shall apply local law that most closely approximates
|
|
||||||
an absolute waiver of all civil liability in connection with the
|
|
||||||
Program, unless a warranty or assumption of liability accompanies a
|
|
||||||
copy of the Program in return for a fee.
|
|
||||||
|
|
||||||
END OF TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
How to Apply These Terms to Your New Programs
|
|
||||||
|
|
||||||
If you develop a new program, and you want it to be of the greatest
|
|
||||||
possible use to the public, the best way to achieve this is to make it
|
|
||||||
free software which everyone can redistribute and change under these terms.
|
|
||||||
|
|
||||||
To do so, attach the following notices to the program. It is safest
|
|
||||||
to attach them to the start of each source file to most effectively
|
|
||||||
state the exclusion of warranty; and each file should have at least
|
|
||||||
the "copyright" line and a pointer to where the full notice is found.
|
|
||||||
|
|
||||||
<one line to give the program's name and a brief idea of what it does.>
|
|
||||||
Copyright (C) <year> <name of author>
|
|
||||||
|
|
||||||
This program is free software: you can redistribute it and/or modify
|
|
||||||
it under the terms of the GNU Affero General Public License as published by
|
|
||||||
the Free Software Foundation, either version 3 of the License, or
|
|
||||||
(at your option) any later version.
|
|
||||||
|
|
||||||
This program is distributed in the hope that it will be useful,
|
|
||||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
GNU Affero General Public License for more details.
|
|
||||||
|
|
||||||
You should have received a copy of the GNU Affero General Public License
|
|
||||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
Also add information on how to contact you by electronic and paper mail.
|
|
||||||
|
|
||||||
If your software can interact with users remotely through a computer
|
|
||||||
network, you should also make sure that it provides a way for users to
|
|
||||||
get its source. For example, if your program is a web application, its
|
|
||||||
interface could display a "Source" link that leads users to an archive
|
|
||||||
of the code. There are many ways you could offer source, and different
|
|
||||||
solutions will be better for different programs; see section 13 for the
|
|
||||||
specific requirements.
|
|
||||||
|
|
||||||
You should also get your employer (if you work as a programmer) or school,
|
|
||||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
|
||||||
For more information on this, and how to apply and follow the GNU AGPL, see
|
|
||||||
<https://www.gnu.org/licenses/>.
|
|
@ -1,6 +0,0 @@
|
|||||||
Licensees holding a valid commercial license with Element may use this
|
|
||||||
software in accordance with the terms contained in a written agreement
|
|
||||||
between you and Element.
|
|
||||||
|
|
||||||
To purchase a commercial license please contact our sales team at
|
|
||||||
licensing@element.io
|
|
56
MANIFEST.in
Normal file
56
MANIFEST.in
Normal file
@ -0,0 +1,56 @@
|
|||||||
|
include synctl
|
||||||
|
include LICENSE
|
||||||
|
include VERSION
|
||||||
|
include *.rst
|
||||||
|
include *.md
|
||||||
|
include demo/README
|
||||||
|
include demo/demo.tls.dh
|
||||||
|
include demo/*.py
|
||||||
|
include demo/*.sh
|
||||||
|
|
||||||
|
include synapse/py.typed
|
||||||
|
recursive-include synapse/storage *.sql
|
||||||
|
recursive-include synapse/storage *.sql.postgres
|
||||||
|
recursive-include synapse/storage *.sql.sqlite
|
||||||
|
recursive-include synapse/storage *.py
|
||||||
|
recursive-include synapse/storage *.txt
|
||||||
|
recursive-include synapse/storage *.md
|
||||||
|
|
||||||
|
recursive-include docs *
|
||||||
|
recursive-include scripts *
|
||||||
|
recursive-include scripts-dev *
|
||||||
|
recursive-include synapse *.pyi
|
||||||
|
recursive-include tests *.py
|
||||||
|
recursive-include tests *.pem
|
||||||
|
recursive-include tests *.p8
|
||||||
|
recursive-include tests *.crt
|
||||||
|
recursive-include tests *.key
|
||||||
|
|
||||||
|
recursive-include synapse/res *
|
||||||
|
recursive-include synapse/static *.css
|
||||||
|
recursive-include synapse/static *.gif
|
||||||
|
recursive-include synapse/static *.html
|
||||||
|
recursive-include synapse/static *.js
|
||||||
|
|
||||||
|
exclude .codecov.yml
|
||||||
|
exclude .coveragerc
|
||||||
|
exclude .dockerignore
|
||||||
|
exclude .editorconfig
|
||||||
|
exclude Dockerfile
|
||||||
|
exclude mypy.ini
|
||||||
|
exclude sytest-blacklist
|
||||||
|
exclude test_postgresql.sh
|
||||||
|
|
||||||
|
include book.toml
|
||||||
|
include pyproject.toml
|
||||||
|
recursive-include changelog.d *
|
||||||
|
|
||||||
|
prune .circleci
|
||||||
|
prune .github
|
||||||
|
prune .ci
|
||||||
|
prune contrib
|
||||||
|
prune debian
|
||||||
|
prune demo/etc
|
||||||
|
prune docker
|
||||||
|
prune snap
|
||||||
|
prune stubs
|
526
README.rst
526
README.rst
@ -1,85 +1,153 @@
|
|||||||
.. image:: ./docs/element_logo_white_bg.svg
|
=========================================================================
|
||||||
:height: 60px
|
Synapse |support| |development| |documentation| |license| |pypi| |python|
|
||||||
|
=========================================================================
|
||||||
**Element Synapse - Matrix homeserver implementation**
|
|
||||||
|
|
||||||
|support| |development| |documentation| |license| |pypi| |python|
|
|
||||||
|
|
||||||
Synapse is an open source `Matrix <https://matrix.org>`__ homeserver
|
|
||||||
implementation, written and maintained by `Element <https://element.io>`_.
|
|
||||||
`Matrix <https://github.com/matrix-org>`__ is the open standard for
|
|
||||||
secure and interoperable real time communications. You can directly run
|
|
||||||
and manage the source code in this repository, available under an AGPL
|
|
||||||
license (or alternatively under a commercial license from Element).
|
|
||||||
There is no support provided by Element unless you have a
|
|
||||||
subscription from Element.
|
|
||||||
|
|
||||||
Subscription
|
|
||||||
============
|
|
||||||
|
|
||||||
For those that need an enterprise-ready solution, Element
|
|
||||||
Server Suite (ESS) is `available via subscription <https://element.io/pricing>`_.
|
|
||||||
ESS builds on Synapse to offer a complete Matrix-based backend including the full
|
|
||||||
`Admin Console product <https://element.io/enterprise-functionality/admin-console>`_,
|
|
||||||
giving admins the power to easily manage an organization-wide
|
|
||||||
deployment. It includes advanced identity management, auditing,
|
|
||||||
moderation and data retention options as well as Long Term Support and
|
|
||||||
SLAs. ESS can be used to support any Matrix-based frontend client.
|
|
||||||
|
|
||||||
.. contents::
|
.. contents::
|
||||||
|
|
||||||
🛠️ Installing and configuration
|
Introduction
|
||||||
===============================
|
============
|
||||||
|
|
||||||
The Synapse documentation describes `how to install Synapse <https://element-hq.github.io/synapse/latest/setup/installation.html>`_. We recommend using
|
Matrix is an ambitious new ecosystem for open federated Instant Messaging and
|
||||||
`Docker images <https://element-hq.github.io/synapse/latest/setup/installation.html#docker-images-and-ansible-playbooks>`_ or `Debian packages from Matrix.org
|
VoIP. The basics you need to know to get up and running are:
|
||||||
<https://element-hq.github.io/synapse/latest/setup/installation.html#matrixorg-packages>`_.
|
|
||||||
|
- Everything in Matrix happens in a room. Rooms are distributed and do not
|
||||||
|
exist on any single server. Rooms can be located using convenience aliases
|
||||||
|
like ``#matrix:matrix.org`` or ``#test:localhost:8448``.
|
||||||
|
|
||||||
|
- Matrix user IDs look like ``@matthew:matrix.org`` (although in the future
|
||||||
|
you will normally refer to yourself and others using a third party identifier
|
||||||
|
(3PID): email address, phone number, etc rather than manipulating Matrix user IDs)
|
||||||
|
|
||||||
|
The overall architecture is::
|
||||||
|
|
||||||
|
client <----> homeserver <=====================> homeserver <----> client
|
||||||
|
https://somewhere.org/_matrix https://elsewhere.net/_matrix
|
||||||
|
|
||||||
|
``#matrix:matrix.org`` is the official support room for Matrix, and can be
|
||||||
|
accessed by any client from https://matrix.org/docs/projects/try-matrix-now.html or
|
||||||
|
via IRC bridge at irc://irc.libera.chat/matrix.
|
||||||
|
|
||||||
|
Synapse is currently in rapid development, but as of version 0.5 we believe it
|
||||||
|
is sufficiently stable to be run as an internet-facing service for real usage!
|
||||||
|
|
||||||
|
About Matrix
|
||||||
|
============
|
||||||
|
|
||||||
|
Matrix specifies a set of pragmatic RESTful HTTP JSON APIs as an open standard,
|
||||||
|
which handle:
|
||||||
|
|
||||||
|
- Creating and managing fully distributed chat rooms with no
|
||||||
|
single points of control or failure
|
||||||
|
- Eventually-consistent cryptographically secure synchronisation of room
|
||||||
|
state across a global open network of federated servers and services
|
||||||
|
- Sending and receiving extensible messages in a room with (optional)
|
||||||
|
end-to-end encryption
|
||||||
|
- Inviting, joining, leaving, kicking, banning room members
|
||||||
|
- Managing user accounts (registration, login, logout)
|
||||||
|
- Using 3rd Party IDs (3PIDs) such as email addresses, phone numbers,
|
||||||
|
Facebook accounts to authenticate, identify and discover users on Matrix.
|
||||||
|
- Placing 1:1 VoIP and Video calls
|
||||||
|
|
||||||
|
These APIs are intended to be implemented on a wide range of servers, services
|
||||||
|
and clients, letting developers build messaging and VoIP functionality on top
|
||||||
|
of the entirely open Matrix ecosystem rather than using closed or proprietary
|
||||||
|
solutions. The hope is for Matrix to act as the building blocks for a new
|
||||||
|
generation of fully open and interoperable messaging and VoIP apps for the
|
||||||
|
internet.
|
||||||
|
|
||||||
|
Synapse is a Matrix "homeserver" implementation developed by the matrix.org core
|
||||||
|
team, written in Python 3/Twisted.
|
||||||
|
|
||||||
|
In Matrix, every user runs one or more Matrix clients, which connect through to
|
||||||
|
a Matrix homeserver. The homeserver stores all their personal chat history and
|
||||||
|
user account information - much as a mail client connects through to an
|
||||||
|
IMAP/SMTP server. Just like email, you can either run your own Matrix
|
||||||
|
homeserver and control and own your own communications and history or use one
|
||||||
|
hosted by someone else (e.g. matrix.org) - there is no single point of control
|
||||||
|
or mandatory service provider in Matrix, unlike WhatsApp, Facebook, Hangouts,
|
||||||
|
etc.
|
||||||
|
|
||||||
|
We'd like to invite you to join #matrix:matrix.org (via
|
||||||
|
https://matrix.org/docs/projects/try-matrix-now.html), run a homeserver, take a look
|
||||||
|
at the `Matrix spec <https://matrix.org/docs/spec>`_, and experiment with the
|
||||||
|
`APIs <https://matrix.org/docs/api>`_ and `Client SDKs
|
||||||
|
<https://matrix.org/docs/projects/try-matrix-now.html#client-sdks>`_.
|
||||||
|
|
||||||
|
Thanks for using Matrix!
|
||||||
|
|
||||||
|
Support
|
||||||
|
=======
|
||||||
|
|
||||||
|
For support installing or managing Synapse, please join |room|_ (from a matrix.org
|
||||||
|
account if necessary) and ask questions there. We do not use GitHub issues for
|
||||||
|
support requests, only for bug reports and feature requests.
|
||||||
|
|
||||||
|
Synapse's documentation is `nicely rendered on GitHub Pages <https://matrix-org.github.io/synapse>`_,
|
||||||
|
with its source available in |docs|_.
|
||||||
|
|
||||||
|
.. |room| replace:: ``#synapse:matrix.org``
|
||||||
|
.. _room: https://matrix.to/#/#synapse:matrix.org
|
||||||
|
|
||||||
|
.. |docs| replace:: ``docs``
|
||||||
|
.. _docs: docs
|
||||||
|
|
||||||
|
Synapse Installation
|
||||||
|
====================
|
||||||
|
|
||||||
.. _federation:
|
.. _federation:
|
||||||
|
|
||||||
Synapse has a variety of `config options
|
* For details on how to install synapse, see
|
||||||
<https://element-hq.github.io/synapse/latest/usage/configuration/config_documentation.html>`_
|
`Installation Instructions <https://matrix-org.github.io/synapse/latest/setup/installation.html>`_.
|
||||||
which can be used to customise its behaviour after installation.
|
* For specific details on how to configure Synapse for federation see `docs/federate.md <docs/federate.md>`_
|
||||||
There are additional details on how to `configure Synapse for federation here
|
|
||||||
<https://element-hq.github.io/synapse/latest/federate.html>`_.
|
|
||||||
|
|
||||||
.. _reverse-proxy:
|
|
||||||
|
|
||||||
Using a reverse proxy with Synapse
|
|
||||||
----------------------------------
|
|
||||||
|
|
||||||
It is recommended to put a reverse proxy such as
|
|
||||||
`nginx <https://nginx.org/en/docs/http/ngx_http_proxy_module.html>`_,
|
|
||||||
`Apache <https://httpd.apache.org/docs/current/mod/mod_proxy_http.html>`_,
|
|
||||||
`Caddy <https://caddyserver.com/docs/quick-starts/reverse-proxy>`_,
|
|
||||||
`HAProxy <https://www.haproxy.org/>`_ or
|
|
||||||
`relayd <https://man.openbsd.org/relayd.8>`_ in front of Synapse. One advantage of
|
|
||||||
doing so is that it means that you can expose the default https port (443) to
|
|
||||||
Matrix clients without needing to run Synapse with root privileges.
|
|
||||||
For information on configuring one, see `the reverse proxy docs
|
|
||||||
<https://element-hq.github.io/synapse/latest/reverse_proxy.html>`_.
|
|
||||||
|
|
||||||
Upgrading an existing Synapse
|
|
||||||
-----------------------------
|
|
||||||
|
|
||||||
The instructions for upgrading Synapse are in `the upgrade notes`_.
|
|
||||||
Please check these instructions as upgrading may require extra steps for some
|
|
||||||
versions of Synapse.
|
|
||||||
|
|
||||||
.. _the upgrade notes: https://element-hq.github.io/synapse/develop/upgrade.html
|
|
||||||
|
|
||||||
|
|
||||||
Platform dependencies
|
Connecting to Synapse from a client
|
||||||
---------------------
|
===================================
|
||||||
|
|
||||||
Synapse uses a number of platform dependencies such as Python and PostgreSQL,
|
The easiest way to try out your new Synapse installation is by connecting to it
|
||||||
and aims to follow supported upstream versions. See the
|
from a web client.
|
||||||
`deprecation policy <https://element-hq.github.io/synapse/latest/deprecation_policy.html>`_
|
|
||||||
for more details.
|
|
||||||
|
|
||||||
|
Unless you are running a test instance of Synapse on your local machine, in
|
||||||
|
general, you will need to enable TLS support before you can successfully
|
||||||
|
connect from a client: see
|
||||||
|
`TLS certificates <https://matrix-org.github.io/synapse/latest/setup/installation.html#tls-certificates>`_.
|
||||||
|
|
||||||
|
An easy way to get started is to login or register via Element at
|
||||||
|
https://app.element.io/#/login or https://app.element.io/#/register respectively.
|
||||||
|
You will need to change the server you are logging into from ``matrix.org``
|
||||||
|
and instead specify a Homeserver URL of ``https://<server_name>:8448``
|
||||||
|
(or just ``https://<server_name>`` if you are using a reverse proxy).
|
||||||
|
If you prefer to use another client, refer to our
|
||||||
|
`client breakdown <https://matrix.org/docs/projects/clients-matrix>`_.
|
||||||
|
|
||||||
|
If all goes well you should at least be able to log in, create a room, and
|
||||||
|
start sending messages.
|
||||||
|
|
||||||
|
.. _`client-user-reg`:
|
||||||
|
|
||||||
|
Registering a new user from a client
|
||||||
|
------------------------------------
|
||||||
|
|
||||||
|
By default, registration of new users via Matrix clients is disabled. To enable
|
||||||
|
it, specify ``enable_registration: true`` in ``homeserver.yaml``. (It is then
|
||||||
|
recommended to also set up CAPTCHA - see `<docs/CAPTCHA_SETUP.md>`_.)
|
||||||
|
|
||||||
|
Once ``enable_registration`` is set to ``true``, it is possible to register a
|
||||||
|
user via a Matrix client.
|
||||||
|
|
||||||
|
Your new user name will be formed partly from the ``server_name``, and partly
|
||||||
|
from a localpart you specify when you create the account. Your name will take
|
||||||
|
the form of::
|
||||||
|
|
||||||
|
@localpart:my.domain.name
|
||||||
|
|
||||||
|
(pronounced "at localpart on my dot domain dot name").
|
||||||
|
|
||||||
|
As when logging in, you will need to specify a "Custom server". Specify your
|
||||||
|
desired ``localpart`` in the 'User name' box.
|
||||||
|
|
||||||
Security note
|
Security note
|
||||||
-------------
|
=============
|
||||||
|
|
||||||
Matrix serves raw, user-supplied data in some APIs -- specifically the `content
|
Matrix serves raw, user-supplied data in some APIs -- specifically the `content
|
||||||
repository endpoints`_.
|
repository endpoints`_.
|
||||||
@ -119,91 +187,33 @@ Following this advice ensures that even if an XSS is found in Synapse, the
|
|||||||
impact to other applications will be minimal.
|
impact to other applications will be minimal.
|
||||||
|
|
||||||
|
|
||||||
🧪 Testing a new installation
|
Upgrading an existing Synapse
|
||||||
=============================
|
=============================
|
||||||
|
|
||||||
The easiest way to try out your new Synapse installation is by connecting to it
|
The instructions for upgrading synapse are in `the upgrade notes`_.
|
||||||
from a web client.
|
Please check these instructions as upgrading may require extra steps for some
|
||||||
|
versions of synapse.
|
||||||
|
|
||||||
Unless you are running a test instance of Synapse on your local machine, in
|
.. _the upgrade notes: https://matrix-org.github.io/synapse/develop/upgrade.html
|
||||||
general, you will need to enable TLS support before you can successfully
|
|
||||||
connect from a client: see
|
|
||||||
`TLS certificates <https://element-hq.github.io/synapse/latest/setup/installation.html#tls-certificates>`_.
|
|
||||||
|
|
||||||
An easy way to get started is to login or register via Element at
|
.. _reverse-proxy:
|
||||||
https://app.element.io/#/login or https://app.element.io/#/register respectively.
|
|
||||||
You will need to change the server you are logging into from ``matrix.org``
|
|
||||||
and instead specify a Homeserver URL of ``https://<server_name>:8448``
|
|
||||||
(or just ``https://<server_name>`` if you are using a reverse proxy).
|
|
||||||
If you prefer to use another client, refer to our
|
|
||||||
`client breakdown <https://matrix.org/ecosystem/clients/>`_.
|
|
||||||
|
|
||||||
If all goes well you should at least be able to log in, create a room, and
|
Using a reverse proxy with Synapse
|
||||||
start sending messages.
|
==================================
|
||||||
|
|
||||||
.. _`client-user-reg`:
|
It is recommended to put a reverse proxy such as
|
||||||
|
`nginx <https://nginx.org/en/docs/http/ngx_http_proxy_module.html>`_,
|
||||||
|
`Apache <https://httpd.apache.org/docs/current/mod/mod_proxy_http.html>`_,
|
||||||
|
`Caddy <https://caddyserver.com/docs/quick-starts/reverse-proxy>`_,
|
||||||
|
`HAProxy <https://www.haproxy.org/>`_ or
|
||||||
|
`relayd <https://man.openbsd.org/relayd.8>`_ in front of Synapse. One advantage of
|
||||||
|
doing so is that it means that you can expose the default https port (443) to
|
||||||
|
Matrix clients without needing to run Synapse with root privileges.
|
||||||
|
|
||||||
Registering a new user from a client
|
For information on configuring one, see `<docs/reverse_proxy.md>`_.
|
||||||
------------------------------------
|
|
||||||
|
|
||||||
By default, registration of new users via Matrix clients is disabled. To enable
|
Identity Servers
|
||||||
it:
|
================
|
||||||
|
|
||||||
1. In the
|
|
||||||
`registration config section <https://element-hq.github.io/synapse/latest/usage/configuration/config_documentation.html#registration>`_
|
|
||||||
set ``enable_registration: true`` in ``homeserver.yaml``.
|
|
||||||
2. Then **either**:
|
|
||||||
|
|
||||||
a. set up a `CAPTCHA <https://element-hq.github.io/synapse/latest/CAPTCHA_SETUP.html>`_, or
|
|
||||||
b. set ``enable_registration_without_verification: true`` in ``homeserver.yaml``.
|
|
||||||
|
|
||||||
We **strongly** recommend using a CAPTCHA, particularly if your homeserver is exposed to
|
|
||||||
the public internet. Without it, anyone can freely register accounts on your homeserver.
|
|
||||||
This can be exploited by attackers to create spambots targeting the rest of the Matrix
|
|
||||||
federation.
|
|
||||||
|
|
||||||
Your new user name will be formed partly from the ``server_name``, and partly
|
|
||||||
from a localpart you specify when you create the account. Your name will take
|
|
||||||
the form of::
|
|
||||||
|
|
||||||
@localpart:my.domain.name
|
|
||||||
|
|
||||||
(pronounced "at localpart on my dot domain dot name").
|
|
||||||
|
|
||||||
As when logging in, you will need to specify a "Custom server". Specify your
|
|
||||||
desired ``localpart`` in the 'User name' box.
|
|
||||||
|
|
||||||
🎯 Troubleshooting and support
|
|
||||||
==============================
|
|
||||||
|
|
||||||
🚀 Professional support
|
|
||||||
-----------------------
|
|
||||||
|
|
||||||
Enterprise quality support for Synapse including SLAs is available as part of an
|
|
||||||
`Element Server Suite (ESS) <https://element.io/pricing>`_ subscription.
|
|
||||||
|
|
||||||
If you are an existing ESS subscriber then you can raise a `support request <https://ems.element.io/support>`_
|
|
||||||
and access the `knowledge base <https://ems-docs.element.io>`_.
|
|
||||||
|
|
||||||
🤝 Community support
|
|
||||||
--------------------
|
|
||||||
|
|
||||||
The `Admin FAQ <https://element-hq.github.io/synapse/latest/usage/administration/admin_faq.html>`_
|
|
||||||
includes tips on dealing with some common problems. For more details, see
|
|
||||||
`Synapse's wider documentation <https://element-hq.github.io/synapse/latest/>`_.
|
|
||||||
|
|
||||||
For additional support installing or managing Synapse, please ask in the community
|
|
||||||
support room |room|_ (from a matrix.org account if necessary). We do not use GitHub
|
|
||||||
issues for support requests, only for bug reports and feature requests.
|
|
||||||
|
|
||||||
.. |room| replace:: ``#synapse:matrix.org``
|
|
||||||
.. _room: https://matrix.to/#/#synapse:matrix.org
|
|
||||||
|
|
||||||
.. |docs| replace:: ``docs``
|
|
||||||
.. _docs: docs
|
|
||||||
|
|
||||||
🪪 Identity Servers
|
|
||||||
===================
|
|
||||||
|
|
||||||
Identity servers have the job of mapping email addresses and other 3rd Party
|
Identity servers have the job of mapping email addresses and other 3rd Party
|
||||||
IDs (3PIDs) to Matrix user IDs, as well as verifying the ownership of 3PIDs
|
IDs (3PIDs) to Matrix user IDs, as well as verifying the ownership of 3PIDs
|
||||||
@ -232,43 +242,221 @@ an email address with your account, or send an invite to another user via their
|
|||||||
email address.
|
email address.
|
||||||
|
|
||||||
|
|
||||||
🛠️ Development
|
Password reset
|
||||||
==============
|
==============
|
||||||
|
|
||||||
We welcome contributions to Synapse from the community!
|
Users can reset their password through their client. Alternatively, a server admin
|
||||||
The best place to get started is our
|
can reset a users password using the `admin API <docs/admin_api/user_admin_api.rst#reset-password>`_
|
||||||
`guide for contributors <https://element-hq.github.io/synapse/latest/development/contributing_guide.html>`_.
|
or by directly editing the database as shown below.
|
||||||
This is part of our larger `documentation <https://element-hq.github.io/synapse/latest>`_, which includes
|
|
||||||
|
First calculate the hash of the new password::
|
||||||
|
|
||||||
|
$ ~/synapse/env/bin/hash_password
|
||||||
|
Password:
|
||||||
|
Confirm password:
|
||||||
|
$2a$12$xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
|
||||||
|
|
||||||
|
Then update the ``users`` table in the database::
|
||||||
|
|
||||||
|
UPDATE users SET password_hash='$2a$12$xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
|
||||||
|
WHERE name='@test:test.com';
|
||||||
|
|
||||||
|
|
||||||
|
Synapse Development
|
||||||
|
===================
|
||||||
|
|
||||||
|
The best place to get started is our
|
||||||
|
`guide for contributors <https://matrix-org.github.io/synapse/latest/development/contributing_guide.html>`_.
|
||||||
|
This is part of our larger `documentation <https://matrix-org.github.io/synapse/latest>`_, which includes
|
||||||
|
information for synapse developers as well as synapse administrators.
|
||||||
|
|
||||||
information for Synapse developers as well as Synapse administrators.
|
|
||||||
Developers might be particularly interested in:
|
Developers might be particularly interested in:
|
||||||
|
|
||||||
* `Synapse's database schema <https://element-hq.github.io/synapse/latest/development/database_schema.html>`_,
|
* `Synapse's database schema <https://matrix-org.github.io/synapse/latest/development/database_schema.html>`_,
|
||||||
* `notes on Synapse's implementation details <https://element-hq.github.io/synapse/latest/development/internal_documentation/index.html>`_, and
|
* `notes on Synapse's implementation details <https://matrix-org.github.io/synapse/latest/development/internal_documentation/index.html>`_, and
|
||||||
* `how we use git <https://element-hq.github.io/synapse/latest/development/git.html>`_.
|
* `how we use git <https://matrix-org.github.io/synapse/latest/development/git.html>`_.
|
||||||
|
|
||||||
Alongside all that, join our developer community on Matrix:
|
Alongside all that, join our developer community on Matrix:
|
||||||
`#synapse-dev:matrix.org <https://matrix.to/#/#synapse-dev:matrix.org>`_, featuring real humans!
|
`#synapse-dev:matrix.org <https://matrix.to/#/#synapse-dev:matrix.org>`_, featuring real humans!
|
||||||
|
|
||||||
Copyright and Licensing
|
|
||||||
=======================
|
|
||||||
|
|
||||||
| Copyright 2014-2017 OpenMarket Ltd
|
Quick start
|
||||||
| Copyright 2017 Vector Creations Ltd
|
-----------
|
||||||
| Copyright 2017-2025 New Vector Ltd
|
|
||||||
|
|
|
||||||
|
|
||||||
This software is dual-licensed by New Vector Ltd (Element). It can be used either:
|
Before setting up a development environment for synapse, make sure you have the
|
||||||
|
system dependencies (such as the python header files) installed - see
|
||||||
|
`Platform-specific prerequisites <https://matrix-org.github.io/synapse/latest/setup/installation.html#platform-specific-prerequisites>`_.
|
||||||
|
|
||||||
(1) for free under the terms of the GNU Affero General Public License (as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version); OR
|
To check out a synapse for development, clone the git repo into a working
|
||||||
|
directory of your choice::
|
||||||
|
|
||||||
(2) under the terms of a paid-for Element Commercial License agreement between you and Element (the terms of which may vary depending on what you and Element have agreed to).
|
git clone https://github.com/matrix-org/synapse.git
|
||||||
|
cd synapse
|
||||||
|
|
||||||
Unless required by applicable law or agreed to in writing, software distributed under the Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.
|
Synapse has a number of external dependencies, that are easiest
|
||||||
|
to install using pip and a virtualenv::
|
||||||
|
|
||||||
|
python3 -m venv ./env
|
||||||
|
source ./env/bin/activate
|
||||||
|
pip install -e ".[all,dev]"
|
||||||
|
|
||||||
|
This will run a process of downloading and installing all the needed
|
||||||
|
dependencies into a virtual env. If any dependencies fail to install,
|
||||||
|
try installing the failing modules individually::
|
||||||
|
|
||||||
|
pip install -e "module-name"
|
||||||
|
|
||||||
|
We recommend using the demo which starts 3 federated instances running on ports `8080` - `8082`
|
||||||
|
|
||||||
|
./demo/start.sh
|
||||||
|
|
||||||
|
(to stop, you can use `./demo/stop.sh`)
|
||||||
|
|
||||||
|
If you just want to start a single instance of the app and run it directly::
|
||||||
|
|
||||||
|
# Create the homeserver.yaml config once
|
||||||
|
python -m synapse.app.homeserver \
|
||||||
|
--server-name my.domain.name \
|
||||||
|
--config-path homeserver.yaml \
|
||||||
|
--generate-config \
|
||||||
|
--report-stats=[yes|no]
|
||||||
|
|
||||||
|
# Start the app
|
||||||
|
python -m synapse.app.homeserver --config-path homeserver.yaml
|
||||||
|
|
||||||
|
|
||||||
.. |support| image:: https://img.shields.io/badge/matrix-community%20support-success
|
Running the unit tests
|
||||||
:alt: (get community support in #synapse:matrix.org)
|
----------------------
|
||||||
|
|
||||||
|
After getting up and running, you may wish to run Synapse's unit tests to
|
||||||
|
check that everything is installed correctly::
|
||||||
|
|
||||||
|
trial tests
|
||||||
|
|
||||||
|
This should end with a 'PASSED' result (note that exact numbers will
|
||||||
|
differ)::
|
||||||
|
|
||||||
|
Ran 1337 tests in 716.064s
|
||||||
|
|
||||||
|
PASSED (skips=15, successes=1322)
|
||||||
|
|
||||||
|
For more tips on running the unit tests, like running a specific test or
|
||||||
|
to see the logging output, see the `CONTRIBUTING doc <CONTRIBUTING.md#run-the-unit-tests>`_.
|
||||||
|
|
||||||
|
|
||||||
|
Running the Integration Tests
|
||||||
|
-----------------------------
|
||||||
|
|
||||||
|
Synapse is accompanied by `SyTest <https://github.com/matrix-org/sytest>`_,
|
||||||
|
a Matrix homeserver integration testing suite, which uses HTTP requests to
|
||||||
|
access the API as a Matrix client would. It is able to run Synapse directly from
|
||||||
|
the source tree, so installation of the server is not required.
|
||||||
|
|
||||||
|
Testing with SyTest is recommended for verifying that changes related to the
|
||||||
|
Client-Server API are functioning correctly. See the `SyTest installation
|
||||||
|
instructions <https://github.com/matrix-org/sytest#installing>`_ for details.
|
||||||
|
|
||||||
|
|
||||||
|
Platform dependencies
|
||||||
|
=====================
|
||||||
|
|
||||||
|
Synapse uses a number of platform dependencies such as Python and PostgreSQL,
|
||||||
|
and aims to follow supported upstream versions. See the
|
||||||
|
`<docs/deprecation_policy.md>`_ document for more details.
|
||||||
|
|
||||||
|
|
||||||
|
Troubleshooting
|
||||||
|
===============
|
||||||
|
|
||||||
|
Need help? Join our community support room on Matrix:
|
||||||
|
`#synapse:matrix.org <https://matrix.to/#/#synapse:matrix.org>`_
|
||||||
|
|
||||||
|
Running out of File Handles
|
||||||
|
---------------------------
|
||||||
|
|
||||||
|
If synapse runs out of file handles, it typically fails badly - live-locking
|
||||||
|
at 100% CPU, and/or failing to accept new TCP connections (blocking the
|
||||||
|
connecting client). Matrix currently can legitimately use a lot of file handles,
|
||||||
|
thanks to busy rooms like #matrix:matrix.org containing hundreds of participating
|
||||||
|
servers. The first time a server talks in a room it will try to connect
|
||||||
|
simultaneously to all participating servers, which could exhaust the available
|
||||||
|
file descriptors between DNS queries & HTTPS sockets, especially if DNS is slow
|
||||||
|
to respond. (We need to improve the routing algorithm used to be better than
|
||||||
|
full mesh, but as of March 2019 this hasn't happened yet).
|
||||||
|
|
||||||
|
If you hit this failure mode, we recommend increasing the maximum number of
|
||||||
|
open file handles to be at least 4096 (assuming a default of 1024 or 256).
|
||||||
|
This is typically done by editing ``/etc/security/limits.conf``
|
||||||
|
|
||||||
|
Separately, Synapse may leak file handles if inbound HTTP requests get stuck
|
||||||
|
during processing - e.g. blocked behind a lock or talking to a remote server etc.
|
||||||
|
This is best diagnosed by matching up the 'Received request' and 'Processed request'
|
||||||
|
log lines and looking for any 'Processed request' lines which take more than
|
||||||
|
a few seconds to execute. Please let us know at #synapse:matrix.org if
|
||||||
|
you see this failure mode so we can help debug it, however.
|
||||||
|
|
||||||
|
Help!! Synapse is slow and eats all my RAM/CPU!
|
||||||
|
-----------------------------------------------
|
||||||
|
|
||||||
|
First, ensure you are running the latest version of Synapse, using Python 3
|
||||||
|
with a PostgreSQL database.
|
||||||
|
|
||||||
|
Synapse's architecture is quite RAM hungry currently - we deliberately
|
||||||
|
cache a lot of recent room data and metadata in RAM in order to speed up
|
||||||
|
common requests. We'll improve this in the future, but for now the easiest
|
||||||
|
way to either reduce the RAM usage (at the risk of slowing things down)
|
||||||
|
is to set the almost-undocumented ``SYNAPSE_CACHE_FACTOR`` environment
|
||||||
|
variable. The default is 0.5, which can be decreased to reduce RAM usage
|
||||||
|
in memory constrained enviroments, or increased if performance starts to
|
||||||
|
degrade.
|
||||||
|
|
||||||
|
However, degraded performance due to a low cache factor, common on
|
||||||
|
machines with slow disks, often leads to explosions in memory use due
|
||||||
|
backlogged requests. In this case, reducing the cache factor will make
|
||||||
|
things worse. Instead, try increasing it drastically. 2.0 is a good
|
||||||
|
starting value.
|
||||||
|
|
||||||
|
Using `libjemalloc <http://jemalloc.net/>`_ can also yield a significant
|
||||||
|
improvement in overall memory use, and especially in terms of giving back
|
||||||
|
RAM to the OS. To use it, the library must simply be put in the
|
||||||
|
LD_PRELOAD environment variable when launching Synapse. On Debian, this
|
||||||
|
can be done by installing the ``libjemalloc1`` package and adding this
|
||||||
|
line to ``/etc/default/matrix-synapse``::
|
||||||
|
|
||||||
|
LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libjemalloc.so.1
|
||||||
|
|
||||||
|
This can make a significant difference on Python 2.7 - it's unclear how
|
||||||
|
much of an improvement it provides on Python 3.x.
|
||||||
|
|
||||||
|
If you're encountering high CPU use by the Synapse process itself, you
|
||||||
|
may be affected by a bug with presence tracking that leads to a
|
||||||
|
massive excess of outgoing federation requests (see `discussion
|
||||||
|
<https://github.com/matrix-org/synapse/issues/3971>`_). If metrics
|
||||||
|
indicate that your server is also issuing far more outgoing federation
|
||||||
|
requests than can be accounted for by your users' activity, this is a
|
||||||
|
likely cause. The misbehavior can be worked around by setting
|
||||||
|
the following in the Synapse config file:
|
||||||
|
|
||||||
|
.. code-block:: yaml
|
||||||
|
|
||||||
|
presence:
|
||||||
|
enabled: false
|
||||||
|
|
||||||
|
People can't accept room invitations from me
|
||||||
|
--------------------------------------------
|
||||||
|
|
||||||
|
The typical failure mode here is that you send an invitation to someone
|
||||||
|
to join a room or direct chat, but when they go to accept it, they get an
|
||||||
|
error (typically along the lines of "Invalid signature"). They might see
|
||||||
|
something like the following in their logs::
|
||||||
|
|
||||||
|
2019-09-11 19:32:04,271 - synapse.federation.transport.server - 288 - WARNING - GET-11752 - authenticate_request failed: 401: Invalid signature for server <server> with key ed25519:a_EqML: Unable to verify signature for <server>
|
||||||
|
|
||||||
|
This is normally caused by a misconfiguration in your reverse-proxy. See
|
||||||
|
`<docs/reverse_proxy.md>`_ and double-check that your settings are correct.
|
||||||
|
|
||||||
|
.. |support| image:: https://img.shields.io/matrix/synapse:matrix.org?label=support&logo=matrix
|
||||||
|
:alt: (get support on #synapse:matrix.org)
|
||||||
:target: https://matrix.to/#/#synapse:matrix.org
|
:target: https://matrix.to/#/#synapse:matrix.org
|
||||||
|
|
||||||
.. |development| image:: https://img.shields.io/matrix/synapse-dev:matrix.org?label=development&logo=matrix
|
.. |development| image:: https://img.shields.io/matrix/synapse-dev:matrix.org?label=development&logo=matrix
|
||||||
@ -277,9 +465,9 @@ Unless required by applicable law or agreed to in writing, software distributed
|
|||||||
|
|
||||||
.. |documentation| image:: https://img.shields.io/badge/documentation-%E2%9C%93-success
|
.. |documentation| image:: https://img.shields.io/badge/documentation-%E2%9C%93-success
|
||||||
:alt: (Rendered documentation on GitHub Pages)
|
:alt: (Rendered documentation on GitHub Pages)
|
||||||
:target: https://element-hq.github.io/synapse/latest/
|
:target: https://matrix-org.github.io/synapse/latest/
|
||||||
|
|
||||||
.. |license| image:: https://img.shields.io/github/license/element-hq/synapse
|
.. |license| image:: https://img.shields.io/github/license/matrix-org/synapse
|
||||||
:alt: (check license in LICENSE file)
|
:alt: (check license in LICENSE file)
|
||||||
:target: LICENSE
|
:target: LICENSE
|
||||||
|
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
Upgrading Synapse
|
Upgrading Synapse
|
||||||
=================
|
=================
|
||||||
|
|
||||||
This document has moved to the `Synapse documentation website <https://element-hq.github.io/synapse/latest/upgrade>`_.
|
This document has moved to the `Synapse documentation website <https://matrix-org.github.io/synapse/latest/upgrade>`_.
|
||||||
Please update your links.
|
Please update your links.
|
||||||
|
|
||||||
The markdown source is available in `docs/upgrade.md <docs/upgrade.md>`_.
|
The markdown source is available in `docs/upgrade.md <docs/upgrade.md>`_.
|
||||||
|
14
book.toml
14
book.toml
@ -16,14 +16,14 @@ create-missing = false
|
|||||||
|
|
||||||
[output.html]
|
[output.html]
|
||||||
# The URL visitors will be directed to when they try to edit a page
|
# The URL visitors will be directed to when they try to edit a page
|
||||||
edit-url-template = "https://github.com/element-hq/synapse/edit/develop/{path}"
|
edit-url-template = "https://github.com/matrix-org/synapse/edit/develop/{path}"
|
||||||
|
|
||||||
# Remove the numbers that appear before each item in the sidebar, as they can
|
# Remove the numbers that appear before each item in the sidebar, as they can
|
||||||
# get quite messy as we nest deeper
|
# get quite messy as we nest deeper
|
||||||
no-section-label = true
|
no-section-label = true
|
||||||
|
|
||||||
# The source code URL of the repository
|
# The source code URL of the repository
|
||||||
git-repository-url = "https://github.com/element-hq/synapse"
|
git-repository-url = "https://github.com/matrix-org/synapse"
|
||||||
|
|
||||||
# The path that the docs are hosted on
|
# The path that the docs are hosted on
|
||||||
site-url = "/synapse/"
|
site-url = "/synapse/"
|
||||||
@ -34,14 +34,6 @@ additional-css = [
|
|||||||
"docs/website_files/table-of-contents.css",
|
"docs/website_files/table-of-contents.css",
|
||||||
"docs/website_files/remove-nav-buttons.css",
|
"docs/website_files/remove-nav-buttons.css",
|
||||||
"docs/website_files/indent-section-headers.css",
|
"docs/website_files/indent-section-headers.css",
|
||||||
"docs/website_files/version-picker.css",
|
|
||||||
]
|
|
||||||
additional-js = [
|
|
||||||
"docs/website_files/table-of-contents.js",
|
|
||||||
"docs/website_files/version-picker.js",
|
|
||||||
"docs/website_files/version.js",
|
|
||||||
]
|
]
|
||||||
|
additional-js = ["docs/website_files/table-of-contents.js"]
|
||||||
theme = "docs/website_files/theme"
|
theme = "docs/website_files/theme"
|
||||||
|
|
||||||
[preprocessor.schema_versions]
|
|
||||||
command = "./scripts-dev/schema_versions.py"
|
|
||||||
|
@ -1,42 +0,0 @@
|
|||||||
# A build script for poetry that adds the rust extension.
|
|
||||||
|
|
||||||
import itertools
|
|
||||||
import os
|
|
||||||
from typing import Any, Dict
|
|
||||||
|
|
||||||
from packaging.specifiers import SpecifierSet
|
|
||||||
from setuptools_rust import Binding, RustExtension
|
|
||||||
|
|
||||||
|
|
||||||
def build(setup_kwargs: Dict[str, Any]) -> None:
|
|
||||||
original_project_dir = os.path.dirname(os.path.realpath(__file__))
|
|
||||||
cargo_toml_path = os.path.join(original_project_dir, "rust", "Cargo.toml")
|
|
||||||
|
|
||||||
extension = RustExtension(
|
|
||||||
target="synapse.synapse_rust",
|
|
||||||
path=cargo_toml_path,
|
|
||||||
binding=Binding.PyO3,
|
|
||||||
# This flag is a no-op in the latest versions. Instead, we need to
|
|
||||||
# specify this in the `bdist_wheel` config below.
|
|
||||||
py_limited_api=True,
|
|
||||||
# We force always building in release mode, as we can't tell the
|
|
||||||
# difference between using `poetry` in development vs production.
|
|
||||||
debug=False,
|
|
||||||
)
|
|
||||||
setup_kwargs.setdefault("rust_extensions", []).append(extension)
|
|
||||||
setup_kwargs["zip_safe"] = False
|
|
||||||
|
|
||||||
# We lookup the minimum supported python version by looking at
|
|
||||||
# `python_requires` (e.g. ">=3.9.0,<4.0.0") and finding the first python
|
|
||||||
# version that matches. We then convert that into the `py_limited_api` form,
|
|
||||||
# e.g. cp39 for python 3.9.
|
|
||||||
py_limited_api: str
|
|
||||||
python_bounds = SpecifierSet(setup_kwargs["python_requires"])
|
|
||||||
for minor_version in itertools.count(start=8):
|
|
||||||
if f"3.{minor_version}.0" in python_bounds:
|
|
||||||
py_limited_api = f"cp3{minor_version}"
|
|
||||||
break
|
|
||||||
|
|
||||||
setup_kwargs.setdefault("options", {}).setdefault("bdist_wheel", {})[
|
|
||||||
"py_limited_api"
|
|
||||||
] = py_limited_api
|
|
@ -1 +0,0 @@
|
|||||||
Add plain-text handling for rich-text topics as per [MSC3765](https://github.com/matrix-org/matrix-spec-proposals/pull/3765).
|
|
@ -1 +0,0 @@
|
|||||||
Add experimental support for [MSC4277](https://github.com/matrix-org/matrix-spec-proposals/pull/4277).
|
|
@ -1 +0,0 @@
|
|||||||
Add ability to limit amount uploaded by a user in a given time period.
|
|
@ -1 +0,0 @@
|
|||||||
Allow user registrations to be done on workers.
|
|
@ -1 +0,0 @@
|
|||||||
Remove unnecessary HTTP replication calls.
|
|
@ -1 +0,0 @@
|
|||||||
Unbreak "Latest dependencies" workflow by using the `--without dev` poetry option instead of removed `--no-dev`.
|
|
@ -1 +0,0 @@
|
|||||||
Use `markdown-it-py` instead of `commonmark` in the release script.
|
|
@ -1 +0,0 @@
|
|||||||
Add doc comment explaining that config files are shallowly merged.
|
|
@ -1 +0,0 @@
|
|||||||
Minor speed up of insertion into `stream_positions` table.
|
|
@ -1,30 +1,21 @@
|
|||||||
#!/usr/bin/env python
|
#!/usr/bin/env python
|
||||||
|
|
||||||
#
|
|
||||||
# This file is licensed under the Affero General Public License (AGPL) version 3.
|
|
||||||
#
|
|
||||||
# Copyright 2014-2016 OpenMarket Ltd
|
# Copyright 2014-2016 OpenMarket Ltd
|
||||||
# Copyright (C) 2023 New Vector, Ltd
|
|
||||||
#
|
#
|
||||||
# This program is free software: you can redistribute it and/or modify
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
# it under the terms of the GNU Affero General Public License as
|
# you may not use this file except in compliance with the License.
|
||||||
# published by the Free Software Foundation, either version 3 of the
|
# You may obtain a copy of the License at
|
||||||
# License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# See the GNU Affero General Public License for more details:
|
|
||||||
# <https://www.gnu.org/licenses/agpl-3.0.html>.
|
|
||||||
#
|
|
||||||
# Originally licensed under the Apache License, Version 2.0:
|
|
||||||
# <http://www.apache.org/licenses/LICENSE-2.0>.
|
|
||||||
#
|
|
||||||
# [This file includes modifications made by New Vector Limited]
|
|
||||||
#
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
#
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
""" Starts a synapse client console. """
|
""" Starts a synapse client console. """
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
import binascii
|
|
||||||
import cmd
|
import cmd
|
||||||
import getpass
|
import getpass
|
||||||
import json
|
import json
|
||||||
@ -35,8 +26,9 @@ import urllib
|
|||||||
from http import TwistedHttpClient
|
from http import TwistedHttpClient
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
|
import nacl.encoding
|
||||||
|
import nacl.signing
|
||||||
import urlparse
|
import urlparse
|
||||||
from signedjson.key import NACL_ED25519, decode_verify_key_bytes
|
|
||||||
from signedjson.sign import SignatureVerifyException, verify_signed_json
|
from signedjson.sign import SignatureVerifyException, verify_signed_json
|
||||||
|
|
||||||
from twisted.internet import defer, reactor, threads
|
from twisted.internet import defer, reactor, threads
|
||||||
@ -49,6 +41,7 @@ TRUSTED_ID_SERVERS = ["localhost:8001"]
|
|||||||
|
|
||||||
|
|
||||||
class SynapseCmd(cmd.Cmd):
|
class SynapseCmd(cmd.Cmd):
|
||||||
|
|
||||||
"""Basic synapse command-line processor.
|
"""Basic synapse command-line processor.
|
||||||
|
|
||||||
This processes commands from the user and calls the relevant HTTP methods.
|
This processes commands from the user and calls the relevant HTTP methods.
|
||||||
@ -245,7 +238,7 @@ class SynapseCmd(cmd.Cmd):
|
|||||||
|
|
||||||
if "flows" not in json_res:
|
if "flows" not in json_res:
|
||||||
print("Failed to find any login flows.")
|
print("Failed to find any login flows.")
|
||||||
return False
|
defer.returnValue(False)
|
||||||
|
|
||||||
flow = json_res["flows"][0] # assume first is the one we want.
|
flow = json_res["flows"][0] # assume first is the one we want.
|
||||||
if "type" not in flow or "m.login.password" != flow["type"] or "stages" in flow:
|
if "type" not in flow or "m.login.password" != flow["type"] or "stages" in flow:
|
||||||
@ -254,8 +247,8 @@ class SynapseCmd(cmd.Cmd):
|
|||||||
"Unable to login via the command line client. Please visit "
|
"Unable to login via the command line client. Please visit "
|
||||||
"%s to login." % fallback_url
|
"%s to login." % fallback_url
|
||||||
)
|
)
|
||||||
return False
|
defer.returnValue(False)
|
||||||
return True
|
defer.returnValue(True)
|
||||||
|
|
||||||
def do_emailrequest(self, line):
|
def do_emailrequest(self, line):
|
||||||
"""Requests the association of a third party identifier
|
"""Requests the association of a third party identifier
|
||||||
@ -427,8 +420,8 @@ class SynapseCmd(cmd.Cmd):
|
|||||||
pubKey = None
|
pubKey = None
|
||||||
pubKeyObj = yield self.http_client.do_request("GET", url)
|
pubKeyObj = yield self.http_client.do_request("GET", url)
|
||||||
if "public_key" in pubKeyObj:
|
if "public_key" in pubKeyObj:
|
||||||
pubKey = decode_verify_key_bytes(
|
pubKey = nacl.signing.VerifyKey(
|
||||||
NACL_ED25519, binascii.unhexlify(pubKeyObj["public_key"])
|
pubKeyObj["public_key"], encoder=nacl.encoding.HexEncoder
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
print("No public key found in pubkey response!")
|
print("No public key found in pubkey response!")
|
||||||
@ -777,7 +770,7 @@ def main(server_url, identity_server_url, username, token, config_path):
|
|||||||
global CONFIG_JSON
|
global CONFIG_JSON
|
||||||
CONFIG_JSON = config_path # bit cheeky, but just overwrite the global
|
CONFIG_JSON = config_path # bit cheeky, but just overwrite the global
|
||||||
try:
|
try:
|
||||||
with open(config_path) as config:
|
with open(config_path, "r") as config:
|
||||||
syn_cmd.config = json.load(config)
|
syn_cmd.config = json.load(config)
|
||||||
try:
|
try:
|
||||||
http_client.verbose = "on" == syn_cmd.config["verbose"]
|
http_client.verbose = "on" == syn_cmd.config["verbose"]
|
||||||
|
@ -1,23 +1,16 @@
|
|||||||
#
|
|
||||||
# This file is licensed under the Affero General Public License (AGPL) version 3.
|
|
||||||
#
|
|
||||||
# Copyright 2014-2016 OpenMarket Ltd
|
# Copyright 2014-2016 OpenMarket Ltd
|
||||||
# Copyright (C) 2023 New Vector, Ltd
|
|
||||||
#
|
#
|
||||||
# This program is free software: you can redistribute it and/or modify
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
# it under the terms of the GNU Affero General Public License as
|
# you may not use this file except in compliance with the License.
|
||||||
# published by the Free Software Foundation, either version 3 of the
|
# You may obtain a copy of the License at
|
||||||
# License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# See the GNU Affero General Public License for more details:
|
|
||||||
# <https://www.gnu.org/licenses/agpl-3.0.html>.
|
|
||||||
#
|
|
||||||
# Originally licensed under the Apache License, Version 2.0:
|
|
||||||
# <http://www.apache.org/licenses/LICENSE-2.0>.
|
|
||||||
#
|
|
||||||
# [This file includes modifications made by New Vector Limited]
|
|
||||||
#
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
#
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
import json
|
import json
|
||||||
import urllib
|
import urllib
|
||||||
@ -44,6 +37,7 @@ class HttpClient:
|
|||||||
Deferred: Succeeds when we get a 2xx HTTP response. The result
|
Deferred: Succeeds when we get a 2xx HTTP response. The result
|
||||||
will be the decoded JSON body.
|
will be the decoded JSON body.
|
||||||
"""
|
"""
|
||||||
|
pass
|
||||||
|
|
||||||
def get_json(self, url, args=None):
|
def get_json(self, url, args=None):
|
||||||
"""Gets some json from the given host homeserver and path
|
"""Gets some json from the given host homeserver and path
|
||||||
@ -59,6 +53,7 @@ class HttpClient:
|
|||||||
Deferred: Succeeds when we get a 2xx HTTP response. The result
|
Deferred: Succeeds when we get a 2xx HTTP response. The result
|
||||||
will be the decoded JSON body.
|
will be the decoded JSON body.
|
||||||
"""
|
"""
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
class TwistedHttpClient(HttpClient):
|
class TwistedHttpClient(HttpClient):
|
||||||
@ -78,7 +73,7 @@ class TwistedHttpClient(HttpClient):
|
|||||||
url, data, headers_dict={"Content-Type": ["application/json"]}
|
url, data, headers_dict={"Content-Type": ["application/json"]}
|
||||||
)
|
)
|
||||||
body = yield readBody(response)
|
body = yield readBody(response)
|
||||||
return response.code, body
|
defer.returnValue((response.code, body))
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def get_json(self, url, args=None):
|
def get_json(self, url, args=None):
|
||||||
@ -88,7 +83,7 @@ class TwistedHttpClient(HttpClient):
|
|||||||
url = "%s?%s" % (url, qs)
|
url = "%s?%s" % (url, qs)
|
||||||
response = yield self._create_get_request(url)
|
response = yield self._create_get_request(url)
|
||||||
body = yield readBody(response)
|
body = yield readBody(response)
|
||||||
return json.loads(body)
|
defer.returnValue(json.loads(body))
|
||||||
|
|
||||||
def _create_put_request(self, url, json_data, headers_dict: Optional[dict] = None):
|
def _create_put_request(self, url, json_data, headers_dict: Optional[dict] = None):
|
||||||
"""Wrapper of _create_request to issue a PUT request"""
|
"""Wrapper of _create_request to issue a PUT request"""
|
||||||
@ -134,7 +129,7 @@ class TwistedHttpClient(HttpClient):
|
|||||||
response = yield self._create_request(method, url)
|
response = yield self._create_request(method, url)
|
||||||
|
|
||||||
body = yield readBody(response)
|
body = yield readBody(response)
|
||||||
return json.loads(body)
|
defer.returnValue(json.loads(body))
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _create_request(
|
def _create_request(
|
||||||
@ -173,7 +168,7 @@ class TwistedHttpClient(HttpClient):
|
|||||||
if self.verbose:
|
if self.verbose:
|
||||||
print("Status %s %s" % (response.code, response.phrase))
|
print("Status %s %s" % (response.code, response.phrase))
|
||||||
print(pformat(list(response.headers.getAllRawHeaders())))
|
print(pformat(list(response.headers.getAllRawHeaders())))
|
||||||
return response
|
defer.returnValue(response)
|
||||||
|
|
||||||
def sleep(self, seconds):
|
def sleep(self, seconds):
|
||||||
d = defer.Deferred()
|
d = defer.Deferred()
|
||||||
|
@ -1,28 +0,0 @@
|
|||||||
# Schema symlinks
|
|
||||||
|
|
||||||
This directory contains symlinks to the latest dump of the postgres full schema. This is useful to have, as it allows IDEs to understand our schema and provide autocomplete, linters, inspections, etc.
|
|
||||||
|
|
||||||
In particular, the DataGrip functionality in IntelliJ's products seems to only consider files called `*.sql` when defining a schema from DDL; `*.sql.postgres` will be ignored. To get around this we symlink those files to ones ending in `.sql`. We've chosen to ignore the `.sql.sqlite` schema dumps here, as they're not intended for production use (and are much quicker to test against).
|
|
||||||
|
|
||||||
## Example
|
|
||||||

|
|
||||||
|
|
||||||
## Caveats
|
|
||||||
|
|
||||||
- Doesn't include temporary tables created ad-hoc by Synapse.
|
|
||||||
- Postgres only. IDEs will likely be confused by SQLite-specific queries.
|
|
||||||
- Will not include migrations created after the latest schema dump.
|
|
||||||
- Symlinks might confuse checkouts on Windows systems.
|
|
||||||
|
|
||||||
## Instructions
|
|
||||||
|
|
||||||
### Jetbrains IDEs with DataGrip plugin
|
|
||||||
|
|
||||||
- View -> Tool Windows -> Database
|
|
||||||
- `+` Icon -> DDL Data Source
|
|
||||||
- Pick a name, e.g. `Synapse schema dump`
|
|
||||||
- Under sources, click `+`.
|
|
||||||
- Add an entry with Path pointing to this directory, and dialect set to PostgreSQL.
|
|
||||||
- OK, and OK.
|
|
||||||
- IDE should now be aware of the schema.
|
|
||||||
- Try control-clicking on a table name in a bit of SQL e.g. in `_get_forgotten_rooms_for_user_txn`.
|
|
@ -1 +0,0 @@
|
|||||||
../../synapse/storage/schema/common/full_schemas/72/full.sql.postgres
|
|
Binary file not shown.
Before Width: | Height: | Size: 13 KiB |
@ -1 +0,0 @@
|
|||||||
../../synapse/storage/schema/main/full_schemas/72/full.sql.postgres
|
|
@ -1 +0,0 @@
|
|||||||
../../synapse/storage/schema/common/schema_version.sql
|
|
@ -1 +0,0 @@
|
|||||||
../../synapse/storage/schema/state/full_schemas/72/full.sql.postgres
|
|
@ -30,6 +30,3 @@ docker-compose up -d
|
|||||||
### More information
|
### More information
|
||||||
|
|
||||||
For more information on required environment variables and mounts, see the main docker documentation at [/docker/README.md](../../docker/README.md)
|
For more information on required environment variables and mounts, see the main docker documentation at [/docker/README.md](../../docker/README.md)
|
||||||
|
|
||||||
**For a more comprehensive Docker Compose example showcasing a full Matrix 2.0 stack, please see
|
|
||||||
https://github.com/element-hq/element-docker-demo**
|
|
@ -14,7 +14,6 @@ services:
|
|||||||
# failure
|
# failure
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
# See the readme for a full documentation of the environment settings
|
# See the readme for a full documentation of the environment settings
|
||||||
# NOTE: You must edit homeserver.yaml to use postgres, it defaults to sqlite
|
|
||||||
environment:
|
environment:
|
||||||
- SYNAPSE_CONFIG_PATH=/data/homeserver.yaml
|
- SYNAPSE_CONFIG_PATH=/data/homeserver.yaml
|
||||||
volumes:
|
volumes:
|
||||||
@ -51,13 +50,13 @@ services:
|
|||||||
- traefik.http.routers.https-synapse.tls.certResolver=le-ssl
|
- traefik.http.routers.https-synapse.tls.certResolver=le-ssl
|
||||||
|
|
||||||
db:
|
db:
|
||||||
image: docker.io/postgres:15-alpine
|
image: docker.io/postgres:12-alpine
|
||||||
# Change that password, of course!
|
# Change that password, of course!
|
||||||
environment:
|
environment:
|
||||||
- POSTGRES_USER=synapse
|
- POSTGRES_USER=synapse
|
||||||
- POSTGRES_PASSWORD=changeme
|
- POSTGRES_PASSWORD=changeme
|
||||||
# ensure the database gets created correctly
|
# ensure the database gets created correctly
|
||||||
# https://element-hq.github.io/synapse/latest/postgres.html#set-up-database
|
# https://matrix-org.github.io/synapse/latest/postgres.html#set-up-database
|
||||||
- POSTGRES_INITDB_ARGS=--encoding=UTF-8 --lc-collate=C --lc-ctype=C
|
- POSTGRES_INITDB_ARGS=--encoding=UTF-8 --lc-collate=C --lc-ctype=C
|
||||||
volumes:
|
volumes:
|
||||||
# You may store the database tables in a local folder..
|
# You may store the database tables in a local folder..
|
||||||
|
@ -1,119 +0,0 @@
|
|||||||
# Setting up Synapse with Workers using Docker Compose
|
|
||||||
|
|
||||||
This directory describes how deploy and manage Synapse and workers via [Docker Compose](https://docs.docker.com/compose/).
|
|
||||||
|
|
||||||
Example worker configuration files can be found [here](workers).
|
|
||||||
|
|
||||||
All examples and snippets assume that your Synapse service is called `synapse` in your Docker Compose file.
|
|
||||||
|
|
||||||
An example Docker Compose file can be found [here](docker-compose.yaml).
|
|
||||||
|
|
||||||
**For a more comprehensive Docker Compose example, showcasing a full Matrix 2.0 stack (originally based on this
|
|
||||||
docker-compose.yaml), please see https://github.com/element-hq/element-docker-demo**
|
|
||||||
|
|
||||||
## Worker Service Examples in Docker Compose
|
|
||||||
|
|
||||||
In order to start the Synapse container as a worker, you must specify an `entrypoint` that loads both the `homeserver.yaml` and the configuration for the worker (`synapse-generic-worker-1.yaml` in the example below). You must also include the worker type in the environment variable `SYNAPSE_WORKER` or alternatively pass `-m synapse.app.generic_worker` as part of the `entrypoint` after `"/start.py", "run"`).
|
|
||||||
|
|
||||||
### Generic Worker Example
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
synapse-generic-worker-1:
|
|
||||||
image: matrixdotorg/synapse:latest
|
|
||||||
container_name: synapse-generic-worker-1
|
|
||||||
restart: unless-stopped
|
|
||||||
entrypoint: ["/start.py", "run", "--config-path=/data/homeserver.yaml", "--config-path=/data/workers/synapse-generic-worker-1.yaml"]
|
|
||||||
healthcheck:
|
|
||||||
test: ["CMD-SHELL", "curl -fSs http://localhost:8081/health || exit 1"]
|
|
||||||
start_period: "5s"
|
|
||||||
interval: "15s"
|
|
||||||
timeout: "5s"
|
|
||||||
volumes:
|
|
||||||
- ${VOLUME_PATH}/data:/data:rw # Replace VOLUME_PATH with the path to your Synapse volume
|
|
||||||
environment:
|
|
||||||
SYNAPSE_WORKER: synapse.app.generic_worker
|
|
||||||
# Expose port if required so your reverse proxy can send requests to this worker
|
|
||||||
# Port configuration will depend on how the http listener is defined in the worker configuration file
|
|
||||||
ports:
|
|
||||||
- 8081:8081
|
|
||||||
depends_on:
|
|
||||||
- synapse
|
|
||||||
```
|
|
||||||
|
|
||||||
### Federation Sender Example
|
|
||||||
|
|
||||||
Please note: The federation sender does not receive REST API calls so no exposed ports are required.
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
synapse-federation-sender-1:
|
|
||||||
image: matrixdotorg/synapse:latest
|
|
||||||
container_name: synapse-federation-sender-1
|
|
||||||
restart: unless-stopped
|
|
||||||
entrypoint: ["/start.py", "run", "--config-path=/data/homeserver.yaml", "--config-path=/data/workers/synapse-federation-sender-1.yaml"]
|
|
||||||
healthcheck:
|
|
||||||
disable: true
|
|
||||||
volumes:
|
|
||||||
- ${VOLUME_PATH}/data:/data:rw # Replace VOLUME_PATH with the path to your Synapse volume
|
|
||||||
environment:
|
|
||||||
SYNAPSE_WORKER: synapse.app.federation_sender
|
|
||||||
depends_on:
|
|
||||||
- synapse
|
|
||||||
```
|
|
||||||
|
|
||||||
## `homeserver.yaml` Configuration
|
|
||||||
|
|
||||||
### Enable Redis
|
|
||||||
|
|
||||||
Locate the `redis` section of your `homeserver.yaml` and enable and configure it:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
redis:
|
|
||||||
enabled: true
|
|
||||||
host: redis
|
|
||||||
port: 6379
|
|
||||||
# dbid: <redis_logical_db_id>
|
|
||||||
# password: <secret_password>
|
|
||||||
# use_tls: True
|
|
||||||
# certificate_file: <path_to_certificate>
|
|
||||||
# private_key_file: <path_to_private_key>
|
|
||||||
# ca_file: <path_to_ca_certificate>
|
|
||||||
```
|
|
||||||
|
|
||||||
This assumes that your Redis service is called `redis` in your Docker Compose file.
|
|
||||||
|
|
||||||
### Add a replication Listener
|
|
||||||
|
|
||||||
Locate the `listeners` section of your `homeserver.yaml` and add the following replication listener:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
listeners:
|
|
||||||
# Other listeners
|
|
||||||
|
|
||||||
- port: 9093
|
|
||||||
type: http
|
|
||||||
resources:
|
|
||||||
- names: [replication]
|
|
||||||
```
|
|
||||||
|
|
||||||
This listener is used by the workers for replication and is referred to in worker config files using the following settings:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
worker_replication_host: synapse
|
|
||||||
worker_replication_http_port: 9093
|
|
||||||
```
|
|
||||||
|
|
||||||
### Configure Federation Senders
|
|
||||||
|
|
||||||
This section is applicable if you are using Federation senders (synapse.app.federation_sender). Locate the `send_federation` and `federation_sender_instances` settings in your `homeserver.yaml` and configure them:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
# This will disable federation sending on the main Synapse instance
|
|
||||||
send_federation: false
|
|
||||||
|
|
||||||
federation_sender_instances:
|
|
||||||
- synapse-federation-sender-1 # The worker_name setting in your federation sender worker configuration file
|
|
||||||
```
|
|
||||||
|
|
||||||
## Other Worker types
|
|
||||||
|
|
||||||
Using the concepts shown here it is possible to create other worker types in Docker Compose. See the [Workers](https://element-hq.github.io/synapse/latest/workers.html#available-worker-applications) documentation for a list of available workers.
|
|
@ -1,77 +0,0 @@
|
|||||||
networks:
|
|
||||||
backend:
|
|
||||||
|
|
||||||
services:
|
|
||||||
postgres:
|
|
||||||
image: postgres:latest
|
|
||||||
restart: unless-stopped
|
|
||||||
volumes:
|
|
||||||
- ${VOLUME_PATH}/var/lib/postgresql/data:/var/lib/postgresql/data:rw
|
|
||||||
networks:
|
|
||||||
- backend
|
|
||||||
environment:
|
|
||||||
POSTGRES_DB: synapse
|
|
||||||
POSTGRES_USER: synapse_user
|
|
||||||
POSTGRES_PASSWORD: postgres
|
|
||||||
POSTGRES_INITDB_ARGS: --encoding=UTF8 --locale=C
|
|
||||||
|
|
||||||
redis:
|
|
||||||
image: redis:latest
|
|
||||||
restart: unless-stopped
|
|
||||||
networks:
|
|
||||||
- backend
|
|
||||||
|
|
||||||
synapse:
|
|
||||||
image: matrixdotorg/synapse:latest
|
|
||||||
container_name: synapse
|
|
||||||
restart: unless-stopped
|
|
||||||
volumes:
|
|
||||||
- ${VOLUME_PATH}/data:/data:rw
|
|
||||||
ports:
|
|
||||||
- 8008:8008
|
|
||||||
networks:
|
|
||||||
- backend
|
|
||||||
environment:
|
|
||||||
SYNAPSE_CONFIG_DIR: /data
|
|
||||||
SYNAPSE_CONFIG_PATH: /data/homeserver.yaml
|
|
||||||
depends_on:
|
|
||||||
- postgres
|
|
||||||
|
|
||||||
synapse-generic-worker-1:
|
|
||||||
image: matrixdotorg/synapse:latest
|
|
||||||
container_name: synapse-generic-worker-1
|
|
||||||
restart: unless-stopped
|
|
||||||
entrypoint: ["/start.py", "run", "--config-path=/data/homeserver.yaml", "--config-path=/data/workers/synapse-generic-worker-1.yaml"]
|
|
||||||
healthcheck:
|
|
||||||
test: ["CMD-SHELL", "curl -fSs http://localhost:8081/health || exit 1"]
|
|
||||||
start_period: "5s"
|
|
||||||
interval: "15s"
|
|
||||||
timeout: "5s"
|
|
||||||
networks:
|
|
||||||
- backend
|
|
||||||
volumes:
|
|
||||||
- ${VOLUME_PATH}/data:/data:rw # Replace VOLUME_PATH with the path to your Synapse volume
|
|
||||||
environment:
|
|
||||||
SYNAPSE_WORKER: synapse.app.generic_worker
|
|
||||||
# Expose port if required so your reverse proxy can send requests to this worker
|
|
||||||
# Port configuration will depend on how the http listener is defined in the worker configuration file
|
|
||||||
ports:
|
|
||||||
- 8081:8081
|
|
||||||
depends_on:
|
|
||||||
- synapse
|
|
||||||
|
|
||||||
synapse-federation-sender-1:
|
|
||||||
image: matrixdotorg/synapse:latest
|
|
||||||
container_name: synapse-federation-sender-1
|
|
||||||
restart: unless-stopped
|
|
||||||
entrypoint: ["/start.py", "run", "--config-path=/data/homeserver.yaml", "--config-path=/data/workers/synapse-federation-sender-1.yaml"]
|
|
||||||
healthcheck:
|
|
||||||
disable: true
|
|
||||||
networks:
|
|
||||||
- backend
|
|
||||||
volumes:
|
|
||||||
- ${VOLUME_PATH}/data:/data:rw # Replace VOLUME_PATH with the path to your Synapse volume
|
|
||||||
environment:
|
|
||||||
SYNAPSE_WORKER: synapse.app.federation_sender
|
|
||||||
depends_on:
|
|
||||||
- synapse
|
|
@ -1,8 +0,0 @@
|
|||||||
worker_app: synapse.app.federation_sender
|
|
||||||
worker_name: synapse-federation-sender-1
|
|
||||||
|
|
||||||
# The replication listener on the main synapse process.
|
|
||||||
worker_replication_host: synapse
|
|
||||||
worker_replication_http_port: 9093
|
|
||||||
|
|
||||||
worker_log_config: /data/federation_sender.log.config
|
|
@ -1,15 +0,0 @@
|
|||||||
worker_app: synapse.app.generic_worker
|
|
||||||
worker_name: synapse-generic-worker-1
|
|
||||||
|
|
||||||
# The replication listener on the main synapse process.
|
|
||||||
worker_replication_host: synapse
|
|
||||||
worker_replication_http_port: 9093
|
|
||||||
|
|
||||||
worker_listeners:
|
|
||||||
- type: http
|
|
||||||
port: 8081
|
|
||||||
x_forwarded: true
|
|
||||||
resources:
|
|
||||||
- names: [client, federation]
|
|
||||||
|
|
||||||
worker_log_config: /data/worker.log.config
|
|
165
contrib/experiments/cursesio.py
Normal file
165
contrib/experiments/cursesio.py
Normal file
@ -0,0 +1,165 @@
|
|||||||
|
# Copyright 2014-2016 OpenMarket Ltd
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
import curses
|
||||||
|
import curses.wrapper
|
||||||
|
from curses.ascii import isprint
|
||||||
|
|
||||||
|
from twisted.internet import reactor
|
||||||
|
|
||||||
|
|
||||||
|
class CursesStdIO:
|
||||||
|
def __init__(self, stdscr, callback=None):
|
||||||
|
self.statusText = "Synapse test app -"
|
||||||
|
self.searchText = ""
|
||||||
|
self.stdscr = stdscr
|
||||||
|
|
||||||
|
self.logLine = ""
|
||||||
|
|
||||||
|
self.callback = callback
|
||||||
|
|
||||||
|
self._setup()
|
||||||
|
|
||||||
|
def _setup(self):
|
||||||
|
self.stdscr.nodelay(1) # Make non blocking
|
||||||
|
|
||||||
|
self.rows, self.cols = self.stdscr.getmaxyx()
|
||||||
|
self.lines = []
|
||||||
|
|
||||||
|
curses.use_default_colors()
|
||||||
|
|
||||||
|
self.paintStatus(self.statusText)
|
||||||
|
self.stdscr.refresh()
|
||||||
|
|
||||||
|
def set_callback(self, callback):
|
||||||
|
self.callback = callback
|
||||||
|
|
||||||
|
def fileno(self):
|
||||||
|
"""We want to select on FD 0"""
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def connectionLost(self, reason):
|
||||||
|
self.close()
|
||||||
|
|
||||||
|
def print_line(self, text):
|
||||||
|
"""add a line to the internal list of lines"""
|
||||||
|
|
||||||
|
self.lines.append(text)
|
||||||
|
self.redraw()
|
||||||
|
|
||||||
|
def print_log(self, text):
|
||||||
|
self.logLine = text
|
||||||
|
self.redraw()
|
||||||
|
|
||||||
|
def redraw(self):
|
||||||
|
"""method for redisplaying lines based on internal list of lines"""
|
||||||
|
|
||||||
|
self.stdscr.clear()
|
||||||
|
self.paintStatus(self.statusText)
|
||||||
|
i = 0
|
||||||
|
index = len(self.lines) - 1
|
||||||
|
while i < (self.rows - 3) and index >= 0:
|
||||||
|
self.stdscr.addstr(self.rows - 3 - i, 0, self.lines[index], curses.A_NORMAL)
|
||||||
|
i = i + 1
|
||||||
|
index = index - 1
|
||||||
|
|
||||||
|
self.printLogLine(self.logLine)
|
||||||
|
|
||||||
|
self.stdscr.refresh()
|
||||||
|
|
||||||
|
def paintStatus(self, text):
|
||||||
|
if len(text) > self.cols:
|
||||||
|
raise RuntimeError("TextTooLongError")
|
||||||
|
|
||||||
|
self.stdscr.addstr(
|
||||||
|
self.rows - 2, 0, text + " " * (self.cols - len(text)), curses.A_STANDOUT
|
||||||
|
)
|
||||||
|
|
||||||
|
def printLogLine(self, text):
|
||||||
|
self.stdscr.addstr(
|
||||||
|
0, 0, text + " " * (self.cols - len(text)), curses.A_STANDOUT
|
||||||
|
)
|
||||||
|
|
||||||
|
def doRead(self):
|
||||||
|
"""Input is ready!"""
|
||||||
|
curses.noecho()
|
||||||
|
c = self.stdscr.getch() # read a character
|
||||||
|
|
||||||
|
if c == curses.KEY_BACKSPACE:
|
||||||
|
self.searchText = self.searchText[:-1]
|
||||||
|
|
||||||
|
elif c == curses.KEY_ENTER or c == 10:
|
||||||
|
text = self.searchText
|
||||||
|
self.searchText = ""
|
||||||
|
|
||||||
|
self.print_line(">> %s" % text)
|
||||||
|
|
||||||
|
try:
|
||||||
|
if self.callback:
|
||||||
|
self.callback.on_line(text)
|
||||||
|
except Exception as e:
|
||||||
|
self.print_line(str(e))
|
||||||
|
|
||||||
|
self.stdscr.refresh()
|
||||||
|
|
||||||
|
elif isprint(c):
|
||||||
|
if len(self.searchText) == self.cols - 2:
|
||||||
|
return
|
||||||
|
self.searchText = self.searchText + chr(c)
|
||||||
|
|
||||||
|
self.stdscr.addstr(
|
||||||
|
self.rows - 1,
|
||||||
|
0,
|
||||||
|
self.searchText + (" " * (self.cols - len(self.searchText) - 2)),
|
||||||
|
)
|
||||||
|
|
||||||
|
self.paintStatus(self.statusText + " %d" % len(self.searchText))
|
||||||
|
self.stdscr.move(self.rows - 1, len(self.searchText))
|
||||||
|
self.stdscr.refresh()
|
||||||
|
|
||||||
|
def logPrefix(self):
|
||||||
|
return "CursesStdIO"
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
"""clean up"""
|
||||||
|
|
||||||
|
curses.nocbreak()
|
||||||
|
self.stdscr.keypad(0)
|
||||||
|
curses.echo()
|
||||||
|
curses.endwin()
|
||||||
|
|
||||||
|
|
||||||
|
class Callback:
|
||||||
|
def __init__(self, stdio):
|
||||||
|
self.stdio = stdio
|
||||||
|
|
||||||
|
def on_line(self, text):
|
||||||
|
self.stdio.print_line(text)
|
||||||
|
|
||||||
|
|
||||||
|
def main(stdscr):
|
||||||
|
screen = CursesStdIO(stdscr) # create Screen object
|
||||||
|
|
||||||
|
callback = Callback(screen)
|
||||||
|
|
||||||
|
screen.set_callback(callback)
|
||||||
|
|
||||||
|
stdscr.refresh()
|
||||||
|
reactor.addReader(screen)
|
||||||
|
reactor.run()
|
||||||
|
screen.close()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
curses.wrapper(main)
|
367
contrib/experiments/test_messaging.py
Normal file
367
contrib/experiments/test_messaging.py
Normal file
@ -0,0 +1,367 @@
|
|||||||
|
# Copyright 2014-2016 OpenMarket Ltd
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
|
||||||
|
""" This is an example of using the server to server implementation to do a
|
||||||
|
basic chat style thing. It accepts commands from stdin and outputs to stdout.
|
||||||
|
|
||||||
|
It assumes that ucids are of the form <user>@<domain>, and uses <domain> as
|
||||||
|
the address of the remote home server to hit.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python test_messaging.py <port>
|
||||||
|
|
||||||
|
Currently assumes the local address is localhost:<port>
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import curses.wrapper
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
|
||||||
|
import cursesio
|
||||||
|
|
||||||
|
from twisted.internet import defer, reactor
|
||||||
|
from twisted.python import log
|
||||||
|
|
||||||
|
from synapse.app.homeserver import SynapseHomeServer
|
||||||
|
from synapse.federation import ReplicationHandler
|
||||||
|
from synapse.federation.units import Pdu
|
||||||
|
from synapse.util import origin_from_ucid
|
||||||
|
|
||||||
|
# from synapse.logging.utils import log_function
|
||||||
|
|
||||||
|
|
||||||
|
logger = logging.getLogger("example")
|
||||||
|
|
||||||
|
|
||||||
|
def excpetion_errback(failure):
|
||||||
|
logging.exception(failure)
|
||||||
|
|
||||||
|
|
||||||
|
class InputOutput:
|
||||||
|
"""This is responsible for basic I/O so that a user can interact with
|
||||||
|
the example app.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, screen, user):
|
||||||
|
self.screen = screen
|
||||||
|
self.user = user
|
||||||
|
|
||||||
|
def set_home_server(self, server):
|
||||||
|
self.server = server
|
||||||
|
|
||||||
|
def on_line(self, line):
|
||||||
|
"""This is where we process commands."""
|
||||||
|
|
||||||
|
try:
|
||||||
|
m = re.match(r"^join (\S+)$", line)
|
||||||
|
if m:
|
||||||
|
# The `sender` wants to join a room.
|
||||||
|
(room_name,) = m.groups()
|
||||||
|
self.print_line("%s joining %s" % (self.user, room_name))
|
||||||
|
self.server.join_room(room_name, self.user, self.user)
|
||||||
|
# self.print_line("OK.")
|
||||||
|
return
|
||||||
|
|
||||||
|
m = re.match(r"^invite (\S+) (\S+)$", line)
|
||||||
|
if m:
|
||||||
|
# `sender` wants to invite someone to a room
|
||||||
|
room_name, invitee = m.groups()
|
||||||
|
self.print_line("%s invited to %s" % (invitee, room_name))
|
||||||
|
self.server.invite_to_room(room_name, self.user, invitee)
|
||||||
|
# self.print_line("OK.")
|
||||||
|
return
|
||||||
|
|
||||||
|
m = re.match(r"^send (\S+) (.*)$", line)
|
||||||
|
if m:
|
||||||
|
# `sender` wants to message a room
|
||||||
|
room_name, body = m.groups()
|
||||||
|
self.print_line("%s send to %s" % (self.user, room_name))
|
||||||
|
self.server.send_message(room_name, self.user, body)
|
||||||
|
# self.print_line("OK.")
|
||||||
|
return
|
||||||
|
|
||||||
|
m = re.match(r"^backfill (\S+)$", line)
|
||||||
|
if m:
|
||||||
|
# we want to backfill a room
|
||||||
|
(room_name,) = m.groups()
|
||||||
|
self.print_line("backfill %s" % room_name)
|
||||||
|
self.server.backfill(room_name)
|
||||||
|
return
|
||||||
|
|
||||||
|
self.print_line("Unrecognized command")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception(e)
|
||||||
|
|
||||||
|
def print_line(self, text):
|
||||||
|
self.screen.print_line(text)
|
||||||
|
|
||||||
|
def print_log(self, text):
|
||||||
|
self.screen.print_log(text)
|
||||||
|
|
||||||
|
|
||||||
|
class IOLoggerHandler(logging.Handler):
|
||||||
|
def __init__(self, io):
|
||||||
|
logging.Handler.__init__(self)
|
||||||
|
self.io = io
|
||||||
|
|
||||||
|
def emit(self, record):
|
||||||
|
if record.levelno < logging.WARN:
|
||||||
|
return
|
||||||
|
|
||||||
|
msg = self.format(record)
|
||||||
|
self.io.print_log(msg)
|
||||||
|
|
||||||
|
|
||||||
|
class Room:
|
||||||
|
"""Used to store (in memory) the current membership state of a room, and
|
||||||
|
which home servers we should send PDUs associated with the room to.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, room_name):
|
||||||
|
self.room_name = room_name
|
||||||
|
self.invited = set()
|
||||||
|
self.participants = set()
|
||||||
|
self.servers = set()
|
||||||
|
|
||||||
|
self.oldest_server = None
|
||||||
|
|
||||||
|
self.have_got_metadata = False
|
||||||
|
|
||||||
|
def add_participant(self, participant):
|
||||||
|
"""Someone has joined the room"""
|
||||||
|
self.participants.add(participant)
|
||||||
|
self.invited.discard(participant)
|
||||||
|
|
||||||
|
server = origin_from_ucid(participant)
|
||||||
|
self.servers.add(server)
|
||||||
|
|
||||||
|
if not self.oldest_server:
|
||||||
|
self.oldest_server = server
|
||||||
|
|
||||||
|
def add_invited(self, invitee):
|
||||||
|
"""Someone has been invited to the room"""
|
||||||
|
self.invited.add(invitee)
|
||||||
|
self.servers.add(origin_from_ucid(invitee))
|
||||||
|
|
||||||
|
|
||||||
|
class HomeServer(ReplicationHandler):
|
||||||
|
"""A very basic home server implentation that allows people to join a
|
||||||
|
room and then invite other people.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, server_name, replication_layer, output):
|
||||||
|
self.server_name = server_name
|
||||||
|
self.replication_layer = replication_layer
|
||||||
|
self.replication_layer.set_handler(self)
|
||||||
|
|
||||||
|
self.joined_rooms = {}
|
||||||
|
|
||||||
|
self.output = output
|
||||||
|
|
||||||
|
def on_receive_pdu(self, pdu):
|
||||||
|
"""We just received a PDU"""
|
||||||
|
pdu_type = pdu.pdu_type
|
||||||
|
|
||||||
|
if pdu_type == "sy.room.message":
|
||||||
|
self._on_message(pdu)
|
||||||
|
elif pdu_type == "sy.room.member" and "membership" in pdu.content:
|
||||||
|
if pdu.content["membership"] == "join":
|
||||||
|
self._on_join(pdu.context, pdu.state_key)
|
||||||
|
elif pdu.content["membership"] == "invite":
|
||||||
|
self._on_invite(pdu.origin, pdu.context, pdu.state_key)
|
||||||
|
else:
|
||||||
|
self.output.print_line(
|
||||||
|
"#%s (unrec) %s = %s"
|
||||||
|
% (pdu.context, pdu.pdu_type, json.dumps(pdu.content))
|
||||||
|
)
|
||||||
|
|
||||||
|
def _on_message(self, pdu):
|
||||||
|
"""We received a message"""
|
||||||
|
self.output.print_line(
|
||||||
|
"#%s %s %s" % (pdu.context, pdu.content["sender"], pdu.content["body"])
|
||||||
|
)
|
||||||
|
|
||||||
|
def _on_join(self, context, joinee):
|
||||||
|
"""Someone has joined a room, either a remote user or a local user"""
|
||||||
|
room = self._get_or_create_room(context)
|
||||||
|
room.add_participant(joinee)
|
||||||
|
|
||||||
|
self.output.print_line("#%s %s %s" % (context, joinee, "*** JOINED"))
|
||||||
|
|
||||||
|
def _on_invite(self, origin, context, invitee):
|
||||||
|
"""Someone has been invited"""
|
||||||
|
room = self._get_or_create_room(context)
|
||||||
|
room.add_invited(invitee)
|
||||||
|
|
||||||
|
self.output.print_line("#%s %s %s" % (context, invitee, "*** INVITED"))
|
||||||
|
|
||||||
|
if not room.have_got_metadata and origin is not self.server_name:
|
||||||
|
logger.debug("Get room state")
|
||||||
|
self.replication_layer.get_state_for_context(origin, context)
|
||||||
|
room.have_got_metadata = True
|
||||||
|
|
||||||
|
@defer.inlineCallbacks
|
||||||
|
def send_message(self, room_name, sender, body):
|
||||||
|
"""Send a message to a room!"""
|
||||||
|
destinations = yield self.get_servers_for_context(room_name)
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield self.replication_layer.send_pdu(
|
||||||
|
Pdu.create_new(
|
||||||
|
context=room_name,
|
||||||
|
pdu_type="sy.room.message",
|
||||||
|
content={"sender": sender, "body": body},
|
||||||
|
origin=self.server_name,
|
||||||
|
destinations=destinations,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception(e)
|
||||||
|
|
||||||
|
@defer.inlineCallbacks
|
||||||
|
def join_room(self, room_name, sender, joinee):
|
||||||
|
"""Join a room!"""
|
||||||
|
self._on_join(room_name, joinee)
|
||||||
|
|
||||||
|
destinations = yield self.get_servers_for_context(room_name)
|
||||||
|
|
||||||
|
try:
|
||||||
|
pdu = Pdu.create_new(
|
||||||
|
context=room_name,
|
||||||
|
pdu_type="sy.room.member",
|
||||||
|
is_state=True,
|
||||||
|
state_key=joinee,
|
||||||
|
content={"membership": "join"},
|
||||||
|
origin=self.server_name,
|
||||||
|
destinations=destinations,
|
||||||
|
)
|
||||||
|
yield self.replication_layer.send_pdu(pdu)
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception(e)
|
||||||
|
|
||||||
|
@defer.inlineCallbacks
|
||||||
|
def invite_to_room(self, room_name, sender, invitee):
|
||||||
|
"""Invite someone to a room!"""
|
||||||
|
self._on_invite(self.server_name, room_name, invitee)
|
||||||
|
|
||||||
|
destinations = yield self.get_servers_for_context(room_name)
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield self.replication_layer.send_pdu(
|
||||||
|
Pdu.create_new(
|
||||||
|
context=room_name,
|
||||||
|
is_state=True,
|
||||||
|
pdu_type="sy.room.member",
|
||||||
|
state_key=invitee,
|
||||||
|
content={"membership": "invite"},
|
||||||
|
origin=self.server_name,
|
||||||
|
destinations=destinations,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception(e)
|
||||||
|
|
||||||
|
def backfill(self, room_name, limit=5):
|
||||||
|
room = self.joined_rooms.get(room_name)
|
||||||
|
|
||||||
|
if not room:
|
||||||
|
return
|
||||||
|
|
||||||
|
dest = room.oldest_server
|
||||||
|
|
||||||
|
return self.replication_layer.backfill(dest, room_name, limit)
|
||||||
|
|
||||||
|
def _get_room_remote_servers(self, room_name):
|
||||||
|
return list(self.joined_rooms.setdefault(room_name).servers)
|
||||||
|
|
||||||
|
def _get_or_create_room(self, room_name):
|
||||||
|
return self.joined_rooms.setdefault(room_name, Room(room_name))
|
||||||
|
|
||||||
|
def get_servers_for_context(self, context):
|
||||||
|
return defer.succeed(
|
||||||
|
self.joined_rooms.setdefault(context, Room(context)).servers
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def main(stdscr):
|
||||||
|
parser = argparse.ArgumentParser()
|
||||||
|
parser.add_argument("user", type=str)
|
||||||
|
parser.add_argument("-v", "--verbose", action="count")
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
user = args.user
|
||||||
|
server_name = origin_from_ucid(user)
|
||||||
|
|
||||||
|
# Set up logging
|
||||||
|
|
||||||
|
root_logger = logging.getLogger()
|
||||||
|
|
||||||
|
formatter = logging.Formatter(
|
||||||
|
"%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(message)s"
|
||||||
|
)
|
||||||
|
if not os.path.exists("logs"):
|
||||||
|
os.makedirs("logs")
|
||||||
|
fh = logging.FileHandler("logs/%s" % user)
|
||||||
|
fh.setFormatter(formatter)
|
||||||
|
|
||||||
|
root_logger.addHandler(fh)
|
||||||
|
root_logger.setLevel(logging.DEBUG)
|
||||||
|
|
||||||
|
# Hack: The only way to get it to stop logging to sys.stderr :(
|
||||||
|
log.theLogPublisher.observers = []
|
||||||
|
observer = log.PythonLoggingObserver()
|
||||||
|
observer.start()
|
||||||
|
|
||||||
|
# Set up synapse server
|
||||||
|
|
||||||
|
curses_stdio = cursesio.CursesStdIO(stdscr)
|
||||||
|
input_output = InputOutput(curses_stdio, user)
|
||||||
|
|
||||||
|
curses_stdio.set_callback(input_output)
|
||||||
|
|
||||||
|
app_hs = SynapseHomeServer(server_name, db_name="dbs/%s" % user)
|
||||||
|
replication = app_hs.get_replication_layer()
|
||||||
|
|
||||||
|
hs = HomeServer(server_name, replication, curses_stdio)
|
||||||
|
|
||||||
|
input_output.set_home_server(hs)
|
||||||
|
|
||||||
|
# Add input_output logger
|
||||||
|
io_logger = IOLoggerHandler(input_output)
|
||||||
|
io_logger.setFormatter(formatter)
|
||||||
|
root_logger.addHandler(io_logger)
|
||||||
|
|
||||||
|
# Start!
|
||||||
|
|
||||||
|
try:
|
||||||
|
port = int(server_name.split(":")[1])
|
||||||
|
except Exception:
|
||||||
|
port = 12345
|
||||||
|
|
||||||
|
app_hs.get_http_server().start_listening(port)
|
||||||
|
|
||||||
|
reactor.addReader(curses_stdio)
|
||||||
|
|
||||||
|
reactor.run()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
curses.wrapper(main)
|
@ -1,6 +1,6 @@
|
|||||||
# Using the Synapse Grafana dashboard
|
# Using the Synapse Grafana dashboard
|
||||||
|
|
||||||
0. Set up Prometheus and Grafana. Out of scope for this readme. Useful documentation about using Grafana with Prometheus: http://docs.grafana.org/features/datasources/prometheus/
|
0. Set up Prometheus and Grafana. Out of scope for this readme. Useful documentation about using Grafana with Prometheus: http://docs.grafana.org/features/datasources/prometheus/
|
||||||
1. Have your Prometheus scrape your Synapse. https://element-hq.github.io/synapse/latest/metrics-howto.html
|
1. Have your Prometheus scrape your Synapse. https://matrix-org.github.io/synapse/latest/metrics-howto.html
|
||||||
2. Import dashboard into Grafana. Download `synapse.json`. Import it to Grafana and select the correct Prometheus datasource. http://docs.grafana.org/reference/export_import/
|
2. Import dashboard into Grafana. Download `synapse.json`. Import it to Grafana and select the correct Prometheus datasource. http://docs.grafana.org/reference/export_import/
|
||||||
3. Set up required recording rules. [contrib/prometheus](../prometheus)
|
3. Set up required recording rules. [contrib/prometheus](../prometheus)
|
||||||
|
File diff suppressed because it is too large
Load Diff
@ -1,43 +1,31 @@
|
|||||||
#
|
|
||||||
# This file is licensed under the Affero General Public License (AGPL) version 3.
|
|
||||||
#
|
|
||||||
# Copyright 2014-2016 OpenMarket Ltd
|
|
||||||
# Copyright (C) 2023 New Vector, Ltd
|
|
||||||
#
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as
|
|
||||||
# published by the Free Software Foundation, either version 3 of the
|
|
||||||
# License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# See the GNU Affero General Public License for more details:
|
|
||||||
# <https://www.gnu.org/licenses/agpl-3.0.html>.
|
|
||||||
#
|
|
||||||
# Originally licensed under the Apache License, Version 2.0:
|
|
||||||
# <http://www.apache.org/licenses/LICENSE-2.0>.
|
|
||||||
#
|
|
||||||
# [This file includes modifications made by New Vector Limited]
|
|
||||||
#
|
|
||||||
#
|
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
|
import cgi
|
||||||
import datetime
|
import datetime
|
||||||
import html
|
|
||||||
import json
|
import json
|
||||||
import urllib.request
|
|
||||||
from typing import List
|
|
||||||
|
|
||||||
import pydot
|
import pydot
|
||||||
|
import urllib2
|
||||||
|
|
||||||
|
# Copyright 2014-2016 OpenMarket Ltd
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
|
||||||
def make_name(pdu_id: str, origin: str) -> str:
|
def make_name(pdu_id, origin):
|
||||||
return f"{pdu_id}@{origin}"
|
return "%s@%s" % (pdu_id, origin)
|
||||||
|
|
||||||
|
|
||||||
def make_graph(pdus: List[dict], filename_prefix: str) -> None:
|
def make_graph(pdus, room, filename_prefix):
|
||||||
"""
|
|
||||||
Generate a dot and SVG file for a graph of events in the room based on the
|
|
||||||
topological ordering by querying a homeserver.
|
|
||||||
"""
|
|
||||||
pdu_map = {}
|
pdu_map = {}
|
||||||
node_map = {}
|
node_map = {}
|
||||||
|
|
||||||
@ -45,10 +33,6 @@ def make_graph(pdus: List[dict], filename_prefix: str) -> None:
|
|||||||
colors = {"red", "green", "blue", "yellow", "purple"}
|
colors = {"red", "green", "blue", "yellow", "purple"}
|
||||||
|
|
||||||
for pdu in pdus:
|
for pdu in pdus:
|
||||||
# TODO: The "origin" field has since been removed from events generated
|
|
||||||
# by Synapse. We should consider removing it here as well but since this
|
|
||||||
# is part of `contrib/`, it is left for the community to revise and ensure things
|
|
||||||
# still work correctly.
|
|
||||||
origins.add(pdu.get("origin"))
|
origins.add(pdu.get("origin"))
|
||||||
|
|
||||||
color_map = {color: color for color in colors if color in origins}
|
color_map = {color: color for color in colors if color in origins}
|
||||||
@ -89,7 +73,7 @@ def make_graph(pdus: List[dict], filename_prefix: str) -> None:
|
|||||||
"name": name,
|
"name": name,
|
||||||
"type": pdu.get("pdu_type"),
|
"type": pdu.get("pdu_type"),
|
||||||
"state_key": pdu.get("state_key"),
|
"state_key": pdu.get("state_key"),
|
||||||
"content": html.escape(json.dumps(pdu.get("content")), quote=True),
|
"content": cgi.escape(json.dumps(pdu.get("content")), quote=True),
|
||||||
"time": t,
|
"time": t,
|
||||||
"depth": pdu.get("depth"),
|
"depth": pdu.get("depth"),
|
||||||
}
|
}
|
||||||
@ -127,10 +111,10 @@ def make_graph(pdus: List[dict], filename_prefix: str) -> None:
|
|||||||
graph.write_svg("%s.svg" % filename_prefix, prog="dot")
|
graph.write_svg("%s.svg" % filename_prefix, prog="dot")
|
||||||
|
|
||||||
|
|
||||||
def get_pdus(host: str, room: str) -> List[dict]:
|
def get_pdus(host, room):
|
||||||
transaction = json.loads(
|
transaction = json.loads(
|
||||||
urllib.request.urlopen(
|
urllib2.urlopen(
|
||||||
f"http://{host}/_matrix/federation/v1/context/{room}/"
|
"http://%s/_matrix/federation/v1/context/%s/" % (host, room)
|
||||||
).read()
|
).read()
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -157,4 +141,4 @@ if __name__ == "__main__":
|
|||||||
|
|
||||||
pdus = get_pdus(host, room)
|
pdus = get_pdus(host, room)
|
||||||
|
|
||||||
make_graph(pdus, prefix)
|
make_graph(pdus, room, prefix)
|
||||||
|
@ -1,51 +1,35 @@
|
|||||||
#
|
|
||||||
# This file is licensed under the Affero General Public License (AGPL) version 3.
|
|
||||||
#
|
|
||||||
# Copyright 2014-2016 OpenMarket Ltd
|
# Copyright 2014-2016 OpenMarket Ltd
|
||||||
# Copyright (C) 2023 New Vector, Ltd
|
|
||||||
#
|
#
|
||||||
# This program is free software: you can redistribute it and/or modify
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
# it under the terms of the GNU Affero General Public License as
|
# you may not use this file except in compliance with the License.
|
||||||
# published by the Free Software Foundation, either version 3 of the
|
# You may obtain a copy of the License at
|
||||||
# License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# See the GNU Affero General Public License for more details:
|
|
||||||
# <https://www.gnu.org/licenses/agpl-3.0.html>.
|
|
||||||
#
|
|
||||||
# Originally licensed under the Apache License, Version 2.0:
|
|
||||||
# <http://www.apache.org/licenses/LICENSE-2.0>.
|
|
||||||
#
|
|
||||||
# [This file includes modifications made by New Vector Limited]
|
|
||||||
#
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
#
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
|
import cgi
|
||||||
import datetime
|
import datetime
|
||||||
import html
|
|
||||||
import json
|
import json
|
||||||
import sqlite3
|
import sqlite3
|
||||||
|
|
||||||
import pydot
|
import pydot
|
||||||
|
|
||||||
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
|
from synapse.events import FrozenEvent
|
||||||
from synapse.events import make_event_from_dict
|
|
||||||
from synapse.util.frozenutils import unfreeze
|
from synapse.util.frozenutils import unfreeze
|
||||||
|
|
||||||
|
|
||||||
def make_graph(db_name: str, room_id: str, file_prefix: str, limit: int) -> None:
|
def make_graph(db_name, room_id, file_prefix, limit):
|
||||||
"""
|
|
||||||
Generate a dot and SVG file for a graph of events in the room based on the
|
|
||||||
topological ordering by reading from a Synapse SQLite database.
|
|
||||||
"""
|
|
||||||
conn = sqlite3.connect(db_name)
|
conn = sqlite3.connect(db_name)
|
||||||
|
|
||||||
sql = "SELECT room_version FROM rooms WHERE room_id = ?"
|
|
||||||
c = conn.execute(sql, (room_id,))
|
|
||||||
room_version = KNOWN_ROOM_VERSIONS[c.fetchone()[0]]
|
|
||||||
|
|
||||||
sql = (
|
sql = (
|
||||||
"SELECT json, internal_metadata FROM event_json as j "
|
"SELECT json FROM event_json as j "
|
||||||
"INNER JOIN events as e ON e.event_id = j.event_id "
|
"INNER JOIN events as e ON e.event_id = j.event_id "
|
||||||
"WHERE j.room_id = ?"
|
"WHERE j.room_id = ?"
|
||||||
)
|
)
|
||||||
@ -59,10 +43,7 @@ def make_graph(db_name: str, room_id: str, file_prefix: str, limit: int) -> None
|
|||||||
|
|
||||||
c = conn.execute(sql, args)
|
c = conn.execute(sql, args)
|
||||||
|
|
||||||
events = [
|
events = [FrozenEvent(json.loads(e[0])) for e in c.fetchall()]
|
||||||
make_event_from_dict(json.loads(e[0]), room_version, json.loads(e[1]))
|
|
||||||
for e in c.fetchall()
|
|
||||||
]
|
|
||||||
|
|
||||||
events.sort(key=lambda e: e.depth)
|
events.sort(key=lambda e: e.depth)
|
||||||
|
|
||||||
@ -103,7 +84,7 @@ def make_graph(db_name: str, room_id: str, file_prefix: str, limit: int) -> None
|
|||||||
"name": event.event_id,
|
"name": event.event_id,
|
||||||
"type": event.type,
|
"type": event.type,
|
||||||
"state_key": event.get("state_key", None),
|
"state_key": event.get("state_key", None),
|
||||||
"content": html.escape(content, quote=True),
|
"content": cgi.escape(content, quote=True),
|
||||||
"time": t,
|
"time": t,
|
||||||
"depth": event.depth,
|
"depth": event.depth,
|
||||||
"state_group": state_group,
|
"state_group": state_group,
|
||||||
@ -115,11 +96,11 @@ def make_graph(db_name: str, room_id: str, file_prefix: str, limit: int) -> None
|
|||||||
graph.add_node(node)
|
graph.add_node(node)
|
||||||
|
|
||||||
for event in events:
|
for event in events:
|
||||||
for prev_id in event.prev_event_ids():
|
for prev_id, _ in event.prev_events:
|
||||||
try:
|
try:
|
||||||
end_node = node_map[prev_id]
|
end_node = node_map[prev_id]
|
||||||
except Exception:
|
except Exception:
|
||||||
end_node = pydot.Node(name=prev_id, label=f"<<b>{prev_id}</b>>")
|
end_node = pydot.Node(name=prev_id, label="<<b>%s</b>>" % (prev_id,))
|
||||||
|
|
||||||
node_map[prev_id] = end_node
|
node_map[prev_id] = end_node
|
||||||
graph.add_node(end_node)
|
graph.add_node(end_node)
|
||||||
@ -131,7 +112,7 @@ def make_graph(db_name: str, room_id: str, file_prefix: str, limit: int) -> None
|
|||||||
if len(event_ids) <= 1:
|
if len(event_ids) <= 1:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
cluster = pydot.Cluster(str(group), label=f"<State Group: {str(group)}>")
|
cluster = pydot.Cluster(str(group), label="<State Group: %s>" % (str(group),))
|
||||||
|
|
||||||
for event_id in event_ids:
|
for event_id in event_ids:
|
||||||
cluster.add_node(node_map[event_id])
|
cluster.add_node(node_map[event_id])
|
||||||
@ -145,7 +126,7 @@ def make_graph(db_name: str, room_id: str, file_prefix: str, limit: int) -> None
|
|||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
parser = argparse.ArgumentParser(
|
parser = argparse.ArgumentParser(
|
||||||
description="Generate a PDU graph for a given room by talking "
|
description="Generate a PDU graph for a given room by talking "
|
||||||
"to the given Synapse SQLite file to get the list of PDUs. \n"
|
"to the given homeserver to get the list of PDUs. \n"
|
||||||
"Requires pydot."
|
"Requires pydot."
|
||||||
)
|
)
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
|
@ -1,53 +1,36 @@
|
|||||||
#
|
|
||||||
# This file is licensed under the Affero General Public License (AGPL) version 3.
|
|
||||||
#
|
|
||||||
# Copyright 2016 OpenMarket Ltd
|
|
||||||
# Copyright (C) 2023 New Vector, Ltd
|
|
||||||
#
|
|
||||||
# This program is free software: you can redistribute it and/or modify
|
|
||||||
# it under the terms of the GNU Affero General Public License as
|
|
||||||
# published by the Free Software Foundation, either version 3 of the
|
|
||||||
# License, or (at your option) any later version.
|
|
||||||
#
|
|
||||||
# See the GNU Affero General Public License for more details:
|
|
||||||
# <https://www.gnu.org/licenses/agpl-3.0.html>.
|
|
||||||
#
|
|
||||||
# Originally licensed under the Apache License, Version 2.0:
|
|
||||||
# <http://www.apache.org/licenses/LICENSE-2.0>.
|
|
||||||
#
|
|
||||||
# [This file includes modifications made by New Vector Limited]
|
|
||||||
#
|
|
||||||
#
|
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
|
import cgi
|
||||||
import datetime
|
import datetime
|
||||||
import html
|
|
||||||
import json
|
|
||||||
|
|
||||||
import pydot
|
import pydot
|
||||||
|
import simplejson as json
|
||||||
|
|
||||||
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
|
from synapse.events import FrozenEvent
|
||||||
from synapse.events import make_event_from_dict
|
|
||||||
from synapse.util.frozenutils import unfreeze
|
from synapse.util.frozenutils import unfreeze
|
||||||
|
|
||||||
|
# Copyright 2016 OpenMarket Ltd
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
def make_graph(file_name: str, file_prefix: str, limit: int) -> None:
|
|
||||||
"""
|
def make_graph(file_name, room_id, file_prefix, limit):
|
||||||
Generate a dot and SVG file for a graph of events in the room based on the
|
|
||||||
topological ordering by reading line-delimited JSON from a file.
|
|
||||||
"""
|
|
||||||
print("Reading lines")
|
print("Reading lines")
|
||||||
with open(file_name) as f:
|
with open(file_name) as f:
|
||||||
lines = f.readlines()
|
lines = f.readlines()
|
||||||
|
|
||||||
print("Read lines")
|
print("Read lines")
|
||||||
|
|
||||||
# Figure out the room version, assume the first line is the create event.
|
events = [FrozenEvent(json.loads(line)) for line in lines]
|
||||||
room_version = KNOWN_ROOM_VERSIONS[
|
|
||||||
json.loads(lines[0]).get("content", {}).get("room_version")
|
|
||||||
]
|
|
||||||
|
|
||||||
events = [make_event_from_dict(json.loads(line), room_version) for line in lines]
|
|
||||||
|
|
||||||
print("Loaded events.")
|
print("Loaded events.")
|
||||||
|
|
||||||
@ -83,8 +66,8 @@ def make_graph(file_name: str, file_prefix: str, limit: int) -> None:
|
|||||||
content.append(
|
content.append(
|
||||||
"<b>%s</b>: %s,"
|
"<b>%s</b>: %s,"
|
||||||
% (
|
% (
|
||||||
html.escape(key, quote=True).encode("ascii", "xmlcharrefreplace"),
|
cgi.escape(key, quote=True).encode("ascii", "xmlcharrefreplace"),
|
||||||
html.escape(value, quote=True).encode("ascii", "xmlcharrefreplace"),
|
cgi.escape(value, quote=True).encode("ascii", "xmlcharrefreplace"),
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -118,11 +101,11 @@ def make_graph(file_name: str, file_prefix: str, limit: int) -> None:
|
|||||||
print("Created Nodes")
|
print("Created Nodes")
|
||||||
|
|
||||||
for event in events:
|
for event in events:
|
||||||
for prev_id in event.prev_event_ids():
|
for prev_id, _ in event.prev_events:
|
||||||
try:
|
try:
|
||||||
end_node = node_map[prev_id]
|
end_node = node_map[prev_id]
|
||||||
except Exception:
|
except Exception:
|
||||||
end_node = pydot.Node(name=prev_id, label=f"<<b>{prev_id}</b>>")
|
end_node = pydot.Node(name=prev_id, label="<<b>%s</b>>" % (prev_id,))
|
||||||
|
|
||||||
node_map[prev_id] = end_node
|
node_map[prev_id] = end_node
|
||||||
graph.add_node(end_node)
|
graph.add_node(end_node)
|
||||||
@ -156,7 +139,8 @@ if __name__ == "__main__":
|
|||||||
)
|
)
|
||||||
parser.add_argument("-l", "--limit", help="Only retrieve the last N events.")
|
parser.add_argument("-l", "--limit", help="Only retrieve the last N events.")
|
||||||
parser.add_argument("event_file")
|
parser.add_argument("event_file")
|
||||||
|
parser.add_argument("room")
|
||||||
|
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
make_graph(args.event_file, args.prefix, args.limit)
|
make_graph(args.event_file, args.room, args.prefix, args.limit)
|
||||||
|
295
contrib/jitsimeetbridge/jitsimeetbridge.py
Normal file
295
contrib/jitsimeetbridge/jitsimeetbridge.py
Normal file
@ -0,0 +1,295 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
|
||||||
|
"""
|
||||||
|
This is an attempt at bridging matrix clients into a Jitis meet room via Matrix
|
||||||
|
video call. It uses hard-coded xml strings overg XMPP BOSH. It can display one
|
||||||
|
of the streams from the Jitsi bridge until the second lot of SDP comes down and
|
||||||
|
we set the remote SDP at which point the stream ends. Our video never gets to
|
||||||
|
the bridge.
|
||||||
|
|
||||||
|
Requires:
|
||||||
|
npm install jquery jsdom
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import time
|
||||||
|
|
||||||
|
import gevent
|
||||||
|
import grequests
|
||||||
|
from BeautifulSoup import BeautifulSoup
|
||||||
|
|
||||||
|
ACCESS_TOKEN = ""
|
||||||
|
|
||||||
|
MATRIXBASE = "https://matrix.org/_matrix/client/api/v1/"
|
||||||
|
MYUSERNAME = "@davetest:matrix.org"
|
||||||
|
|
||||||
|
HTTPBIND = "https://meet.jit.si/http-bind"
|
||||||
|
# HTTPBIND = 'https://jitsi.vuc.me/http-bind'
|
||||||
|
# ROOMNAME = "matrix"
|
||||||
|
ROOMNAME = "pibble"
|
||||||
|
|
||||||
|
HOST = "guest.jit.si"
|
||||||
|
# HOST="jitsi.vuc.me"
|
||||||
|
|
||||||
|
TURNSERVER = "turn.guest.jit.si"
|
||||||
|
# TURNSERVER="turn.jitsi.vuc.me"
|
||||||
|
|
||||||
|
ROOMDOMAIN = "meet.jit.si"
|
||||||
|
# ROOMDOMAIN="conference.jitsi.vuc.me"
|
||||||
|
|
||||||
|
|
||||||
|
class TrivialMatrixClient:
|
||||||
|
def __init__(self, access_token):
|
||||||
|
self.token = None
|
||||||
|
self.access_token = access_token
|
||||||
|
|
||||||
|
def getEvent(self):
|
||||||
|
while True:
|
||||||
|
url = (
|
||||||
|
MATRIXBASE
|
||||||
|
+ "events?access_token="
|
||||||
|
+ self.access_token
|
||||||
|
+ "&timeout=60000"
|
||||||
|
)
|
||||||
|
if self.token:
|
||||||
|
url += "&from=" + self.token
|
||||||
|
req = grequests.get(url)
|
||||||
|
resps = grequests.map([req])
|
||||||
|
obj = json.loads(resps[0].content)
|
||||||
|
print("incoming from matrix", obj)
|
||||||
|
if "end" not in obj:
|
||||||
|
continue
|
||||||
|
self.token = obj["end"]
|
||||||
|
if len(obj["chunk"]):
|
||||||
|
return obj["chunk"][0]
|
||||||
|
|
||||||
|
def joinRoom(self, roomId):
|
||||||
|
url = MATRIXBASE + "rooms/" + roomId + "/join?access_token=" + self.access_token
|
||||||
|
print(url)
|
||||||
|
headers = {"Content-Type": "application/json"}
|
||||||
|
req = grequests.post(url, headers=headers, data="{}")
|
||||||
|
resps = grequests.map([req])
|
||||||
|
obj = json.loads(resps[0].content)
|
||||||
|
print("response: ", obj)
|
||||||
|
|
||||||
|
def sendEvent(self, roomId, evType, event):
|
||||||
|
url = (
|
||||||
|
MATRIXBASE
|
||||||
|
+ "rooms/"
|
||||||
|
+ roomId
|
||||||
|
+ "/send/"
|
||||||
|
+ evType
|
||||||
|
+ "?access_token="
|
||||||
|
+ self.access_token
|
||||||
|
)
|
||||||
|
print(url)
|
||||||
|
print(json.dumps(event))
|
||||||
|
headers = {"Content-Type": "application/json"}
|
||||||
|
req = grequests.post(url, headers=headers, data=json.dumps(event))
|
||||||
|
resps = grequests.map([req])
|
||||||
|
obj = json.loads(resps[0].content)
|
||||||
|
print("response: ", obj)
|
||||||
|
|
||||||
|
|
||||||
|
xmppClients = {}
|
||||||
|
|
||||||
|
|
||||||
|
def matrixLoop():
|
||||||
|
while True:
|
||||||
|
ev = matrixCli.getEvent()
|
||||||
|
print(ev)
|
||||||
|
if ev["type"] == "m.room.member":
|
||||||
|
print("membership event")
|
||||||
|
if ev["membership"] == "invite" and ev["state_key"] == MYUSERNAME:
|
||||||
|
roomId = ev["room_id"]
|
||||||
|
print("joining room %s" % (roomId))
|
||||||
|
matrixCli.joinRoom(roomId)
|
||||||
|
elif ev["type"] == "m.room.message":
|
||||||
|
if ev["room_id"] in xmppClients:
|
||||||
|
print("already have a bridge for that user, ignoring")
|
||||||
|
continue
|
||||||
|
print("got message, connecting")
|
||||||
|
xmppClients[ev["room_id"]] = TrivialXmppClient(ev["room_id"], ev["user_id"])
|
||||||
|
gevent.spawn(xmppClients[ev["room_id"]].xmppLoop)
|
||||||
|
elif ev["type"] == "m.call.invite":
|
||||||
|
print("Incoming call")
|
||||||
|
# sdp = ev['content']['offer']['sdp']
|
||||||
|
# print "sdp: %s" % (sdp)
|
||||||
|
# xmppClients[ev['room_id']] = TrivialXmppClient(ev['room_id'], ev['user_id'])
|
||||||
|
# gevent.spawn(xmppClients[ev['room_id']].xmppLoop)
|
||||||
|
elif ev["type"] == "m.call.answer":
|
||||||
|
print("Call answered")
|
||||||
|
sdp = ev["content"]["answer"]["sdp"]
|
||||||
|
if ev["room_id"] not in xmppClients:
|
||||||
|
print("We didn't have a call for that room")
|
||||||
|
continue
|
||||||
|
# should probably check call ID too
|
||||||
|
xmppCli = xmppClients[ev["room_id"]]
|
||||||
|
xmppCli.sendAnswer(sdp)
|
||||||
|
elif ev["type"] == "m.call.hangup":
|
||||||
|
if ev["room_id"] in xmppClients:
|
||||||
|
xmppClients[ev["room_id"]].stop()
|
||||||
|
del xmppClients[ev["room_id"]]
|
||||||
|
|
||||||
|
|
||||||
|
class TrivialXmppClient:
|
||||||
|
def __init__(self, matrixRoom, userId):
|
||||||
|
self.rid = 0
|
||||||
|
self.matrixRoom = matrixRoom
|
||||||
|
self.userId = userId
|
||||||
|
self.running = True
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
self.running = False
|
||||||
|
|
||||||
|
def nextRid(self):
|
||||||
|
self.rid += 1
|
||||||
|
return "%d" % (self.rid)
|
||||||
|
|
||||||
|
def sendIq(self, xml):
|
||||||
|
fullXml = (
|
||||||
|
"<body rid='%s' xmlns='http://jabber.org/protocol/httpbind' sid='%s'>%s</body>"
|
||||||
|
% (self.nextRid(), self.sid, xml)
|
||||||
|
)
|
||||||
|
# print "\t>>>%s" % (fullXml)
|
||||||
|
return self.xmppPoke(fullXml)
|
||||||
|
|
||||||
|
def xmppPoke(self, xml):
|
||||||
|
headers = {"Content-Type": "application/xml"}
|
||||||
|
req = grequests.post(HTTPBIND, verify=False, headers=headers, data=xml)
|
||||||
|
resps = grequests.map([req])
|
||||||
|
obj = BeautifulSoup(resps[0].content)
|
||||||
|
return obj
|
||||||
|
|
||||||
|
def sendAnswer(self, answer):
|
||||||
|
print("sdp from matrix client", answer)
|
||||||
|
p = subprocess.Popen(
|
||||||
|
["node", "unjingle/unjingle.js", "--sdp"],
|
||||||
|
stdin=subprocess.PIPE,
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
)
|
||||||
|
jingle, out_err = p.communicate(answer)
|
||||||
|
jingle = jingle % {
|
||||||
|
"tojid": self.callfrom,
|
||||||
|
"action": "session-accept",
|
||||||
|
"initiator": self.callfrom,
|
||||||
|
"responder": self.jid,
|
||||||
|
"sid": self.callsid,
|
||||||
|
}
|
||||||
|
print("answer jingle from sdp", jingle)
|
||||||
|
res = self.sendIq(jingle)
|
||||||
|
print("reply from answer: ", res)
|
||||||
|
|
||||||
|
self.ssrcs = {}
|
||||||
|
jingleSoup = BeautifulSoup(jingle)
|
||||||
|
for cont in jingleSoup.iq.jingle.findAll("content"):
|
||||||
|
if cont.description:
|
||||||
|
self.ssrcs[cont["name"]] = cont.description["ssrc"]
|
||||||
|
print("my ssrcs:", self.ssrcs)
|
||||||
|
|
||||||
|
gevent.joinall([gevent.spawn(self.advertiseSsrcs)])
|
||||||
|
|
||||||
|
def advertiseSsrcs(self):
|
||||||
|
time.sleep(7)
|
||||||
|
print("SSRC spammer started")
|
||||||
|
while self.running:
|
||||||
|
ssrcMsg = "<presence to='%(tojid)s' xmlns='jabber:client'><x xmlns='http://jabber.org/protocol/muc'/><c xmlns='http://jabber.org/protocol/caps' hash='sha-1' node='http://jitsi.org/jitsimeet' ver='0WkSdhFnAUxrz4ImQQLdB80GFlE='/><nick xmlns='http://jabber.org/protocol/nick'>%(nick)s</nick><stats xmlns='http://jitsi.org/jitmeet/stats'><stat name='bitrate_download' value='175'/><stat name='bitrate_upload' value='176'/><stat name='packetLoss_total' value='0'/><stat name='packetLoss_download' value='0'/><stat name='packetLoss_upload' value='0'/></stats><media xmlns='http://estos.de/ns/mjs'><source type='audio' ssrc='%(assrc)s' direction='sendre'/><source type='video' ssrc='%(vssrc)s' direction='sendre'/></media></presence>" % {
|
||||||
|
"tojid": "%s@%s/%s" % (ROOMNAME, ROOMDOMAIN, self.shortJid),
|
||||||
|
"nick": self.userId,
|
||||||
|
"assrc": self.ssrcs["audio"],
|
||||||
|
"vssrc": self.ssrcs["video"],
|
||||||
|
}
|
||||||
|
res = self.sendIq(ssrcMsg)
|
||||||
|
print("reply from ssrc announce: ", res)
|
||||||
|
time.sleep(10)
|
||||||
|
|
||||||
|
def xmppLoop(self):
|
||||||
|
self.matrixCallId = time.time()
|
||||||
|
res = self.xmppPoke(
|
||||||
|
"<body rid='%s' xmlns='http://jabber.org/protocol/httpbind' to='%s' xml:lang='en' wait='60' hold='1' content='text/xml; charset=utf-8' ver='1.6' xmpp:version='1.0' xmlns:xmpp='urn:xmpp:xbosh'/>"
|
||||||
|
% (self.nextRid(), HOST)
|
||||||
|
)
|
||||||
|
|
||||||
|
print(res)
|
||||||
|
self.sid = res.body["sid"]
|
||||||
|
print("sid %s" % (self.sid))
|
||||||
|
|
||||||
|
res = self.sendIq(
|
||||||
|
"<auth xmlns='urn:ietf:params:xml:ns:xmpp-sasl' mechanism='ANONYMOUS'/>"
|
||||||
|
)
|
||||||
|
|
||||||
|
res = self.xmppPoke(
|
||||||
|
"<body rid='%s' xmlns='http://jabber.org/protocol/httpbind' sid='%s' to='%s' xml:lang='en' xmpp:restart='true' xmlns:xmpp='urn:xmpp:xbosh'/>"
|
||||||
|
% (self.nextRid(), self.sid, HOST)
|
||||||
|
)
|
||||||
|
|
||||||
|
res = self.sendIq(
|
||||||
|
"<iq type='set' id='_bind_auth_2' xmlns='jabber:client'><bind xmlns='urn:ietf:params:xml:ns:xmpp-bind'/></iq>"
|
||||||
|
)
|
||||||
|
print(res)
|
||||||
|
|
||||||
|
self.jid = res.body.iq.bind.jid.string
|
||||||
|
print("jid: %s" % (self.jid))
|
||||||
|
self.shortJid = self.jid.split("-")[0]
|
||||||
|
|
||||||
|
res = self.sendIq(
|
||||||
|
"<iq type='set' id='_session_auth_2' xmlns='jabber:client'><session xmlns='urn:ietf:params:xml:ns:xmpp-session'/></iq>"
|
||||||
|
)
|
||||||
|
|
||||||
|
# randomthing = res.body.iq['to']
|
||||||
|
# whatsitpart = randomthing.split('-')[0]
|
||||||
|
|
||||||
|
# print "other random bind thing: %s" % (randomthing)
|
||||||
|
|
||||||
|
# advertise preence to the jitsi room, with our nick
|
||||||
|
res = self.sendIq(
|
||||||
|
"<iq type='get' to='%s' xmlns='jabber:client' id='1:sendIQ'><services xmlns='urn:xmpp:extdisco:1'><service host='%s'/></services></iq><presence to='%s@%s/d98f6c40' xmlns='jabber:client'><x xmlns='http://jabber.org/protocol/muc'/><c xmlns='http://jabber.org/protocol/caps' hash='sha-1' node='http://jitsi.org/jitsimeet' ver='0WkSdhFnAUxrz4ImQQLdB80GFlE='/><nick xmlns='http://jabber.org/protocol/nick'>%s</nick></presence>"
|
||||||
|
% (HOST, TURNSERVER, ROOMNAME, ROOMDOMAIN, self.userId)
|
||||||
|
)
|
||||||
|
self.muc = {"users": []}
|
||||||
|
for p in res.body.findAll("presence"):
|
||||||
|
u = {}
|
||||||
|
u["shortJid"] = p["from"].split("/")[1]
|
||||||
|
if p.c and p.c.nick:
|
||||||
|
u["nick"] = p.c.nick.string
|
||||||
|
self.muc["users"].append(u)
|
||||||
|
print("muc: ", self.muc)
|
||||||
|
|
||||||
|
# wait for stuff
|
||||||
|
while True:
|
||||||
|
print("waiting...")
|
||||||
|
res = self.sendIq("")
|
||||||
|
print("got from stream: ", res)
|
||||||
|
if res.body.iq:
|
||||||
|
jingles = res.body.iq.findAll("jingle")
|
||||||
|
if len(jingles):
|
||||||
|
self.callfrom = res.body.iq["from"]
|
||||||
|
self.handleInvite(jingles[0])
|
||||||
|
elif "type" in res.body and res.body["type"] == "terminate":
|
||||||
|
self.running = False
|
||||||
|
del xmppClients[self.matrixRoom]
|
||||||
|
return
|
||||||
|
|
||||||
|
def handleInvite(self, jingle):
|
||||||
|
self.initiator = jingle["initiator"]
|
||||||
|
self.callsid = jingle["sid"]
|
||||||
|
p = subprocess.Popen(
|
||||||
|
["node", "unjingle/unjingle.js", "--jingle"],
|
||||||
|
stdin=subprocess.PIPE,
|
||||||
|
stdout=subprocess.PIPE,
|
||||||
|
)
|
||||||
|
print("raw jingle invite", str(jingle))
|
||||||
|
sdp, out_err = p.communicate(str(jingle))
|
||||||
|
print("transformed remote offer sdp", sdp)
|
||||||
|
inviteEvent = {
|
||||||
|
"offer": {"type": "offer", "sdp": sdp},
|
||||||
|
"call_id": self.matrixCallId,
|
||||||
|
"version": 0,
|
||||||
|
"lifetime": 30000,
|
||||||
|
}
|
||||||
|
matrixCli.sendEvent(self.matrixRoom, "m.call.invite", inviteEvent)
|
||||||
|
|
||||||
|
|
||||||
|
matrixCli = TrivialMatrixClient(ACCESS_TOKEN) # Undefined name
|
||||||
|
|
||||||
|
gevent.joinall([gevent.spawn(matrixLoop)])
|
188
contrib/jitsimeetbridge/syweb-jitsi-conference.patch
Normal file
188
contrib/jitsimeetbridge/syweb-jitsi-conference.patch
Normal file
@ -0,0 +1,188 @@
|
|||||||
|
diff --git a/syweb/webclient/app/components/matrix/matrix-call.js b/syweb/webclient/app/components/matrix/matrix-call.js
|
||||||
|
index 9fbfff0..dc68077 100644
|
||||||
|
--- a/syweb/webclient/app/components/matrix/matrix-call.js
|
||||||
|
+++ b/syweb/webclient/app/components/matrix/matrix-call.js
|
||||||
|
@@ -16,6 +16,45 @@ limitations under the License.
|
||||||
|
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
+
|
||||||
|
+function sendKeyframe(pc) {
|
||||||
|
+ console.log('sendkeyframe', pc.iceConnectionState);
|
||||||
|
+ if (pc.iceConnectionState !== 'connected') return; // safe...
|
||||||
|
+ pc.setRemoteDescription(
|
||||||
|
+ pc.remoteDescription,
|
||||||
|
+ function () {
|
||||||
|
+ pc.createAnswer(
|
||||||
|
+ function (modifiedAnswer) {
|
||||||
|
+ pc.setLocalDescription(
|
||||||
|
+ modifiedAnswer,
|
||||||
|
+ function () {
|
||||||
|
+ // noop
|
||||||
|
+ },
|
||||||
|
+ function (error) {
|
||||||
|
+ console.log('triggerKeyframe setLocalDescription failed', error);
|
||||||
|
+ messageHandler.showError();
|
||||||
|
+ }
|
||||||
|
+ );
|
||||||
|
+ },
|
||||||
|
+ function (error) {
|
||||||
|
+ console.log('triggerKeyframe createAnswer failed', error);
|
||||||
|
+ messageHandler.showError();
|
||||||
|
+ }
|
||||||
|
+ );
|
||||||
|
+ },
|
||||||
|
+ function (error) {
|
||||||
|
+ console.log('triggerKeyframe setRemoteDescription failed', error);
|
||||||
|
+ messageHandler.showError();
|
||||||
|
+ }
|
||||||
|
+ );
|
||||||
|
+}
|
||||||
|
+
|
||||||
|
+
|
||||||
|
+
|
||||||
|
+
|
||||||
|
+
|
||||||
|
+
|
||||||
|
+
|
||||||
|
var forAllVideoTracksOnStream = function(s, f) {
|
||||||
|
var tracks = s.getVideoTracks();
|
||||||
|
for (var i = 0; i < tracks.length; i++) {
|
||||||
|
@@ -83,7 +122,7 @@ angular.module('MatrixCall', [])
|
||||||
|
}
|
||||||
|
|
||||||
|
// FIXME: we should prevent any calls from being placed or accepted before this has finished
|
||||||
|
- MatrixCall.getTurnServer();
|
||||||
|
+ //MatrixCall.getTurnServer();
|
||||||
|
|
||||||
|
MatrixCall.CALL_TIMEOUT = 60000;
|
||||||
|
MatrixCall.FALLBACK_STUN_SERVER = 'stun:stun.l.google.com:19302';
|
||||||
|
@@ -132,6 +171,22 @@ angular.module('MatrixCall', [])
|
||||||
|
pc.onsignalingstatechange = function() { self.onSignallingStateChanged(); };
|
||||||
|
pc.onicecandidate = function(c) { self.gotLocalIceCandidate(c); };
|
||||||
|
pc.onaddstream = function(s) { self.onAddStream(s); };
|
||||||
|
+
|
||||||
|
+ var datachan = pc.createDataChannel('RTCDataChannel', {
|
||||||
|
+ reliable: false
|
||||||
|
+ });
|
||||||
|
+ console.log("data chan: "+datachan);
|
||||||
|
+ datachan.onopen = function() {
|
||||||
|
+ console.log("data channel open");
|
||||||
|
+ };
|
||||||
|
+ datachan.onmessage = function() {
|
||||||
|
+ console.log("data channel message");
|
||||||
|
+ };
|
||||||
|
+ pc.ondatachannel = function(event) {
|
||||||
|
+ console.log("have data channel");
|
||||||
|
+ event.channel.binaryType = 'blob';
|
||||||
|
+ };
|
||||||
|
+
|
||||||
|
return pc;
|
||||||
|
}
|
||||||
|
|
||||||
|
@@ -200,6 +255,12 @@ angular.module('MatrixCall', [])
|
||||||
|
}, this.msg.lifetime - event.age);
|
||||||
|
};
|
||||||
|
|
||||||
|
+ MatrixCall.prototype.receivedInvite = function(event) {
|
||||||
|
+ console.log("Got second invite for call "+this.call_id);
|
||||||
|
+ this.peerConn.setRemoteDescription(new RTCSessionDescription(this.msg.offer), this.onSetRemoteDescriptionSuccess, this.onSetRemoteDescriptionError);
|
||||||
|
+ };
|
||||||
|
+
|
||||||
|
+
|
||||||
|
// perverse as it may seem, sometimes we want to instantiate a call with a hangup message
|
||||||
|
// (because when getting the state of the room on load, events come in reverse order and
|
||||||
|
// we want to remember that a call has been hung up)
|
||||||
|
@@ -349,7 +410,7 @@ angular.module('MatrixCall', [])
|
||||||
|
'mandatory': {
|
||||||
|
'OfferToReceiveAudio': true,
|
||||||
|
'OfferToReceiveVideo': this.type == 'video'
|
||||||
|
- },
|
||||||
|
+ }
|
||||||
|
};
|
||||||
|
this.peerConn.createAnswer(function(d) { self.createdAnswer(d); }, function(e) {}, constraints);
|
||||||
|
// This can't be in an apply() because it's called by a predecessor call under glare conditions :(
|
||||||
|
@@ -359,8 +420,20 @@ angular.module('MatrixCall', [])
|
||||||
|
MatrixCall.prototype.gotLocalIceCandidate = function(event) {
|
||||||
|
if (event.candidate) {
|
||||||
|
console.log("Got local ICE "+event.candidate.sdpMid+" candidate: "+event.candidate.candidate);
|
||||||
|
- this.sendCandidate(event.candidate);
|
||||||
|
- }
|
||||||
|
+ //this.sendCandidate(event.candidate);
|
||||||
|
+ } else {
|
||||||
|
+ console.log("have all candidates, sending answer");
|
||||||
|
+ var content = {
|
||||||
|
+ version: 0,
|
||||||
|
+ call_id: this.call_id,
|
||||||
|
+ answer: this.peerConn.localDescription
|
||||||
|
+ };
|
||||||
|
+ this.sendEventWithRetry('m.call.answer', content);
|
||||||
|
+ var self = this;
|
||||||
|
+ $rootScope.$apply(function() {
|
||||||
|
+ self.state = 'connecting';
|
||||||
|
+ });
|
||||||
|
+ }
|
||||||
|
}
|
||||||
|
|
||||||
|
MatrixCall.prototype.gotRemoteIceCandidate = function(cand) {
|
||||||
|
@@ -418,15 +491,6 @@ angular.module('MatrixCall', [])
|
||||||
|
console.log("Created answer: "+description);
|
||||||
|
var self = this;
|
||||||
|
this.peerConn.setLocalDescription(description, function() {
|
||||||
|
- var content = {
|
||||||
|
- version: 0,
|
||||||
|
- call_id: self.call_id,
|
||||||
|
- answer: self.peerConn.localDescription
|
||||||
|
- };
|
||||||
|
- self.sendEventWithRetry('m.call.answer', content);
|
||||||
|
- $rootScope.$apply(function() {
|
||||||
|
- self.state = 'connecting';
|
||||||
|
- });
|
||||||
|
}, function() { console.log("Error setting local description!"); } );
|
||||||
|
};
|
||||||
|
|
||||||
|
@@ -448,6 +512,9 @@ angular.module('MatrixCall', [])
|
||||||
|
$rootScope.$apply(function() {
|
||||||
|
self.state = 'connected';
|
||||||
|
self.didConnect = true;
|
||||||
|
+ /*$timeout(function() {
|
||||||
|
+ sendKeyframe(self.peerConn);
|
||||||
|
+ }, 1000);*/
|
||||||
|
});
|
||||||
|
} else if (this.peerConn.iceConnectionState == 'failed') {
|
||||||
|
this.hangup('ice_failed');
|
||||||
|
@@ -518,6 +585,7 @@ angular.module('MatrixCall', [])
|
||||||
|
|
||||||
|
MatrixCall.prototype.onRemoteStreamEnded = function(event) {
|
||||||
|
console.log("Remote stream ended");
|
||||||
|
+ return;
|
||||||
|
var self = this;
|
||||||
|
$rootScope.$apply(function() {
|
||||||
|
self.state = 'ended';
|
||||||
|
diff --git a/syweb/webclient/app/components/matrix/matrix-phone-service.js b/syweb/webclient/app/components/matrix/matrix-phone-service.js
|
||||||
|
index 55dbbf5..272fa27 100644
|
||||||
|
--- a/syweb/webclient/app/components/matrix/matrix-phone-service.js
|
||||||
|
+++ b/syweb/webclient/app/components/matrix/matrix-phone-service.js
|
||||||
|
@@ -48,6 +48,13 @@ angular.module('matrixPhoneService', [])
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
+ // do we already have an entry for this call ID?
|
||||||
|
+ var existingEntry = matrixPhoneService.allCalls[msg.call_id];
|
||||||
|
+ if (existingEntry) {
|
||||||
|
+ existingEntry.receivedInvite(msg);
|
||||||
|
+ return;
|
||||||
|
+ }
|
||||||
|
+
|
||||||
|
var call = undefined;
|
||||||
|
if (!isLive) {
|
||||||
|
// if this event wasn't live then this call may already be over
|
||||||
|
@@ -108,7 +115,7 @@ angular.module('matrixPhoneService', [])
|
||||||
|
call.hangup();
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
- $rootScope.$broadcast(matrixPhoneService.INCOMING_CALL_EVENT, call);
|
||||||
|
+ $rootScope.$broadcast(matrixPhoneService.INCOMING_CALL_EVENT, call);
|
||||||
|
}
|
||||||
|
} else if (event.type == 'm.call.answer') {
|
||||||
|
var call = matrixPhoneService.allCalls[msg.call_id];
|
712
contrib/jitsimeetbridge/unjingle/strophe.jingle.sdp.js
Normal file
712
contrib/jitsimeetbridge/unjingle/strophe.jingle.sdp.js
Normal file
@ -0,0 +1,712 @@
|
|||||||
|
/* jshint -W117 */
|
||||||
|
// SDP STUFF
|
||||||
|
function SDP(sdp) {
|
||||||
|
this.media = sdp.split('\r\nm=');
|
||||||
|
for (var i = 1; i < this.media.length; i++) {
|
||||||
|
this.media[i] = 'm=' + this.media[i];
|
||||||
|
if (i != this.media.length - 1) {
|
||||||
|
this.media[i] += '\r\n';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
this.session = this.media.shift() + '\r\n';
|
||||||
|
this.raw = this.session + this.media.join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
exports.SDP = SDP;
|
||||||
|
|
||||||
|
var jsdom = require("jsdom");
|
||||||
|
var window = jsdom.jsdom().parentWindow;
|
||||||
|
var $ = require('jquery')(window);
|
||||||
|
|
||||||
|
var SDPUtil = require('./strophe.jingle.sdp.util.js').SDPUtil;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns map of MediaChannel mapped per channel idx.
|
||||||
|
*/
|
||||||
|
SDP.prototype.getMediaSsrcMap = function() {
|
||||||
|
var self = this;
|
||||||
|
var media_ssrcs = {};
|
||||||
|
for (channelNum = 0; channelNum < self.media.length; channelNum++) {
|
||||||
|
modified = true;
|
||||||
|
tmp = SDPUtil.find_lines(self.media[channelNum], 'a=ssrc:');
|
||||||
|
var type = SDPUtil.parse_mid(SDPUtil.find_line(self.media[channelNum], 'a=mid:'));
|
||||||
|
var channel = new MediaChannel(channelNum, type);
|
||||||
|
media_ssrcs[channelNum] = channel;
|
||||||
|
tmp.forEach(function (line) {
|
||||||
|
var linessrc = line.substring(7).split(' ')[0];
|
||||||
|
// allocate new ChannelSsrc
|
||||||
|
if(!channel.ssrcs[linessrc]) {
|
||||||
|
channel.ssrcs[linessrc] = new ChannelSsrc(linessrc, type);
|
||||||
|
}
|
||||||
|
channel.ssrcs[linessrc].lines.push(line);
|
||||||
|
});
|
||||||
|
tmp = SDPUtil.find_lines(self.media[channelNum], 'a=ssrc-group:');
|
||||||
|
tmp.forEach(function(line){
|
||||||
|
var semantics = line.substr(0, idx).substr(13);
|
||||||
|
var ssrcs = line.substr(14 + semantics.length).split(' ');
|
||||||
|
if (ssrcs.length != 0) {
|
||||||
|
var ssrcGroup = new ChannelSsrcGroup(semantics, ssrcs);
|
||||||
|
channel.ssrcGroups.push(ssrcGroup);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return media_ssrcs;
|
||||||
|
};
|
||||||
|
/**
|
||||||
|
* Returns <tt>true</tt> if this SDP contains given SSRC.
|
||||||
|
* @param ssrc the ssrc to check.
|
||||||
|
* @returns {boolean} <tt>true</tt> if this SDP contains given SSRC.
|
||||||
|
*/
|
||||||
|
SDP.prototype.containsSSRC = function(ssrc) {
|
||||||
|
var channels = this.getMediaSsrcMap();
|
||||||
|
var contains = false;
|
||||||
|
Object.keys(channels).forEach(function(chNumber){
|
||||||
|
var channel = channels[chNumber];
|
||||||
|
//console.log("Check", channel, ssrc);
|
||||||
|
if(Object.keys(channel.ssrcs).indexOf(ssrc) != -1){
|
||||||
|
contains = true;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return contains;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Returns map of MediaChannel that contains only media not contained in <tt>otherSdp</tt>. Mapped by channel idx.
|
||||||
|
* @param otherSdp the other SDP to check ssrc with.
|
||||||
|
*/
|
||||||
|
SDP.prototype.getNewMedia = function(otherSdp) {
|
||||||
|
|
||||||
|
// this could be useful in Array.prototype.
|
||||||
|
function arrayEquals(array) {
|
||||||
|
// if the other array is a falsy value, return
|
||||||
|
if (!array)
|
||||||
|
return false;
|
||||||
|
|
||||||
|
// compare lengths - can save a lot of time
|
||||||
|
if (this.length != array.length)
|
||||||
|
return false;
|
||||||
|
|
||||||
|
for (var i = 0, l=this.length; i < l; i++) {
|
||||||
|
// Check if we have nested arrays
|
||||||
|
if (this[i] instanceof Array && array[i] instanceof Array) {
|
||||||
|
// recurse into the nested arrays
|
||||||
|
if (!this[i].equals(array[i]))
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
else if (this[i] != array[i]) {
|
||||||
|
// Warning - two different object instances will never be equal: {x:20} != {x:20}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
var myMedia = this.getMediaSsrcMap();
|
||||||
|
var othersMedia = otherSdp.getMediaSsrcMap();
|
||||||
|
var newMedia = {};
|
||||||
|
Object.keys(othersMedia).forEach(function(channelNum) {
|
||||||
|
var myChannel = myMedia[channelNum];
|
||||||
|
var othersChannel = othersMedia[channelNum];
|
||||||
|
if(!myChannel && othersChannel) {
|
||||||
|
// Add whole channel
|
||||||
|
newMedia[channelNum] = othersChannel;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// Look for new ssrcs accross the channel
|
||||||
|
Object.keys(othersChannel.ssrcs).forEach(function(ssrc) {
|
||||||
|
if(Object.keys(myChannel.ssrcs).indexOf(ssrc) === -1) {
|
||||||
|
// Allocate channel if we've found ssrc that doesn't exist in our channel
|
||||||
|
if(!newMedia[channelNum]){
|
||||||
|
newMedia[channelNum] = new MediaChannel(othersChannel.chNumber, othersChannel.mediaType);
|
||||||
|
}
|
||||||
|
newMedia[channelNum].ssrcs[ssrc] = othersChannel.ssrcs[ssrc];
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Look for new ssrc groups across the channels
|
||||||
|
othersChannel.ssrcGroups.forEach(function(otherSsrcGroup){
|
||||||
|
|
||||||
|
// try to match the other ssrc-group with an ssrc-group of ours
|
||||||
|
var matched = false;
|
||||||
|
for (var i = 0; i < myChannel.ssrcGroups.length; i++) {
|
||||||
|
var mySsrcGroup = myChannel.ssrcGroups[i];
|
||||||
|
if (otherSsrcGroup.semantics == mySsrcGroup.semantics
|
||||||
|
&& arrayEquals.apply(otherSsrcGroup.ssrcs, [mySsrcGroup.ssrcs])) {
|
||||||
|
|
||||||
|
matched = true;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!matched) {
|
||||||
|
// Allocate channel if we've found an ssrc-group that doesn't
|
||||||
|
// exist in our channel
|
||||||
|
|
||||||
|
if(!newMedia[channelNum]){
|
||||||
|
newMedia[channelNum] = new MediaChannel(othersChannel.chNumber, othersChannel.mediaType);
|
||||||
|
}
|
||||||
|
newMedia[channelNum].ssrcGroups.push(otherSsrcGroup);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
return newMedia;
|
||||||
|
};
|
||||||
|
|
||||||
|
// remove iSAC and CN from SDP
|
||||||
|
SDP.prototype.mangle = function () {
|
||||||
|
var i, j, mline, lines, rtpmap, newdesc;
|
||||||
|
for (i = 0; i < this.media.length; i++) {
|
||||||
|
lines = this.media[i].split('\r\n');
|
||||||
|
lines.pop(); // remove empty last element
|
||||||
|
mline = SDPUtil.parse_mline(lines.shift());
|
||||||
|
if (mline.media != 'audio')
|
||||||
|
continue;
|
||||||
|
newdesc = '';
|
||||||
|
mline.fmt.length = 0;
|
||||||
|
for (j = 0; j < lines.length; j++) {
|
||||||
|
if (lines[j].substr(0, 9) == 'a=rtpmap:') {
|
||||||
|
rtpmap = SDPUtil.parse_rtpmap(lines[j]);
|
||||||
|
if (rtpmap.name == 'CN' || rtpmap.name == 'ISAC')
|
||||||
|
continue;
|
||||||
|
mline.fmt.push(rtpmap.id);
|
||||||
|
newdesc += lines[j] + '\r\n';
|
||||||
|
} else {
|
||||||
|
newdesc += lines[j] + '\r\n';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
this.media[i] = SDPUtil.build_mline(mline) + '\r\n';
|
||||||
|
this.media[i] += newdesc;
|
||||||
|
}
|
||||||
|
this.raw = this.session + this.media.join('');
|
||||||
|
};
|
||||||
|
|
||||||
|
// remove lines matching prefix from session section
|
||||||
|
SDP.prototype.removeSessionLines = function(prefix) {
|
||||||
|
var self = this;
|
||||||
|
var lines = SDPUtil.find_lines(this.session, prefix);
|
||||||
|
lines.forEach(function(line) {
|
||||||
|
self.session = self.session.replace(line + '\r\n', '');
|
||||||
|
});
|
||||||
|
this.raw = this.session + this.media.join('');
|
||||||
|
return lines;
|
||||||
|
}
|
||||||
|
// remove lines matching prefix from a media section specified by mediaindex
|
||||||
|
// TODO: non-numeric mediaindex could match mid
|
||||||
|
SDP.prototype.removeMediaLines = function(mediaindex, prefix) {
|
||||||
|
var self = this;
|
||||||
|
var lines = SDPUtil.find_lines(this.media[mediaindex], prefix);
|
||||||
|
lines.forEach(function(line) {
|
||||||
|
self.media[mediaindex] = self.media[mediaindex].replace(line + '\r\n', '');
|
||||||
|
});
|
||||||
|
this.raw = this.session + this.media.join('');
|
||||||
|
return lines;
|
||||||
|
}
|
||||||
|
|
||||||
|
// add content's to a jingle element
|
||||||
|
SDP.prototype.toJingle = function (elem, thecreator) {
|
||||||
|
var i, j, k, mline, ssrc, rtpmap, tmp, line, lines;
|
||||||
|
var self = this;
|
||||||
|
// new bundle plan
|
||||||
|
if (SDPUtil.find_line(this.session, 'a=group:')) {
|
||||||
|
lines = SDPUtil.find_lines(this.session, 'a=group:');
|
||||||
|
for (i = 0; i < lines.length; i++) {
|
||||||
|
tmp = lines[i].split(' ');
|
||||||
|
var semantics = tmp.shift().substr(8);
|
||||||
|
elem.c('group', {xmlns: 'urn:xmpp:jingle:apps:grouping:0', semantics:semantics});
|
||||||
|
for (j = 0; j < tmp.length; j++) {
|
||||||
|
elem.c('content', {name: tmp[j]}).up();
|
||||||
|
}
|
||||||
|
elem.up();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// old bundle plan, to be removed
|
||||||
|
var bundle = [];
|
||||||
|
if (SDPUtil.find_line(this.session, 'a=group:BUNDLE')) {
|
||||||
|
bundle = SDPUtil.find_line(this.session, 'a=group:BUNDLE ').split(' ');
|
||||||
|
bundle.shift();
|
||||||
|
}
|
||||||
|
for (i = 0; i < this.media.length; i++) {
|
||||||
|
mline = SDPUtil.parse_mline(this.media[i].split('\r\n')[0]);
|
||||||
|
if (!(mline.media === 'audio' ||
|
||||||
|
mline.media === 'video' ||
|
||||||
|
mline.media === 'application'))
|
||||||
|
{
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (SDPUtil.find_line(this.media[i], 'a=ssrc:')) {
|
||||||
|
ssrc = SDPUtil.find_line(this.media[i], 'a=ssrc:').substring(7).split(' ')[0]; // take the first
|
||||||
|
} else {
|
||||||
|
ssrc = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
elem.c('content', {creator: thecreator, name: mline.media});
|
||||||
|
if (SDPUtil.find_line(this.media[i], 'a=mid:')) {
|
||||||
|
// prefer identifier from a=mid if present
|
||||||
|
var mid = SDPUtil.parse_mid(SDPUtil.find_line(this.media[i], 'a=mid:'));
|
||||||
|
elem.attrs({ name: mid });
|
||||||
|
|
||||||
|
// old BUNDLE plan, to be removed
|
||||||
|
if (bundle.indexOf(mid) !== -1) {
|
||||||
|
elem.c('bundle', {xmlns: 'http://estos.de/ns/bundle'}).up();
|
||||||
|
bundle.splice(bundle.indexOf(mid), 1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (SDPUtil.find_line(this.media[i], 'a=rtpmap:').length)
|
||||||
|
{
|
||||||
|
elem.c('description',
|
||||||
|
{xmlns: 'urn:xmpp:jingle:apps:rtp:1',
|
||||||
|
media: mline.media });
|
||||||
|
if (ssrc) {
|
||||||
|
elem.attrs({ssrc: ssrc});
|
||||||
|
}
|
||||||
|
for (j = 0; j < mline.fmt.length; j++) {
|
||||||
|
rtpmap = SDPUtil.find_line(this.media[i], 'a=rtpmap:' + mline.fmt[j]);
|
||||||
|
elem.c('payload-type', SDPUtil.parse_rtpmap(rtpmap));
|
||||||
|
// put any 'a=fmtp:' + mline.fmt[j] lines into <param name=foo value=bar/>
|
||||||
|
if (SDPUtil.find_line(this.media[i], 'a=fmtp:' + mline.fmt[j])) {
|
||||||
|
tmp = SDPUtil.parse_fmtp(SDPUtil.find_line(this.media[i], 'a=fmtp:' + mline.fmt[j]));
|
||||||
|
for (k = 0; k < tmp.length; k++) {
|
||||||
|
elem.c('parameter', tmp[k]).up();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
this.RtcpFbToJingle(i, elem, mline.fmt[j]); // XEP-0293 -- map a=rtcp-fb
|
||||||
|
|
||||||
|
elem.up();
|
||||||
|
}
|
||||||
|
if (SDPUtil.find_line(this.media[i], 'a=crypto:', this.session)) {
|
||||||
|
elem.c('encryption', {required: 1});
|
||||||
|
var crypto = SDPUtil.find_lines(this.media[i], 'a=crypto:', this.session);
|
||||||
|
crypto.forEach(function(line) {
|
||||||
|
elem.c('crypto', SDPUtil.parse_crypto(line)).up();
|
||||||
|
});
|
||||||
|
elem.up(); // end of encryption
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ssrc) {
|
||||||
|
// new style mapping
|
||||||
|
elem.c('source', { ssrc: ssrc, xmlns: 'urn:xmpp:jingle:apps:rtp:ssma:0' });
|
||||||
|
// FIXME: group by ssrc and support multiple different ssrcs
|
||||||
|
var ssrclines = SDPUtil.find_lines(this.media[i], 'a=ssrc:');
|
||||||
|
ssrclines.forEach(function(line) {
|
||||||
|
idx = line.indexOf(' ');
|
||||||
|
var linessrc = line.substr(0, idx).substr(7);
|
||||||
|
if (linessrc != ssrc) {
|
||||||
|
elem.up();
|
||||||
|
ssrc = linessrc;
|
||||||
|
elem.c('source', { ssrc: ssrc, xmlns: 'urn:xmpp:jingle:apps:rtp:ssma:0' });
|
||||||
|
}
|
||||||
|
var kv = line.substr(idx + 1);
|
||||||
|
elem.c('parameter');
|
||||||
|
if (kv.indexOf(':') == -1) {
|
||||||
|
elem.attrs({ name: kv });
|
||||||
|
} else {
|
||||||
|
elem.attrs({ name: kv.split(':', 2)[0] });
|
||||||
|
elem.attrs({ value: kv.split(':', 2)[1] });
|
||||||
|
}
|
||||||
|
elem.up();
|
||||||
|
});
|
||||||
|
elem.up();
|
||||||
|
|
||||||
|
// old proprietary mapping, to be removed at some point
|
||||||
|
tmp = SDPUtil.parse_ssrc(this.media[i]);
|
||||||
|
tmp.xmlns = 'http://estos.de/ns/ssrc';
|
||||||
|
tmp.ssrc = ssrc;
|
||||||
|
elem.c('ssrc', tmp).up(); // ssrc is part of description
|
||||||
|
|
||||||
|
// XEP-0339 handle ssrc-group attributes
|
||||||
|
var ssrc_group_lines = SDPUtil.find_lines(this.media[i], 'a=ssrc-group:');
|
||||||
|
ssrc_group_lines.forEach(function(line) {
|
||||||
|
idx = line.indexOf(' ');
|
||||||
|
var semantics = line.substr(0, idx).substr(13);
|
||||||
|
var ssrcs = line.substr(14 + semantics.length).split(' ');
|
||||||
|
if (ssrcs.length != 0) {
|
||||||
|
elem.c('ssrc-group', { semantics: semantics, xmlns: 'urn:xmpp:jingle:apps:rtp:ssma:0' });
|
||||||
|
ssrcs.forEach(function(ssrc) {
|
||||||
|
elem.c('source', { ssrc: ssrc })
|
||||||
|
.up();
|
||||||
|
});
|
||||||
|
elem.up();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (SDPUtil.find_line(this.media[i], 'a=rtcp-mux')) {
|
||||||
|
elem.c('rtcp-mux').up();
|
||||||
|
}
|
||||||
|
|
||||||
|
// XEP-0293 -- map a=rtcp-fb:*
|
||||||
|
this.RtcpFbToJingle(i, elem, '*');
|
||||||
|
|
||||||
|
// XEP-0294
|
||||||
|
if (SDPUtil.find_line(this.media[i], 'a=extmap:')) {
|
||||||
|
lines = SDPUtil.find_lines(this.media[i], 'a=extmap:');
|
||||||
|
for (j = 0; j < lines.length; j++) {
|
||||||
|
tmp = SDPUtil.parse_extmap(lines[j]);
|
||||||
|
elem.c('rtp-hdrext', { xmlns: 'urn:xmpp:jingle:apps:rtp:rtp-hdrext:0',
|
||||||
|
uri: tmp.uri,
|
||||||
|
id: tmp.value });
|
||||||
|
if (tmp.hasOwnProperty('direction')) {
|
||||||
|
switch (tmp.direction) {
|
||||||
|
case 'sendonly':
|
||||||
|
elem.attrs({senders: 'responder'});
|
||||||
|
break;
|
||||||
|
case 'recvonly':
|
||||||
|
elem.attrs({senders: 'initiator'});
|
||||||
|
break;
|
||||||
|
case 'sendrecv':
|
||||||
|
elem.attrs({senders: 'both'});
|
||||||
|
break;
|
||||||
|
case 'inactive':
|
||||||
|
elem.attrs({senders: 'none'});
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// TODO: handle params
|
||||||
|
elem.up();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
elem.up(); // end of description
|
||||||
|
}
|
||||||
|
|
||||||
|
// map ice-ufrag/pwd, dtls fingerprint, candidates
|
||||||
|
this.TransportToJingle(i, elem);
|
||||||
|
|
||||||
|
if (SDPUtil.find_line(this.media[i], 'a=sendrecv', this.session)) {
|
||||||
|
elem.attrs({senders: 'both'});
|
||||||
|
} else if (SDPUtil.find_line(this.media[i], 'a=sendonly', this.session)) {
|
||||||
|
elem.attrs({senders: 'initiator'});
|
||||||
|
} else if (SDPUtil.find_line(this.media[i], 'a=recvonly', this.session)) {
|
||||||
|
elem.attrs({senders: 'responder'});
|
||||||
|
} else if (SDPUtil.find_line(this.media[i], 'a=inactive', this.session)) {
|
||||||
|
elem.attrs({senders: 'none'});
|
||||||
|
}
|
||||||
|
if (mline.port == '0') {
|
||||||
|
// estos hack to reject an m-line
|
||||||
|
elem.attrs({senders: 'rejected'});
|
||||||
|
}
|
||||||
|
elem.up(); // end of content
|
||||||
|
}
|
||||||
|
elem.up();
|
||||||
|
return elem;
|
||||||
|
};
|
||||||
|
|
||||||
|
SDP.prototype.TransportToJingle = function (mediaindex, elem) {
|
||||||
|
var i = mediaindex;
|
||||||
|
var tmp;
|
||||||
|
var self = this;
|
||||||
|
elem.c('transport');
|
||||||
|
|
||||||
|
// XEP-0343 DTLS/SCTP
|
||||||
|
if (SDPUtil.find_line(this.media[mediaindex], 'a=sctpmap:').length)
|
||||||
|
{
|
||||||
|
var sctpmap = SDPUtil.find_line(
|
||||||
|
this.media[i], 'a=sctpmap:', self.session);
|
||||||
|
if (sctpmap)
|
||||||
|
{
|
||||||
|
var sctpAttrs = SDPUtil.parse_sctpmap(sctpmap);
|
||||||
|
elem.c('sctpmap',
|
||||||
|
{
|
||||||
|
xmlns: 'urn:xmpp:jingle:transports:dtls-sctp:1',
|
||||||
|
number: sctpAttrs[0], /* SCTP port */
|
||||||
|
protocol: sctpAttrs[1], /* protocol */
|
||||||
|
});
|
||||||
|
// Optional stream count attribute
|
||||||
|
if (sctpAttrs.length > 2)
|
||||||
|
elem.attrs({ streams: sctpAttrs[2]});
|
||||||
|
elem.up();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// XEP-0320
|
||||||
|
var fingerprints = SDPUtil.find_lines(this.media[mediaindex], 'a=fingerprint:', this.session);
|
||||||
|
fingerprints.forEach(function(line) {
|
||||||
|
tmp = SDPUtil.parse_fingerprint(line);
|
||||||
|
tmp.xmlns = 'urn:xmpp:jingle:apps:dtls:0';
|
||||||
|
elem.c('fingerprint').t(tmp.fingerprint);
|
||||||
|
delete tmp.fingerprint;
|
||||||
|
line = SDPUtil.find_line(self.media[mediaindex], 'a=setup:', self.session);
|
||||||
|
if (line) {
|
||||||
|
tmp.setup = line.substr(8);
|
||||||
|
}
|
||||||
|
elem.attrs(tmp);
|
||||||
|
elem.up(); // end of fingerprint
|
||||||
|
});
|
||||||
|
tmp = SDPUtil.iceparams(this.media[mediaindex], this.session);
|
||||||
|
if (tmp) {
|
||||||
|
tmp.xmlns = 'urn:xmpp:jingle:transports:ice-udp:1';
|
||||||
|
elem.attrs(tmp);
|
||||||
|
// XEP-0176
|
||||||
|
if (SDPUtil.find_line(this.media[mediaindex], 'a=candidate:', this.session)) { // add any a=candidate lines
|
||||||
|
var lines = SDPUtil.find_lines(this.media[mediaindex], 'a=candidate:', this.session);
|
||||||
|
lines.forEach(function (line) {
|
||||||
|
elem.c('candidate', SDPUtil.candidateToJingle(line)).up();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
elem.up(); // end of transport
|
||||||
|
}
|
||||||
|
|
||||||
|
SDP.prototype.RtcpFbToJingle = function (mediaindex, elem, payloadtype) { // XEP-0293
|
||||||
|
var lines = SDPUtil.find_lines(this.media[mediaindex], 'a=rtcp-fb:' + payloadtype);
|
||||||
|
lines.forEach(function (line) {
|
||||||
|
var tmp = SDPUtil.parse_rtcpfb(line);
|
||||||
|
if (tmp.type == 'trr-int') {
|
||||||
|
elem.c('rtcp-fb-trr-int', {xmlns: 'urn:xmpp:jingle:apps:rtp:rtcp-fb:0', value: tmp.params[0]});
|
||||||
|
elem.up();
|
||||||
|
} else {
|
||||||
|
elem.c('rtcp-fb', {xmlns: 'urn:xmpp:jingle:apps:rtp:rtcp-fb:0', type: tmp.type});
|
||||||
|
if (tmp.params.length > 0) {
|
||||||
|
elem.attrs({'subtype': tmp.params[0]});
|
||||||
|
}
|
||||||
|
elem.up();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
SDP.prototype.RtcpFbFromJingle = function (elem, payloadtype) { // XEP-0293
|
||||||
|
var media = '';
|
||||||
|
var tmp = elem.find('>rtcp-fb-trr-int[xmlns="urn:xmpp:jingle:apps:rtp:rtcp-fb:0"]');
|
||||||
|
if (tmp.length) {
|
||||||
|
media += 'a=rtcp-fb:' + '*' + ' ' + 'trr-int' + ' ';
|
||||||
|
if (tmp.attr('value')) {
|
||||||
|
media += tmp.attr('value');
|
||||||
|
} else {
|
||||||
|
media += '0';
|
||||||
|
}
|
||||||
|
media += '\r\n';
|
||||||
|
}
|
||||||
|
tmp = elem.find('>rtcp-fb[xmlns="urn:xmpp:jingle:apps:rtp:rtcp-fb:0"]');
|
||||||
|
tmp.each(function () {
|
||||||
|
media += 'a=rtcp-fb:' + payloadtype + ' ' + $(this).attr('type');
|
||||||
|
if ($(this).attr('subtype')) {
|
||||||
|
media += ' ' + $(this).attr('subtype');
|
||||||
|
}
|
||||||
|
media += '\r\n';
|
||||||
|
});
|
||||||
|
return media;
|
||||||
|
};
|
||||||
|
|
||||||
|
// construct an SDP from a jingle stanza
|
||||||
|
SDP.prototype.fromJingle = function (jingle) {
|
||||||
|
var self = this;
|
||||||
|
this.raw = 'v=0\r\n' +
|
||||||
|
'o=- ' + '1923518516' + ' 2 IN IP4 0.0.0.0\r\n' +// FIXME
|
||||||
|
's=-\r\n' +
|
||||||
|
't=0 0\r\n';
|
||||||
|
// http://tools.ietf.org/html/draft-ietf-mmusic-sdp-bundle-negotiation-04#section-8
|
||||||
|
if ($(jingle).find('>group[xmlns="urn:xmpp:jingle:apps:grouping:0"]').length) {
|
||||||
|
$(jingle).find('>group[xmlns="urn:xmpp:jingle:apps:grouping:0"]').each(function (idx, group) {
|
||||||
|
var contents = $(group).find('>content').map(function (idx, content) {
|
||||||
|
return content.getAttribute('name');
|
||||||
|
}).get();
|
||||||
|
if (contents.length > 0) {
|
||||||
|
self.raw += 'a=group:' + (group.getAttribute('semantics') || group.getAttribute('type')) + ' ' + contents.join(' ') + '\r\n';
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} else if ($(jingle).find('>group[xmlns="urn:ietf:rfc:5888"]').length) {
|
||||||
|
// temporary namespace, not to be used. to be removed soon.
|
||||||
|
$(jingle).find('>group[xmlns="urn:ietf:rfc:5888"]').each(function (idx, group) {
|
||||||
|
var contents = $(group).find('>content').map(function (idx, content) {
|
||||||
|
return content.getAttribute('name');
|
||||||
|
}).get();
|
||||||
|
if (group.getAttribute('type') !== null && contents.length > 0) {
|
||||||
|
self.raw += 'a=group:' + group.getAttribute('type') + ' ' + contents.join(' ') + '\r\n';
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
// for backward compability, to be removed soon
|
||||||
|
// assume all contents are in the same bundle group, can be improved upon later
|
||||||
|
var bundle = $(jingle).find('>content').filter(function (idx, content) {
|
||||||
|
//elem.c('bundle', {xmlns:'http://estos.de/ns/bundle'});
|
||||||
|
return $(content).find('>bundle').length > 0;
|
||||||
|
}).map(function (idx, content) {
|
||||||
|
return content.getAttribute('name');
|
||||||
|
}).get();
|
||||||
|
if (bundle.length) {
|
||||||
|
this.raw += 'a=group:BUNDLE ' + bundle.join(' ') + '\r\n';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
this.session = this.raw;
|
||||||
|
jingle.find('>content').each(function () {
|
||||||
|
var m = self.jingle2media($(this));
|
||||||
|
self.media.push(m);
|
||||||
|
});
|
||||||
|
|
||||||
|
// reconstruct msid-semantic -- apparently not necessary
|
||||||
|
/*
|
||||||
|
var msid = SDPUtil.parse_ssrc(this.raw);
|
||||||
|
if (msid.hasOwnProperty('mslabel')) {
|
||||||
|
this.session += "a=msid-semantic: WMS " + msid.mslabel + "\r\n";
|
||||||
|
}
|
||||||
|
*/
|
||||||
|
|
||||||
|
this.raw = this.session + this.media.join('');
|
||||||
|
};
|
||||||
|
|
||||||
|
// translate a jingle content element into an an SDP media part
|
||||||
|
SDP.prototype.jingle2media = function (content) {
|
||||||
|
var media = '',
|
||||||
|
desc = content.find('description'),
|
||||||
|
ssrc = desc.attr('ssrc'),
|
||||||
|
self = this,
|
||||||
|
tmp;
|
||||||
|
var sctp = content.find(
|
||||||
|
'>transport>sctpmap[xmlns="urn:xmpp:jingle:transports:dtls-sctp:1"]');
|
||||||
|
|
||||||
|
tmp = { media: desc.attr('media') };
|
||||||
|
tmp.port = '1';
|
||||||
|
if (content.attr('senders') == 'rejected') {
|
||||||
|
// estos hack to reject an m-line.
|
||||||
|
tmp.port = '0';
|
||||||
|
}
|
||||||
|
if (content.find('>transport>fingerprint').length || desc.find('encryption').length) {
|
||||||
|
if (sctp.length)
|
||||||
|
tmp.proto = 'DTLS/SCTP';
|
||||||
|
else
|
||||||
|
tmp.proto = 'RTP/SAVPF';
|
||||||
|
} else {
|
||||||
|
tmp.proto = 'RTP/AVPF';
|
||||||
|
}
|
||||||
|
if (!sctp.length)
|
||||||
|
{
|
||||||
|
tmp.fmt = desc.find('payload-type').map(
|
||||||
|
function () { return this.getAttribute('id'); }).get();
|
||||||
|
media += SDPUtil.build_mline(tmp) + '\r\n';
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
media += 'm=application 1 DTLS/SCTP ' + sctp.attr('number') + '\r\n';
|
||||||
|
media += 'a=sctpmap:' + sctp.attr('number') +
|
||||||
|
' ' + sctp.attr('protocol');
|
||||||
|
|
||||||
|
var streamCount = sctp.attr('streams');
|
||||||
|
if (streamCount)
|
||||||
|
media += ' ' + streamCount + '\r\n';
|
||||||
|
else
|
||||||
|
media += '\r\n';
|
||||||
|
}
|
||||||
|
|
||||||
|
media += 'c=IN IP4 0.0.0.0\r\n';
|
||||||
|
if (!sctp.length)
|
||||||
|
media += 'a=rtcp:1 IN IP4 0.0.0.0\r\n';
|
||||||
|
//tmp = content.find('>transport[xmlns="urn:xmpp:jingle:transports:ice-udp:1"]');
|
||||||
|
tmp = content.find('>bundle>transport[xmlns="urn:xmpp:jingle:transports:ice-udp:1"]');
|
||||||
|
//console.log('transports: '+content.find('>transport[xmlns="urn:xmpp:jingle:transports:ice-udp:1"]').length);
|
||||||
|
//console.log('bundle.transports: '+content.find('>bundle>transport[xmlns="urn:xmpp:jingle:transports:ice-udp:1"]').length);
|
||||||
|
//console.log("tmp fingerprint: "+tmp.find('>fingerprint').innerHTML);
|
||||||
|
if (tmp.length) {
|
||||||
|
if (tmp.attr('ufrag')) {
|
||||||
|
media += SDPUtil.build_iceufrag(tmp.attr('ufrag')) + '\r\n';
|
||||||
|
}
|
||||||
|
if (tmp.attr('pwd')) {
|
||||||
|
media += SDPUtil.build_icepwd(tmp.attr('pwd')) + '\r\n';
|
||||||
|
}
|
||||||
|
tmp.find('>fingerprint').each(function () {
|
||||||
|
// FIXME: check namespace at some point
|
||||||
|
media += 'a=fingerprint:' + this.getAttribute('hash');
|
||||||
|
media += ' ' + $(this).text();
|
||||||
|
media += '\r\n';
|
||||||
|
//console.log("mline "+media);
|
||||||
|
if (this.getAttribute('setup')) {
|
||||||
|
media += 'a=setup:' + this.getAttribute('setup') + '\r\n';
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
switch (content.attr('senders')) {
|
||||||
|
case 'initiator':
|
||||||
|
media += 'a=sendonly\r\n';
|
||||||
|
break;
|
||||||
|
case 'responder':
|
||||||
|
media += 'a=recvonly\r\n';
|
||||||
|
break;
|
||||||
|
case 'none':
|
||||||
|
media += 'a=inactive\r\n';
|
||||||
|
break;
|
||||||
|
case 'both':
|
||||||
|
media += 'a=sendrecv\r\n';
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
media += 'a=mid:' + content.attr('name') + '\r\n';
|
||||||
|
/*if (content.attr('name') == 'video') {
|
||||||
|
media += 'a=x-google-flag:conference' + '\r\n';
|
||||||
|
}*/
|
||||||
|
|
||||||
|
// <description><rtcp-mux/></description>
|
||||||
|
// see http://code.google.com/p/libjingle/issues/detail?id=309 -- no spec though
|
||||||
|
// and http://mail.jabber.org/pipermail/jingle/2011-December/001761.html
|
||||||
|
if (desc.find('rtcp-mux').length) {
|
||||||
|
media += 'a=rtcp-mux\r\n';
|
||||||
|
}
|
||||||
|
|
||||||
|
if (desc.find('encryption').length) {
|
||||||
|
desc.find('encryption>crypto').each(function () {
|
||||||
|
media += 'a=crypto:' + this.getAttribute('tag');
|
||||||
|
media += ' ' + this.getAttribute('crypto-suite');
|
||||||
|
media += ' ' + this.getAttribute('key-params');
|
||||||
|
if (this.getAttribute('session-params')) {
|
||||||
|
media += ' ' + this.getAttribute('session-params');
|
||||||
|
}
|
||||||
|
media += '\r\n';
|
||||||
|
});
|
||||||
|
}
|
||||||
|
desc.find('payload-type').each(function () {
|
||||||
|
media += SDPUtil.build_rtpmap(this) + '\r\n';
|
||||||
|
if ($(this).find('>parameter').length) {
|
||||||
|
media += 'a=fmtp:' + this.getAttribute('id') + ' ';
|
||||||
|
media += $(this).find('parameter').map(function () { return (this.getAttribute('name') ? (this.getAttribute('name') + '=') : '') + this.getAttribute('value'); }).get().join('; ');
|
||||||
|
media += '\r\n';
|
||||||
|
}
|
||||||
|
// xep-0293
|
||||||
|
media += self.RtcpFbFromJingle($(this), this.getAttribute('id'));
|
||||||
|
});
|
||||||
|
|
||||||
|
// xep-0293
|
||||||
|
media += self.RtcpFbFromJingle(desc, '*');
|
||||||
|
|
||||||
|
// xep-0294
|
||||||
|
tmp = desc.find('>rtp-hdrext[xmlns="urn:xmpp:jingle:apps:rtp:rtp-hdrext:0"]');
|
||||||
|
tmp.each(function () {
|
||||||
|
media += 'a=extmap:' + this.getAttribute('id') + ' ' + this.getAttribute('uri') + '\r\n';
|
||||||
|
});
|
||||||
|
|
||||||
|
content.find('>bundle>transport[xmlns="urn:xmpp:jingle:transports:ice-udp:1"]>candidate').each(function () {
|
||||||
|
media += SDPUtil.candidateFromJingle(this);
|
||||||
|
});
|
||||||
|
|
||||||
|
// XEP-0339 handle ssrc-group attributes
|
||||||
|
tmp = content.find('description>ssrc-group[xmlns="urn:xmpp:jingle:apps:rtp:ssma:0"]').each(function() {
|
||||||
|
var semantics = this.getAttribute('semantics');
|
||||||
|
var ssrcs = $(this).find('>source').map(function() {
|
||||||
|
return this.getAttribute('ssrc');
|
||||||
|
}).get();
|
||||||
|
|
||||||
|
if (ssrcs.length != 0) {
|
||||||
|
media += 'a=ssrc-group:' + semantics + ' ' + ssrcs.join(' ') + '\r\n';
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
tmp = content.find('description>source[xmlns="urn:xmpp:jingle:apps:rtp:ssma:0"]');
|
||||||
|
tmp.each(function () {
|
||||||
|
var ssrc = this.getAttribute('ssrc');
|
||||||
|
$(this).find('>parameter').each(function () {
|
||||||
|
media += 'a=ssrc:' + ssrc + ' ' + this.getAttribute('name');
|
||||||
|
if (this.getAttribute('value') && this.getAttribute('value').length)
|
||||||
|
media += ':' + this.getAttribute('value');
|
||||||
|
media += '\r\n';
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
if (tmp.length === 0) {
|
||||||
|
// fallback to proprietary mapping of a=ssrc lines
|
||||||
|
tmp = content.find('description>ssrc[xmlns="http://estos.de/ns/ssrc"]');
|
||||||
|
if (tmp.length) {
|
||||||
|
media += 'a=ssrc:' + ssrc + ' cname:' + tmp.attr('cname') + '\r\n';
|
||||||
|
media += 'a=ssrc:' + ssrc + ' msid:' + tmp.attr('msid') + '\r\n';
|
||||||
|
media += 'a=ssrc:' + ssrc + ' mslabel:' + tmp.attr('mslabel') + '\r\n';
|
||||||
|
media += 'a=ssrc:' + ssrc + ' label:' + tmp.attr('label') + '\r\n';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return media;
|
||||||
|
};
|
||||||
|
|
408
contrib/jitsimeetbridge/unjingle/strophe.jingle.sdp.util.js
Normal file
408
contrib/jitsimeetbridge/unjingle/strophe.jingle.sdp.util.js
Normal file
@ -0,0 +1,408 @@
|
|||||||
|
/**
|
||||||
|
* Contains utility classes used in SDP class.
|
||||||
|
*
|
||||||
|
*/
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Class holds a=ssrc lines and media type a=mid
|
||||||
|
* @param ssrc synchronization source identifier number(a=ssrc lines from SDP)
|
||||||
|
* @param type media type eg. "audio" or "video"(a=mid frm SDP)
|
||||||
|
* @constructor
|
||||||
|
*/
|
||||||
|
function ChannelSsrc(ssrc, type) {
|
||||||
|
this.ssrc = ssrc;
|
||||||
|
this.type = type;
|
||||||
|
this.lines = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Class holds a=ssrc-group: lines
|
||||||
|
* @param semantics
|
||||||
|
* @param ssrcs
|
||||||
|
* @constructor
|
||||||
|
*/
|
||||||
|
function ChannelSsrcGroup(semantics, ssrcs, line) {
|
||||||
|
this.semantics = semantics;
|
||||||
|
this.ssrcs = ssrcs;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Helper class represents media channel. Is a container for ChannelSsrc, holds channel idx and media type.
|
||||||
|
* @param channelNumber channel idx in SDP media array.
|
||||||
|
* @param mediaType media type(a=mid)
|
||||||
|
* @constructor
|
||||||
|
*/
|
||||||
|
function MediaChannel(channelNumber, mediaType) {
|
||||||
|
/**
|
||||||
|
* SDP channel number
|
||||||
|
* @type {*}
|
||||||
|
*/
|
||||||
|
this.chNumber = channelNumber;
|
||||||
|
/**
|
||||||
|
* Channel media type(a=mid)
|
||||||
|
* @type {*}
|
||||||
|
*/
|
||||||
|
this.mediaType = mediaType;
|
||||||
|
/**
|
||||||
|
* The maps of ssrc numbers to ChannelSsrc objects.
|
||||||
|
*/
|
||||||
|
this.ssrcs = {};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* The array of ChannelSsrcGroup objects.
|
||||||
|
* @type {Array}
|
||||||
|
*/
|
||||||
|
this.ssrcGroups = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
SDPUtil = {
|
||||||
|
iceparams: function (mediadesc, sessiondesc) {
|
||||||
|
var data = null;
|
||||||
|
if (SDPUtil.find_line(mediadesc, 'a=ice-ufrag:', sessiondesc) &&
|
||||||
|
SDPUtil.find_line(mediadesc, 'a=ice-pwd:', sessiondesc)) {
|
||||||
|
data = {
|
||||||
|
ufrag: SDPUtil.parse_iceufrag(SDPUtil.find_line(mediadesc, 'a=ice-ufrag:', sessiondesc)),
|
||||||
|
pwd: SDPUtil.parse_icepwd(SDPUtil.find_line(mediadesc, 'a=ice-pwd:', sessiondesc))
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
parse_iceufrag: function (line) {
|
||||||
|
return line.substring(12);
|
||||||
|
},
|
||||||
|
build_iceufrag: function (frag) {
|
||||||
|
return 'a=ice-ufrag:' + frag;
|
||||||
|
},
|
||||||
|
parse_icepwd: function (line) {
|
||||||
|
return line.substring(10);
|
||||||
|
},
|
||||||
|
build_icepwd: function (pwd) {
|
||||||
|
return 'a=ice-pwd:' + pwd;
|
||||||
|
},
|
||||||
|
parse_mid: function (line) {
|
||||||
|
return line.substring(6);
|
||||||
|
},
|
||||||
|
parse_mline: function (line) {
|
||||||
|
var parts = line.substring(2).split(' '),
|
||||||
|
data = {};
|
||||||
|
data.media = parts.shift();
|
||||||
|
data.port = parts.shift();
|
||||||
|
data.proto = parts.shift();
|
||||||
|
if (parts[parts.length - 1] === '') { // trailing whitespace
|
||||||
|
parts.pop();
|
||||||
|
}
|
||||||
|
data.fmt = parts;
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
build_mline: function (mline) {
|
||||||
|
return 'm=' + mline.media + ' ' + mline.port + ' ' + mline.proto + ' ' + mline.fmt.join(' ');
|
||||||
|
},
|
||||||
|
parse_rtpmap: function (line) {
|
||||||
|
var parts = line.substring(9).split(' '),
|
||||||
|
data = {};
|
||||||
|
data.id = parts.shift();
|
||||||
|
parts = parts[0].split('/');
|
||||||
|
data.name = parts.shift();
|
||||||
|
data.clockrate = parts.shift();
|
||||||
|
data.channels = parts.length ? parts.shift() : '1';
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
/**
|
||||||
|
* Parses SDP line "a=sctpmap:..." and extracts SCTP port from it.
|
||||||
|
* @param line eg. "a=sctpmap:5000 webrtc-datachannel"
|
||||||
|
* @returns [SCTP port number, protocol, streams]
|
||||||
|
*/
|
||||||
|
parse_sctpmap: function (line)
|
||||||
|
{
|
||||||
|
var parts = line.substring(10).split(' ');
|
||||||
|
var sctpPort = parts[0];
|
||||||
|
var protocol = parts[1];
|
||||||
|
// Stream count is optional
|
||||||
|
var streamCount = parts.length > 2 ? parts[2] : null;
|
||||||
|
return [sctpPort, protocol, streamCount];// SCTP port
|
||||||
|
},
|
||||||
|
build_rtpmap: function (el) {
|
||||||
|
var line = 'a=rtpmap:' + el.getAttribute('id') + ' ' + el.getAttribute('name') + '/' + el.getAttribute('clockrate');
|
||||||
|
if (el.getAttribute('channels') && el.getAttribute('channels') != '1') {
|
||||||
|
line += '/' + el.getAttribute('channels');
|
||||||
|
}
|
||||||
|
return line;
|
||||||
|
},
|
||||||
|
parse_crypto: function (line) {
|
||||||
|
var parts = line.substring(9).split(' '),
|
||||||
|
data = {};
|
||||||
|
data.tag = parts.shift();
|
||||||
|
data['crypto-suite'] = parts.shift();
|
||||||
|
data['key-params'] = parts.shift();
|
||||||
|
if (parts.length) {
|
||||||
|
data['session-params'] = parts.join(' ');
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
parse_fingerprint: function (line) { // RFC 4572
|
||||||
|
var parts = line.substring(14).split(' '),
|
||||||
|
data = {};
|
||||||
|
data.hash = parts.shift();
|
||||||
|
data.fingerprint = parts.shift();
|
||||||
|
// TODO assert that fingerprint satisfies 2UHEX *(":" 2UHEX) ?
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
parse_fmtp: function (line) {
|
||||||
|
var parts = line.split(' '),
|
||||||
|
i, key, value,
|
||||||
|
data = [];
|
||||||
|
parts.shift();
|
||||||
|
parts = parts.join(' ').split(';');
|
||||||
|
for (i = 0; i < parts.length; i++) {
|
||||||
|
key = parts[i].split('=')[0];
|
||||||
|
while (key.length && key[0] == ' ') {
|
||||||
|
key = key.substring(1);
|
||||||
|
}
|
||||||
|
value = parts[i].split('=')[1];
|
||||||
|
if (key && value) {
|
||||||
|
data.push({name: key, value: value});
|
||||||
|
} else if (key) {
|
||||||
|
// rfc 4733 (DTMF) style stuff
|
||||||
|
data.push({name: '', value: key});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
parse_icecandidate: function (line) {
|
||||||
|
var candidate = {},
|
||||||
|
elems = line.split(' ');
|
||||||
|
candidate.foundation = elems[0].substring(12);
|
||||||
|
candidate.component = elems[1];
|
||||||
|
candidate.protocol = elems[2].toLowerCase();
|
||||||
|
candidate.priority = elems[3];
|
||||||
|
candidate.ip = elems[4];
|
||||||
|
candidate.port = elems[5];
|
||||||
|
// elems[6] => "typ"
|
||||||
|
candidate.type = elems[7];
|
||||||
|
candidate.generation = 0; // default value, may be overwritten below
|
||||||
|
for (var i = 8; i < elems.length; i += 2) {
|
||||||
|
switch (elems[i]) {
|
||||||
|
case 'raddr':
|
||||||
|
candidate['rel-addr'] = elems[i + 1];
|
||||||
|
break;
|
||||||
|
case 'rport':
|
||||||
|
candidate['rel-port'] = elems[i + 1];
|
||||||
|
break;
|
||||||
|
case 'generation':
|
||||||
|
candidate.generation = elems[i + 1];
|
||||||
|
break;
|
||||||
|
case 'tcptype':
|
||||||
|
candidate.tcptype = elems[i + 1];
|
||||||
|
break;
|
||||||
|
default: // TODO
|
||||||
|
console.log('parse_icecandidate not translating "' + elems[i] + '" = "' + elems[i + 1] + '"');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
candidate.network = '1';
|
||||||
|
candidate.id = Math.random().toString(36).substr(2, 10); // not applicable to SDP -- FIXME: should be unique, not just random
|
||||||
|
return candidate;
|
||||||
|
},
|
||||||
|
build_icecandidate: function (cand) {
|
||||||
|
var line = ['a=candidate:' + cand.foundation, cand.component, cand.protocol, cand.priority, cand.ip, cand.port, 'typ', cand.type].join(' ');
|
||||||
|
line += ' ';
|
||||||
|
switch (cand.type) {
|
||||||
|
case 'srflx':
|
||||||
|
case 'prflx':
|
||||||
|
case 'relay':
|
||||||
|
if (cand.hasOwnAttribute('rel-addr') && cand.hasOwnAttribute('rel-port')) {
|
||||||
|
line += 'raddr';
|
||||||
|
line += ' ';
|
||||||
|
line += cand['rel-addr'];
|
||||||
|
line += ' ';
|
||||||
|
line += 'rport';
|
||||||
|
line += ' ';
|
||||||
|
line += cand['rel-port'];
|
||||||
|
line += ' ';
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
if (cand.hasOwnAttribute('tcptype')) {
|
||||||
|
line += 'tcptype';
|
||||||
|
line += ' ';
|
||||||
|
line += cand.tcptype;
|
||||||
|
line += ' ';
|
||||||
|
}
|
||||||
|
line += 'generation';
|
||||||
|
line += ' ';
|
||||||
|
line += cand.hasOwnAttribute('generation') ? cand.generation : '0';
|
||||||
|
return line;
|
||||||
|
},
|
||||||
|
parse_ssrc: function (desc) {
|
||||||
|
// proprietary mapping of a=ssrc lines
|
||||||
|
// TODO: see "Jingle RTP Source Description" by Juberti and P. Thatcher on google docs
|
||||||
|
// and parse according to that
|
||||||
|
var lines = desc.split('\r\n'),
|
||||||
|
data = {};
|
||||||
|
for (var i = 0; i < lines.length; i++) {
|
||||||
|
if (lines[i].substring(0, 7) == 'a=ssrc:') {
|
||||||
|
var idx = lines[i].indexOf(' ');
|
||||||
|
data[lines[i].substr(idx + 1).split(':', 2)[0]] = lines[i].substr(idx + 1).split(':', 2)[1];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
parse_rtcpfb: function (line) {
|
||||||
|
var parts = line.substr(10).split(' ');
|
||||||
|
var data = {};
|
||||||
|
data.pt = parts.shift();
|
||||||
|
data.type = parts.shift();
|
||||||
|
data.params = parts;
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
parse_extmap: function (line) {
|
||||||
|
var parts = line.substr(9).split(' ');
|
||||||
|
var data = {};
|
||||||
|
data.value = parts.shift();
|
||||||
|
if (data.value.indexOf('/') != -1) {
|
||||||
|
data.direction = data.value.substr(data.value.indexOf('/') + 1);
|
||||||
|
data.value = data.value.substr(0, data.value.indexOf('/'));
|
||||||
|
} else {
|
||||||
|
data.direction = 'both';
|
||||||
|
}
|
||||||
|
data.uri = parts.shift();
|
||||||
|
data.params = parts;
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
find_line: function (haystack, needle, sessionpart) {
|
||||||
|
var lines = haystack.split('\r\n');
|
||||||
|
for (var i = 0; i < lines.length; i++) {
|
||||||
|
if (lines[i].substring(0, needle.length) == needle) {
|
||||||
|
return lines[i];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (!sessionpart) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
// search session part
|
||||||
|
lines = sessionpart.split('\r\n');
|
||||||
|
for (var j = 0; j < lines.length; j++) {
|
||||||
|
if (lines[j].substring(0, needle.length) == needle) {
|
||||||
|
return lines[j];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
},
|
||||||
|
find_lines: function (haystack, needle, sessionpart) {
|
||||||
|
var lines = haystack.split('\r\n'),
|
||||||
|
needles = [];
|
||||||
|
for (var i = 0; i < lines.length; i++) {
|
||||||
|
if (lines[i].substring(0, needle.length) == needle)
|
||||||
|
needles.push(lines[i]);
|
||||||
|
}
|
||||||
|
if (needles.length || !sessionpart) {
|
||||||
|
return needles;
|
||||||
|
}
|
||||||
|
// search session part
|
||||||
|
lines = sessionpart.split('\r\n');
|
||||||
|
for (var j = 0; j < lines.length; j++) {
|
||||||
|
if (lines[j].substring(0, needle.length) == needle) {
|
||||||
|
needles.push(lines[j]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return needles;
|
||||||
|
},
|
||||||
|
candidateToJingle: function (line) {
|
||||||
|
// a=candidate:2979166662 1 udp 2113937151 192.168.2.100 57698 typ host generation 0
|
||||||
|
// <candidate component=... foundation=... generation=... id=... ip=... network=... port=... priority=... protocol=... type=.../>
|
||||||
|
if (line.indexOf('candidate:') === 0) {
|
||||||
|
line = 'a=' + line;
|
||||||
|
} else if (line.substring(0, 12) != 'a=candidate:') {
|
||||||
|
console.log('parseCandidate called with a line that is not a candidate line');
|
||||||
|
console.log(line);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
if (line.substring(line.length - 2) == '\r\n') // chomp it
|
||||||
|
line = line.substring(0, line.length - 2);
|
||||||
|
var candidate = {},
|
||||||
|
elems = line.split(' '),
|
||||||
|
i;
|
||||||
|
if (elems[6] != 'typ') {
|
||||||
|
console.log('did not find typ in the right place');
|
||||||
|
console.log(line);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
candidate.foundation = elems[0].substring(12);
|
||||||
|
candidate.component = elems[1];
|
||||||
|
candidate.protocol = elems[2].toLowerCase();
|
||||||
|
candidate.priority = elems[3];
|
||||||
|
candidate.ip = elems[4];
|
||||||
|
candidate.port = elems[5];
|
||||||
|
// elems[6] => "typ"
|
||||||
|
candidate.type = elems[7];
|
||||||
|
|
||||||
|
candidate.generation = '0'; // default, may be overwritten below
|
||||||
|
for (i = 8; i < elems.length; i += 2) {
|
||||||
|
switch (elems[i]) {
|
||||||
|
case 'raddr':
|
||||||
|
candidate['rel-addr'] = elems[i + 1];
|
||||||
|
break;
|
||||||
|
case 'rport':
|
||||||
|
candidate['rel-port'] = elems[i + 1];
|
||||||
|
break;
|
||||||
|
case 'generation':
|
||||||
|
candidate.generation = elems[i + 1];
|
||||||
|
break;
|
||||||
|
case 'tcptype':
|
||||||
|
candidate.tcptype = elems[i + 1];
|
||||||
|
break;
|
||||||
|
default: // TODO
|
||||||
|
console.log('not translating "' + elems[i] + '" = "' + elems[i + 1] + '"');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
candidate.network = '1';
|
||||||
|
candidate.id = Math.random().toString(36).substr(2, 10); // not applicable to SDP -- FIXME: should be unique, not just random
|
||||||
|
return candidate;
|
||||||
|
},
|
||||||
|
candidateFromJingle: function (cand) {
|
||||||
|
var line = 'a=candidate:';
|
||||||
|
line += cand.getAttribute('foundation');
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('component');
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('protocol'); //.toUpperCase(); // chrome M23 doesn't like this
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('priority');
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('ip');
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('port');
|
||||||
|
line += ' ';
|
||||||
|
line += 'typ';
|
||||||
|
line += ' ' + cand.getAttribute('type');
|
||||||
|
line += ' ';
|
||||||
|
switch (cand.getAttribute('type')) {
|
||||||
|
case 'srflx':
|
||||||
|
case 'prflx':
|
||||||
|
case 'relay':
|
||||||
|
if (cand.getAttribute('rel-addr') && cand.getAttribute('rel-port')) {
|
||||||
|
line += 'raddr';
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('rel-addr');
|
||||||
|
line += ' ';
|
||||||
|
line += 'rport';
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('rel-port');
|
||||||
|
line += ' ';
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
if (cand.getAttribute('protocol').toLowerCase() == 'tcp') {
|
||||||
|
line += 'tcptype';
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('tcptype');
|
||||||
|
line += ' ';
|
||||||
|
}
|
||||||
|
line += 'generation';
|
||||||
|
line += ' ';
|
||||||
|
line += cand.getAttribute('generation') || '0';
|
||||||
|
return line + '\r\n';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
exports.SDPUtil = SDPUtil;
|
||||||
|
|
254
contrib/jitsimeetbridge/unjingle/strophe/XMLHttpRequest.js
Normal file
254
contrib/jitsimeetbridge/unjingle/strophe/XMLHttpRequest.js
Normal file
@ -0,0 +1,254 @@
|
|||||||
|
/**
|
||||||
|
* Wrapper for built-in http.js to emulate the browser XMLHttpRequest object.
|
||||||
|
*
|
||||||
|
* This can be used with JS designed for browsers to improve reuse of code and
|
||||||
|
* allow the use of existing libraries.
|
||||||
|
*
|
||||||
|
* Usage: include("XMLHttpRequest.js") and use XMLHttpRequest per W3C specs.
|
||||||
|
*
|
||||||
|
* @todo SSL Support
|
||||||
|
* @author Dan DeFelippi <dan@driverdan.com>
|
||||||
|
* @license MIT
|
||||||
|
*/
|
||||||
|
|
||||||
|
var Url = require("url")
|
||||||
|
,sys = require("util");
|
||||||
|
|
||||||
|
exports.XMLHttpRequest = function() {
|
||||||
|
/**
|
||||||
|
* Private variables
|
||||||
|
*/
|
||||||
|
var self = this;
|
||||||
|
var http = require('http');
|
||||||
|
var https = require('https');
|
||||||
|
|
||||||
|
// Holds http.js objects
|
||||||
|
var client;
|
||||||
|
var request;
|
||||||
|
var response;
|
||||||
|
|
||||||
|
// Request settings
|
||||||
|
var settings = {};
|
||||||
|
|
||||||
|
// Set some default headers
|
||||||
|
var defaultHeaders = {
|
||||||
|
"User-Agent": "node.js",
|
||||||
|
"Accept": "*/*",
|
||||||
|
};
|
||||||
|
|
||||||
|
var headers = defaultHeaders;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Constants
|
||||||
|
*/
|
||||||
|
this.UNSENT = 0;
|
||||||
|
this.OPENED = 1;
|
||||||
|
this.HEADERS_RECEIVED = 2;
|
||||||
|
this.LOADING = 3;
|
||||||
|
this.DONE = 4;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Public vars
|
||||||
|
*/
|
||||||
|
// Current state
|
||||||
|
this.readyState = this.UNSENT;
|
||||||
|
|
||||||
|
// default ready state change handler in case one is not set or is set late
|
||||||
|
this.onreadystatechange = function() {};
|
||||||
|
|
||||||
|
// Result & response
|
||||||
|
this.responseText = "";
|
||||||
|
this.responseXML = "";
|
||||||
|
this.status = null;
|
||||||
|
this.statusText = null;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Open the connection. Currently supports local server requests.
|
||||||
|
*
|
||||||
|
* @param string method Connection method (eg GET, POST)
|
||||||
|
* @param string url URL for the connection.
|
||||||
|
* @param boolean async Asynchronous connection. Default is true.
|
||||||
|
* @param string user Username for basic authentication (optional)
|
||||||
|
* @param string password Password for basic authentication (optional)
|
||||||
|
*/
|
||||||
|
this.open = function(method, url, async, user, password) {
|
||||||
|
settings = {
|
||||||
|
"method": method,
|
||||||
|
"url": url,
|
||||||
|
"async": async || null,
|
||||||
|
"user": user || null,
|
||||||
|
"password": password || null
|
||||||
|
};
|
||||||
|
|
||||||
|
this.abort();
|
||||||
|
|
||||||
|
setState(this.OPENED);
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Sets a header for the request.
|
||||||
|
*
|
||||||
|
* @param string header Header name
|
||||||
|
* @param string value Header value
|
||||||
|
*/
|
||||||
|
this.setRequestHeader = function(header, value) {
|
||||||
|
headers[header] = value;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Gets a header from the server response.
|
||||||
|
*
|
||||||
|
* @param string header Name of header to get.
|
||||||
|
* @return string Text of the header or null if it doesn't exist.
|
||||||
|
*/
|
||||||
|
this.getResponseHeader = function(header) {
|
||||||
|
if (this.readyState > this.OPENED && response.headers[header]) {
|
||||||
|
return header + ": " + response.headers[header];
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Gets all the response headers.
|
||||||
|
*
|
||||||
|
* @return string
|
||||||
|
*/
|
||||||
|
this.getAllResponseHeaders = function() {
|
||||||
|
if (this.readyState < this.HEADERS_RECEIVED) {
|
||||||
|
throw "INVALID_STATE_ERR: Headers have not been received.";
|
||||||
|
}
|
||||||
|
var result = "";
|
||||||
|
|
||||||
|
for (var i in response.headers) {
|
||||||
|
result += i + ": " + response.headers[i] + "\r\n";
|
||||||
|
}
|
||||||
|
return result.substr(0, result.length - 2);
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Sends the request to the server.
|
||||||
|
*
|
||||||
|
* @param string data Optional data to send as request body.
|
||||||
|
*/
|
||||||
|
this.send = function(data) {
|
||||||
|
if (this.readyState != this.OPENED) {
|
||||||
|
throw "INVALID_STATE_ERR: connection must be opened before send() is called";
|
||||||
|
}
|
||||||
|
|
||||||
|
var ssl = false;
|
||||||
|
var url = Url.parse(settings.url);
|
||||||
|
|
||||||
|
// Determine the server
|
||||||
|
switch (url.protocol) {
|
||||||
|
case 'https:':
|
||||||
|
ssl = true;
|
||||||
|
// SSL & non-SSL both need host, no break here.
|
||||||
|
case 'http:':
|
||||||
|
var host = url.hostname;
|
||||||
|
break;
|
||||||
|
|
||||||
|
case undefined:
|
||||||
|
case '':
|
||||||
|
var host = "localhost";
|
||||||
|
break;
|
||||||
|
|
||||||
|
default:
|
||||||
|
throw "Protocol not supported.";
|
||||||
|
}
|
||||||
|
|
||||||
|
// Default to port 80. If accessing localhost on another port be sure
|
||||||
|
// to use http://localhost:port/path
|
||||||
|
var port = url.port || (ssl ? 443 : 80);
|
||||||
|
// Add query string if one is used
|
||||||
|
var uri = url.pathname + (url.search ? url.search : '');
|
||||||
|
|
||||||
|
// Set the Host header or the server may reject the request
|
||||||
|
this.setRequestHeader("Host", host);
|
||||||
|
|
||||||
|
// Set content length header
|
||||||
|
if (settings.method == "GET" || settings.method == "HEAD") {
|
||||||
|
data = null;
|
||||||
|
} else if (data) {
|
||||||
|
this.setRequestHeader("Content-Length", Buffer.byteLength(data));
|
||||||
|
|
||||||
|
if (!headers["Content-Type"]) {
|
||||||
|
this.setRequestHeader("Content-Type", "text/plain;charset=UTF-8");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use the proper protocol
|
||||||
|
var doRequest = ssl ? https.request : http.request;
|
||||||
|
|
||||||
|
var options = {
|
||||||
|
host: host,
|
||||||
|
port: port,
|
||||||
|
path: uri,
|
||||||
|
method: settings.method,
|
||||||
|
headers: headers,
|
||||||
|
agent: false
|
||||||
|
};
|
||||||
|
|
||||||
|
var req = doRequest(options, function(res) {
|
||||||
|
response = res;
|
||||||
|
response.setEncoding("utf8");
|
||||||
|
|
||||||
|
setState(self.HEADERS_RECEIVED);
|
||||||
|
self.status = response.statusCode;
|
||||||
|
|
||||||
|
response.on('data', function(chunk) {
|
||||||
|
// Make sure there's some data
|
||||||
|
if (chunk) {
|
||||||
|
self.responseText += chunk;
|
||||||
|
}
|
||||||
|
setState(self.LOADING);
|
||||||
|
});
|
||||||
|
|
||||||
|
response.on('end', function() {
|
||||||
|
setState(self.DONE);
|
||||||
|
});
|
||||||
|
|
||||||
|
response.on('error', function() {
|
||||||
|
self.handleError(error);
|
||||||
|
});
|
||||||
|
}).on('error', function(error) {
|
||||||
|
self.handleError(error);
|
||||||
|
});
|
||||||
|
|
||||||
|
req.setHeader("Connection", "Close");
|
||||||
|
|
||||||
|
// Node 0.4 and later won't accept empty data. Make sure it's needed.
|
||||||
|
if (data) {
|
||||||
|
req.write(data);
|
||||||
|
}
|
||||||
|
|
||||||
|
req.end();
|
||||||
|
};
|
||||||
|
|
||||||
|
this.handleError = function(error) {
|
||||||
|
this.status = 503;
|
||||||
|
this.statusText = error;
|
||||||
|
this.responseText = error.stack;
|
||||||
|
setState(this.DONE);
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Aborts a request.
|
||||||
|
*/
|
||||||
|
this.abort = function() {
|
||||||
|
headers = defaultHeaders;
|
||||||
|
this.readyState = this.UNSENT;
|
||||||
|
this.responseText = "";
|
||||||
|
this.responseXML = "";
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Changes readyState and calls onreadystatechange.
|
||||||
|
*
|
||||||
|
* @param int state New state
|
||||||
|
*/
|
||||||
|
var setState = function(state) {
|
||||||
|
self.readyState = state;
|
||||||
|
self.onreadystatechange();
|
||||||
|
}
|
||||||
|
};
|
83
contrib/jitsimeetbridge/unjingle/strophe/base64.js
Normal file
83
contrib/jitsimeetbridge/unjingle/strophe/base64.js
Normal file
@ -0,0 +1,83 @@
|
|||||||
|
// This code was written by Tyler Akins and has been placed in the
|
||||||
|
// public domain. It would be nice if you left this header intact.
|
||||||
|
// Base64 code from Tyler Akins -- http://rumkin.com
|
||||||
|
|
||||||
|
var Base64 = (function () {
|
||||||
|
var keyStr = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
|
||||||
|
|
||||||
|
var obj = {
|
||||||
|
/**
|
||||||
|
* Encodes a string in base64
|
||||||
|
* @param {String} input The string to encode in base64.
|
||||||
|
*/
|
||||||
|
encode: function (input) {
|
||||||
|
var output = "";
|
||||||
|
var chr1, chr2, chr3;
|
||||||
|
var enc1, enc2, enc3, enc4;
|
||||||
|
var i = 0;
|
||||||
|
|
||||||
|
do {
|
||||||
|
chr1 = input.charCodeAt(i++);
|
||||||
|
chr2 = input.charCodeAt(i++);
|
||||||
|
chr3 = input.charCodeAt(i++);
|
||||||
|
|
||||||
|
enc1 = chr1 >> 2;
|
||||||
|
enc2 = ((chr1 & 3) << 4) | (chr2 >> 4);
|
||||||
|
enc3 = ((chr2 & 15) << 2) | (chr3 >> 6);
|
||||||
|
enc4 = chr3 & 63;
|
||||||
|
|
||||||
|
if (isNaN(chr2)) {
|
||||||
|
enc3 = enc4 = 64;
|
||||||
|
} else if (isNaN(chr3)) {
|
||||||
|
enc4 = 64;
|
||||||
|
}
|
||||||
|
|
||||||
|
output = output + keyStr.charAt(enc1) + keyStr.charAt(enc2) +
|
||||||
|
keyStr.charAt(enc3) + keyStr.charAt(enc4);
|
||||||
|
} while (i < input.length);
|
||||||
|
|
||||||
|
return output;
|
||||||
|
},
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Decodes a base64 string.
|
||||||
|
* @param {String} input The string to decode.
|
||||||
|
*/
|
||||||
|
decode: function (input) {
|
||||||
|
var output = "";
|
||||||
|
var chr1, chr2, chr3;
|
||||||
|
var enc1, enc2, enc3, enc4;
|
||||||
|
var i = 0;
|
||||||
|
|
||||||
|
// remove all characters that are not A-Z, a-z, 0-9, +, /, or =
|
||||||
|
input = input.replace(/[^A-Za-z0-9\+\/\=]/g, '');
|
||||||
|
|
||||||
|
do {
|
||||||
|
enc1 = keyStr.indexOf(input.charAt(i++));
|
||||||
|
enc2 = keyStr.indexOf(input.charAt(i++));
|
||||||
|
enc3 = keyStr.indexOf(input.charAt(i++));
|
||||||
|
enc4 = keyStr.indexOf(input.charAt(i++));
|
||||||
|
|
||||||
|
chr1 = (enc1 << 2) | (enc2 >> 4);
|
||||||
|
chr2 = ((enc2 & 15) << 4) | (enc3 >> 2);
|
||||||
|
chr3 = ((enc3 & 3) << 6) | enc4;
|
||||||
|
|
||||||
|
output = output + String.fromCharCode(chr1);
|
||||||
|
|
||||||
|
if (enc3 != 64) {
|
||||||
|
output = output + String.fromCharCode(chr2);
|
||||||
|
}
|
||||||
|
if (enc4 != 64) {
|
||||||
|
output = output + String.fromCharCode(chr3);
|
||||||
|
}
|
||||||
|
} while (i < input.length);
|
||||||
|
|
||||||
|
return output;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return obj;
|
||||||
|
})();
|
||||||
|
|
||||||
|
// Nodify
|
||||||
|
exports.Base64 = Base64;
|
279
contrib/jitsimeetbridge/unjingle/strophe/md5.js
Normal file
279
contrib/jitsimeetbridge/unjingle/strophe/md5.js
Normal file
@ -0,0 +1,279 @@
|
|||||||
|
/*
|
||||||
|
* A JavaScript implementation of the RSA Data Security, Inc. MD5 Message
|
||||||
|
* Digest Algorithm, as defined in RFC 1321.
|
||||||
|
* Version 2.1 Copyright (C) Paul Johnston 1999 - 2002.
|
||||||
|
* Other contributors: Greg Holt, Andrew Kepert, Ydnar, Lostinet
|
||||||
|
* Distributed under the BSD License
|
||||||
|
* See http://pajhome.org.uk/crypt/md5 for more info.
|
||||||
|
*/
|
||||||
|
|
||||||
|
var MD5 = (function () {
|
||||||
|
/*
|
||||||
|
* Configurable variables. You may need to tweak these to be compatible with
|
||||||
|
* the server-side, but the defaults work in most cases.
|
||||||
|
*/
|
||||||
|
var hexcase = 0; /* hex output format. 0 - lowercase; 1 - uppercase */
|
||||||
|
var b64pad = ""; /* base-64 pad character. "=" for strict RFC compliance */
|
||||||
|
var chrsz = 8; /* bits per input character. 8 - ASCII; 16 - Unicode */
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Add integers, wrapping at 2^32. This uses 16-bit operations internally
|
||||||
|
* to work around bugs in some JS interpreters.
|
||||||
|
*/
|
||||||
|
var safe_add = function (x, y) {
|
||||||
|
var lsw = (x & 0xFFFF) + (y & 0xFFFF);
|
||||||
|
var msw = (x >> 16) + (y >> 16) + (lsw >> 16);
|
||||||
|
return (msw << 16) | (lsw & 0xFFFF);
|
||||||
|
};
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Bitwise rotate a 32-bit number to the left.
|
||||||
|
*/
|
||||||
|
var bit_rol = function (num, cnt) {
|
||||||
|
return (num << cnt) | (num >>> (32 - cnt));
|
||||||
|
};
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Convert a string to an array of little-endian words
|
||||||
|
* If chrsz is ASCII, characters >255 have their hi-byte silently ignored.
|
||||||
|
*/
|
||||||
|
var str2binl = function (str) {
|
||||||
|
var bin = [];
|
||||||
|
var mask = (1 << chrsz) - 1;
|
||||||
|
for(var i = 0; i < str.length * chrsz; i += chrsz)
|
||||||
|
{
|
||||||
|
bin[i>>5] |= (str.charCodeAt(i / chrsz) & mask) << (i%32);
|
||||||
|
}
|
||||||
|
return bin;
|
||||||
|
};
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Convert an array of little-endian words to a string
|
||||||
|
*/
|
||||||
|
var binl2str = function (bin) {
|
||||||
|
var str = "";
|
||||||
|
var mask = (1 << chrsz) - 1;
|
||||||
|
for(var i = 0; i < bin.length * 32; i += chrsz)
|
||||||
|
{
|
||||||
|
str += String.fromCharCode((bin[i>>5] >>> (i % 32)) & mask);
|
||||||
|
}
|
||||||
|
return str;
|
||||||
|
};
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Convert an array of little-endian words to a hex string.
|
||||||
|
*/
|
||||||
|
var binl2hex = function (binarray) {
|
||||||
|
var hex_tab = hexcase ? "0123456789ABCDEF" : "0123456789abcdef";
|
||||||
|
var str = "";
|
||||||
|
for(var i = 0; i < binarray.length * 4; i++)
|
||||||
|
{
|
||||||
|
str += hex_tab.charAt((binarray[i>>2] >> ((i%4)*8+4)) & 0xF) +
|
||||||
|
hex_tab.charAt((binarray[i>>2] >> ((i%4)*8 )) & 0xF);
|
||||||
|
}
|
||||||
|
return str;
|
||||||
|
};
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Convert an array of little-endian words to a base-64 string
|
||||||
|
*/
|
||||||
|
var binl2b64 = function (binarray) {
|
||||||
|
var tab = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
|
||||||
|
var str = "";
|
||||||
|
var triplet, j;
|
||||||
|
for(var i = 0; i < binarray.length * 4; i += 3)
|
||||||
|
{
|
||||||
|
triplet = (((binarray[i >> 2] >> 8 * ( i %4)) & 0xFF) << 16) |
|
||||||
|
(((binarray[i+1 >> 2] >> 8 * ((i+1)%4)) & 0xFF) << 8 ) |
|
||||||
|
((binarray[i+2 >> 2] >> 8 * ((i+2)%4)) & 0xFF);
|
||||||
|
for(j = 0; j < 4; j++)
|
||||||
|
{
|
||||||
|
if(i * 8 + j * 6 > binarray.length * 32) { str += b64pad; }
|
||||||
|
else { str += tab.charAt((triplet >> 6*(3-j)) & 0x3F); }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return str;
|
||||||
|
};
|
||||||
|
|
||||||
|
/*
|
||||||
|
* These functions implement the four basic operations the algorithm uses.
|
||||||
|
*/
|
||||||
|
var md5_cmn = function (q, a, b, x, s, t) {
|
||||||
|
return safe_add(bit_rol(safe_add(safe_add(a, q),safe_add(x, t)), s),b);
|
||||||
|
};
|
||||||
|
|
||||||
|
var md5_ff = function (a, b, c, d, x, s, t) {
|
||||||
|
return md5_cmn((b & c) | ((~b) & d), a, b, x, s, t);
|
||||||
|
};
|
||||||
|
|
||||||
|
var md5_gg = function (a, b, c, d, x, s, t) {
|
||||||
|
return md5_cmn((b & d) | (c & (~d)), a, b, x, s, t);
|
||||||
|
};
|
||||||
|
|
||||||
|
var md5_hh = function (a, b, c, d, x, s, t) {
|
||||||
|
return md5_cmn(b ^ c ^ d, a, b, x, s, t);
|
||||||
|
};
|
||||||
|
|
||||||
|
var md5_ii = function (a, b, c, d, x, s, t) {
|
||||||
|
return md5_cmn(c ^ (b | (~d)), a, b, x, s, t);
|
||||||
|
};
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Calculate the MD5 of an array of little-endian words, and a bit length
|
||||||
|
*/
|
||||||
|
var core_md5 = function (x, len) {
|
||||||
|
/* append padding */
|
||||||
|
x[len >> 5] |= 0x80 << ((len) % 32);
|
||||||
|
x[(((len + 64) >>> 9) << 4) + 14] = len;
|
||||||
|
|
||||||
|
var a = 1732584193;
|
||||||
|
var b = -271733879;
|
||||||
|
var c = -1732584194;
|
||||||
|
var d = 271733878;
|
||||||
|
|
||||||
|
var olda, oldb, oldc, oldd;
|
||||||
|
for (var i = 0; i < x.length; i += 16)
|
||||||
|
{
|
||||||
|
olda = a;
|
||||||
|
oldb = b;
|
||||||
|
oldc = c;
|
||||||
|
oldd = d;
|
||||||
|
|
||||||
|
a = md5_ff(a, b, c, d, x[i+ 0], 7 , -680876936);
|
||||||
|
d = md5_ff(d, a, b, c, x[i+ 1], 12, -389564586);
|
||||||
|
c = md5_ff(c, d, a, b, x[i+ 2], 17, 606105819);
|
||||||
|
b = md5_ff(b, c, d, a, x[i+ 3], 22, -1044525330);
|
||||||
|
a = md5_ff(a, b, c, d, x[i+ 4], 7 , -176418897);
|
||||||
|
d = md5_ff(d, a, b, c, x[i+ 5], 12, 1200080426);
|
||||||
|
c = md5_ff(c, d, a, b, x[i+ 6], 17, -1473231341);
|
||||||
|
b = md5_ff(b, c, d, a, x[i+ 7], 22, -45705983);
|
||||||
|
a = md5_ff(a, b, c, d, x[i+ 8], 7 , 1770035416);
|
||||||
|
d = md5_ff(d, a, b, c, x[i+ 9], 12, -1958414417);
|
||||||
|
c = md5_ff(c, d, a, b, x[i+10], 17, -42063);
|
||||||
|
b = md5_ff(b, c, d, a, x[i+11], 22, -1990404162);
|
||||||
|
a = md5_ff(a, b, c, d, x[i+12], 7 , 1804603682);
|
||||||
|
d = md5_ff(d, a, b, c, x[i+13], 12, -40341101);
|
||||||
|
c = md5_ff(c, d, a, b, x[i+14], 17, -1502002290);
|
||||||
|
b = md5_ff(b, c, d, a, x[i+15], 22, 1236535329);
|
||||||
|
|
||||||
|
a = md5_gg(a, b, c, d, x[i+ 1], 5 , -165796510);
|
||||||
|
d = md5_gg(d, a, b, c, x[i+ 6], 9 , -1069501632);
|
||||||
|
c = md5_gg(c, d, a, b, x[i+11], 14, 643717713);
|
||||||
|
b = md5_gg(b, c, d, a, x[i+ 0], 20, -373897302);
|
||||||
|
a = md5_gg(a, b, c, d, x[i+ 5], 5 , -701558691);
|
||||||
|
d = md5_gg(d, a, b, c, x[i+10], 9 , 38016083);
|
||||||
|
c = md5_gg(c, d, a, b, x[i+15], 14, -660478335);
|
||||||
|
b = md5_gg(b, c, d, a, x[i+ 4], 20, -405537848);
|
||||||
|
a = md5_gg(a, b, c, d, x[i+ 9], 5 , 568446438);
|
||||||
|
d = md5_gg(d, a, b, c, x[i+14], 9 , -1019803690);
|
||||||
|
c = md5_gg(c, d, a, b, x[i+ 3], 14, -187363961);
|
||||||
|
b = md5_gg(b, c, d, a, x[i+ 8], 20, 1163531501);
|
||||||
|
a = md5_gg(a, b, c, d, x[i+13], 5 , -1444681467);
|
||||||
|
d = md5_gg(d, a, b, c, x[i+ 2], 9 , -51403784);
|
||||||
|
c = md5_gg(c, d, a, b, x[i+ 7], 14, 1735328473);
|
||||||
|
b = md5_gg(b, c, d, a, x[i+12], 20, -1926607734);
|
||||||
|
|
||||||
|
a = md5_hh(a, b, c, d, x[i+ 5], 4 , -378558);
|
||||||
|
d = md5_hh(d, a, b, c, x[i+ 8], 11, -2022574463);
|
||||||
|
c = md5_hh(c, d, a, b, x[i+11], 16, 1839030562);
|
||||||
|
b = md5_hh(b, c, d, a, x[i+14], 23, -35309556);
|
||||||
|
a = md5_hh(a, b, c, d, x[i+ 1], 4 , -1530992060);
|
||||||
|
d = md5_hh(d, a, b, c, x[i+ 4], 11, 1272893353);
|
||||||
|
c = md5_hh(c, d, a, b, x[i+ 7], 16, -155497632);
|
||||||
|
b = md5_hh(b, c, d, a, x[i+10], 23, -1094730640);
|
||||||
|
a = md5_hh(a, b, c, d, x[i+13], 4 , 681279174);
|
||||||
|
d = md5_hh(d, a, b, c, x[i+ 0], 11, -358537222);
|
||||||
|
c = md5_hh(c, d, a, b, x[i+ 3], 16, -722521979);
|
||||||
|
b = md5_hh(b, c, d, a, x[i+ 6], 23, 76029189);
|
||||||
|
a = md5_hh(a, b, c, d, x[i+ 9], 4 , -640364487);
|
||||||
|
d = md5_hh(d, a, b, c, x[i+12], 11, -421815835);
|
||||||
|
c = md5_hh(c, d, a, b, x[i+15], 16, 530742520);
|
||||||
|
b = md5_hh(b, c, d, a, x[i+ 2], 23, -995338651);
|
||||||
|
|
||||||
|
a = md5_ii(a, b, c, d, x[i+ 0], 6 , -198630844);
|
||||||
|
d = md5_ii(d, a, b, c, x[i+ 7], 10, 1126891415);
|
||||||
|
c = md5_ii(c, d, a, b, x[i+14], 15, -1416354905);
|
||||||
|
b = md5_ii(b, c, d, a, x[i+ 5], 21, -57434055);
|
||||||
|
a = md5_ii(a, b, c, d, x[i+12], 6 , 1700485571);
|
||||||
|
d = md5_ii(d, a, b, c, x[i+ 3], 10, -1894986606);
|
||||||
|
c = md5_ii(c, d, a, b, x[i+10], 15, -1051523);
|
||||||
|
b = md5_ii(b, c, d, a, x[i+ 1], 21, -2054922799);
|
||||||
|
a = md5_ii(a, b, c, d, x[i+ 8], 6 , 1873313359);
|
||||||
|
d = md5_ii(d, a, b, c, x[i+15], 10, -30611744);
|
||||||
|
c = md5_ii(c, d, a, b, x[i+ 6], 15, -1560198380);
|
||||||
|
b = md5_ii(b, c, d, a, x[i+13], 21, 1309151649);
|
||||||
|
a = md5_ii(a, b, c, d, x[i+ 4], 6 , -145523070);
|
||||||
|
d = md5_ii(d, a, b, c, x[i+11], 10, -1120210379);
|
||||||
|
c = md5_ii(c, d, a, b, x[i+ 2], 15, 718787259);
|
||||||
|
b = md5_ii(b, c, d, a, x[i+ 9], 21, -343485551);
|
||||||
|
|
||||||
|
a = safe_add(a, olda);
|
||||||
|
b = safe_add(b, oldb);
|
||||||
|
c = safe_add(c, oldc);
|
||||||
|
d = safe_add(d, oldd);
|
||||||
|
}
|
||||||
|
return [a, b, c, d];
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Calculate the HMAC-MD5, of a key and some data
|
||||||
|
*/
|
||||||
|
var core_hmac_md5 = function (key, data) {
|
||||||
|
var bkey = str2binl(key);
|
||||||
|
if(bkey.length > 16) { bkey = core_md5(bkey, key.length * chrsz); }
|
||||||
|
|
||||||
|
var ipad = new Array(16), opad = new Array(16);
|
||||||
|
for(var i = 0; i < 16; i++)
|
||||||
|
{
|
||||||
|
ipad[i] = bkey[i] ^ 0x36363636;
|
||||||
|
opad[i] = bkey[i] ^ 0x5C5C5C5C;
|
||||||
|
}
|
||||||
|
|
||||||
|
var hash = core_md5(ipad.concat(str2binl(data)), 512 + data.length * chrsz);
|
||||||
|
return core_md5(opad.concat(hash), 512 + 128);
|
||||||
|
};
|
||||||
|
|
||||||
|
var obj = {
|
||||||
|
/*
|
||||||
|
* These are the functions you'll usually want to call.
|
||||||
|
* They take string arguments and return either hex or base-64 encoded
|
||||||
|
* strings.
|
||||||
|
*/
|
||||||
|
hexdigest: function (s) {
|
||||||
|
return binl2hex(core_md5(str2binl(s), s.length * chrsz));
|
||||||
|
},
|
||||||
|
|
||||||
|
b64digest: function (s) {
|
||||||
|
return binl2b64(core_md5(str2binl(s), s.length * chrsz));
|
||||||
|
},
|
||||||
|
|
||||||
|
hash: function (s) {
|
||||||
|
return binl2str(core_md5(str2binl(s), s.length * chrsz));
|
||||||
|
},
|
||||||
|
|
||||||
|
hmac_hexdigest: function (key, data) {
|
||||||
|
return binl2hex(core_hmac_md5(key, data));
|
||||||
|
},
|
||||||
|
|
||||||
|
hmac_b64digest: function (key, data) {
|
||||||
|
return binl2b64(core_hmac_md5(key, data));
|
||||||
|
},
|
||||||
|
|
||||||
|
hmac_hash: function (key, data) {
|
||||||
|
return binl2str(core_hmac_md5(key, data));
|
||||||
|
},
|
||||||
|
|
||||||
|
/*
|
||||||
|
* Perform a simple self-test to see if the VM is working
|
||||||
|
*/
|
||||||
|
test: function () {
|
||||||
|
return MD5.hexdigest("abc") === "900150983cd24fb0d6963f7d28e17f72";
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return obj;
|
||||||
|
})();
|
||||||
|
|
||||||
|
// Nodify
|
||||||
|
exports.MD5 = MD5;
|
3256
contrib/jitsimeetbridge/unjingle/strophe/strophe.js
Normal file
3256
contrib/jitsimeetbridge/unjingle/strophe/strophe.js
Normal file
File diff suppressed because it is too large
Load Diff
48
contrib/jitsimeetbridge/unjingle/unjingle.js
Normal file
48
contrib/jitsimeetbridge/unjingle/unjingle.js
Normal file
@ -0,0 +1,48 @@
|
|||||||
|
var strophe = require("./strophe/strophe.js").Strophe;
|
||||||
|
|
||||||
|
var Strophe = strophe.Strophe;
|
||||||
|
var $iq = strophe.$iq;
|
||||||
|
var $msg = strophe.$msg;
|
||||||
|
var $build = strophe.$build;
|
||||||
|
var $pres = strophe.$pres;
|
||||||
|
|
||||||
|
var jsdom = require("jsdom");
|
||||||
|
var window = jsdom.jsdom().parentWindow;
|
||||||
|
var $ = require('jquery')(window);
|
||||||
|
|
||||||
|
var stropheJingle = require("./strophe.jingle.sdp.js");
|
||||||
|
|
||||||
|
|
||||||
|
var input = '';
|
||||||
|
|
||||||
|
process.stdin.on('readable', function() {
|
||||||
|
var chunk = process.stdin.read();
|
||||||
|
if (chunk !== null) {
|
||||||
|
input += chunk;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
process.stdin.on('end', function() {
|
||||||
|
if (process.argv[2] == '--jingle') {
|
||||||
|
var elem = $(input);
|
||||||
|
// app does:
|
||||||
|
// sess.setRemoteDescription($(iq).find('>jingle'), 'offer');
|
||||||
|
//console.log(elem.find('>content'));
|
||||||
|
var sdp = new stropheJingle.SDP('');
|
||||||
|
sdp.fromJingle(elem);
|
||||||
|
console.log(sdp.raw);
|
||||||
|
} else if (process.argv[2] == '--sdp') {
|
||||||
|
var sdp = new stropheJingle.SDP(input);
|
||||||
|
var accept = $iq({to: '%(tojid)s',
|
||||||
|
type: 'set'})
|
||||||
|
.c('jingle', {xmlns: 'urn:xmpp:jingle:1',
|
||||||
|
//action: 'session-accept',
|
||||||
|
action: '%(action)s',
|
||||||
|
initiator: '%(initiator)s',
|
||||||
|
responder: '%(responder)s',
|
||||||
|
sid: '%(sid)s' });
|
||||||
|
sdp.toJingle(accept, 'responder');
|
||||||
|
console.log(Strophe.serialize(accept));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
@ -1,47 +0,0 @@
|
|||||||
# `lnav` config for Synapse logs
|
|
||||||
|
|
||||||
[lnav](https://lnav.org/) is a log-viewing tool. It is particularly useful when
|
|
||||||
you need to interleave multiple log files, or for exploring a large log file
|
|
||||||
with regex filters. The downside is that it is not as ubiquitous as tools like
|
|
||||||
`less`, `grep`, etc.
|
|
||||||
|
|
||||||
This directory contains an `lnav` [log format definition](
|
|
||||||
https://docs.lnav.org/en/v0.10.1/formats.html#defining-a-new-format
|
|
||||||
) for Synapse logs as
|
|
||||||
emitted by Synapse with the default [logging configuration](
|
|
||||||
https://element-hq.github.io/synapse/latest/usage/configuration/config_documentation.html#log_config
|
|
||||||
). It supports lnav 0.10.1 because that's what's packaged by my distribution.
|
|
||||||
|
|
||||||
This should allow lnav:
|
|
||||||
|
|
||||||
- to interpret timestamps, allowing log interleaving;
|
|
||||||
- to interpret log severity levels, allowing colouring by log level(!!!);
|
|
||||||
- to interpret request IDs, allowing you to skip through a specific request; and
|
|
||||||
- to highlight room, event and user IDs in logs.
|
|
||||||
|
|
||||||
See also https://gist.github.com/benje/e2ab750b0a81d11920d83af637d289f7 for a
|
|
||||||
similar example.
|
|
||||||
|
|
||||||
## Example
|
|
||||||
|
|
||||||
[](https://asciinema.org/a/556133)
|
|
||||||
|
|
||||||
## Tips
|
|
||||||
|
|
||||||
- `lnav -i /path/to/synapse/checkout/contrib/lnav/synapse-log-format.json`
|
|
||||||
- `lnav my_synapse_log_file` or `lnav synapse_log_files.*`, etc.
|
|
||||||
- `lnav --help` for CLI help.
|
|
||||||
|
|
||||||
Within lnav itself:
|
|
||||||
|
|
||||||
- `?` for help within lnav itself.
|
|
||||||
- `q` to quit.
|
|
||||||
- `/` to search a-la `less` and `vim`, then `n` and `N` to continue searching
|
|
||||||
down and up.
|
|
||||||
- Use `o` and `O` to skip through logs based on the request ID (`POST-1234`, or
|
|
||||||
else the value of the [`request_id_header`](
|
|
||||||
https://element-hq.github.io/synapse/latest/usage/configuration/config_documentation.html?highlight=request_id_header#listeners
|
|
||||||
) header). This may get confused if the same request ID is repeated among
|
|
||||||
multiple files or process restarts.
|
|
||||||
- ???
|
|
||||||
- Profit
|
|
@ -1,67 +0,0 @@
|
|||||||
{
|
|
||||||
"$schema": "https://lnav.org/schemas/format-v1.schema.json",
|
|
||||||
"synapse": {
|
|
||||||
"title": "Synapse logs",
|
|
||||||
"description": "Logs output by Synapse, a Matrix homesever, under its default logging config.",
|
|
||||||
"regex": {
|
|
||||||
"log": {
|
|
||||||
"pattern": ".*(?<timestamp>\\d{4}-\\d{2}-\\d{2} \\d{2}:\\d{2}:\\d{2},\\d{3}) - (?<logger>.+) - (?<lineno>\\d+) - (?<level>\\w+) - (?<context>.+) - (?<body>.*)"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"json": false,
|
|
||||||
"timestamp-field": "timestamp",
|
|
||||||
"timestamp-format": [
|
|
||||||
"%Y-%m-%d %H:%M:%S,%L"
|
|
||||||
],
|
|
||||||
"level-field": "level",
|
|
||||||
"body-field": "body",
|
|
||||||
"opid-field": "context",
|
|
||||||
"level": {
|
|
||||||
"critical": "CRITICAL",
|
|
||||||
"error": "ERROR",
|
|
||||||
"warning": "WARNING",
|
|
||||||
"info": "INFO",
|
|
||||||
"debug": "DEBUG"
|
|
||||||
},
|
|
||||||
"sample": [
|
|
||||||
{
|
|
||||||
"line": "my-matrix-server-generic-worker-4 | 2023-01-27 09:47:09,818 - synapse.replication.tcp.client - 381 - ERROR - PUT-32992 - Timed out waiting for stream receipts",
|
|
||||||
"level": "error"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"line": "my-matrix-server-federation-sender-1 | 2023-01-25 20:56:20,995 - synapse.http.matrixfederationclient - 709 - WARNING - federation_transaction_transmission_loop-3 - {PUT-O-3} [example.com] Request failed: PUT matrix-federation://example.com/_matrix/federation/v1/send/1674680155797: HttpResponseException('403: Forbidden')",
|
|
||||||
"level": "warning"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"line": "my-matrix-server | 2023-01-25 20:55:54,433 - synapse.storage.databases - 66 - INFO - main - [database config 'master']: Checking database server",
|
|
||||||
"level": "info"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"line": "my-matrix-server | 2023-01-26 15:08:40,447 - synapse.access.http.8008 - 460 - INFO - PUT-74929 - 0.0.0.0 - 8008 - {@alice:example.com} Processed request: 0.011sec/0.000sec (0.000sec, 0.000sec) (0.001sec/0.008sec/3) 2B 200 \"PUT /_matrix/client/r0/user/%40alice%3Atexample.com/account_data/im.vector.setting.breadcrumbs HTTP/1.0\" \"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Element/1.11.20 Chrome/108.0.5359.179 Electron/22.0.3 Safari/537.36\" [0 dbevts]",
|
|
||||||
"level": "info"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"highlights": {
|
|
||||||
"user_id": {
|
|
||||||
"pattern": "(@|%40)[^:% ]+(:|%3A)[\\[\\]0-9a-zA-Z.\\-:]+(:\\d{1,5})?(?<!:)",
|
|
||||||
"underline": true
|
|
||||||
},
|
|
||||||
"room_id": {
|
|
||||||
"pattern": "(!|%21)[^:% ]+(:|%3A)[\\[\\]0-9a-zA-Z.\\-:]+(:\\d{1,5})?(?<!:)",
|
|
||||||
"underline": true
|
|
||||||
},
|
|
||||||
"room_alias": {
|
|
||||||
"pattern": "(#|%23)[^:% ]+(:|%3A)[\\[\\]0-9a-zA-Z.\\-:]+(:\\d{1,5})?(?<!:)",
|
|
||||||
"underline": true
|
|
||||||
},
|
|
||||||
"event_id_v1_v2": {
|
|
||||||
"pattern": "(\\$|%25)[^:% ]+(:|%3A)[\\[\\]0-9a-zA-Z.\\-:]+(:\\d{1,5})?(?<!:)",
|
|
||||||
"underline": true
|
|
||||||
},
|
|
||||||
"event_id_v3_plus": {
|
|
||||||
"pattern": "(\\$|%25)([A-Za-z0-9+/_]|-){43}",
|
|
||||||
"underline": true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
@ -34,7 +34,7 @@ Add a new job to the main prometheus.yml file:
|
|||||||
```
|
```
|
||||||
|
|
||||||
An example of a Prometheus configuration with workers can be found in
|
An example of a Prometheus configuration with workers can be found in
|
||||||
[metrics-howto.md](https://element-hq.github.io/synapse/latest/metrics-howto.html).
|
[metrics-howto.md](https://matrix-org.github.io/synapse/latest/metrics-howto.html).
|
||||||
|
|
||||||
To use `synapse.rules` add
|
To use `synapse.rules` add
|
||||||
|
|
||||||
|
@ -92,6 +92,22 @@ new PromConsole.Graph({
|
|||||||
})
|
})
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
|
<h3>Pending calls per tick</h3>
|
||||||
|
<div id="reactor_pending_calls"></div>
|
||||||
|
<script>
|
||||||
|
new PromConsole.Graph({
|
||||||
|
node: document.querySelector("#reactor_pending_calls"),
|
||||||
|
expr: "rate(python_twisted_reactor_pending_calls_sum[30s]) / rate(python_twisted_reactor_pending_calls_count[30s])",
|
||||||
|
name: "[[job]]-[[index]]",
|
||||||
|
min: 0,
|
||||||
|
renderer: "line",
|
||||||
|
height: 150,
|
||||||
|
yAxisFormatter: PromConsole.NumberFormatter.humanize,
|
||||||
|
yHoverFormatter: PromConsole.NumberFormatter.humanize,
|
||||||
|
yTitle: "Pending Calls"
|
||||||
|
})
|
||||||
|
</script>
|
||||||
|
|
||||||
<h1>Storage</h1>
|
<h1>Storage</h1>
|
||||||
|
|
||||||
<h3>Queries</h3>
|
<h3>Queries</h3>
|
||||||
|
21
contrib/prometheus/synapse-v1.rules
Normal file
21
contrib/prometheus/synapse-v1.rules
Normal file
@ -0,0 +1,21 @@
|
|||||||
|
synapse_federation_transaction_queue_pendingEdus:total = sum(synapse_federation_transaction_queue_pendingEdus or absent(synapse_federation_transaction_queue_pendingEdus)*0)
|
||||||
|
synapse_federation_transaction_queue_pendingPdus:total = sum(synapse_federation_transaction_queue_pendingPdus or absent(synapse_federation_transaction_queue_pendingPdus)*0)
|
||||||
|
|
||||||
|
synapse_http_server_request_count:method{servlet=""} = sum(synapse_http_server_request_count) by (method)
|
||||||
|
synapse_http_server_request_count:servlet{method=""} = sum(synapse_http_server_request_count) by (servlet)
|
||||||
|
|
||||||
|
synapse_http_server_request_count:total{servlet=""} = sum(synapse_http_server_request_count:by_method) by (servlet)
|
||||||
|
|
||||||
|
synapse_cache:hit_ratio_5m = rate(synapse_util_caches_cache:hits[5m]) / rate(synapse_util_caches_cache:total[5m])
|
||||||
|
synapse_cache:hit_ratio_30s = rate(synapse_util_caches_cache:hits[30s]) / rate(synapse_util_caches_cache:total[30s])
|
||||||
|
|
||||||
|
synapse_federation_client_sent{type="EDU"} = synapse_federation_client_sent_edus + 0
|
||||||
|
synapse_federation_client_sent{type="PDU"} = synapse_federation_client_sent_pdu_destinations:count + 0
|
||||||
|
synapse_federation_client_sent{type="Query"} = sum(synapse_federation_client_sent_queries) by (job)
|
||||||
|
|
||||||
|
synapse_federation_server_received{type="EDU"} = synapse_federation_server_received_edus + 0
|
||||||
|
synapse_federation_server_received{type="PDU"} = synapse_federation_server_received_pdus + 0
|
||||||
|
synapse_federation_server_received{type="Query"} = sum(synapse_federation_server_received_queries) by (job)
|
||||||
|
|
||||||
|
synapse_federation_transaction_queue_pending{type="EDU"} = synapse_federation_transaction_queue_pending_edus + 0
|
||||||
|
synapse_federation_transaction_queue_pending{type="PDU"} = synapse_federation_transaction_queue_pending_pdus + 0
|
@ -1,20 +1,37 @@
|
|||||||
groups:
|
groups:
|
||||||
- name: synapse
|
- name: synapse
|
||||||
rules:
|
rules:
|
||||||
|
- record: "synapse_federation_transaction_queue_pendingEdus:total"
|
||||||
|
expr: "sum(synapse_federation_transaction_queue_pendingEdus or absent(synapse_federation_transaction_queue_pendingEdus)*0)"
|
||||||
|
- record: "synapse_federation_transaction_queue_pendingPdus:total"
|
||||||
|
expr: "sum(synapse_federation_transaction_queue_pendingPdus or absent(synapse_federation_transaction_queue_pendingPdus)*0)"
|
||||||
|
- record: 'synapse_http_server_request_count:method'
|
||||||
|
labels:
|
||||||
|
servlet: ""
|
||||||
|
expr: "sum(synapse_http_server_request_count) by (method)"
|
||||||
|
- record: 'synapse_http_server_request_count:servlet'
|
||||||
|
labels:
|
||||||
|
method: ""
|
||||||
|
expr: 'sum(synapse_http_server_request_count) by (servlet)'
|
||||||
|
|
||||||
|
- record: 'synapse_http_server_request_count:total'
|
||||||
|
labels:
|
||||||
|
servlet: ""
|
||||||
|
expr: 'sum(synapse_http_server_request_count:by_method) by (servlet)'
|
||||||
|
|
||||||
|
- record: 'synapse_cache:hit_ratio_5m'
|
||||||
|
expr: 'rate(synapse_util_caches_cache:hits[5m]) / rate(synapse_util_caches_cache:total[5m])'
|
||||||
|
- record: 'synapse_cache:hit_ratio_30s'
|
||||||
|
expr: 'rate(synapse_util_caches_cache:hits[30s]) / rate(synapse_util_caches_cache:total[30s])'
|
||||||
|
|
||||||
###
|
|
||||||
### Prometheus Console Only
|
|
||||||
### The following rules are only needed if you use the Prometheus Console
|
|
||||||
### in contrib/prometheus/consoles/synapse.html
|
|
||||||
###
|
|
||||||
- record: 'synapse_federation_client_sent'
|
- record: 'synapse_federation_client_sent'
|
||||||
labels:
|
labels:
|
||||||
type: "EDU"
|
type: "EDU"
|
||||||
expr: 'synapse_federation_client_sent_edus_total + 0'
|
expr: 'synapse_federation_client_sent_edus + 0'
|
||||||
- record: 'synapse_federation_client_sent'
|
- record: 'synapse_federation_client_sent'
|
||||||
labels:
|
labels:
|
||||||
type: "PDU"
|
type: "PDU"
|
||||||
expr: 'synapse_federation_client_sent_pdu_destinations_count_total + 0'
|
expr: 'synapse_federation_client_sent_pdu_destinations:count + 0'
|
||||||
- record: 'synapse_federation_client_sent'
|
- record: 'synapse_federation_client_sent'
|
||||||
labels:
|
labels:
|
||||||
type: "Query"
|
type: "Query"
|
||||||
@ -23,11 +40,11 @@ groups:
|
|||||||
- record: 'synapse_federation_server_received'
|
- record: 'synapse_federation_server_received'
|
||||||
labels:
|
labels:
|
||||||
type: "EDU"
|
type: "EDU"
|
||||||
expr: 'synapse_federation_server_received_edus_total + 0'
|
expr: 'synapse_federation_server_received_edus + 0'
|
||||||
- record: 'synapse_federation_server_received'
|
- record: 'synapse_federation_server_received'
|
||||||
labels:
|
labels:
|
||||||
type: "PDU"
|
type: "PDU"
|
||||||
expr: 'synapse_federation_server_received_pdus_total + 0'
|
expr: 'synapse_federation_server_received_pdus + 0'
|
||||||
- record: 'synapse_federation_server_received'
|
- record: 'synapse_federation_server_received'
|
||||||
labels:
|
labels:
|
||||||
type: "Query"
|
type: "Query"
|
||||||
@ -41,34 +58,21 @@ groups:
|
|||||||
labels:
|
labels:
|
||||||
type: "PDU"
|
type: "PDU"
|
||||||
expr: 'synapse_federation_transaction_queue_pending_pdus + 0'
|
expr: 'synapse_federation_transaction_queue_pending_pdus + 0'
|
||||||
###
|
|
||||||
### End of 'Prometheus Console Only' rules block
|
|
||||||
###
|
|
||||||
|
|
||||||
|
|
||||||
###
|
|
||||||
### Grafana Only
|
|
||||||
### The following rules are only needed if you use the Grafana dashboard
|
|
||||||
### in contrib/grafana/synapse.json
|
|
||||||
###
|
|
||||||
- record: synapse_storage_events_persisted_by_source_type
|
- record: synapse_storage_events_persisted_by_source_type
|
||||||
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep_total{origin_type="remote"})
|
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_type="remote"})
|
||||||
labels:
|
labels:
|
||||||
type: remote
|
type: remote
|
||||||
- record: synapse_storage_events_persisted_by_source_type
|
- record: synapse_storage_events_persisted_by_source_type
|
||||||
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep_total{origin_entity="*client*",origin_type="local"})
|
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_entity="*client*",origin_type="local"})
|
||||||
labels:
|
labels:
|
||||||
type: local
|
type: local
|
||||||
- record: synapse_storage_events_persisted_by_source_type
|
- record: synapse_storage_events_persisted_by_source_type
|
||||||
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep_total{origin_entity!="*client*",origin_type="local"})
|
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_entity!="*client*",origin_type="local"})
|
||||||
labels:
|
labels:
|
||||||
type: bridges
|
type: bridges
|
||||||
|
|
||||||
- record: synapse_storage_events_persisted_by_event_type
|
- record: synapse_storage_events_persisted_by_event_type
|
||||||
expr: sum without(origin_entity, origin_type) (synapse_storage_events_persisted_events_sep_total)
|
expr: sum without(origin_entity, origin_type) (synapse_storage_events_persisted_events_sep)
|
||||||
|
|
||||||
- record: synapse_storage_events_persisted_by_origin
|
- record: synapse_storage_events_persisted_by_origin
|
||||||
expr: sum without(type) (synapse_storage_events_persisted_events_sep_total)
|
expr: sum without(type) (synapse_storage_events_persisted_events_sep)
|
||||||
###
|
|
||||||
### End of 'Grafana Only' rules block
|
|
||||||
###
|
|
||||||
|
@ -4,7 +4,7 @@ Purge history API examples
|
|||||||
# `purge_history.sh`
|
# `purge_history.sh`
|
||||||
|
|
||||||
A bash file, that uses the
|
A bash file, that uses the
|
||||||
[purge history API](https://element-hq.github.io/synapse/latest/admin_api/purge_history_api.html)
|
[purge history API](https://matrix-org.github.io/synapse/latest/admin_api/purge_history_api.html)
|
||||||
to purge all messages in a list of rooms up to a certain event. You can select a
|
to purge all messages in a list of rooms up to a certain event. You can select a
|
||||||
timeframe or a number of messages that you want to keep in the room.
|
timeframe or a number of messages that you want to keep in the room.
|
||||||
|
|
||||||
@ -14,5 +14,5 @@ the script.
|
|||||||
# `purge_remote_media.sh`
|
# `purge_remote_media.sh`
|
||||||
|
|
||||||
A bash file, that uses the
|
A bash file, that uses the
|
||||||
[purge history API](https://element-hq.github.io/synapse/latest/admin_api/purge_history_api.html)
|
[purge history API](https://matrix-org.github.io/synapse/latest/admin_api/purge_history_api.html)
|
||||||
to purge all old cached remote media.
|
to purge all old cached remote media.
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user