Copy_from_upstream: no subprocess call & update_cbom fix for CI. (#1412)

* Refactor update_cbom and update_docs_from_yaml, allow copy_from_upstream to import them.
Workaround for issue in GitPython, caused update_cbom to fail in Github CI.

* updates after copy_from_upstream
This commit is contained in:
Basil Hess 2023-03-06 15:54:43 +01:00 committed by GitHub
parent 92b84c47c9
commit 4c7ced218a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
12 changed files with 386 additions and 356 deletions

View File

@ -6,7 +6,7 @@
- **Authors' website**: https://classic.mceliece.org
- **Specification version**: SUPERCOP-20191221.
- **Primary Source**<a name="primary-source"></a>:
- **Source**: https://github.com/PQClean/PQClean/commit/33bceb17eb06a40fbdc72251f533734e8d869615
- **Source**: https://github.com/PQClean/PQClean/commit/245c95cd1ec326f8f38e26cb17a28832701ba17b
- **Implementation license (SPDX-Identifier)**: Public domain
- **Ancestors of primary source**:
- SUPERCOP-20191221 "vec" and "avx" implementations

View File

@ -372,4 +372,4 @@ parameter-sets:
auxiliary-submitters: []
primary-upstream:
spdx-license-identifier: Public domain
source: https://github.com/PQClean/PQClean/commit/33bceb17eb06a40fbdc72251f533734e8d869615
source: https://github.com/PQClean/PQClean/commit/245c95cd1ec326f8f38e26cb17a28832701ba17b

View File

@ -6,7 +6,7 @@
- **Authors' website**: https://pqc-hqc.org/
- **Specification version**: NIST Round 3 submission.
- **Primary Source**<a name="primary-source"></a>:
- **Source**: https://github.com/PQClean/PQClean/commit/33bceb17eb06a40fbdc72251f533734e8d869615
- **Source**: https://github.com/PQClean/PQClean/commit/245c95cd1ec326f8f38e26cb17a28832701ba17b
- **Implementation license (SPDX-Identifier)**: Public domain
- **Ancestors of primary source**:
- https://github.com/jschanck/package-pqclean/tree/29f79e72/hqc, which takes it from:

View File

@ -125,4 +125,4 @@ parameter-sets:
upstream: primary-upstream
primary-upstream:
spdx-license-identifier: Public domain
source: https://github.com/PQClean/PQClean/commit/33bceb17eb06a40fbdc72251f533734e8d869615
source: https://github.com/PQClean/PQClean/commit/245c95cd1ec326f8f38e26cb17a28832701ba17b

View File

@ -11,7 +11,7 @@
- **Implementation license (SPDX-Identifier)**: CC0-1.0 or Apache-2.0
- **Optimized Implementation sources**: https://github.com/pq-crystals/kyber/commit/518de2414a85052bb91349bcbcc347f391292d5b with copy_from_upstream patches
- **pqclean-aarch64**:<a name="pqclean-aarch64"></a>
- **Source**: https://github.com/PQClean/PQClean/commit/33bceb17eb06a40fbdc72251f533734e8d869615 with copy_from_upstream patches
- **Source**: https://github.com/PQClean/PQClean/commit/245c95cd1ec326f8f38e26cb17a28832701ba17b with copy_from_upstream patches
- **Implementation license (SPDX-Identifier)**: CC0-1.0

View File

@ -22,7 +22,7 @@ primary-upstream:
spdx-license-identifier: CC0-1.0 or Apache-2.0
optimized-upstreams:
pqclean-aarch64:
source: https://github.com/PQClean/PQClean/commit/33bceb17eb06a40fbdc72251f533734e8d869615
source: https://github.com/PQClean/PQClean/commit/245c95cd1ec326f8f38e26cb17a28832701ba17b
with copy_from_upstream patches
spdx-license-identifier: CC0-1.0
parameter-sets:

View File

@ -11,7 +11,7 @@
- **Implementation license (SPDX-Identifier)**: CC0-1.0 or Apache-2.0
- **Optimized Implementation sources**: https://github.com/pq-crystals/dilithium/commit/3e9b9f1412f6c7435dbeb4e10692ea58f181ee51 with copy_from_upstream patches
- **pqclean-aarch64**:<a name="pqclean-aarch64"></a>
- **Source**: https://github.com/PQClean/PQClean/commit/33bceb17eb06a40fbdc72251f533734e8d869615 with copy_from_upstream patches
- **Source**: https://github.com/PQClean/PQClean/commit/245c95cd1ec326f8f38e26cb17a28832701ba17b with copy_from_upstream patches
- **Implementation license (SPDX-Identifier)**: CC0-1.0

View File

@ -20,7 +20,7 @@ primary-upstream:
spdx-license-identifier: CC0-1.0 or Apache-2.0
optimized-upstreams:
pqclean-aarch64:
source: https://github.com/PQClean/PQClean/commit/33bceb17eb06a40fbdc72251f533734e8d869615
source: https://github.com/PQClean/PQClean/commit/245c95cd1ec326f8f38e26cb17a28832701ba17b
with copy_from_upstream patches
spdx-license-identifier: CC0-1.0
parameter-sets:

View File

@ -616,10 +616,11 @@ def copy_from_upstream():
update_upstream_alg_docs.do_it(os.environ['LIBOQS_DIR'])
# Not in love with using sub process to call a python script, but this is the easiest solution for
# automatically calling this script in its current state.
shell(["python3", os.environ['LIBOQS_DIR'] + "/scripts/update_docs_from_yaml.py", "--liboqs-root", os.environ['LIBOQS_DIR']])
shell(["python3", os.environ['LIBOQS_DIR'] + "/scripts/update_cbom.py", "--liboqs-root", os.environ['LIBOQS_DIR']])
sys.path.insert(1, os.path.join(os.environ['LIBOQS_DIR'], 'scripts'))
import update_docs_from_yaml
import update_cbom
update_docs_from_yaml.do_it(os.environ['LIBOQS_DIR'])
update_cbom.update_cbom_if_algs_not_changed(os.environ['LIBOQS_DIR'], "git")
def verify_from_upstream():
instructions = load_instructions()

View File

@ -17,10 +17,6 @@ import copy
cbom_json_file = "cbom.json"
parser = argparse.ArgumentParser()
parser.add_argument("--liboqs-root", default=".")
parser.add_argument("--liboqs-version", default="git")
args = parser.parse_args()
def load_yaml(filename, encoding='utf-8'):
with open(filename, mode='r', encoding=encoding) as fh:
@ -130,67 +126,97 @@ def add_cbom_component(out, kem_yaml, parameter_set):
component_cpy['bom-ref'] : dep
})
def build_cbom(liboqs_root, liboqs_version):
## Add KEM components
for kem_yaml_path in sorted(glob.glob(os.path.join(liboqs_root, 'docs', 'algorithms', 'kem', '*.yml'))):
kem_yaml = load_yaml(kem_yaml_path)
kem_yamls.append(kem_yaml)
kem_name = os.path.splitext(os.path.basename(kem_yaml_path))[0]
name = kem_yaml['name']
for parameter_set in kem_yaml['parameter-sets']:
add_cbom_component(None, kem_yaml, parameter_set)
## Add Sig components
for sig_yaml_path in sorted(glob.glob(os.path.join(liboqs_root, 'docs', 'algorithms', 'sig', '*.yml'))):
sig_yaml = load_yaml(sig_yaml_path)
sig_yamls.append(sig_yaml)
sig_name = os.path.splitext(os.path.basename(sig_yaml_path))[0]
for parameter_set in sig_yaml['parameter-sets']:
add_cbom_component(None, sig_yaml, parameter_set)
## Add KEM components
for kem_yaml_path in sorted(glob.glob(os.path.join(args.liboqs_root, 'docs', 'algorithms', 'kem', '*.yml'))):
kem_yaml = load_yaml(kem_yaml_path)
kem_yamls.append(kem_yaml)
kem_name = os.path.splitext(os.path.basename(kem_yaml_path))[0]
name = kem_yaml['name']
for parameter_set in kem_yaml['parameter-sets']:
add_cbom_component(None, kem_yaml, parameter_set)
## liboqs component
liboqs_component = {}
version = liboqs_version
if version == "git":
repo = git.Repo(search_parent_directories=True, odbt=git.GitDB)
version = repo.head.object.hexsha
liboqs_component['type'] = "library"
liboqs_component['bom-ref'] = "pkg:github/open-quantum-safe/liboqs@" + version
liboqs_component['name'] = "liboqs"
liboqs_component['version'] = version
## Add Sig components
for sig_yaml_path in sorted(glob.glob(os.path.join(args.liboqs_root, 'docs', 'algorithms', 'sig', '*.yml'))):
sig_yaml = load_yaml(sig_yaml_path)
sig_yamls.append(sig_yaml)
sig_name = os.path.splitext(os.path.basename(sig_yaml_path))[0]
for parameter_set in sig_yaml['parameter-sets']:
add_cbom_component(None, sig_yaml, parameter_set)
cbom_components.insert(0, liboqs_component)
## liboqs component
liboqs_component = {}
version = args.liboqs_version
if version == "git":
repo = git.Repo(search_parent_directories=True)
version = repo.head.object.hexsha
liboqs_component['type'] = "library"
liboqs_component['bom-ref'] = "pkg:github/open-quantum-safe/liboqs@" + version
liboqs_component['name'] = "liboqs"
liboqs_component['version'] = version
metadata = {}
metadata['timestamp'] = datetime.now().isoformat()
metadata['component'] = liboqs_component
cbom_components.insert(0, liboqs_component)
## Dependencies
metadata = {}
metadata['timestamp'] = datetime.now().isoformat()
metadata['component'] = liboqs_component
## Dependencies
dependencies = []
dependencies.append({
"ref": liboqs_component['bom-ref'],
"dependsOn": bom_algs_bomrefs,
"dependencyType": "implements"
})
for usedep in bom_algs_use_dependencies.keys():
dependencies = []
dependencies.append({
"ref": usedep,
"dependsOn": bom_algs_use_dependencies[usedep],
"dependencyType": "uses"
"ref": liboqs_component['bom-ref'],
"dependsOn": bom_algs_bomrefs,
"dependencyType": "implements"
})
for usedep in bom_algs_use_dependencies.keys():
dependencies.append({
"ref": usedep,
"dependsOn": bom_algs_use_dependencies[usedep],
"dependencyType": "uses"
})
## CBOM
cbom = {}
cbom['bomFormat'] = "CBOM"
cbom['specVersion'] = "1.4-cbom-1.0"
cbom['serialNumber'] = "urn:uuid:" + str(uuid.uuid4())
cbom['version'] = 1
cbom['metadata'] = metadata
cbom['components'] = cbom_components + [common_crypto_component_aes, common_crypto_component_sha3]
cbom['dependencies'] = dependencies
## CBOM
cbom = {}
cbom['bomFormat'] = "CBOM"
cbom['specVersion'] = "1.4-cbom-1.0"
cbom['serialNumber'] = "urn:uuid:" + str(uuid.uuid4())
cbom['version'] = 1
cbom['metadata'] = metadata
cbom['components'] = cbom_components + [common_crypto_component_aes, common_crypto_component_sha3]
cbom['dependencies'] = dependencies
return cbom
with open(os.path.join(args.liboqs_root, 'docs', cbom_json_file), mode='w', encoding='utf-8') as out_md:
out_md.write(json.dumps(cbom, indent=2))
def algorithms_changed(cbom, cbom_path):
if os.path.isfile(cbom_path):
with open(cbom_path, mode='r', encoding='utf-8') as c:
existing_cbom = json.load(c)
existing_cbom['serialNumber'] = cbom['serialNumber']
existing_cbom['metadata']['timestamp'] = cbom['metadata']['timestamp']
existing_cbom['metadata']['component']['bom-ref'] = cbom['metadata']['component']['bom-ref']
existing_cbom['metadata']['component']['version'] = cbom['metadata']['component']['version']
existing_cbom['components'][0]['bom-ref'] = cbom['components'][0]['bom-ref']
existing_cbom['components'][0]['version'] = cbom['components'][0]['version']
existing_cbom['dependencies'][0]['ref'] = cbom['dependencies'][0]['ref']
update_cbom = existing_cbom != cbom
c.close()
return update_cbom
else:
return True
def update_cbom_if_algs_not_changed(liboqs_root, liboqs_version):
cbom_path = os.path.join(liboqs_root, 'docs', cbom_json_file)
cbom = build_cbom(liboqs_root, liboqs_version)
if algorithms_changed(cbom, cbom_path):
with open(cbom_path, mode='w', encoding='utf-8') as out_md:
out_md.write(json.dumps(cbom, indent=2))
out_md.close()
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument("--liboqs-root", default=".")
parser.add_argument("--liboqs-version", default="git")
args = parser.parse_args()
update_cbom_if_algs_not_changed(args.liboqs_root, args.liboqs_version)

View File

@ -7,10 +7,6 @@ import tabulate
import yaml
import os
parser = argparse.ArgumentParser()
parser.add_argument("--liboqs-root", default=".")
args = parser.parse_args()
def load_yaml(filename, encoding='utf-8'):
with open(filename, mode='r', encoding=encoding) as fh:
return yaml.safe_load(fh.read())
@ -25,323 +21,330 @@ sig_yamls = []
########################################
# Update the KEM markdown documentation.
########################################
for kem_yaml_path in sorted(glob.glob(os.path.join(args.liboqs_root, 'docs', 'algorithms', 'kem', '*.yml'))):
kem_yaml = load_yaml(kem_yaml_path)
kem_yamls.append(kem_yaml)
kem_name = os.path.splitext(os.path.basename(kem_yaml_path))[0]
print('Updating {}/{}.md'.format(os.path.dirname(kem_yaml_path), kem_name))
def do_it(liboqs_root):
for kem_yaml_path in sorted(glob.glob(os.path.join(liboqs_root, 'docs', 'algorithms', 'kem', '*.yml'))):
kem_yaml = load_yaml(kem_yaml_path)
kem_yamls.append(kem_yaml)
kem_name = os.path.splitext(os.path.basename(kem_yaml_path))[0]
print('Updating {}/{}.md'.format(os.path.dirname(kem_yaml_path), kem_name))
with open(os.path.join(args.liboqs_root, 'docs', 'algorithms', 'kem', '{}.md'.format(kem_name)), mode='w', encoding='utf-8') as out_md:
out_md.write('# {}\n\n'.format(kem_yaml['name']))
out_md.write('- **Algorithm type**: Key encapsulation mechanism.\n')
out_md.write('- **Main cryptographic assumption**: {}.\n'.format(kem_yaml['crypto-assumption']))
out_md.write('- **Principal submitters**: {}.\n'.format(', '.join(kem_yaml['principal-submitters'])))
if 'auxiliary-submitters' in kem_yaml and kem_yaml['auxiliary-submitters']:
out_md.write('- **Auxiliary submitters**: {}.\n'.format(', '.join(kem_yaml['auxiliary-submitters'])))
out_md.write('- **Authors\' website**: {}\n'.format(kem_yaml['website']))
out_md.write('- **Specification version**: {}.\n'.format(kem_yaml['spec-version']))
with open(os.path.join(liboqs_root, 'docs', 'algorithms', 'kem', '{}.md'.format(kem_name)), mode='w', encoding='utf-8') as out_md:
out_md.write('# {}\n\n'.format(kem_yaml['name']))
out_md.write('- **Algorithm type**: Key encapsulation mechanism.\n')
out_md.write('- **Main cryptographic assumption**: {}.\n'.format(kem_yaml['crypto-assumption']))
out_md.write('- **Principal submitters**: {}.\n'.format(', '.join(kem_yaml['principal-submitters'])))
if 'auxiliary-submitters' in kem_yaml and kem_yaml['auxiliary-submitters']:
out_md.write('- **Auxiliary submitters**: {}.\n'.format(', '.join(kem_yaml['auxiliary-submitters'])))
out_md.write('- **Authors\' website**: {}\n'.format(kem_yaml['website']))
out_md.write('- **Specification version**: {}.\n'.format(kem_yaml['spec-version']))
out_md.write('- **Primary Source**<a name="primary-source"></a>:\n')
out_md.write(' - **Source**: {}\n'.format(kem_yaml['primary-upstream']['source']))
out_md.write(' - **Implementation license (SPDX-Identifier)**: {}\n'.format(kem_yaml['primary-upstream']['spdx-license-identifier']))
if 'optimized-upstreams' in kem_yaml:
out_md.write('- **Optimized Implementation sources**: {}\n'.format(kem_yaml['primary-upstream']['source']))
for opt_upstream in kem_yaml['optimized-upstreams']:
out_md.write(' - **{}**:<a name="{}"></a>\n'.format(opt_upstream, opt_upstream))
out_md.write(' - **Source**: {}\n'.format(kem_yaml['optimized-upstreams'][opt_upstream]['source']))
out_md.write(' - **Implementation license (SPDX-Identifier)**: {}\n'.format(kem_yaml['optimized-upstreams'][opt_upstream]['spdx-license-identifier']))
if 'upstream-ancestors' in kem_yaml:
out_md.write('- **Ancestors of primary source**:\n')
for url in kem_yaml['upstream-ancestors'][:-1]:
out_md.write(' - {}, which takes it from:\n'.format(url))
out_md.write(' - {}\n'.format(kem_yaml['upstream-ancestors'][-1]))
else:
out_md.write('\n')
if 'advisories' in kem_yaml:
out_md.write('\n## Advisories\n\n')
for advisory in kem_yaml['advisories']:
out_md.write('- {}\n'.format(advisory))
out_md.write('\n## Parameter set summary\n\n')
table = [['Parameter set',
'Security model',
'Claimed NIST Level',
'Public key size (bytes)',
'Secret key size (bytes)',
'Ciphertext size (bytes)',
'Shared secret size (bytes)']]
for parameter_set in kem_yaml['parameter-sets']:
table.append([parameter_set['name'],
parameter_set['claimed-security'],
parameter_set['claimed-nist-level'],
parameter_set['length-public-key'],
parameter_set['length-secret-key'],
parameter_set['length-ciphertext'],
parameter_set['length-shared-secret']])
out_md.write(tabulate.tabulate(table, tablefmt="pipe", headers="firstrow", colalign=("center",)))
out_md.write('\n')
for index, parameter_set in enumerate(kem_yaml['parameter-sets']):
out_md.write('\n## {} implementation characteristics\n\n'.format(parameter_set['name'].replace("_", "\_")))
table_header = ['Implementation source',
'Identifier in upstream',
'Supported architecture(s)',
'Supported operating system(s)',
'CPU extension(s) used',
'No branching-on-secrets claimed?',
'No branching-on-secrets checked by valgrind?']
if index == 0:
table_header.append('Large stack usage?‡')
out_md.write('- **Primary Source**<a name="primary-source"></a>:\n')
out_md.write(' - **Source**: {}\n'.format(kem_yaml['primary-upstream']['source']))
out_md.write(' - **Implementation license (SPDX-Identifier)**: {}\n'.format(kem_yaml['primary-upstream']['spdx-license-identifier']))
if 'optimized-upstreams' in kem_yaml:
out_md.write('- **Optimized Implementation sources**: {}\n'.format(kem_yaml['primary-upstream']['source']))
for opt_upstream in kem_yaml['optimized-upstreams']:
out_md.write(' - **{}**:<a name="{}"></a>\n'.format(opt_upstream, opt_upstream))
out_md.write(' - **Source**: {}\n'.format(kem_yaml['optimized-upstreams'][opt_upstream]['source']))
out_md.write(' - **Implementation license (SPDX-Identifier)**: {}\n'.format(kem_yaml['optimized-upstreams'][opt_upstream]['spdx-license-identifier']))
if 'upstream-ancestors' in kem_yaml:
out_md.write('- **Ancestors of primary source**:\n')
for url in kem_yaml['upstream-ancestors'][:-1]:
out_md.write(' - {}, which takes it from:\n'.format(url))
out_md.write(' - {}\n'.format(kem_yaml['upstream-ancestors'][-1]))
else:
table_header.append('Large stack usage?')
out_md.write('\n')
table = [table_header]
for impl in parameter_set['implementations']:
# todo, automate linking this?
# if all platforms are supported, assuming not optimized and is primary upstream
if impl['supported-platforms'] == 'all':
table.append(['[Primary Source](#primary-source)',
impl['upstream-id'].replace('_', '\_'),
'All',
'All',
'None',
impl['no-secret-dependent-branching-claimed'],
impl['no-secret-dependent-branching-checked-by-valgrind'],
impl['large-stack-usage']])
else:
for platform in impl['supported-platforms']:
if 'operating_systems' not in platform:
platform['operating_systems'] = ['All']
op_systems = ','.join(platform['operating_systems'])
if 'required_flags' in platform and platform['required_flags']:
flags = ','.join(flag.upper() for flag in platform['required_flags'])
else:
flags = 'None'
if impl['upstream'] == 'primary-upstream':
name = 'Primary Source'
anchor = 'primary-source'
else:
name = impl['upstream']
anchor = impl['upstream']
upstream_name = '[{}](#{})'.format(name, anchor)
table.append([upstream_name,
impl['upstream-id'].replace('_', '\_'),
platform['architecture'].replace('_', '\_'),
op_systems,
flags,
impl['no-secret-dependent-branching-claimed'],
impl['no-secret-dependent-branching-checked-by-valgrind'],
impl['large-stack-usage']])
if 'advisories' in kem_yaml:
out_md.write('\n## Advisories\n\n')
for advisory in kem_yaml['advisories']:
out_md.write('- {}\n'.format(advisory))
out_md.write('\n## Parameter set summary\n\n')
table = [['Parameter set',
'Security model',
'Claimed NIST Level',
'Public key size (bytes)',
'Secret key size (bytes)',
'Ciphertext size (bytes)',
'Shared secret size (bytes)']]
for parameter_set in kem_yaml['parameter-sets']:
table.append([parameter_set['name'],
parameter_set['claimed-security'],
parameter_set['claimed-nist-level'],
parameter_set['length-public-key'],
parameter_set['length-secret-key'],
parameter_set['length-ciphertext'],
parameter_set['length-shared-secret']])
out_md.write(tabulate.tabulate(table, tablefmt="pipe", headers="firstrow", colalign=("center",)))
out_md.write('\n')
if 'implementations-switch-on-runtime-cpu-features' in parameter_set:
out_md.write('\nAre implementations chosen based on runtime CPU feature detection? **{}**.\n'.format('Yes' if parameter_set['implementations-switch-on-runtime-cpu-features'] else 'No'))
if index == 0:
out_md.write('\n ‡For an explanation of what this denotes, consult the [Explanation of Terms](#explanation-of-terms) section at the end of this file.\n')
out_md.write('\n## Explanation of Terms\n\n')
out_md.write('- **Large Stack Usage**: Implementations identified as having such may cause failures when running in threads or in constrained environments.')
##############################################
# Update the signature markdown documentation.
##############################################
for sig_yaml_path in sorted(glob.glob(os.path.join(args.liboqs_root, 'docs', 'algorithms', 'sig', '*.yml'))):
sig_yaml = load_yaml(sig_yaml_path)
sig_yamls.append(sig_yaml)
sig_name = os.path.splitext(os.path.basename(sig_yaml_path))[0]
print('Updating {}/{}.md'.format(os.path.dirname(sig_yaml_path), sig_name))
with open(os.path.join(args.liboqs_root, 'docs', 'algorithms', 'sig', '{}.md'.format(sig_name)), mode='w', encoding='utf-8') as out_md:
out_md.write('# {}\n\n'.format(sig_yaml['name']))
out_md.write('- **Algorithm type**: Digital signature scheme.\n')
out_md.write('- **Main cryptographic assumption**: {}.\n'.format(sig_yaml['crypto-assumption']))
out_md.write('- **Principal submitters**: {}.\n'.format(', '.join(sig_yaml['principal-submitters'])))
if 'auxiliary-submitters' in sig_yaml and sig_yaml['auxiliary-submitters']:
out_md.write('- **Auxiliary submitters**: {}.\n'.format(', '.join(sig_yaml['auxiliary-submitters'])))
out_md.write('- **Authors\' website**: {}\n'.format(sig_yaml['website']))
out_md.write('- **Specification version**: {}.\n'.format(sig_yaml['spec-version']))
out_md.write('- **Primary Source**<a name="primary-source"></a>:\n')
out_md.write(' - **Source**: {}\n'.format(sig_yaml['primary-upstream']['source']))
out_md.write(' - **Implementation license (SPDX-Identifier)**: {}\n'.format(sig_yaml['primary-upstream']['spdx-license-identifier']))
if 'optimized-upstreams' in sig_yaml:
out_md.write('- **Optimized Implementation sources**: {}\n'.format(sig_yaml['primary-upstream']['source']))
for opt_upstream in sig_yaml['optimized-upstreams']:
out_md.write(' - **{}**:<a name="{}"></a>\n'.format(opt_upstream, opt_upstream))
out_md.write(' - **Source**: {}\n'.format(sig_yaml['optimized-upstreams'][opt_upstream]['source']))
out_md.write(' - **Implementation license (SPDX-Identifier)**: {}\n'.format(sig_yaml['optimized-upstreams'][opt_upstream]['spdx-license-identifier']))
if 'upstream-ancestors' in sig_yaml:
out_md.write(', which takes it from:\n')
for url in sig_yaml['upstream-ancestors'][:-1]:
out_md.write(' - {}, which takes it from:\n'.format(url))
out_md.write(' - {}\n'.format(sig_yaml['upstream-ancestors'][-1]))
else:
out_md.write('\n')
if 'advisories' in sig_yaml:
out_md.write('\n## Advisories\n\n')
for advisory in sig_yaml['advisories']:
out_md.write('- {}\n'.format(advisory))
out_md.write('\n## Parameter set summary\n\n')
table = [['Parameter set',
'Security model',
'Claimed NIST Level',
'Public key size (bytes)',
'Secret key size (bytes)',
'Signature size (bytes)']]
for parameter_set in sig_yaml['parameter-sets']:
table.append([parameter_set['name'].replace('_', '\_'),
parameter_set['claimed-security'],
parameter_set['claimed-nist-level'],
parameter_set['length-public-key'],
parameter_set['length-secret-key'],
parameter_set['length-signature']])
out_md.write(tabulate.tabulate(table, tablefmt="pipe", headers="firstrow", colalign=("center",)))
out_md.write('\n')
for index, parameter_set in enumerate(sig_yaml['parameter-sets']):
out_md.write('\n## {} implementation characteristics\n\n'.format(parameter_set['name'].replace("_", "\_")))
table_header = ['Implementation source',
'Identifier in upstream',
'Supported architecture(s)',
'Supported operating system(s)',
'CPU extension(s) used',
'No branching-on-secrets claimed?',
'No branching-on-secrets checked by valgrind?']
if index == 0:
table_header.append('Large stack usage?‡')
else:
table_header.append('Large stack usage?')
table = [table_header]
for impl in parameter_set['implementations']:
# todo, automate linking this?
# if all platforms are supported, assuming not optimized and is primary upstream
if impl['supported-platforms'] == 'all':
table.append(['[Primary Source](#primary-source)',
impl['upstream-id'].replace('_', '\_'),
'All',
'All',
'None',
impl['no-secret-dependent-branching-claimed'],
impl['no-secret-dependent-branching-checked-by-valgrind'],
impl['large-stack-usage']])
for index, parameter_set in enumerate(kem_yaml['parameter-sets']):
out_md.write('\n## {} implementation characteristics\n\n'.format(parameter_set['name'].replace("_", "\_")))
table_header = ['Implementation source',
'Identifier in upstream',
'Supported architecture(s)',
'Supported operating system(s)',
'CPU extension(s) used',
'No branching-on-secrets claimed?',
'No branching-on-secrets checked by valgrind?']
if index == 0:
table_header.append('Large stack usage?‡')
else:
for platform in impl['supported-platforms']:
if 'operating_systems' not in platform:
platform['operating_systems'] = ['All']
op_systems = ','.join(platform['operating_systems'])
if 'required_flags' in platform and platform['required_flags']:
flags = ','.join(flag.upper() for flag in platform['required_flags'])
else:
flags = 'None'
if impl['upstream'] == 'primary-upstream':
name = 'Primary Source'
anchor = 'primary-source'
else:
name = impl['upstream']
anchor = impl['upstream']
upstream_name = '[{}](#{})'.format(name, anchor)
table.append([upstream_name,
table_header.append('Large stack usage?')
table = [table_header]
for impl in parameter_set['implementations']:
# todo, automate linking this?
# if all platforms are supported, assuming not optimized and is primary upstream
if impl['supported-platforms'] == 'all':
table.append(['[Primary Source](#primary-source)',
impl['upstream-id'].replace('_', '\_'),
platform['architecture'].replace('_', '\_'),
op_systems,
flags,
'All',
'All',
'None',
impl['no-secret-dependent-branching-claimed'],
impl['no-secret-dependent-branching-checked-by-valgrind'],
impl['large-stack-usage']])
else:
for platform in impl['supported-platforms']:
if 'operating_systems' not in platform:
platform['operating_systems'] = ['All']
op_systems = ','.join(platform['operating_systems'])
if 'required_flags' in platform and platform['required_flags']:
flags = ','.join(flag.upper() for flag in platform['required_flags'])
else:
flags = 'None'
if impl['upstream'] == 'primary-upstream':
name = 'Primary Source'
anchor = 'primary-source'
else:
name = impl['upstream']
anchor = impl['upstream']
upstream_name = '[{}](#{})'.format(name, anchor)
table.append([upstream_name,
impl['upstream-id'].replace('_', '\_'),
platform['architecture'].replace('_', '\_'),
op_systems,
flags,
impl['no-secret-dependent-branching-claimed'],
impl['no-secret-dependent-branching-checked-by-valgrind'],
impl['large-stack-usage']])
out_md.write(tabulate.tabulate(table, tablefmt="pipe", headers="firstrow", colalign=("center",)))
out_md.write('\n')
if 'implementations-switch-on-runtime-cpu-features' in parameter_set:
out_md.write('\nAre implementations chosen based on runtime CPU feature detection? **{}**.\n'.format('Yes' if parameter_set['implementations-switch-on-runtime-cpu-features'] else 'No'))
if index == 0:
out_md.write('\n ‡For an explanation of what this denotes, consult the [Explanation of Terms](#explanation-of-terms) section at the end of this file.\n')
out_md.write('\n## Explanation of Terms\n\n')
out_md.write('- **Large Stack Usage**: Implementations identified as having such may cause failures when running in threads or in constrained environments.')
##############################################
# Update the signature markdown documentation.
##############################################
for sig_yaml_path in sorted(glob.glob(os.path.join(liboqs_root, 'docs', 'algorithms', 'sig', '*.yml'))):
sig_yaml = load_yaml(sig_yaml_path)
sig_yamls.append(sig_yaml)
sig_name = os.path.splitext(os.path.basename(sig_yaml_path))[0]
print('Updating {}/{}.md'.format(os.path.dirname(sig_yaml_path), sig_name))
with open(os.path.join(liboqs_root, 'docs', 'algorithms', 'sig', '{}.md'.format(sig_name)), mode='w', encoding='utf-8') as out_md:
out_md.write('# {}\n\n'.format(sig_yaml['name']))
out_md.write('- **Algorithm type**: Digital signature scheme.\n')
out_md.write('- **Main cryptographic assumption**: {}.\n'.format(sig_yaml['crypto-assumption']))
out_md.write('- **Principal submitters**: {}.\n'.format(', '.join(sig_yaml['principal-submitters'])))
if 'auxiliary-submitters' in sig_yaml and sig_yaml['auxiliary-submitters']:
out_md.write('- **Auxiliary submitters**: {}.\n'.format(', '.join(sig_yaml['auxiliary-submitters'])))
out_md.write('- **Authors\' website**: {}\n'.format(sig_yaml['website']))
out_md.write('- **Specification version**: {}.\n'.format(sig_yaml['spec-version']))
out_md.write('- **Primary Source**<a name="primary-source"></a>:\n')
out_md.write(' - **Source**: {}\n'.format(sig_yaml['primary-upstream']['source']))
out_md.write(' - **Implementation license (SPDX-Identifier)**: {}\n'.format(sig_yaml['primary-upstream']['spdx-license-identifier']))
if 'optimized-upstreams' in sig_yaml:
out_md.write('- **Optimized Implementation sources**: {}\n'.format(sig_yaml['primary-upstream']['source']))
for opt_upstream in sig_yaml['optimized-upstreams']:
out_md.write(' - **{}**:<a name="{}"></a>\n'.format(opt_upstream, opt_upstream))
out_md.write(' - **Source**: {}\n'.format(sig_yaml['optimized-upstreams'][opt_upstream]['source']))
out_md.write(' - **Implementation license (SPDX-Identifier)**: {}\n'.format(sig_yaml['optimized-upstreams'][opt_upstream]['spdx-license-identifier']))
if 'upstream-ancestors' in sig_yaml:
out_md.write(', which takes it from:\n')
for url in sig_yaml['upstream-ancestors'][:-1]:
out_md.write(' - {}, which takes it from:\n'.format(url))
out_md.write(' - {}\n'.format(sig_yaml['upstream-ancestors'][-1]))
else:
out_md.write('\n')
if 'advisories' in sig_yaml:
out_md.write('\n## Advisories\n\n')
for advisory in sig_yaml['advisories']:
out_md.write('- {}\n'.format(advisory))
out_md.write('\n## Parameter set summary\n\n')
table = [['Parameter set',
'Security model',
'Claimed NIST Level',
'Public key size (bytes)',
'Secret key size (bytes)',
'Signature size (bytes)']]
for parameter_set in sig_yaml['parameter-sets']:
table.append([parameter_set['name'].replace('_', '\_'),
parameter_set['claimed-security'],
parameter_set['claimed-nist-level'],
parameter_set['length-public-key'],
parameter_set['length-secret-key'],
parameter_set['length-signature']])
out_md.write(tabulate.tabulate(table, tablefmt="pipe", headers="firstrow", colalign=("center",)))
out_md.write('\n')
if 'implementations-switch-on-runtime-cpu-features' in parameter_set:
out_md.write('\nAre implementations chosen based on runtime CPU feature detection? **{}**.\n'.format('Yes' if parameter_set['implementations-switch-on-runtime-cpu-features'] else 'No'))
if index == 0:
out_md.write('\n ‡For an explanation of what this denotes, consult the [Explanation of Terms](#explanation-of-terms) section at the end of this file.\n')
for index, parameter_set in enumerate(sig_yaml['parameter-sets']):
out_md.write('\n## {} implementation characteristics\n\n'.format(parameter_set['name'].replace("_", "\_")))
table_header = ['Implementation source',
'Identifier in upstream',
'Supported architecture(s)',
'Supported operating system(s)',
'CPU extension(s) used',
'No branching-on-secrets claimed?',
'No branching-on-secrets checked by valgrind?']
if index == 0:
table_header.append('Large stack usage?‡')
else:
table_header.append('Large stack usage?')
out_md.write('\n## Explanation of Terms\n\n')
out_md.write('- **Large Stack Usage**: Implementations identified as having such may cause failures when running in threads or in constrained environments.')
table = [table_header]
for impl in parameter_set['implementations']:
# todo, automate linking this?
# if all platforms are supported, assuming not optimized and is primary upstream
if impl['supported-platforms'] == 'all':
table.append(['[Primary Source](#primary-source)',
impl['upstream-id'].replace('_', '\_'),
'All',
'All',
'None',
impl['no-secret-dependent-branching-claimed'],
impl['no-secret-dependent-branching-checked-by-valgrind'],
impl['large-stack-usage']])
else:
for platform in impl['supported-platforms']:
if 'operating_systems' not in platform:
platform['operating_systems'] = ['All']
op_systems = ','.join(platform['operating_systems'])
if 'required_flags' in platform and platform['required_flags']:
flags = ','.join(flag.upper() for flag in platform['required_flags'])
else:
flags = 'None'
if impl['upstream'] == 'primary-upstream':
name = 'Primary Source'
anchor = 'primary-source'
else:
name = impl['upstream']
anchor = impl['upstream']
upstream_name = '[{}](#{})'.format(name, anchor)
table.append([upstream_name,
impl['upstream-id'].replace('_', '\_'),
platform['architecture'].replace('_', '\_'),
op_systems,
flags,
impl['no-secret-dependent-branching-claimed'],
impl['no-secret-dependent-branching-checked-by-valgrind'],
impl['large-stack-usage']])
out_md.write(tabulate.tabulate(table, tablefmt="pipe", headers="firstrow", colalign=("center",)))
out_md.write('\n')
if 'implementations-switch-on-runtime-cpu-features' in parameter_set:
out_md.write('\nAre implementations chosen based on runtime CPU feature detection? **{}**.\n'.format('Yes' if parameter_set['implementations-switch-on-runtime-cpu-features'] else 'No'))
if index == 0:
out_md.write('\n ‡For an explanation of what this denotes, consult the [Explanation of Terms](#explanation-of-terms) section at the end of this file.\n')
out_md.write('\n## Explanation of Terms\n\n')
out_md.write('- **Large Stack Usage**: Implementations identified as having such may cause failures when running in threads or in constrained environments.')
####################
# Update the README.
####################
print("Updating README.md")
####################
# Update the README.
####################
print("Updating README.md")
readme_path = os.path.join(args.liboqs_root, 'README.md')
start_identifier_tmpl = '<!--- OQS_TEMPLATE_FRAGMENT_LIST_{}_START -->'
end_identifier_tmpl = '<!--- OQS_TEMPLATE_FRAGMENT_LIST_{}_END -->'
readme_path = os.path.join(liboqs_root, 'README.md')
start_identifier_tmpl = '<!--- OQS_TEMPLATE_FRAGMENT_LIST_{}_START -->'
end_identifier_tmpl = '<!--- OQS_TEMPLATE_FRAGMENT_LIST_{}_END -->'
# KEMS
readme_contents = file_get_contents(readme_path)
# KEMS
readme_contents = file_get_contents(readme_path)
identifier_start = start_identifier_tmpl.format('KEXS')
identifier_end = end_identifier_tmpl.format('KEXS')
identifier_start = start_identifier_tmpl.format('KEXS')
identifier_end = end_identifier_tmpl.format('KEXS')
preamble = readme_contents[:readme_contents.find(identifier_start)]
postamble = readme_contents[readme_contents.find(identifier_end):]
preamble = readme_contents[:readme_contents.find(identifier_start)]
postamble = readme_contents[readme_contents.find(identifier_end):]
with open(readme_path, mode='w', encoding='utf-8') as readme:
readme.write(preamble + identifier_start + '\n')
with open(readme_path, mode='w', encoding='utf-8') as readme:
readme.write(preamble + identifier_start + '\n')
for kem_yaml in kem_yamls:
parameter_sets = kem_yaml['parameter-sets']
if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
readme.write('- **{}**: {}'.format(kem_yaml['name'], parameter_sets[0]['name']))
else:
readme.write('- **{}**: {}'.format(kem_yaml['name'], parameter_sets[0]['name']))
for parameter_set in parameter_sets[1:]:
if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
readme.write(', {}'.format(parameter_set['name']))
for kem_yaml in kem_yamls:
parameter_sets = kem_yaml['parameter-sets']
if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
readme.write('- **{}**: {}'.format(kem_yaml['name'], parameter_sets[0]['name']))
else:
readme.write(', {}'.format(parameter_set['name']))
readme.write('\n')
readme.write('- **{}**: {}'.format(kem_yaml['name'], parameter_sets[0]['name']))
for parameter_set in parameter_sets[1:]:
if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
readme.write(', {}'.format(parameter_set['name']))
else:
readme.write(', {}'.format(parameter_set['name']))
readme.write('\n')
readme.write(postamble)
readme.write(postamble)
# Signatures
readme_contents = file_get_contents(readme_path)
# Signatures
readme_contents = file_get_contents(readme_path)
identifier_start = start_identifier_tmpl.format('SIGS')
identifier_end = end_identifier_tmpl.format('SIGS')
identifier_start = start_identifier_tmpl.format('SIGS')
identifier_end = end_identifier_tmpl.format('SIGS')
preamble = readme_contents[:readme_contents.find(identifier_start)]
postamble = readme_contents[readme_contents.find(identifier_end):]
preamble = readme_contents[:readme_contents.find(identifier_start)]
postamble = readme_contents[readme_contents.find(identifier_end):]
with open(readme_path, mode='w', encoding='utf-8') as readme:
readme.write(preamble + identifier_start + '\n')
with open(readme_path, mode='w', encoding='utf-8') as readme:
readme.write(preamble + identifier_start + '\n')
for sig_yaml in sig_yamls[:-1]: # SPHINCS is last in this sorted list and requires special handling.
parameter_sets = sig_yaml['parameter-sets']
if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
readme.write('- **{}**: {}'.format(sig_yaml['name'], parameter_sets[0]['name'].replace('_','\_')))
else:
readme.write('- **{}**: {}'.format(sig_yaml['name'], parameter_sets[0]['name'].replace('_','\_')))
for parameter_set in parameter_sets[1:]:
if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
readme.write(', {}'.format(parameter_set['name'].replace('_', '\_')))
for sig_yaml in sig_yamls[:-1]: # SPHINCS is last in this sorted list and requires special handling.
parameter_sets = sig_yaml['parameter-sets']
if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
readme.write('- **{}**: {}'.format(sig_yaml['name'], parameter_sets[0]['name'].replace('_','\_')))
else:
readme.write(', {}'.format(parameter_set['name'].replace('_', '\_')))
readme.write('\n')
readme.write('- **{}**: {}'.format(sig_yaml['name'], parameter_sets[0]['name'].replace('_','\_')))
for parameter_set in parameter_sets[1:]:
if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
readme.write(', {}'.format(parameter_set['name'].replace('_', '\_')))
else:
readme.write(', {}'.format(parameter_set['name'].replace('_', '\_')))
readme.write('\n')
sphincs_yml = sig_yamls[-1]
for hash_func in ['Haraka', 'SHA256', 'SHAKE256']:
parameter_sets = [pset for pset in sphincs_yml['parameter-sets'] if hash_func in pset['name']]
if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
readme.write('- **SPHINCS+-{}**: {}'.format(hash_func, parameter_sets[0]['name'].replace('_','\_')))
else:
readme.write('- **SPHINCS+-{}**: {}'.format(hash_func, parameter_sets[0]['name'].replace('_','\_')))
for parameter_set in parameter_sets[1:]:
if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
readme.write(', {}'.format(parameter_set['name'].replace('_', '\_')))
sphincs_yml = sig_yamls[-1]
for hash_func in ['Haraka', 'SHA256', 'SHAKE256']:
parameter_sets = [pset for pset in sphincs_yml['parameter-sets'] if hash_func in pset['name']]
if any(impl['large-stack-usage'] for impl in parameter_sets[0]['implementations']):
readme.write('- **SPHINCS+-{}**: {}'.format(hash_func, parameter_sets[0]['name'].replace('_','\_')))
else:
readme.write(', {}'.format(parameter_set['name'].replace('_', '\_')))
readme.write('\n')
readme.write('- **SPHINCS+-{}**: {}'.format(hash_func, parameter_sets[0]['name'].replace('_','\_')))
for parameter_set in parameter_sets[1:]:
if any(impl['large-stack-usage'] for impl in parameter_set['implementations']):
readme.write(', {}'.format(parameter_set['name'].replace('_', '\_')))
else:
readme.write(', {}'.format(parameter_set['name'].replace('_', '\_')))
readme.write('\n')
readme.write(postamble)
readme.write(postamble)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("--liboqs-root", default=".")
args = parser.parse_args()
do_it(args.liboqs_root)

View File

@ -32,4 +32,4 @@
"Kyber768": "89e82a5bf2d4ddb2c6444e10409e6d9ca65dafbca67d1a0db2c9b54920a29172",
"Kyber768-90s": "68bf2e3914c0b4e053cefc67dd9f10f567946da5720f0b453b347610c3cc2c0a",
"sntrup761": "afc42c3a5b10f4ef69654250097ebda9b9564570f4086744b24a6daf2bd1f89a"
}
}